Apply for, replace or renew a tachograph driver card/company card beta assessment

The report for the Apply for, replace or renew a tachograph driver card/company card beta assessment on the 28 April 2021

Service Standard assessment report

Apply for, replace or renew a tachograph driver card/company card

From: Central Digital & Data Office (CDDO)
Assessment date: 28/04/2021
Stage: Beta
Result: Met
Service provider: Driver Vehicle Licensing Agency

Previous assessment reports

Apply for, replace or renew a tachograph driver card/company card alpha assessment

Service Description

The service will allow users to apply for a first driver tachograph card online, to complement the existing paper and telephony channels. Customers will also be able to use the service to renew driver tachograph cards that are about to expire or to replace any driver cards that have become lost, damaged, stolen or malfunctioning

Overview of service users

  • Bus and/or coach drivers
  • HGV/Lorry drivers

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team employed a range of methods for gathering user insights throughout the development of the service
  • the team strove to be as inclusive as possible in their user research, including testing with users with accessibility needs and across a range of ages / geographic locations
  • the team’s efforts to consider assisted digital needs and the use of a digital inclusion scale to ensure their research was as inclusive as possible
  • the use of personas and their refinement since the Alpha phase
  • the involvement of the team in user research and analysis

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to test/gather feedback with a wider group of users throughout Public Beta
  • continue to refine the personas for use throughout further iterations
  • continue to deepen their understanding of the accessibility needs of the users of this service

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team considered alternatives to creating a service in starting exploration around the user need for a physical card
  • the team demonstrated iterative design with refinements based on user feedback
  • the team considered the wider journey and the paper counterpart to this service

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate a detailed understanding of the service landscape, through a service landscape map, showing everything from both the user and organisation’s perspective
  • demonstrate how their KPIs relate to user needs and user experience
  • work with a content designer to refine error messages and test these with users

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has made use of established alternative mechanisms for users to contact DVLA in the event of a problem e.g. webchat. This reduces demand on the telephony channel and can improve the overall user experience dependent on user preference
  • the team tested and refined the forms associated with this service
  • the team considered the end to end journey and ensured that internal staff have easy access to customer and order information in order to best assist users with any enquiries

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate a detailed understanding of the end to end user journey across channels. Provide an end to end user journey map detailing all channels for the service, including pain points and opportunities
  • regularly work with a content designer to ensure that the language throughout the journey is user centred and consistent - especially error messages
  • consider managing user expectation by communicating time frames and next steps to users as they are making their application and also in the confirmation email
  • consider their approach to consent and being more transparent about how users’ data will be used across the end to end journey
  • ensure all links are logical and take users where appropriate - especially when assistance is needed (e.g. when no advisor is available the contact link takes you to the GOV.UK tachograph page and contact info is lost within that page)
  • identify any remaining opportunities to allow users to complete the service journey digitally rather than requiring the user to post a form. The panel acknowledges the fact that forms may be required in journeys where a user must send back a tachograph by post for instance

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has built an internal service for staff to use, with reusable designs, common patterns, and is clearly laid out for minimal training requirements
  • the team has used Google analytics to identify pain points within the service, and user satisfaction surveys across channels for further feedback
  • design system patterns and components have been used within the service, e.g. GOV.UK table for listing licence details, date input for date of birth, and the details component for help information
  • user research was undertaken to understand how a user may find the service including search terms which may be entered on external search engines or on GOV.UK
  • while some content and design issues were identified these are largely minor and the team demonstrated an effective ability to iterate the service and to correct identified problems rapidly

What the team needs to explore

Before their next assessment, the team needs to:

  • consider all findings of the content review and implement the ‘Must implement’ recommendations before scaling out the service further
  • further explore and user test the start page content, service guidance, and GOV.UK routing into the service. The start page should include details of what the user will need to have ready when starting their application
  • further explore GDS patterns and components that could be designed into the service, for example Contact a department or service team where contact details are needed, and the payment page
  • consider bringing drop out page content headers in line with GDS style guide, (the GDS panel component is noted as not highlighting important information within the body content)
  • consider using standard GOV.UK body paragraph rather than xsmall on the ‘I need to talk to an advisor’ summary text
  • consider providing more information to users about why certain requests cannot be processed online and therefore require a paper form
  • review the content to ensure that they are accounting for incorrect data entry in the error pages. As an example, a user may reach one page which informs them that they do not have a valid GB licence entitlement - it would be worth confirming if such pages could be reached through accidental errors in the data submitted and if so we would recommend that the user be asked to check the information they have provided
  • avoid acronyms where possible and ensure that any acronyms that are used are explained to the user including country codes e.g. GB
  • ensure that all elements of the service follow the GOV.UK Design Pattern. The panel noted that some or all of the paper forms are hosted on a Department for Transport branded site and this may be confusing to users

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has tested designs against a range of personas, and users across a selection of accessibility needs
  • user testing and design iterations have been carried out across offline channels as well as within the online service
  • the service has been user tested on a range of devices, across features and iterations

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that users don’t require a channel switch to complete a transaction within the service. Users should not be excluded from completing a transaction due to access to printing
  • clarify the contact process for users who can’t complete online; clearly surface the contact details in guidance and at all drop out points within the service, including operating hours for both telephone and webchat
  • implement the findings of the Digital Accessibility Audit to the point of at least full WCAG AA accessibility compliance prior to scaling the service further
  • ensure the web chat functionality is accessibility audited and any remaining defects preventing WCAG AA accessibility compliance are resolved prior to scaling the service further
  • provide all users with links to accessible versions of the paper forms within the service prior to scaling the service further

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is for the most part comprised of the digital roles which would be expected for the development of a digital service using agile methodologies
  • the team has support from a wide range of subject matter experts from other disciplines which are crucial to developing an effective Government service but which may not always merit full time membership of the team e.g. policy, legal etc
  • the team has sought an appropriate level of senior support for the service and has the buy-in from key external stakeholders

What the team needs to explore

Before their next assessment, the team needs to:

  • consider bringing onboard a Content Designer to review content at future major iteration points
  • ensure that the team remains sufficiently scaled to manage the Public Beta phase and the possible increase in user need identification which will result from this

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • an effective squad structure has been established
  • decision making appears effectively devolved from senior leadership but with the necessary buy-in from leadership to clear blockers
  • the team has embedded the use of an appropriate set of agile delivery tooling such as Jira

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • there is clear, regular iteration in the service through the established agile ways of working
  • the team have demonstrated a clear improvement flow between the findings of their Private Beta and the iteration which has resulted in the service

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should consider more deeply embedding the learnings they acquire from performance analytics into their iteration flows. This will be particularly important as the service enters Public Beta

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have conducted recent CHECK penetration testing of the service and have remedied critical and high importance recommendations made during that exercise to achieve a successful retest
  • the team will repeat periodic, formal health check assessments, and will undertake full penetration testing in response to any major architectural change
  • the team has a dedicated internal cybersecurity function available, which carries out continuous checks to ensure the service remains secure
  • the team implements a range of vulnerability scanning tooling, including static code analysis, container vulnerability scanning, and OWASP compliance testing to ensure that any code deployed is secure
  • the team has a well-established and clear escalation policy for dealing with security incidents
  • the team are very clear about the risks associated with fraudulent misuse of the tachograph system and have long-standing procedures in place for mitigating those risks

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the intended modernisation of the ‘Identity’ utility, that provides authentication services for the tachograph service, is thoroughly tested to ensure it doesn’t introduce any new vulnerabilities to any associated services

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had identified a good set of Key Performance Indicators (KPI’s) beyond the mandatory four
  • the team showed how changes to the KPI’s could be used to identify pain points and potentially make improvements to the service
  • the team used existing service monitoring data sources to help decide on the KPI’s
  • the team has carefully considered the KPI’s needed using requirements gathering from service managers and other data stakeholders
  • the team is using Google Analytics to build usage funnels that help identify pain points on the service
  • the team ensures no Personal Identifiable Information (PII) is collected and passed to Google. They have also anonymised the IP address on the Google Analytics tags
  • the Performance Analytics team have provided an excellent suite of reporting tools ensuring the team can self serve with their reporting needs allowing them to do more proactive analysis
  • there are regular touchpoints between the service and analytics teams to flag insights but there is also an opportunity to share insights on an ad-hoc basis
  • the team’s cookie policy does not collect any personal data. They are in a good position to understand the effect of using an opt in cookie compliance policy as they have back end data which can be compared to Google Analytics front end behaviour data - this has not been implemented yet but is in the pipeline. The team worked with their information assurance team to ensure they are compliant with all relevant legislation with regards to cookies
  • the team’s Senior Information Risk Officer (SIRO) has also signed off on using Google Analytics for the analytics tool
  • the organisation already has and account on data.gov.uk and the team have had discussions with them to understand their responsibilities for providing the mandatory KPI data on data.gov.uk

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that while they have a very good set of KPI’s it is not clear how they map to understanding if the service is meeting its user needs, and how they feed into using data to make improvements to the service. It would be good if the team produced a performance framework to help map KPI’s to user needs
  • consider if the performance framework referenced above would also help to identify benchmarks for the team
  • consider that a performance framework could also help the team identify additional user needs which could then inform future iteration of the service

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has an established technology stack in use across their digital services that ensures they will continue to have the right skills and experience in place as the service evolves
  • the team intends to revisit their technology choices to ensure they remain the correct choices. An example given is their intention to explore the use of TypeScript within the service
  • the team have an extremely clear vision for this service and other services across the DVLA digital estate, that emphasises both modularity and reuse, meaning future services should be relatively easy to deploy and integrate, and existing services should be simpler to maintain modernity, particularly now that the migration from a legacy monolith system is complete

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct periodic reviews of their technology stack to ensure that their choices remain fit for purpose

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a clear steer and set of criteria for determining whether, and which new repositories can be made open source, whilst operating in an environment working with Critical National Infrastructure that means it may not always be appropriate to code in the open
  • the team have made efforts to identify and open-source certain repositories, for example their DVLA Report Portal work: https://github.com/dvla/dvla-reportportal-ruby

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the policy and criteria for determining whether new repositories can be made open source, is subject to regular review to ensure that any concerns or constraints are still valid and current
  • continue to work with GDS and the National Cyber Security Centre (NCSC) to ensure that they are striking the right balance between their necessary code security constraints, and the benefits that open source coding can bring

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have made extensive use of the components of the GOV.UK Design System
  • the team has contributed to the Ruby community by contributing code to a Ruby configuration Gem
  • the team intends to share their experience of migrating from a monolithic architecture to a more flexible microservice based architecture with the wider development community
  • the team are using a contract first approach to API design based on OpenAPI

What the team needs to explore

Before their next assessment, the team needs to:

  • explore additional opportunities to share their experiences with communities across Government and beyond, given the success of the legacy migration work
  • continue to be mindful of opportunities to contribute to open standards initiatives

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • high user satisfaction was reported during Private Beta
  • the team has an established and proven scale-up capability managed by their dedicated Cloud function that should ensure this service could handle volumes of use beyond that which they are expecting for this service
  • the team are cognisant of the service’s criticality to users’ livelihoods and maintain multi-channel avenues so users can still complete their journey in the event of an extended period of downtime to the service
  • the team has ensured that legally required information is shared with the European Union and vice versa. The team has also done well to ensure that this information sharing is clearly explained to the user
  • the team has developed plans for how to respond should a significant issue affecting the service be detected including a clear communications strategy

What the team needs to explore

Before their next assessment, the team needs to:

  • seek to encourage user input into satisfaction surveys during Public Beta and monitor these statistics closely given the low number of users who responded during Private Beta

Updates to this page

Published 23 August 2022