Pay a Clean Air Zone charge alpha assessment report

The report for Defra's Pay a Clean Air Zone charge alpha assessment on the 8th October 2019.

From: Central Digital and Data Office
Assessment date: 08/10/2019
Stage: Alpha
Result: Met
Service provider: Defra

Service description

Clean Air Zones (CAZ) are physical areas that a number of local authorities will implement to improve air quality, enhance health benefits and drive economic growth. A number of these local authorities will be implementing ‘Charging’ Clean Air Zones as a measure to meet these goals.

The end-to-end service is the simplest way for vehicle users affected by Clean Air Zones to understand their responsibilities and be encouraged to change their behaviour; or make payment for driving through a zone if their vehicle fails to meet emission standards.

Service users

Primary users

Individual Owner Driver (IOD)

  • Clear, consistent, trusted communications; what is a CAZ, how it impacts me
  • Check whether one needs to pay a charge
  • Understand CAZ boundaries
  • Understand how to avoid a penalty charge
  • Get customer support, if and when needed, to comply with CAZ scheme
  • Receive proof of payment

IOD – Frequent user

  • Efficient process for frequent payment of CAZ charge
  • Easy to self-manage; means to verify if I have a charge to pay
  • Easily track which trips I’ve paid for
  • Easily find evidence how my payment to CAZ is continually making a difference to air quality

IOD – Multi CAZ

  • Clear communication of rules and requirements of CAZs used
  • Streamlined process to make payment
  • Easily track which trips I’ve paid for

IOD – Infrequent

  • Awareness of CAZ and how to comply to avoid penalty charge notices (PCN)
  • Check CAZ boundary offline/on the go if not connected
  • Get offline support when needed if English not first language

Assisted Digital vehicle driver

  • Check my vehicle or pay a charge with assistance with the process
  • Notified by text when charged as might forget to check if been in a zone
  • Be able to call someone to verify if a charge is outstanding
  • Notified charge owning 12 hours before window to pay closes

Fleet Manager

  • Trusted, up-to-date information to make fleet compliant
  • Avoid paying a penalty charge if a vehicle makes an unplanned entry into a zone
  • Single payment across multiple CAZs
  • Flexibility if vehicle breaks down to swap hire or replacement in
  • Clearly sign-posted and simple refund process
  • Accurate payment reports with filters or API to integrate with internal systems

Secondary users

Partners

  • Control over local CAZ-related communications and content, seamless link between local and central government content online (incl. options available)
  • Clear division of labour between local and central government for support (we support local CAZ issues - like mitigations and tariffs - and central deals with digital service queries, assisted digital, for example) to be able to manage budgets
  • Keeping all stakeholders satisfied; press, businesses and private users (be clear what is expected of them to successfully comply)
  • Access to overall scheme performance – key data as well as detailed analytics
  • Control over the flexibility of the scheme – suspension, geographic parameters, updating whitelists, for example. and link to local strategy
  • Visibility of the payment process, but not actually managing it
  • Understanding the implications of GDPR – who owns the data

Service Owner

  • Ability to answer and resolve public queries about their car, inaccuracies stored with the DVLA
  • Direct users to offline service should the system be unavailable.

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team identified and engaged with a range of users impacted by the service
  • they recognised that some user needs are not in line with policy, for example, fleet users and the payment period
  • the team appreciates the need to undertake further work and have a plan for researching their most risky assumptions early in the next stage
  • they are working closely with social researchers to influence policy decisions as appropriate
  • the team have been learning from other schemes, including similar schemes in London, to understand the potential effect they can expect their service to have on user behaviour
  • the team were able to show how their understanding of their users had evolved as research continued, including how the increasing maturity of their understanding of users who need assistance with using online services (AD users) has allowed them to develop a non-digital support model.

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how they will research with users with a wide range of accessibility needs. The team explained that they planned to undertake an element of accessibility research every other sprint, although the plan showed to the panel seemed to focus on accessibility audits, and undertaking assisted digital research more regularly. The panel is confident that the team understand the difference between assisted digital users and users with accessibility needs and that both need to be researched with, and remind the team that accessibility audits will not be sufficient
  • reconsider the overarching user needs they have identified. One of these is policy driven rather than a need that users will come to the service with, or that the service will ever be able to address
  • think about how to balance ease of use of the service while still enabling the service to promote a change in driving behaviour. Striking the right balance will be a challenge, especially as there is outside pressure for the behavioural influence levers in use to be strong. The panel hope the team will be able to balance these pressures by continuing to champion the needs of their users.

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • a range of appropriate user research methods have been used
  • the team have shown that from a usability perspective, users can get through vehicle checker and payments services and broadly understand the content.

What the team needs to explore

Before their next assessment, the team needs to:

  • undertake more research, including testing alternative hypotheses in prototype form, on the end-to-end journey from the user’s perspective. Acknowledging the challenges of this being a shared service with local authorities, the user’s journey is likely to start with awareness of boundaries and vehicle charges and considering/applying for exemption. This experience will impact how users get on with the products the team are developing
  • present a plan showing detailed activities they intend to undertake in the private beta phase. This should include workstreams covering work with real beta users, the continuing development of the prototype from the backlog and research with assisted digital and accessibility users
  • better demonstrate their understanding of the limits of hypothetical research and how they are guarding against this in their design. For example, the best predictor of future behaviour is past behaviour, so the team could explore how users behave on comparable services e.g. paying a parking fine. The team’s beta plans also need to show how they will rigorously test their designs and assumptions in the next phase
  • the panel were concerned with some of the assumptions the team had made about user behaviour around the policy in private beta. For example, not being eligible for a refund after the date paid for has passed and having a small timeframe to make a payment after travelling. These could create huge pain points in beta and must be properly impacted. This is particularly important in light of the team’s finding that users associate the scheme with their local authority. Are Leeds and Birmingham prepared for the number of enquiries, media interest etc. these situations could result in?

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is co-located between JAQU (Defra, DfT), DVLA and Informed Solutions
  • the standard skill sets and team members are included in the day-to-day delivery of the service
  • the team members from the various departments and agencies attend the stand ups and show and tells at the end of the sprint
  • the team has grown and flexed, and there are plans to get more civil servants - lack of that today due to multiple challenges
  • for beta, the team will bring in a security specialist and test manager - ideally test manager will be resourced from within DDTS, subject to other departmental priorities such as EU Exit commitment and deliverables.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that when reporting to the programme board the details are filtered accordingly and challenges/blockers identified are informed at that level to ensure the support from the board is received.

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a fairly good structure to get the priorities confirmed from their backlog, where they also use tools such as Confluence for the stories and progress, for example, changes based on iterative / research with the users
  • the agile ceremonies seem to be followed correctly and help the team manage the various needs and help them with prioritisation, communication and delivery
  • the panel was impressed how the team uses the ceremonies to perform the various communications to stakeholders and use the ceremonies to gather the info for that kind of reporting.

What the team needs to explore

Before their next assessment, the team needs to:

  • have a more stable and concrete way of getting the priorities confirmed in case the various stakeholders / parties’ priorities conflict.

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the payment page has gone through iterations which partially covered the payments for multiple local authorities
  • the compliance checker has gone through a number of iterations with the users which has shown changes to journey/design of the page(s).

What the team needs to explore

Before their next assessment, the team needs to:

  • make testing the service the default option as it will give the team the insights of what good looks like and what works for the users.

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used languages Ruby and javaspring boot
  • the team took a cloud first approach using the AWS platform, Docker and Kubernetes
  • the team used restful API services for integration
  • the service was aligned with DVLA architecture framework and principles
  • both NoSQL or SQL databases were examined before coming to the conclusion - that an SQL approach based on Postgres was preferable
  • after careful consideration, serverless technology was settled on as opposed to using a server based architecture
  • the technical architect produced nice, clear explanatory slides to explain the technical architecture with the correct balance between clarity and detail to be understood by the panel (several of whom were not technical architects).

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • a security threat model including mapped out vectors and protected API using NCSC guidance was used
  • AWS WAF (Web Application Firewall), Internet Gateway and Cognitoo were included in security design for the service
  • data impact assessments have been carried out which service passed
  • data sharing agreements is being devised with elements of data sharing to be finalised.

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team plans to make private beta code open source
  • the team has a private GitHub repository for code.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure they are able to work in the open, sharing their code for alpha and subsequent beta developments publicly (eg using GitHub).

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used open standards restful API ( json used as language)
  • the service makes use of GOV.UK Design Patterns
  • the service uses GOV.UK Pay.
  • the service uses GOV.UK Notify.

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the technical architect was able to present a coherent and logical plan to explain to the panel how the service works from end to end from a technological perspective
  • the solution overview slide was very helpful to illustrate the end-to-end journey
  • the technical architect was able to explain how the central (Defra) architecture integrated with component architectures of the local authorities, making good use of the technical architecture slides.

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • failure pattern for queuing messaging was implemented and that the team is still working on transaction volumes
  • offline capabilities are still being decided upon. Policy as it stands allows payment up to 7 days in advance or up to midnight the next day. Phone based alternative will be set up; operational readiness team are defining this.

12. Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has tested multiple times to improve the completion rate so that 100% of participants completed the payment journey the first time of trying
  • the team described how they had worked closely with Leeds and Birmingham local authorities during the project
  • the team has developed a central support model to make it easier for users to access
  • there is work with mapping providers to include CAZ in maps and driving directions
  • work has started to develop solutions specific to fleet users and managers
  • there is a communications plan for the build up to CAZ implementation
  • the team has explored a step by step approach to the user journey
  • the service has moved to a 2 week window for payment (1 week before, 1 week after)
  • the service will send out warnings rather than PCNs for either a time period or for first offences.

What the team needs to explore

Before their next assessment, the team needs to:

  • make it clearer to users how to gain an exemption on the central CAZ information and service
  • investigate multi-CAZ journeys and how users will navigate them and pay, and the content and support needed
  • look into the needs around user accounts and the best ways to implement them
  • investigate different mapping components and make sure they are accessible - whether in-service or on external mapping sites. Mapping solutions must be accessible and included in the accessibility testing and statement
  • carry out further iteration on the car details provided and presenting the emissions of the car, as well as make and model so users can make longer-term decisions about their vehicles
  • work through duplicate charge scenarios, surface information about previous payments and how users can easily get compensated. Look at providing mobile/text receipts as well as via email. Investigate automatic mitigation/refunds
  • write and publish an accessibility statement for all parts of the service
  • make sure the service works end-to-end on mobile devices, including local authority content and external mapping sites
  • work closely with new local authorities to ensure each CAZ implementation is learnt from and improved, and multi-CAZ journeys are understood and make sense to users
  • investigate novel touchpoints, such as DVLA letters, envelopes and services to ensure wide communication about CAZs.

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has used GOV.UK design system patterns throughout the majority of the journeys presented at assessment.

What the team needs to explore

Before their next assessment, the team needs to:

  • conform to GOV.UK design system patterns and content style. For example, the red error box used to confirm that a vehicle is subject to a charge, instead of the panel component, and the check box to indicate the user does not have an exemption. If the team persist with non-standard elements, they must have sufficient research to back these decisions up, and feed the components and research back into the design system. During private beta there should be a full content review adhering to the GDS content style guide to ensure consistency
  • talk to GOV.UK to discuss the content surrounding the service and the start page(s)
  • talk to GDS about a possible short url to put on signage etc
  • continue to iterate and test the end-to-end service including content on GOV.UK and local authority websites
  • more work and testing on the content design throughout the service about exemptions - the current call-to-action / charge panel, and negative tick box in the payment flow are not clear or easy to get right for users.

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has compared digital and non-digital journeys to provide feedback on improvements, which fed back into making the digital service better and easier to use
  • the team has iterated and improved the service in response to user research
  • the team has a robust communications strategy in place to address winning hearts and minds.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to iterate the service based on user research
  • demonstrate that a thoroughly researched assisted digital service support is in place that meets user needs.

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • type of vehicle, type of fuel that it uses and emission data will be gathered and used to perform more details reports and get a better data quality to make certain decisions.

What the team needs to explore

Before their next assessment, the team needs to:

  • the team needs to have a better understanding of the type of data the organisation already collects (e.g. air quality analysis data) and clearly define what new data this service will collect to inform

  • service performance
  • user satisfaction
  • policy updates.

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • key metrics such as cost per transaction, user satisfaction, completion rate and digital take up have been identified
  • the team has plans in place to monitor and create the various metrics and reports based on the data.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure there is a clear way of collecting and managing the above metrics for both online and offline versions of the service
  • consider ways of comparing the relative success of failure of service journeys across local authority sites (are some local authority services more effective than others?)
  • consider ways of comparing single CAZ journeys with journeys across multiple CAZs
  • explore what performance metrics will inform how the service has influenced and changed driver behaviour
  • explore how the service can access performance data from local authorities.

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • there are plans to think about this and create the various reports based on data collected.

What the team needs to explore

Before their next assessment, the team needs to:

  • have some idea on what reports need to be produced and what data is required to produce those reports - the delivery team needs to plan this in more details at this stage. No concrete plans in place, while this may be too early for the full details.

18. Test with the minister

Point 18 of the Standard does not apply at this stage.

Published 29 October 2019