Apply For A Phytosanitary Certificate Beta Reassessment Report

The report for DEFRA's Apply For Phytosanitary Certificate Service beta reassessment on the 18th of August 2021

Service Standard Reassessment report

Apply for a phytosanitary certificate

From: Government Digital Service
Assessment date: 18/08/2021
Stage: Beta
Result: Met
Service provider: DEFRA

Service description

The Apply for a phytosanitary certificate service gives users in Great Britain a simple, online method of applying for this certificate. Once the application is received, the Animal and Plant Health Agency (APHA) will certify the goods by conducting a physical inspection or by commissioning a lab test. If the goods pass these checks, the inspector then issues a phytosanitary certificate to the exporter.

Service users

Apply for a phytosanitary certificate caters for the following users:

  1. Exporters, including
  • growers, producers or traders selling their goods overseas
  • retailers moving goods from Great Britain to their stores outside GB
  • agents, who apply for phytosanitary certificates on the exporter’s behalf
  1. Inspectors, including
  • “Office inspectors” – conducting casework checks on applications
  • “Field inspectors” – conducting physical inspections of the goods – and issuing the certificates

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team provided evidence that they had engaged with internal inspectors across the various food producing regions within England and Wales
  • user needs of inspectors have subsequently been well documented and further user research is planned in this area
  • user research with inspectors with accessibility needs had increased from one user to six, and the team conducted a second DAC audit across the entire user journey
  • the team presented different personas and needs, and have now collated top-level user needs to playback in further user research sessions
  • the team is making the effort to identify and validate the internal user journey with assisted digital users

What the team needs to explore

Before their next assessment, the team needs to:

  • user uptake appears to be quite low even when considering the service has been invite-only during private beta. Moving into public beta, Defra must make a concerted effort to reach those inspectors in the busiest regions to sell the benefits of this system compared to the legacy applications E-Domero and PEACH
  • a detailed plan to mitigate the issues that were highlighted from the user research with internal staff, such as the frustrations with the Dynamics solutions, and also some of the implications of the assisted digital research, e.g. where internal staff choose to do the application on behalf of the users
  • multiple field inspectors and internal staff users have identified usability issues with the Microsoft Dynamics backend. Primarily issues around navigation on smaller devices were flagged as a major issue, as well as a lack of training or guidance. Defra should therefore look to remediate these issues within public beta testing
  • WRT Internal Users, there needs to be a clearer and more comprehensive articulation of *actual* user needs, pain points, and their proposed mitigation through design or otherwise
  • WRT internal users and the personas in development, show clearly, how these personas were arrived at (research data and methodology ) and then weigh them to show who the core/primary personas are (the panel understands that field inspectors and CIT are two but are there any more? Discuss and articulate the rationale for why these two are the primary personas. Are there any that are secondary, or deprioritised?). Consider showing how these internal personas fit into the internal user journey service map
  • in terms of presenting future research findings, consider developing a simple, easy-to-follow narrative, with this narrative framed by the actors (personas let us say), their needs, wants, goals, pain points, frustrations and so on as they go on the journey. This narrative should then be “boundaried” by the real world possibilities of the system being developed including factors such as platform limitations, MS Dynamics, and other considerations, such as assisted digital/accessibility needs which can be used to highlight and discuss the trade-offs for the users and their needs)

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated evidence of a re-focus towards the Inspector user experience
  • the team also showed how they are establishing connections with other government departments and facilitating knowledge sharing to establish more efficient processes across the entire user journey
  • utilising existing platform functionality to increase efficiency for repeat users (copy and paste function to speed up repeat applications and so on)

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work with other departments to make sure that the service utilises examples of best practices across the entire user journey
  • establish a connection with EU colleagues to gain efficiencies by sharing data standards, also to understand where they can improve working processes and policy
  • understand and explore how best to optimise the service in order to be utilised and usable for inspectors and other users “in the field”, using mobile devices
  • consider the potential needs around users wanting to have template applications set up which they can reuse in order to save more time, and also cater for seasonal peaks and other needs around variations in usage

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have established a champions network to help promote the service and facilitate digital uptake in their users
  • they had researched device usage with inspectors and gathered evidence for the as-is use of dynamics on these devices

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to review the internal processes in the wider team and make improvements to remove any processes that sit outside the service. E.g. use of spreadsheets for data recording

  • further develop the user support model and continue to iterate on the offline channels inline with wider service updates

  • conduct user research with the 28% of Inspectors continuing to use the paper-based processes understanding in greater detail their potential barriers to using the new digital system

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • since the last assessment the team had made changes to the inspectors’ journey based on user feedback
  • they had prioritised their work based on the frequency and impact of changes made
  • the updates they had made were having a positive impact to reduce mistakes and reliance on the training module
  • they could make improvements within the technical constraints of dynamics

What the team needs to explore

Before their next assessment, the team needs to:

  • continue listening to their users and iterating on their designs, they should be clear about the expected improvement that each update will deliver and record its impact
  • make sure that sight of the core KPIs is not lost in the changes that are being delivered
  • continue to make regular iterative updates to the usability of both the internal and external service as well as the online and offline parts based on user research.
  • explore in more detail the use of mobile devices and any opportunities for process improvement they might present
  • explore the limits of changes that can be made to improve the usability of dynamics based on their research findings
  • explore more ways to make it easier for busy Inspectors to provide feedback at each stage of the user journey

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have completed DAC and accessibility audits and work to bring the system up to WCAG2.1 AA is underway and planned in
  • since the last assessment they have liaised with the fees and charges group and legal teams. This has allowed them to build the service so that users aren’t charged the mandatory paper application fee for using an assisted digital route
  • they are working so that the legal statute setting out the application fees will be changed in the 2022 spending review
  • training is being updated to include the accessibility findings and shared with other export services

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the planned work to make the service WCAG2.1 AA compliant happens for both the internal and external parts of the service and address any new accessibility issues that are raised by making updates and improvements
  • further, explore the assisted digital routes and make sure that these are iterated in line with the other service improvements that are being made
  • appoint an assisted digital lead within the team to make sure these routes stay as a priority and that users aren’t being excluded from the service

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a clearly defined set of KPI’s developed from user needs and personas
  • team has shown clear examples of using data to improve the services including using search and feedback data to name the service
  • the team has developed robust statistical sound methods to validate changes measured from the new services against benchmarks
  • the team has a clearly defined process for feeding insights into the wider team ensuring that data is used to drive improvements
  • the team used a wide range of data sources and has really managed to focus on using the data they need to
  • the team is using the free version of Google Analytics and Google Tag manager as most of the data is from other data sources. They have anonymised the IP address, have ad tracking switched off and have had SIRO signoff
  • the team has a compliant cookie opt-in mechanism

What the team needs to explore

Before their next assessment, the team needs to:

  • the team plans to publish data on the 4 mandatory KPI’s on the new GDS data platform. This platform is not ready for use at this time. The team needs to publish their KPI data on data.gov.uk. Further information on this is available on https://www.gov.uk/service-manual/measuring-success/data-you-must-publish

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had championed the need for open-sourcing and subsequently open-sourced the microservices at https://github.com/DEFRA/trade-phes
  • the team has addressed secrets management as part of the workflow and developers are utilising git-secrets and try prior to publishing code
  • the team are using shared components and contributing to DEFRA platforms
  • the prototype code is open-sourced and the team have worked to produce wireframes of the MS Dynamics user interface

What the team needs to explore

Before their next assessment, the team needs to:

  • look to address the difficulties of open-sourcing MS dynamics code and configuration and share the code there
  • integrate the secrets and vulnerability scanning into the CI and code publish pipeline
Published 9 September 2021