Import of Products, Animals, Food and Feed System (IPAFFS) plant journey

IPAFFS enables Defra to ensure the import of animals, animal products, high risk food and feed, and now plants and plant products, to be inspected before releasing in free circulation within the UK.

Service Standard assessment report

IPAFFS (Import of Products, Animals, Food and Feed System)- Plant journey

From: Government Digital Service
Assessment date: 13/07/2021
Stage: Alpha
Result: Met
Service provider: Defra
Service Manager: Stuart Combrinck GDS assurance lead: Dominic Coleman  

Service description

IPAFFS enables Defra, as the competent authority, to ensure the import of animals, animal products, high risk food and feed, and now plants and plant products, to be inspected before releasing in free circulation within the UK. This ensures bio-security and quality measures are met thus safeguarding the UK’s food chain and ecosystem.

Their legacy systems do not provide functionality for the volume nor new needs following leaving the EU. Due to expected volume, a new service is needed to be in public beta by the end of 2021.

Service users

This service is for…

  • importers – legal obligation to pre-notify Defra of any controlled commodities being imported into the UK
  • agents - helping the importer with administering the process
  • inspectors – internal users - these could be APHA or Port Health Authority operational staff who are responsible for ensuring all documentation is in order as well undertaking any ID and physical checks if necessary
  • Food Standards Agency (FSA) – monitor the import of food and ensure there is full traceability, and have the ability to enforce ID and physical checks

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had detailed personas of users, also differentiating between internal and external users, small/medium/large scale traders
  • the team has used user research to test the team’s assumptions, to identify the main user needs, and to inform design decisions
  • the team has used a variety of user research techniques
  • the team has a good relationship with users and have so have a good participant base
  • the team had tested with geographically diverse users across the UK
  • the team has a plan for testing in private beta. The team has already identified three early adopter organisations and ten inspectors as participants for this stage and what is likely to take place
  • the team is aware of gaps (for example to do more user research with assisted digital groups), and have a plan on addressing this

What the team needs to explore

Before their next assessment, the team needs to:

  • explore user research with API feasibility and desirability from users. The panel didn’t believe this was explored as much as it could be before eliminating as an option
  • use the private beta opportunity to continue some alpha testing, for example assisted digital, accessibility, API
  • present to the beta assessment panel the high level details of how many rounds of research have been conducted, how many participants have been talked to, the type of research methods used, the different demographics/characteristics of the participants spoken to, the plans for research in the future and to highlight any limitations the team is aware of. The panel believes the team has this information available, and the user research panelist will be interested in knowing more about this
  • talk in the beta assessment about the user research done on testing the user experience in improving speed and how it compares to the legacy platform, as the core user need is to make the user experience for the service faster
  • try to do/highlight user research done with international traders with whom you do not have a prior relationship with if possible. This is recommended to help provide a sense check on the user journey and design with completely new participants. The panel understands that this may be difficult to achieve

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated strong subject matter knowledge and understand the real world impact of the service
  • the team demonstrated the relationship between the service and the parameters of rules and regulations
  • the team showed how someone would interact with the service from a Google search starting point and explained the scope of their work

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the relationship of the service with agents including a dedicated persona view. We heard 95% of applications originate from import agents
  • see if they can map the end-to-end journey or service landscape, including any elements out of their scope so they can demonstrate the whole service and constraints out of their control
  • engage and present a geographical representation of users and how their needs differ (if applicable) whilst acknowledging that many users are home workers
  • look further at the API feature experience in their private beta plan

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the external to internal user journey was demonstrated and thought out
  • the team recognises that the legacy services have been exclusively digital only since 2010 and they need to do more on inclusive design including assisted digital and engaging with users with accessibility needs
  • the team has established links with HMRC and other actors in the ecosystem

What the team needs to explore

Before their next assessment, the team needs to:

  • as the service team recognised, they need to do more on inclusive design including assisted digital and engaging with users with accessibility needs
  • ensure internal and external assisted digital and accessibility journeys are recognised, articulated and included within their private beta plan
  • articulate the internal and external user components (and those in and out of project scope) and multi-channel journey in a visual

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a well-defined and thorough design process for refining their assumptions, ideating and testing them
  • the team is conducting frequent usability testing and iterating on this feedback to make their service as simple to use as possible, focusing in particular on the time spent on the user’s task
  • the team has worked hard to ensure consistency with the gov.uk design system

What the team needs to explore

Before their next assessment, the team needs to:

  • consider whether they could do more to understand and improve the earlier stages of a user’s journey through the design of guidance
  • consider how they could ideate around the concept of unhappy paths and ensuring there are no dead ends. It was suggested that if a user was to struggle to complete their task they might return to the portal. Consider whether this is the preferred outcome, or if it would it be more optimal to direct them towards human support or other content for example
  • ensure that any content has been reviewed and well researched to meet user needs (within any legal constraints) and that it follows the GOV.UK style guide for all text

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had conducted an accessibility audit, conducted automated testing, and created an accessibility statement
  • the team was aware of a number of issues that require fixing in order to meet WCAG2.1 accessibility principles
  • the team was aware of the need to create an assisted digital support model for their service and had plans in place to start recruiting users who could help them explore this further

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work towards making their service compliant with WCAG 2.1
  • the team should take a broad approach to defining their assisted digital users and identifying their pain points by talking to a range of users, including those with low levels of digital confidence, given the service has been digital-first for a number of years
  • they should also start thinking about how they will design the assisted digital support model, based on these pain points. During their beta the team should then focus on recruiting these users to test their assumptions
  • ensure that all aspects of the user journey have been tested for and meet accessibility requirements, including any internal-facing systems, such as Microsoft Dynamics, SSCL and any common platforms, such as Defra ID

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the product owner is a subject matter expert in her field and articulated the subject area and the wider eco-system clearly. She showed empathy with the user pain points
  • the team demonstrated they work across multiple services and departments to interact and link with teams that relate to the customer journeys (for example HMRC, APHA imports)

What the team needs to explore

Before their next assessment, the team needs to:

  • there needs to be a drive to bring in civil servants for sustainability and to manage risk. Relatively, there’s little civil service involvement, with at least 80% of the team being contractors from two contractor organisations
  • have civil service staff in the tech side of the team to be part of quality assurance and ensure sustainable outcomes. The development team is purely external including 16 developers across four teams
  • plan how to manage the product effectively within private beta and public beta usage including transition plans to civil servant ownership and management
  • show how the developers are part of the team rather than a separate team. No non-Dynamics user software engineers were listed within the team maps

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses common daily agile cadence
  • there are endeavours to address synergy across the four teams including a daily scrum of scrums across the four scrum teams
  • there was evidence that backlog refinement and UR is not just a UR activity
  • ideation techniques are used including lean UX canvases
  • the team looked at riskiest approaches, reframed as user-centred problems to solve

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate how the whole team is actively involved in the agile process for example development of a complex deliverable like the API, considering there’s such a large team
  • show further details of strong agile ways of working

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a user centred approach where they iterated and enhanced the user experience, by talking through solving a problem related to uploading a large volume of data for bulk commodities and exploring solutions for different types of users
  • there’s a clear high-level holistic plan for private beta including starting with three external organisations will be participating initially with six users and with 10 internal inspectors

What the team needs to explore

Before their next assessment, the team needs to:

  • show evidence of contributing patterns to DEFRA and the wider DfE community
  • demonstrate a relationship between the analytical data and private beta use to improve the service
  • show cadence with user research and design in their private beta plan
  • ensure accessibility and assisted digital is included in their private beta plans

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the projects systems have been running and publicly available for half a year and all security incidents have been handled with no data compromise or loss of service
  • a DPIA assessment has been carried out
  • a security pen test has been carried out on the projects existing systems
  • security has been taken into account in the design, development and deployment processes
  • the team has worked with the departmental security function to ensure security standards have been followed, reviewed and passed by the design authority
  • the team pioneered use of the new departmental customer identity service

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the DPIA assessment is updated with any changes introduced by their new service
  • carry out a pen test on the new service before publicly testing the service

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • there is clear thinking around the mandatory KPIs including thinking about the impact of saving drafts on their completion rate KPI
  • the team has timed legacy service use as a stepping stone to benchmarking

What the team needs to explore

Before their next assessment, the team needs to:

  • set up tools to track internal and external service performance data, as planned
  • organise an end of service satisfaction survey or similar
  • benchmark further against their KPIs and iterate their initial thinking

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made a solid set of technology choices using tools that are commonly used and supported throughout government projects
  • the team has begun to work with the business as usual team and plan handover

What the team needs to explore

Before their next assessment, the team needs to:

  • be able to demonstrate progress with the preparation for transfer to business as usual
  • document how the systems can be built and maintained in the future, after the development stage of the project is completed. This could be combined with the recommendations in points 12 and 13
  • It would be good to see the business as usual team participate in the next assessment.
  • make more progress on the API in terms of it specification, testing, security and user involvement

12. Make new source code open

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is aware of the need to open source their code and the recent departmental decision that has now made this possible
  • coding standards are already in place to ensure secrets and personal information are excluded from the codebase

What the team needs to explore

Before their next assessment, the team needs to:

  • agree a policy on what and how code should be open sourced
  • ensure that open sources licenses are in place and that all code files contain the correct copyright notice and license details
  • investigate the use of third party code in the code base and ensure that its licenses allow open sourcing and all their license restrictions and notices are in place
  • test that the code that is open sourced can be built. This is a key requirement if the team chooses to publish from an internal working repository to public repository
  • ensure that sufficient documentation is in place to make it practical for others to make use of the code

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the main languages used in the code base, node.js and java, are open standards and widely used in government
  • the user interface is based on the GDS design system
  • the team has worked with other teams within their department to reuse components and patterns in areas such as identity validation, build and deployment and security

What the team needs to explore

Before their next assessment, the team needs to:

  • make an active contribution to the components and standards that they are using
  • continue to develop their own API and promote its use within their user community
  • use the open sourcing task as an opportunity to share and promote their achievements. For example by writing blog posts describing interesting and innovative aspects of their system and the solutions they developed
  • encourage all members of the team to actively participate in cross government forum, such as making contributions to the GDS design system
  • ensure that contributing to open standards, components and patterns is a goal for the whole team, record those contributions and present them to the next assessment

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • there has been no unscheduled downtime on the existing animal service since January 2021
  • as above, cyber attacks have been stopped before they have penetrated the system
  • the service is talked about as the thing internal and external users use
  • the system is hosted with a cloud provider with the benefits of service scaling and redundancy which provides basis of a stable environment
  • monitoring, logging and auditing systems are in place to enable the reliability of the system to be measured and for a timely response to incidents to be made

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the introduction of the new service takes full advantage of the existing infrastructure and deployment procedures, without introducing new problems
  • continue to explore; explain clearly their service operating model and how it benefits from building on existing procedures
  • use examples of the live service running of the animal service to evidence their model
  • use service performance tools as part of their offer for their existing system and to prove they could demonstrate service performance going forward
  • continue to make preparations for the transfer to this being a business as usual process. This will be an area of increased focus in the next assessment
Published 28 July 2021