Comply with UK REACH (Chemicals) beta service assessment report

The report for Defra's Comply with UK REACH's beta assessment on 10/11/2020.

Digital Service Standard assessment report

Comply with UK REACH - Chemicals

From: Central Digital and Data Office
Assessment date: 10/11/2020
Stage: Beta
Result: Met
Service provider: Defra

Service description

Comply with UK REACH is the service introduced to allow industry users in the UK to register and manage their chemicals handling in the UK. It allows the Health and Safety Executive to operate as regulator of the Chemicals industry. This service facilitates the registration and regulation and monitoring of chemicals trade and manufacture in the UK

Comply with UK REACH supports the development, storage and trade of chemicals for the UK, which UK users will be legally required to use instead of the existing EU service.

Service users

The users of this service are:

  • Policy- DEFRA
  • Regulator- HSE & Environment Agency
  • Industry- Manufacturers, Importers, Consultants, Downstream Users, Only Representatives (Or’s), Third party Representatives (TPR’s), Formulators, etc

1. Understand user needs

User research assessor

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team’s use of their Model Office exercises seem like a really good example of them maximising opportunities to learn about the service’s users
  • the team responded well to the things that were brought up in the mock assessment, and seem to be building up a better understanding of the needs of novice users
  • the team provided clear and thorough profiles of their different user groups capturing the various needs and pain points, and have clearly done a large amount of research with manufacturers and consultants.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to prioritise user research with novice users of the service to build on the team’s understanding of their needs. This should help flag things like where simplification and clarification of language is needed
  • try and do further user research with participants with access needs - for example things like dyslexia or visual impairments. This feels particularly important given the complexity of the service. To clarify - this is separate to an accessibility audit or use of professional accessibility testers like DAC which the team made reference to doing
  • ensure the needs of downstream users are well understood.

2. Do ongoing user research

User research assessor

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has built up a strong cadence of doing regular user research that is feeding into the team’s work and the service’s design
  • the team’s schedule shows they are carrying out a variety of research with different users, with focus shifting depending on upcoming priorities.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to keep up a good rhythm of user research activities before and after launch.

3. Have a multidisciplinary team

Lead assessor

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • despite the team being made up from a number of organisations, roles are clearly set out and understood and the teams work well together across policy and delivery and across the various departmental responsibilities

  • the team is working well together despite the challenges of working remotely

  • the operational team and the product work together under an empowered service manager
  • the team have thought about how they will operate in a live service environment whilst continuing to iterate and improve the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that there is sufficient handover from the supplier to the service team
  • consider bringing user research and service design in house.

4. Use agile methods

Lead assessor

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working in two weekly sprints
  • the team have a clear governance process across all levels from fortnightly senior meetings to day to day working practices
  • the team use online collaboration tools to work efficiently from different locations
  • clear problem statements have been developed to inform their backlog prioritisation and sprint planning.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that there is a plan in place to continue to improve the service
  • ensure that user research is prioritised
  • continue to collaborate with other service teams to ensure a joined up user journey for users.

5. Iterate and improve frequently

Lead assessor

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have iterated the dashboard after identifying needs around navigation
  • the team are using collaborative tools to critique and iterate journeys
  • the team have plans to develop the regulators’ journey.

What the team needs to explore

Before their next assessment, the team needs to:

  • further explore navigation issues and provide a simpler, more joined up user journey
  • continue with plans to develop email notifications
  • make sure they are taking into account novice user’s needs when designing journeys and use this opportunity to rethink how the system works. For example, the service uses technical acronyms and this should be avoided
  • the team have iterated the home page as they had identified navigation issues however I would encourage the team to simplify that page and simply link to activities users can do or guidance under headings. This could allow you to improve the page hierarchy and remove the grey backgrounds
  • consider implementing task list patterns to help users see when actions are pending and to understand next steps for substances
  • avoid using step-by-step patterns in the service, these are normally used by GOV.UK to bring together disconnected services in government.

6. Evaluate tools and systems

Tech assessor

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made good choices of tools and systems matching service needs and Defra capabilities, existing landscape for transition services - high-level summary below:

  • the team has put in place processes enabling the service to quick adapt based on industry changes. For example, where they are now part of the EU beta testing programme for new software releases to IUCLID (International Uniform Chemical Information Database) and therefore providing the required window to make service changes

Using comms tools including Confluence, Jira, trello, sharepoint, MURAL(visual collaboration). Azure tech stack environments, Azure SQL DB, BLOB storage (storing dossiers), Azure Application Insights (monitoring performance), node js app (javascript server-side scripting) serving all of the interface pages, service layer of Java apps (backend microservices) also using open-source Redis (REmote DIctionary Server, in-memory DB). All java and node js apps are deployed as dockerized containers in a VNet and using Azure App Service environments (ASEs) for isolation and secure network access. Using Jenkins for continuous integration and aqua for vulnerabilities container scanning. Security processes and tools are built into the development process and pipelines (scanning Docker images and containers with Anchore and Trivy).

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to explore remaining dependencies on the existing service team and then socialise tools, systems and technology choices with Defra resources to maintain and support services post launch
  • continually evaluate systems and tools based on UK industry user needs and HSE requirements.

7. Understand security and privacy issues

Tech Assessor, lead assessor

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • industry fraud vectors, security controls, protective and privacy threats well understood by the service and measures taken to mitigate risks/traceability including File/Dossier upload virus scanned for ingesting into the service.

  • cookie policy shows what each of the cookies does, privacy policy provides details of data protection in line with GDPR (including Data Protection Officer contact details)
  • the service is leading the way for other transition services with integration into Defra GIO and Defra SOC for monitoring of environments and running services

  • formal gates in place requiring sign off by the Senior Risk Owner(SRO) and stakeholders, Defra security accreditation policy before the service can go live.

What the team needs to explore

Before their next assessment, the team needs to:

  • work on the assumption that a data breach will take place at some point soon. The best way to prepare for a breach is to set up and test the Chemical service disaster-recovery policy with a real-life simulated exercise done collaboratively between Defra and HSE.

8. Make all new source code open

Tech assessor

Decision

The service met with conditions point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • using a GitLab code repository offering development (and DevOps) teams additional tools and workflows.

The code should be open and reusable. This standard will be met on the proviso that Defra, HSE are committed to following guidance post launch and also ensure source code can be reused across Defra

  • the source code will need to be available on GitHub/GitLab or other code repository and continue being open: Making source code open and reusable (also see ‘Related guides’ links on this page) making-source-code-open-and-reusable

ref ‘When code should be open or closed’ when-code-should-be-open-or-closed.

What the team needs to explore

Before their next assessment, the team needs to:

9. Use open standards and common platforms

Tech assessor

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using Defra common platform components (Defra.Integration and Defra.Identity), RESTFUL interfaces with Hapi framework, GOV common components including Notify and other Defra centric standards and platforms integrated into Defra

  • the team has confirmed no additional releases are required for public beta including the Defra common platform components.

What the team needs to explore

Before their next assessment, the team needs to:

  • When using any of the Defra Common Platforms ensure change control processes are in place to meet the needs of your service needs. This includes any changes/new features to these components are well integrated with your end to end service. This is especially important in regards to Defra Identity where this common platform component is also dependent on HMRC Government Gateway Secure Credentials Platform (SCP).

10. Test the end-to-end service

Tech assessor

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • all required environments are built, deployment pipelines are in place for service readiness and the team can create new environments quickly and easily
  • service team walked through ‘Environment specific testing’ from Sandpit, Test, Pre-Product and through to Production with integrated cyber security tools built into the development process and pipelines
  • non functional requirements in place including the required service availability. Transaction load testing completed for this service with the required service ability to scale without core development

  • the team have tested the the digital journeys using the latest release and presented on regular show and tells
  • independent final IT Health Check and Penetration Test with no outstanding Critical or High issues and plan in place for regularly testing and remediation plans.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue with end-to-end testing with industry users and obtaining feedback and continuing to make the service intuitive
  • continue with Accessibility reviews/testing including the obligation for the service to meet UK legal requirements
  • test with HSE Admin internal users for their respective digital journeys and iterate the service to meet their needs.

11. Make a plan for being offline

Lead assessor, tech assessor

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has detailed service and business continuity plans, prevention and recovery tested in place with regular repeatable out of hours releases as per their route to live pipelines
  • the service ownership between Defra and Regulator HSE understood and processes in place for the worst case scenario(s).

What the team needs to explore

Before their next assessment, the team needs to:

  • continually analyse user and industry behaviours in terms of business hours and continually review their support mechanisms and improve contingency measures for being offline.

12: Make sure users succeed first time

Design assessor

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have conducted 4 rounds of research with users with accessibility needs
  • the team have had an external accessibility audit.

What the team needs to explore

Before their next assessment, the team needs to:

  • explore and take into account the journey for users who need support with
  • focus on making a simple journey that works for new users.

13. Make the user experience consistent with GOV.UK

Design assessor

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is consistent with GOV.UK and the team are working with GOV.UK to make sure that users can easily find the service
  • there is a comms plan in place and the team are engaging with businesses.

14. Encourage everyone to use the digital service

Lead assessor

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have identified key stakeholders and groups for their services
  • although the users are already using a digital service, user research has looked at users at the lower end of the digital inclusion scale
  • work is continuing on the start page to ensure that users without a Government Gateway account know what to do
  • a comprehensive and continuous communications plan is in place
  • business engagement is ongoing.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that novice users are considered in further research
  • ensure that a user research plan is put in place to make sure that the service is continued to be improved so that everyone can use it.

15. Collect performance data

Performance analyst assessor

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has implemented appropriate tracking for their service. The team have decided to use Google Analytics and were able to explain why they chose this tool. The team recognised the limitations of Google Analytics data due to cookie consent and explained how they are using alternative sources of data and site performance data to improve their KPIs
  • the team have mapped and created funnels on a dashboard to visualise different parts of the service journey
  • the team have set up a way to collect user feedback using SmartSurvey and are using supplementary questions to create a meaningful satisfaction measure.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work with a performance analyst in the next phase and ensure that performance data and analysis of real users is fed into sprint planning and future iterations.

16. Identify performance indicators

Performance analyst assessor

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has worked out how to calculate the three of the four mandatory KPIs that are relevant for their service. They also provided evidence that they have identified other secondary metrics that are important to service
  • the team has created visual dashboards and reports of the user journey that are well presented and are ready for the next phase. They were also able to explain how these reports will fit into the service MI reporting and will be shared with stakeholders.

What the team needs to explore

Before their next assessment, the team needs to:

  • identify how the communication and engagement plan will be measured and consider how to include this plan will be included in the larger service journey measurement. Identifying users and encouraging them to use the service seems to be a key part of the journey so this should be included in measuring how the service performs.

17. Report performance data on the Performance Platform

Point 17 of the Standard no longer applies.

18. Test with the minister

Lead assessor

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have tested the service with the SRO
  • sessions have been offered with the MInister
  • demos have been done with the Board and senior officials.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the service has been tested with the Minister.
Published 26 November 2020