Digital Waste Tracking
Service Standard assessment report DEFRA's Digital Waste Tracking alpha assessment
Service Standard assessment report
Service Standard Assessment report Digital Waste Tracking
From: | GDS |
Assessment date: | 25/06/2025 |
Stage: | Alpha |
Result: | Amber |
Service provider: | DEFRA |
Service description
- the aim of the waste tracking service is to be an enabler in a complex problem space. The main problem we’re helping to solve is waste crime which costs the UK economy an estimated £1billion annually with the associated environmental harm. We aim to help solve this problem by creating a digital service that can capture data about waste collected and received throughout the UK. Giving environmental enforcement and investigation teams access to this data will help to enhance enforcement action in the UK waste sector; currently they only have visibility of hazardous waste movements as a matter of routine and a proportion of exports. For the majority of non-hazardous waste, there is quarterly aggregated site level reporting only. There are significant gaps in the data reported. The digital service and data it will collect will also help inform future policy by DEFRA providing new insights into waste movements in the UK. We will also help policy teams understand what waste is in the UK and the impacts this could have on a circular economy
Service users
This service is for
- the alpha is focussed on enabling operators of waste receiving sites to meet their legislative requirements which come into force from October 2026 (legislation mirrors staggered onboarding in digital). This has been the focus for alpha as our riskiest assumptions identified in discovery were all around ensuring we could get accurate timely data from Waste Receivers.
Things the service team has done well:
- the team have developed and designed in the open through use of Github pages which has had a double use of documentation while also being an artefact used in research sessions and with users across the journey. The overall approach of using alpha to address riskiest assumptions has helped the team identify and prioritise changes to policy reducing friction and potential issues that could come up when rolling out the new legislative policies.
- the team refined and narrowed their scope, in a justified way, while recognising the larger range of users and the wider service needs they will need to accommodate to solve the whole problem for users. The latter (carriers and brokers in particular) are to be dealt with by a distinct team, but in close collaboration with this team, and potentially with some of these members splitting off.
- the approach to technology is unified at Department level through a Common Delivery Platform. This allows the service to make effective use of existing patterns in the development, deployment, and support of the service from the early stages.
- the team demonstrated a robust understanding of the problem space and a focussed/ targeted approach to solving the problem without being discouraged by the scope and complexity of waste tracking.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
- understanding what percentage of the 15K receivers are already using software, and who therefore might be able to use the primary solution being developed. Current figures were based on around 150 respondents, which show users adopting software just over 50% of the time (though much less likely amongst smaller receivers). Market research is essential for the team to understand how widely adopted the service will be, and how many will be relying on the spreadsheet option v the API. The results of that market research do not have to be interpreted as undermining the API solution, the results would help to put the various aspects of the service in their appropriate light.
- this information is also important for measuring success.
- we appreciate it is very difficult to get access to users who are sceptical of, or nervous about, collaborating with a regulator. Nevertheless further effort should be made to incorporate people in the private beta who are hard to reach, especially as they are likely to be non-software users. Incentives should be considered.
2. Solve a whole problem for users
Decision
The service was amber for point 2 of the Standard.
Optional advice to help the service team continually improve the service:
-
the team should discuss with the service assessment how to ensure the onboarding and payment parts of their service meet the alpha requirements. This should come before the service goes into private beta. The team could also consider additionally internal critiques and reviews within Defra to help them utilise existing knowledge and design patterns.
-
the team have narrowed their scope in line with the narrowing of regulation, to focus on waste receivers first, who are the first user group to be brought under this legislation. They have provided options for a) users of waste tracking software, b) users who do (or can) manage their service via a spreadsheet, and c) need to be exempted from using this service and who will continue doing paper quarterly returns.
3. Provide a joined-up experience across all channels
Decision
The service was rated green for point 3 of the Standard.
4. Make the service simple to use
Decision
The service was rated amber for point 4 of the Standard.
During the assessment, we didn’t see evidence of:
- what the impact changes to data requirements will have, if any, on the offline journeys. The team should consider how the different data will impact regulators when considering auditing or taking action against a receiver. They should also attempt to find and work with likely users of the offline processes to better understand their needs and any potential behavioural change that could come about.
- for users relying on 3rd party software, the ease of use for the service will boil down to whether they can enter any extra information to the software they are currently using, and whether registration/payment is difficult. A lot of this is outside the control of the WT service team.
- for users relying on the spreadsheet option, this needs to be tested thoroughly in private beta, to ensure the majority of users can set this up correctly. The service team were aware of the importance of testing this.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
Optional advice to help the service team continually improve the service
-
the team should prioritise research and design time for developing the onboarding and payment journeys for the new service, taking learnings from the work the previous team have done. They should explore whether the offline users are likely to need to register, and pay, and what this looks like for them.
-
the beta assessment needs to ensure that the majority of users can configure their online spreadsheet correctly, understand how registration works, understand the use of API keys, and how payment works.
-
the user researchers reported speaking with some participants who could not read or write. Working with the widest range of possible users will continue to be important going forward, so we avoid making a government service that is only accessible by market players of a given size or sophistication.
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
Optional advice to help the service team continually improve the service
- ensure there is more UR and Design involvement from core Defra team to assure retention of knowledge but also to have space to capture any big picture opportunities for transformation and to avoid duplication across the whole service line
7. Use agile ways of working
Decision
The service was rated amber for point 7 of the Standard.
During the assessment, we didn’t see evidence of:
- the team is considering full user journeys for the service that will be going into Beta. The panel understood that work on the registration and payment elements took place prior to this work and was parked for this alpha. This is planned to be restarted during private Beta. Before the service proceeds to their next assessment these parts of the journey need to be alpha assessed as they include some key transactions for the users and include user groups that will have different user needs. The team needs to work with GDS assessment to team to agree on how this will look like.
8. Iterate and improve frequently
Decision
The service was rated amber for point 8 of the Standard.
During the assessment, we didn’t see evidence of:
- team getting enough real users to have strong confidence that the final prototype that they are taking into private Beta will work for the majority of users
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
Optional advice to help the service team continually improve the service
Before their Beta assessment the team should:
-
schedule a ITHC as discussed during the tech pre-call
-
develop in full their data recovery approach with the 3rd parties involved
10. Define what success looks like and publish performance data
Decision
The service was rated green for point 10 of the Standard.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
Optional advice to help the service team continually improve the service - the service team should ensure that the other components that will be developed as part of this service (e.g. registration, payments) are released as open source code.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
Optional advice to help the service team continually improve the service: -the service team, before Beta, should have clear plans on the possible transition of the DEFRA ID component to OneLogin, and explain how any service disruption will be minimised, or how the migration will impact the service users.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Optional advice to help the service team continually improve the service - ahead of Beta, the service team should identify the impact on service reliability of the different component of the service that were not part of this assessment (e.g. registration, payments), and show how their impact is minimised.
Next Steps
This service can now move into a private beta phase, subject to addressing the amber points within four - six months time and GDS spend approval if applicable. Specifically for standard point 7 we understand that team may need more time to address this - this condition should be closed before coming to private beta assessment and specifics should be agreed with GDS assessment team.
To get the service ready to launch on GOV.UK the team needs to:
-
get a GOV.UK service domain name
-
work with the GOV.UK content team on any changes required to GOV.UK content