Find Information about Products and Services

DFE's Find Information about Products and Services beta assessment report

Find Information about Products and Services

Assessment date 15/12/2025
Assessment stage Beta
Assessment type Assessment
Service provider Find Information about Products and Services
Result Red

Service description

This service aims to help enable DDT users to access product and service information to reduce duplication and speed up delivery in line with our strategic ambitions.  

In the longer term, the service aims to get a centralised, structured source of truth for all services and products across DfE enabling reuse, reducing duplication and speeding up delivery.

Service users

This service is for DDT users.

Things the service team has done well:

Optional. Max one line per assessor.

  • It was great to meet the team and see how personally invested they were in creating this product - it was clear they believe it can make a real difference within DfE, enabling collaboration and savings.
  • Overall, while we thought the service is on the right track, the assessment came too early - with clear evidence that only limited iteration took place so far, with several use cases that were outlined as priority not followed up on, and some governance issues (including technical) not being as advanced as they should be at this stage. 
  • The assessment panel thought the somewhat staggered progress was typical of services we’ve seen which were built as a part-time initiative by groups of passionate individuals, but without the necessary mandate to make the work their priority (and unlock the collaboration from the rest of the organisation).
  • This was especially clear from the difficulty with data collection (currently at c.40%), which in turn affects the take-up from users and the team’s ability to both learn and prove the value proposition of the service.

1. Understand users and their needs

Assessed by: User research assessor (and design assessor when relevant)

Decision

The service was rated amber for point 1 of the Standard.

This is amber because: 

  • The team must conduct research with people using the service in the real world. This will answer outstanding questions about whether the service meets their needs and what scenarios people are using it in. This should include people updating the service and those searching it.
  • The team must use their newly installed data analytics to drive the research. Understanding what is happening will enable them to focus on understanding the why.
  • The team must include people with additional needs in every research round.
  • The team must test the support model.
  • The team must research with users who will be onboarded during the public beta phase e.g. policy colleagues. It is important the team understand their needs before they begin to use the service.

2. Solve a whole problem for users

Assessed by: Design assessor (with input from research, and lead assessor where relevant)

Decision

The service was rated amber for point 2 of the Standard.

This is amber because:

  • The team must understand the journey pre- and post-service, how people navigate to the service and their expectations
  • The team must define the start page for the service.

3. Provide a joined-up experience across all channels

Assessed by: Design assessor (with input from research, and lead assessor where relevant)

Decision

The service was rated amber for point 3 of the Standard.

This is amber because:

  • The team must define how users will find the service, create clear entry points/start page.
  • The team must establish performance metrics and monitor dropouts.
  • The team must identify a sustainable route for support for the service once live.

4. Make the service simple to use

Assessed by: Design assessor

Note: Link to patterns in the GOV.UK Design System (or similar) that the team should be using or is already using

Decision

The service was rated amber for point 4 of the Standard.

This is amber because:

  • The team must integrate performance analytics into the service, monitor these and use to support continuous improvement.
  • The team must undertake user research to understand any identified performance issues.

5. Make sure everyone can use the service 

Assessed by: Design assessor with user research assessor input   

Decision

The service was rated amber for point 5 of the Standard.

This is amber because:

  • The team must ensure that all identified accessibility issues that can be resolved by the project team are resolved.
  • The team must test with users who have additional barriers to access, and users of assistive technologies.
  • The team must continue to work with Microsoft to resolve accessibility issues.

6. Have a multidisciplinary team

Assessed by: Lead assessor

Decision

The service was rated amber for point 6 of the Standard.

This is amber because:

  • While it is clear that the team is highly skilled and bought into the importance of this work, the speed of progress in some areas is symptomatic of products developed part-time and without the necessary mandate. If the product is as key to broader transformation within DfE as indicated it should be prioritised and resourced appropriately, and the messaging to teams about data completion should be strengthened.

7. Use agile ways of working

Assessed by: Lead assessor

Decision

The service was rated green for point 7 of the Standard.

8. Iterate and improve frequently

Assessed by: Lead assessor with input from user research, design and performance analyst when relevant 

Decision

The service was rated amber for point 8 of the Standard.

This is amber because:

  • The team must take more time to iterate the service. Unfortunately, the team has not been able to validate several of the user needs they presented to us in the private beta so far, for example the desire to find services that may meet similar needs, use the same components, or rely on the same technology. Without this, users rely on product names being sufficiently descriptive which by the team’s own admission is often not the case, or the freetext (but unsearchable) description field.

9. Create a secure service which protects users’ privacy

Assessed by: Tech assessor with performance analyst input when relevant

Decision

The service was rated amber for point 9 of the Standard.

This is amber because:

  • A threat analysis has been performed but the service would benefit from a more thorough threat model which has not been yet completed, with the necessary mitigation plan.

10. Define what success looks like and publish performance data

Assessed by: Lead assessor at alpha. Performance analyst at beta and live

Decision

The service was rated green for point 10 of the Standard.

11. Choose the right tools and technology

Assessed by: Tech assessor 

Decision

The service was rated green for point 11 of the Standard.

12. Make new source code open

Assessed by: Tech assessor

Decision

The service was rated green for point 12 of the Standard.

13. Use and contribute to open standards, common components and patterns

Assessed by: Tech assessor for open standards, components and design assessor for design system

Decision

The service was rated green for point 13 of the Standard.

14. Operate a reliable service

Assessed by: Technology with input from lead and design assessors.

Decision

The service was rated red for point 14 of the Standard.

This is red because:

  • The team does not have a fully agreed operating model and there is no clear plan for how the service will be maintained, supported and improved.
  • The delivery team lacks dedicated technical resources and these are urgently needed for the service to be properly supported, maintained, iterated and improved. While the delivery team has demonstrated they have done everything they can within the existing constraints, and the can-do attitude is commendable, there has to be a long-term plan detailing how the service will be maintained and supported within the organisation and technical resources are key for this.
  • At this particular time, DfE must decide whether the service needs to be prepared to be handed over to a business-as-usual function or whether it gets maintained and supported within the delivery team given the expertise they hold. If the latter option is chosen, dedicated technical resources are to be provided.

Updates to this page

Published 10 April 2026