Find Education and Training

DFE's Find Education and Training (FETA) beta assessment report

Find Education and Training

Assessment date 10/02/2026
Assessment stage Beta
Assessment type Assessment
Service provider Find Education and Training
Result Amber

Service description

The service’s mission is to help all learners make informed choices when finding and applying for education and training opportunities that are right for them to ultimately improve their quality of life. The service’s vision is to surface all government funded education and training opportunities for 16 to 18-year-old and adult learners in one place and enable them to assess which are most suitable for their unique circumstances.

Things the service team has done well:

  • The team presented their work (including scope and constraints) clearly. However the fact that there has been no decision as to where this work will sit means that it is difficult to assess. 
  • The panel have noted recommendations which will be easily resolved once a decision is made, but that are not in place should FEAT remain standalone. 
  • The panel were impressed by the team’s culture which shone through, and with the work that they’ve done to remove the friction for users in their search for their next step. 
  • It was also good to see the sharing of research internally (for example from the care leavers work, to understand the barriers) and that the work is being showcased externally, including to SCS conferences and with GDS AI chat and MOD Skills Passport.
  • The tech assessor was particularly keen to give a big thumbs up for the team’s use of Github as a live repo, and their commitment to doing real open source coding in the truest sense.

1. Understand users and their needs

Assessed by: User research assessor (and design assessor when relevant)

Decision

The service was rated green for point 1 of the Standard.

Optional advice to help the service team continually improve the service:

  • The panel were impressed with the strength of understanding of the user needs for the primary user groups - young people, parents and adult learners. The mindset spectrum added a dynamic layer and nuance to each user group, providing a richer picture to how each behave in reality. An expanded research method repertoire was also demonstrated strongly, supplementing usability testing with unmoderated testing and feedback from surveys. The team also demonstrated the shared pain points effectively.

2. Solve a whole problem for users

Assessed by: Design assessor (with input from research, and lead assessor where relevant)

Decision

The service was rated green for point 2 of the Standard.

Optional advice to help the service team continually improve the service:

  • While the service team presented great evidence on how they will have improved the search functionality, it remained unclear how users will successfully use the search results. The team shared evidence of users writing things down and moving to a different website to continue the journey. The panel wondered if there is an outstanding need around shortlisting and saving results that could be explored.

3. Provide a joined-up experience across all channels

Assessed by: Design assessor (with input from research, and lead assessor where relevant)

Decision

The service was rated green for point 3 of the Standard.

4. Make the service simple to use

Assessed by: Design assessor

Note: Link to patterns in the GOV.UK Design System (or similar) that the team should be using or is already using

Decision

The service was rated green for point 4 of the Standard.

5. Make sure everyone can use the service 

Assessed by: Design assessor with user research assessor input   

Decision

The service was rated amber for point 5 of the Standard.

This is amber (or red) because:

  • The team must confirm where the service will be hosted, as the lack of a decision means they cannot yet show that the final version will meet accessibility requirements. Internal accessibility reviews are useful, but they do not replace the need for a full accessibility audit of the beta service in the environment where users will actually access it.
  • The team must complete a full accessibility audit of the beta service and provide sufficient evidence of testing with users with access needs. These were clear expectations set at alpha and must be met before progressing to going live in public beta.
  • The team confirmed that the National Careers Service (NCS) contact centre will provide user support, including handling queries and carrying out live searches on behalf of the user. If the final decision is for the service to operate as a standalone product, or if a combination of access routes is used, the team must ensure that NCS support options are clearly visible on the service.

6. Have a multidisciplinary team

Assessed by: Lead assessor

Decision

The service was rated amber for point 6 of the Standard.

This is amber because:

  • The team have had all the skills need for their multidisciplinary team as they have developed FEAT. The panel notes the decision on not using full time roles for the private beta phase where they were not needed, and notes the commitment to using T shaped people to cover these roles.  However the team confirmed that the National Careers Service (NCS) contact centre will provide user support, including handling queries and carrying out live searches on behalf of the user. If the final decision is for the service to operate as a standalone product, or if a combination of access routes is used, the team must ensure that NCS support options are clearly visible on the service.

7. Use agile ways of working

Assessed by: Lead assessor

Decision

The service was rated green for point 7 of the Standard.

Optional advice to help the service team continually improve the service:

  • The panel were pleased to note reflective practice and a strong team culture.

8. Iterate and improve frequently

Assessed by: Lead assessor with input from user research, design and performance analyst when relevant

Decision

The service was rated green for point 8 of the Standard.

Optional advice to help the service team continually improve the service:

  • The panel noted significant design iterations as the service has developed. The service team provided good examples of improvements and testing for the semantic search function. As language evolves, the team must have a plan for continuous testing to ensure data quality and relevance ranking.

9. Create a secure service which protects users’ privacy

Assessed by: Tech assessor with performance analyst input when relevant

Decision

The service was rated green for point 9 of the Standard.

10. Define what success looks like and publish performance data

Assessed by: Lead assessor at alpha. Performance analyst at beta and live

Decision

The service was rated green for point 10 of the Standard.

Optional advice to help the service team continually improve the service:

  • The team gave a clear account of how analytics is set up for the service, and the level and range of data they have access to. They gave good examples of how data has been used to make tangible improvements to the service, and they outlined some of their plans for developing their analytics capability in the future.

11. Choose the right tools and technology

Assessed by: Tech assessor 

Decision

The service was rated green for point 11 of the Standard.

12. Make new source code open

Assessed by: Tech assessor

Decision

The service was rated green for point 12 of the Standard.

13. Use and contribute to open standards, common components and patterns

Assessed by: Tech assessor for open standards, components and design assessor for design system

Decision

The service was rated green for point 13 of the Standard.

14. Operate a reliable service

Assessed by: Technology with input from lead and design assessors.

Decision

The service was rated green for point 14 of the Standard.

Updates to this page

Published 10 April 2026