Standards and Testing Agency (STA) service Beta assessment

Service Standard assessment report Standards and Testing Agency (STA) service


Service Standard assessment report

Standards and Testing Agency (STA) service


Assessment date: 17/10/2024
Stage: Beta
Type: Assessment
Result: Amber
Service provider: Standards and Testing Agency (STA)

Previous assessment reports

Service description

This service is housed within GOV.UK where practitioners can administer primary school assessments to streamline the development, admin and the way results are received, and make the end-to-end process more efficient and sustainable, whilst maintaining security and high assessment standards. The service is made up of 3 parts which serve different functions: Start, Manage and Develop, 2 of which are relevant for the GDS assessment: Manage This service is where practitioners register pupils, manage registered pupils and submit a headteacher’s declaration form. Start This service is where practitioners pair devices in preparation for an assessment, complete training, read the administration guidance, and administer the assessment.

1. Understand users and their needs

Decision

The service was rated green for point 1 of the Standard.

Things the service team has done well:

  • Differentiated user needs which reflected the different user groups had been developed, maintained, validated and iterated throughout the private beta
  • User research samples were large, varied and representative to ensure the needs of all users had been taken into account.
  • User personas had been developed to capture and communicate the different goals, behaviours, needs and pain points of the user groups
  • User needs had been prioritised using a team-based approach, considering goals of the service and the needs of the users.

Optional advice to help the service team continually improve the service:

  • continue to test the service with users most at risk of being disadvantaged by a digital service due to digital poverty and digital exclusion. This includes schools/organisations with financial limitations which may make digital upkeep and maintenance challenging and schools/organisations in areas with poor connectivity.

2. Solve a whole problem for users

Decision

The service was rated green for point 2 of the Standard.

3. Provide a joined-up experience across all channels

Decision

The service was rated green for point 3 of the Standard.

4. Make the service simple to use

Decision

The service was rated amber for point 4 of the Standard.

During the assessment, we didn’t see evidence of:

  • Review use of non-GOV.UK components and patterns to ensure there is a valid reason for the deviation. For example look at the use of ‘Next’ rather than ‘Continue’ in green buttons, and the use of capitalised text in tags (COMPLETED). Set up a design system to organise and explain the use of these components.

5. Make sure everyone can use the service

Decision

The service was rated amber for point 5 of the Standard.

During the assessment, we didn’t see evidence of:

  • The team needs to redouble its efforts to recruit headteachers and practitioners with access needs, inc. users of assistive tech, into future user research and testing, as well as finding proxy users
  • The team needs to bring the service up to level AA of the WCAG 2.2 standard during the next phase of work.

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

Things the service team has done well:

  • The panel felt there was a good spread of profession groups and regular consistent forms of communication provided well informed transparency.
  • The panel felt that core team engaged well and links to other functions outside of this (Data teams) were well maintained.
  • The panel felt that stakeholders were well informed and there was a clear path to escalate issues and risks.

Optional advice to help the service team continually improve the service:

  • Ensure the core team is maintained as part of the transition period.
  • Continue to work with policy teams to ensure alignment with business need.

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

Things the service team has done well:

  • The panel was impressed that scrum ceremonies observed to right level of cadence were taking place.
  • The panel felt that regular Show and Tells for the business alongside internal sprint reviews were well executed and delivered a good level of information to the differing audiences.
  • The panel felt that the cadence for each iteration was appropriate and captured the right level of feedback to design teams.

Optional advice to help the service team continually improve the service:

  • Ensure the team is sustainable in its current form to move into public beta and subsequent LIVE deployment.

8. Iterate and improve frequently

Decision

The service was rated green for point 8 of the Standard.

Things the service team has done well:

  • The panel felt that the level of iteration was good and offered enough feedback to design teams.
  • The panel were impressed with the feedback and changes made, and the communication provided as justification for the work to continue.
  • The panel felt there was a good level of adaption to changing circumstances and environments.
  • The team provided evidence of iterative improvement driven by data insights, including identifying pain points from file validation errors, and adjusting the CSV templates accordingly.

Optional advice to help the service team continually improve the service:

  • Ensure iterations can continue and the team is in place moving into public beta.
  • Continued focus on refining metrics and ensuring they are used to measure future iterations will be key to assist in driving future iterative improvements as the service scales up.

9. Create a secure service which protects users’ privacy

Decision

The service was rated green for point 9 of the Standard.

Things the service team has done well:

  • The team have a clear understanding of the sensitivity of the data they are handling and have designed an architecture for their service that is appropriate for this, as well as engaging with relevant organisational processes.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

Things the service team has done well:

  • The team has demonstrated an appropriate approach to measuring success by focusing on both the mandatory KPIs for publication on Gov UK and more granular success measures including the number of non-engaging schools, number of training modules completed per school and the length of time spent on specific tasks.
  • This will allow for a deeper understanding of user engagement.
  • It is also positive to see user profiles have been considered when defining measurement KPIs.
  • A variety of tools are being used to track KPIs and insight including Azure application insights, Gov UK page analytics, questionnaires and help desk data.
  • The team has shown they have considered the difference in behaviour across the two different device types used. The ideation session held to address device flipping issues is an effective step to improving the user experience.

Optional advice to help the service team continually improve the service:

  • The team could consider collating all their KPIs into a performance framework document (goal, success & failure points, along with theoretical metrics) for each user group. This will ensure KPIs can be refined and reviewed accordingly when required. It will also help to keep track of the measurement strategy in place and provide a central place for the team to record it.

11. Choose the right tools and technology

Decision

The service was rated green for point 11 of the Standard.

Things the service team has done well:

  • The team have a good understanding of how availability of devices and network connectivity will impact users of their service and have made good technical decisions to support this.
  • The team have tested different technical approaches to their “two device” requirement and have iterated this based upon evidence from testing.

12. Make new source code open

Decision

The service was rated amber for point 12 of the Standard.

During the assessment, we didn’t see evidence of:

  • The team have a plan for making their code open source but have not done so yet. The team must release their code in an open-source repository under an appropriate open-source licence, except where there is an articulated reason for being unable to do so.

13. Use and contribute to open standards, common components, and patterns

Decision

The service was rated green for point 13 of the Standard.

14. Operate a reliable service

Decision

The service was rated amber for point 14 of the Standard.

During the assessment, we didn’t see evidence of:

  • The team gather a large amount of log and telemetry data but have not yet identified what metrics will be important to monitor and alert upon to have confidence that the service is working well. The team must identify these metrics and configure appropriate alerting on them.

Next Steps

This service can now move into a public beta phase, subject to addressing the amber points within three months’ time and CDDO spend approval.

This service now has permission to launch on a GOV.UK service domain with a beta banner. These instructions explain how to set up your *.service.gov.uk domain.

The service must pass a Live assessment before:

  • turning off a legacy service
  • reducing the team’s resources to a ‘business as usual’ level, or
  • removing the ‘beta’ banner from the service

Updates to this page

Published 1 October 2025