Civil Service Apprenticeships - Beta Assessment

The report from the beta assessment of the Civil Service Resourcing's Civil Service Apprenticeships service on 21 January 2016.

Stage Beta
Result Met
Service provider Civil Service Resourcing (CSR)

Outcome of service assessment

The assessment panel has concluded the Civil Service Apprenticeships service has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to launch as a Beta service on a domain.


Overall, the panel was impressed by significant amount of progress that had been made since the alpha assessment.

The team demonstrated that they have done a significant amount of good user research during alpha and have a much more in depth understanding of different user groups. The team were able to clearly articulate the user needs and provided many different examples of how the service had been iterated to meet these. It was great to see the way that the team have challenged internally, in response to feedback from users, which has resulted in a much clearer and simpler way to request diversity information from users.

The team have tested the full end-to-end journey, including the initial ‘homepage’. More research and iteration is needed, but the team demonstrated a good understanding of how users are likely to find the service, and choose an apprenticeship. And the team have good evidence that users are able to successfully complete the end-to-end journey first time.

The team also demonstrated that they now have a much better understanding of assisted digital users and their needs. As a result, they have put in place support through the Post Office and Job Centres to help people use the digital service. We would encourage the team to keep testing and iterating this in beta.

The team demonstrated they were working in an agile way and had the ability to quickly respond to user needs. There is a sustainable team in place with a service manager who is empowered to make decisions to improve the service.

The transaction is consistent with the GOV.UK design patterns and style guide. Some minor details are raised in the Design and Content Annex.

From a technical standpoint, the team demonstrated a sound understanding of their relationship with the HMRC Tax Platform and how this impacts both the management and development of their application.

The team understood that data stored in the platform is of a sensitive nature and had taken steps to control and limit access to it.

Currently, the code is not open, the team are aware of the open sourcing policy at HMRC and are aiming to complete the open sourcing of most of the repositories immediately. The panel would like to see this work completed.

The team have a clear idea of how they plan to measure the performance of the service. The team need to continue working with the performance dashboard team to ensure data is being uploaded and published to the performance dashboard.


  • Complete work with GOV.UK to use campaign format for ‘homepage’ rather than existing organisation format

  • Complete design and content recommendations

  • Continue working with GDS performance platform team to ensure that performance data is being published

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 22 December 2016