Apply for a Job in the Civil Service alpha assessment

Service Standard assessment report Apply for a Job in the Civil Service 02/02/2022

Service Standard assessment report

Apply for a Job in the Civil Service

From: Central Digital & Data Office (CDDO)
Assessment date: 02/02/2022
Stage: Alpha
Result: Met
Service provider: Cabinet Office

Service description

The digital recruitment platform has been in use since 2010 and is moving into its fourth iteration. It uses a Software as a Service partner for cost and time efficiency reasons.

Civil Service Jobs processed four million users, one million job applications and 195,000 job advertisements in the last 12 months. The current contract for the platform lasts until 2025. This provides the Government Business Services team with an opportunity to renew the service offer to ensure that it still aligns with user needs. Their discovery highlighted a need to simplify the user experience in key areas and improve hiring times throughout the recruitment journey.

In scope: front-end interface for candidates, where they can search and apply for jobs, and a back-end interface, where department users (vacancy holders and recruiters) can advertise jobs, sift candidates and complete background checks.

Not in scope: specialised recruitment services such as the Fast Stream graduate programme, Apprenticeships, or public appointments. Separate services like Online Test and Candidate Assessments and Background Checks will use the new platform but their services are not the focus of this assessment.

Service users

This service is for:

  • Candidates - users searching and applying for Civil Service jobs
  • Vacancy holders - users who have a job they need to fill
  • Recruiters - users who manage and advise recruitment campaigns, helping both vacancy holders and candidates

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a good understanding of their users, recognising the different needs their users have and recording these clearly via the use of research artefacts such as journey maps and personas.
  • a range of methodologies were utilised to gather user insight, with good use of both quantitative and qualitative research to ensure user needs and pain points were uncovered.
  • evidence from user research was used to develop the service design so that it aligns with user needs.
  • user journey maps were clear and helped to demonstrate the end to end journey for the user.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure research artefacts clearly and accurately differentiate between user needs and user wants so that the team understands what should be prioritised.
  • write user need statements to include the reason behind the need, to help build empathy and understanding with the user. E.g. The statement ‘I want to start my new role as soon as possible’ should include detail as to why this is important to the user.
  • continue to develop the user journey maps. The team identified that different user groups within the 3 main personas (candidate, vacancy holder, recruiter) may have different needs and pain points - utilise journey maps to demonstrate how the journey differs for these different user groups to ensure the needs of all users are addressed.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the start and end of the service have been well thought through.
  • relevant parts of this service can all be brought together in a way that makes sense to users - adverts, application information, tests, background checks, etc.
  • the team clearly recognises how the vacancy holder work impacts on the applicant’s experience and has identified ways to improve this - including templates, word limits and reducing the ‘wall of content’ that candidates can face.
  • the team recognises that candidates may apply for many different roles and found ways to reuse information candidates have already submitted.
  • the team has thought about how the shared services programme is an opportunity for reducing data double entry and is keeping connected to this work.
  • routes into the existing live service and volumes are understood.

What the team needs to explore

Before the next assessment, the team needs to:

  • how how it’s going to work with communities to build ‘civil service templates’ and how these will be maintained without becoming overbearing to either the Cabinet Office team or vacancy holders.
  • how how they have found a supplier that is not just capable of implementing what the team has designed so far but will be able to continue to improve the service after beta.
  • show if their work has influenced departments not currently using their service to use the future version.
  • show how the job templates, word limits and other interventions such as ‘nudges’ or support to vacancy holders have led to simpler and clearer job adverts for candidates.
  • show it has continued to align to the shared services programme and can demonstrate information can move from the recruitment service into HR systems.

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understands how the actions of vacancy holders and recruiters impact on the experience of candidates and are designing with this in mind.
  • the team runs Show and Tells where the current live service and recruiters can hear about their findings and work.
  • the team has experimented with various ways for staff to complete pre-employment check work and how their tasks and messages to applicants can all be brought together as part of the service.

What the team needs to explore

Before the next assessment, the team needs to:

  • show which journeys they have designed and tested and which haven’t yet been designed.
  • explain how offline applications will work with the new service.
  • do more testing of the pre-employment check work especially from a workflow perspective as the alpha prototype seems to have been focused on a few tasks for one candidate.

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • designs have been iterated based on research insights.
  • the team has used appropriate government design systems and components to produce their prototypes and will require this of their supplier.
  • designs consider that users may be unfamiliar with the process or infrequent users of the service.
  • the team has identified how using existing data (previous applications, assessments and security checks) can be used to make things faster and easier for users.
  • pain points from the live service, such as the lack of a CV upload, have been identified and designs tested.
  • content improvements have been identified from user research to make it easier for users to do what they need to do.

What the team needs to explore

Before the next assessment, the team needs to:

  • show that the service will work on a range of different devices.
  • show how users in private beta have found the whole end to end process easier.
  • test how vacancy holders deal with a variety of different templates to choose from.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • barriers to people using the Reasonable Adjustments option have been identified.
  • the team had considered the needs of users with access needs and conducted research to ensure the service is accessible to all.
  • the specific needs of users with access needs have been identified, recognised and responded to.
  • the team can explain how users will get support and different levels of support depending on the problem.
  • the team has tested the supplier market for compliance with accessibility requirements. WCAG compliance will be part of the procurement process and there are private beta plans for further testing.

What the team needs to explore

Before the next assessment, the team needs to:

  • demonstrate how they’ve found ways to encourage more people that are eligible to use Reasonable Adjustments.
  • ensure research is conducted with an adequate representation of users with access needs across each of the user groups.
  • show that they have procured a supplier that can both make the service accessible but also continue to iterate and improve in the future.

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has an appropriate mix of digital, data and technology (DDaT) Civil Servants and external contractors.
  • the team has strong subject matter experts who have provided guidance/support on their alpha work. This means they can get access to specialist help when needed.
  • the team’s technical architect will work closely with the chosen supplier.

What the team needs to explore

Before the next assessment, the team needs to:

  • show the knowledge transfer plan when they bring in the external supplier to ensure the ways of working are sustainable.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • journeys that need the most improvement have been prioritised.
  • the team has started with design hypotheses and proved or disproved these through research and design.

What the team needs to explore

Before the next assessment, the team needs to:

  • explore their ways of working with the Live Services team and the sharing of best practice.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has continually iterated prototypes during alpha.
  • the team has considered the ability to iterate at the configuration level of the supplier during private beta.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that the team will be able to iterate frequently in private beta, including in the case of the supplier making changes to their system in order to support requirements.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is considering security in their choice of supplier.
  • the team is working with the National Cyber Security Centre (NCSC).
  • the team is considering the main threats against the service.
  • the team is planning to follow security-related best practices in private beta.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that usage of tag managers for analytics purposes is tightly controlled, both in terms of access, and their abilities. At a minimum, the site should have a strict Content Security Policy.

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has developed a comprehensive performance framework mapping KPIs to user needs and desirable outcomes.
  • they are using a wide range of data sources including: Qualtrics, Civil Service Jobs, Salesforce, Celonis, Google Analytics & RapidSpike that should help deliver rich data sets and reports.
  • they are using both survey and machine generated data to generate their reports.

What the team needs to explore

Before the next assessment, the team needs to:

  • the use of statistics on the live service has great potential to help inform and shape future project phases. The panel would like this data to be more prominent in the next service assessment (beta) and inform other areas of the service standard.
  • during the next service assessment (beta) they should show dashboard prototypes and as a minimum these should indicate how their KPIs and performance data will be presented.
  • given the quantity of data and the prominence of this service they should articulate more of their data architecture, integration and governance thinking.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is evaluating how well suppliers would be able to make production versions of the alpha prototypes.
  • the team is considering how the supplier will integrate into new and existing systems.
  • the team is considering non-operational reporting components.

What the team needs to explore

Before the next assessment, the team needs to:

  • be aware of the lower level details of the stack of the chosen supplier.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the source code of the alpha prototypes has been made open.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that as much source is made open as possible in private beta.

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has used the GOV.UK Prototype kit during alpha.
  • the team plans to use the GOV.UK Design System in private beta.

What the team needs to explore

Before the next assessment, the team needs to:

  • explore using GOV.UK Notify and GOV.UK One Login.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is aware of best practices, e.g. maximum load times for pages.
  • the team intends to have security monitoring in place.
  • the supplier will be held accountable in terms of reliability of the service.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure that components can function as much as possible without other services running. For example, if the planned Central Services Integration Hub were to go offline or be degraded, the operational user-facing service must continue to function.
Published 22 January 2024