Take An Online Test When I Apply For A Job in the Civil Service

The Online Take A Test When I Apply For A Job in the Civil Service also known as Online Tests and Assessments (OTAP) service is part of Government Recruitment Service (GRS), and provides a SaaS platform to develop psychometric tests for recruitment.

Digital Service Standard assessment report

Take An Online Test When I Apply For A Job in the Civil Service

From: Central Digital and Data Office
Assessment date: 25/03/20
Stage: Beta Assessment
Result: Not met
Service provider: Cabinet Office

Previous assessment reports

● Alpha assessment report: Met (13 August 2018)

Service description

The Online Take A Test When I Apply For A Job in the Civil Service also known as Online Tests and Assessments (OTAP) service is part of Government Recruitment Service (GRS), and provides a SaaS platform to develop psychometric tests for recruitment. The tests are developed in-house by the Tests and Assessments (OTAP) team, which includes GRS Occupational Psychologists and a multidisciplinary digital team. Tests are made available by integration with CSJobs, a wider SaaS platform where Recruiters set up vacancies and manage candidate applications, and where job seekers can find and apply for jobs in the Civil Service.

When they advertise a vacancy, Recruiters and Vacancy Holders have the opportunity to add our Online Tests as part of their selection process, alongside other methods like CV and personal statement. For applicants, this means that if a job is set up to require an Online Test, they will need to undertake this as part of their application. The service allows Applicants to complete tests which accurately reflect their ability in a fair and inclusive way, contributing to a more efficient and transparent recruitment journey.

Service users

● applicants - users who are applying for a job in government. They want the role and have often already invested time and effort into researching and applying

● vacancy holders - users who have a vacancy in their team / area that needs to be filled with a good candidate

● recruiters - users who are specialists in Government Recruitment Service (GRS) or department recruitment that advise and support vacancy holders through the recruitment process

● test creators - specialists in GRS or department Human Resources that develop tests and test content (items), typically Occupational Psychologists and Psychometricians

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has adopted a range of methodology to reach out to users and understand their user needs
  • the team tested service with users who use assistive tech to go online and users who fall in the assisted digital spectrum
  • it was impressive to see that team ensured that their research is inclusive and reached out to a large number of users

What the team needs to explore

Before their next assessment, the team needs to:

  • continue researching and testing service with users of assistive tech. It would be good to see how users would access service using their own devices, in their own environment
  • continue to understand needs of Assisted Digital users
  • research and understand the unhappy paths such as why users drop out or fail to complete

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team understands the need for ongoing research and there is a plan in place for this and ongoing support for users

What the team needs to explore

Before their next assessment, the team needs to:

  • get a user researcher in the team
  • ensure there are different avenues of getting user feedback for iterations, moderated/remote unmoderated research, analytics etc

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • good team morale and collaboration using the standard modern tools to enable communication, prioritisation and sharing of responsibilities to deliver the right functionality in the committed sprint
  • the right people were involved for the majority of the delivery of this service

What the team needs to explore

Before their next assessment, the team needs to:

  • the team will need to have a full time user researcher, interaction designer and content designer; the latter two are being hired for which is a positive step towards having the complete team in place

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a very good understanding of the agile methods with the right rituals in place
  • there is a good structure of the sprint with good understanding of its purpose, commitments and teams involved in delivering the user stories as agreed
  • good plans for mid-sprint planning if needed
  • retrospective in place after their sprint - bi-weekly

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the supplier teams have the same understanding of the agile ways of working as the service team
  • supplier teams may need further alignments to the service team especially when it comes to aligning deliverables, understanding and communicating the dependencies between the two teams

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • some improvements were made due to the iterating of the service such as the button on the page(s), content around describing the practice tests and the real tests
  • tests were tested with users to improve and simplify the content to explain the tests, very good progress made with distinguishing the differences between the mock test to the real test
  • the team ensured that there is good flexibility when it comes to having to edit the tests, processes and create new tests which will meet both policy requirements and hiring teams’ individual requirements to creating a test

What the team needs to explore

Before their next assessment, the team needs to:

  • no google analytics in place at this stage to provide a clearer view of how to iterate, partially due to the third party partner

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • a Software as a Service (SaaS) platform was selected after careful consideration of the options available. Building a custom solution would not have been cost effective and open source options did not provide a sufficient level of functionality or maturity
  • the team have helped avoid ‘lock in’ by procuring the software with a short contract duration
  • although the software is a SaaS model, the contract enables the service team to iterate the service to meet evolving needs
  • the team have a strong and healthy relationship with the SaaS provider and work with existing release schedules to plan work
  • the team has a tried and tested support model for the service which functions well
  • the SaaS platform provider monitors the health of the platform at infrastructure and application levels and runs frequent smoke tests on production ensuring common journeys are working correctly. Any anomalies trigger notifications and alerts which feed into established incident processes

What the team needs to explore

Before their next assessment, the team needs to:

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have considered the threats and the potential of misuse and fraud within the service. Mitigations are in place where possible including unique sessions and broad questions sets so no two tests are the same. Honesty statements are presented to applicants to help ensure they are aware of the repercussions of inappropriately asking others for help
  • the SaaS platform is regularly penetration tested and automated security testing is run during the deployment of any changes including dependency updates and security patches
  • the team have worked with security specialists in the broader Cabinet Office team to help evaluate the security of the service
  • the SaaS platform provider security processes adhere to ISO27001 & the NCSC Cyber Essentials Scheme. The SaaS platform has also been evaluated against the NCSC Cloud Security Principles

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct a Data Protection Impact Assessment (DPIA). The SaaS platform in scope for this assessment does not store or process any Personal Identifiable Information (PII) beyond an Id that links to an application instance and the IP address of the applicant. It is recommended that a DPIA is carried out to ensure the team are aware of where this data is stored and how long it is retained for

8. Make all new source code open

As this service uses a SaaS platform, it is not possible for the code to be open sourced. Custom code has been written to ensure the application tracking system can integrate with the SaaS testing platform but this is also specific to the SaaS platform and hence not appropriate to be open sourced.

Decision

N/A for point 8 of the Standard.

  • this service uses a SaaS platform, it is not possible for the code to be open sourced. Custom code has been written to ensure the application tracking system can integrate with the SaaS testing platform but this is also specific to the SaaS platform and hence not appropriate to be open sourced

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • although the SaaS platform is proprietary, it makes use of the IMS Question and Test Interoperability specification (QTI)for storing question content. This should make it easier for the team to move content onto another platform if needed at a later date
  • the API that provides the integration between the SaaS platform and the application tracking system uses the SOAP messaging protocol specification

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the SaaS platform uses Continuous Integration and multiple environments to test changes and patches. A variety of different tests are run including unit tests and integration tests. Test environments use scrubbed production data to ensure the results are as accurate as possible
  • the SaaS platform publishes a release schedule in advance so the service team are aware of when changes will be made and can test features in a UAT environment linked to the application tracking platform before they are made live
  • the service team understand their capacity requirements and have confidence that the SaaS platform can comfortably handle
  • the service team test user journey changes on a wide range of browsers, operating systems and assistive technologies

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has plans in place for if the platforms they depend upon are down. This includes communication plans with recruiters and notifications for applicants. Manual workarounds can be used if absolutely necessary
  • the SaaS platform has an established incident process in the event of any outages with a support rota running 24/7. In the event of a serious incident, communications will be sent to all customers through the account management function
  • the SaaS platform architecture has been designed to be resilient to the loss of two data centers (Availability Zones). A second environment is also provisioned in a separate region for a manual restoration of service in the event of a regional failure. This operates to a Recovery Time Objective (RTO) of 24 hours. Data is regularly replicated to the second region

12. Make sure users succeed first time

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • design iterations have clearly been informed by extensive user research, considered content design and test design be practise to ensure the service reduces bias and is understandable by all users
  • the service team were clear in their presentation of the challenges they have faced and how they have tried to overcome them

What the team needs to explore

Before their next assessment, the team needs to:

  • while the team demonstrated very strong qualitative user research and quantitative methods such as the questionnaire issued to users who have successfully completed the test, the gap in live usage data can not be overlooked. Without data such as funnels demonstrating where users might be struggling or dropping out we can not be sure if users do in fact succeed the first time. Due to this lack of evidence the current outcome is not met. As highlighted in point 1, further work to understand unhappy paths should be done prior to reassessment
  • once analytics on user journey and behaviour is implemented we would like to see the team make a data driven decision on prioritising different areas of the design. For example, it was discussed that some elements of the design are less usable on mobile. Without data it is hard to judge whether this is problematic or not

13. Make the user experience consistent with GOV.UK

Decision

N/A to point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made considerable progress since their last assessment by ensuring the overall usability of their service
  • while the team may be exempt from aligning with GOV.UK due to platform and service not being on GOV.UK, it’s clear the team is trying their best to apply rigorously tested existing patterns. The team is slightly constrained by their software supplier but it was impressive to see these challenges are being overcome by the team and the collaboration between parties seems strong

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to resolve accessibility issues in the elements of the current design which don’t align with the GOV.UK design system - for example error messaging

14. Encourage everyone to use the digital service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is mindful that the digital service must be inclusive to all. This was demonstrated by their utilisation of DAC, their neurodiversity focused workshops, testing their service with GDS empathy lab and their accessibility audit

What the team needs to explore

Before their next assessment, the team needs to:

  • however, at the time of the assessment several accessibility issues remained unresolved, therefore the team must continue to address the points raised by their accessibility audit. The team should be able to demonstrate this point is fully met via a new accessibility audit to confirm the recent changes made have been successful

15. Collect performance data

Decision

The service did not meet point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • although it’s challenging that the team doesn’t have full ‘end to end’ performance data showing users as they move through the service, the team is making the best of the data they have and have been able to set some performance expectations based on baseline data
  • the portion of the service on Civil Service jobs feeds data into a central Data lake for the team to access - analysing high level metrics including number of vacancies using tests and “time to hire”
  • some Google Analytics data is available for the Civil Service jobs part of the journey and the team is following best practice by anonymising data
  • where web analytics data isn’t available (or as a supplement where it is present) the team is utilising qualitative data such as post-test completion surveys to enhance their understanding of the service and the team were able to highlight some of the constraints with these responses

What the team needs to explore

Before their next assessment, the team needs to:

  • as mentioned in point 12 above, the panel cannot overlook the lack of visibility of the full user journey. Therefore, the team needs to continue to work with their third party partner to ensure analytics tagging is added to the product roadmap (as outlined in their tender process) so that they can collect additional performance data and demonstrate how this is used to iterate

16. Identify performance indicators

Decision

The service did not meet point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has run a “performance framework” session where they identified KPIs and the data sources to measure these
  • when Google Analytics is applied across the service the team has a strong plan in place for what they want to measure and have clearly outlined how this could feed into the backlog, and how this will help the team improve the service

What the team needs to explore

Before their next assessment, the team needs to:

  • adapt the performance framework to make it clearer how performance data can be paired with user needs to drive improvements. The part of the framework presented at the assessment was quite high level - the team would benefit from linking the KPIs back to user needs, rather than only focussing on the strategic objectives. Here’s an example from GDS and the UK parliament website

17. Report performance data on the Performance Platform

Decision

N/A to point 17 of the Standard.

This is not applicable as the service is not transactional and therefore does not need a dashboard on the Performance Platform.

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the panel heard that the team has done previous testing with the Chief Executive or a Permanent Secretary for the overarching service
  • the team has, to an extent, a view on future programmes which will cover the Civil Service recruitment platforms and that will go through this testing

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the options to test this with the minister, or the Chief Executive or a Permanent Secretary, as the Online Tests and Assessment has created a significant change to the overarching user journey, added additional steps to the overarching journey, has introduced new teams (both internal and external)
  • it is also important, and recommended, that this testing covers the entire journey to give the minister or equivalent the opportunity to understand where Tests and Assessments (OTA) part specifically fits in into the entire user journey
Published 9 June 2020