Apply for a basic criminal record check alpha assessment

The report from the alpha assessment for the Disclosure and Barring Service's apply for a basic criminal record check service on 15 May 2017.

From: Government Digital Service
Assessment date: 15 May 2017
Stage: Alpha
Result: Met
Service provider: Disclosure and Barring Service

The service met the Standard because:

  • Understanding of user needs and iterating of the service accordingly
  • Having a sustainable, multidisciplinary team in place
  • Building using agile, iterative and user-centred methods

About the service


The service enables individual to apply for a Basic Criminal Record check on line, the output is a paper certificate which will include any unspent conviction information.

Service users

Any member of the public above the age of 16 is eligible to apply.


User needs

The team has a clear set of user needs and the panel was pleased to see concrete examples of where user research had influenced service design. For example, removing questions that causes users confusion and do not add value to the algorithm. It is clear that the service team is fully engaged in the user research process and was pleased to see that they have been proactive in attending and observing research sessions.

User research undertaken to date has been limited in both scale and scope and the panel understands that there have been mitigating factors for this outside of the team’s control. The first of the four rounds of user research undertaken was led by external consultants and was not methodologically sound; whilst the most recent was limited in scope by the arrival of purdah. Nevertheless, despite these limitations - as noted above - there is a focus on user needs, and design changes in response to user research are clearly in evidence.

The team is aware that it has been focusing on the MVP and that user needs for the service beyond this are not fully understood. Its plan is to cover these in future rounds of research, with topics including: a face-to-face solution for those unable to use the digital service; employer payment solutions; non-UK citizens; those without a biographic and/or financial footprint; and the end-to-end journey including Verify. The panel expects these to be fully evidenced in private beta and strongly recommends that the volume and reach of user research be increased significantly during this stage.

Since the second round of user research, it has been led by a contractor embedded in the team, and the plan is for this individual to be replaced by in-house resource that is currently being upskilled. The panel strongly recommends that the new user researcher undertakes formal GDS training in addition to the planned coaching.


The team clearly has established strong agile ways of working incorporating a flat co-located structure with a balance of permanent civil servants and contractors. The team has a strong core of a Service Manager, Product Owner and Tech lead and are using recognised agile tools and processes (such as scrum and JIRA)

The team has been engaging with other digital teams from across central government and wider public sector around finding best practice on delivering digital services.

There is a clear demonstration of appropriate governance, with the Service Manager having access to senior decision makers within the organisation who are brought into the service and its approach to delivering the service. The team have organised show and tells for senior people and key stakeholders within the organisation to aid in regular decision making.


The service has evolved quickly in the last six months, utilising relevant, mature technologies and patterns appropriate to the service being delivered.

The panel were pleased at the high-utilisation of GaaP services and the willingness to explore future use of the GaaP suite as it expands.

The teams Agile delivery capability was well demonstrated by the use of behaviour driven development, a mature sprint cycle and build pipeline for fast team-wide feedback. The consideration given to security is notable at this stage with penetration testing tools integrated into the build pipeline, early threat modelling and appropriate consideration for security of personal data stored in the public cloud.

The team demonstrated that the dependency on rigid interface definitions into the backend systems was of some concern, but they had been able to challenge some assumptions and inputs to those endpoints without adversely affecting the end goal of the service. It was positive to hear that there is a plan to further investigate the structure of those interfaces when the backend services are delivered.

The team has done admirable work on extending open-source libraries and intends to contribute that work back to the community. It would be sensible for the team to ensure that there is a plan to open-source more of the service as they move to beta.


The team has tested and iterated 4 prototypes during Alpha. They have done good work to remove unnecessary stages from the application, getting agreement with senior stakeholders in the organisation. The legacy backend imposes some limitations on this process, and it’s encouraging to see that the team intend to keep simplifying the service despite this.

At alpha stage there could have been a wider range of design approaches tested. The panel would recommend looking further into discussed ideas such as allowing users to save and return, and the option to get a fully paper-free certificate.

The design and content design is consistent with GDS design patterns.

There are competing background checking services available online, which rank higher on Google because they buy ad space. The team should have more of a plan of how to mitigate against this, as their new service should be cheaper, faster and more secure for users — it would be a shame for them not to use it.


The team have been considering the types of metrics that would be needed to manage the performance of the digital service as well as the end to end service as a whole. It is important for the team to focus on establishing meaningful metrics which will aid in the iteration and improvement of the service for both the service team and organisation as a whole, but avoid setting and reporting on metrics for metrics sake.


To pass the next assessment, the service team must:

  • Making code available as open source
  • Defining KPIs and establishing performance benchmarks
  • Managing data, security level, legal responsibilities, privacy issues and risks

The service team should also:

  • Establish a plan for private beta which incorporates both straightforward and complex applications and has key measurable outcomes.
  • New user researchers should undertake formal GDS training in addition to the planned coaching.
  • Ensure that there is a plan to open-source more of the service as they move to beta.
  • During private beta, the team should develop plans for the service being offline for when the service goes into public beta.

Next Steps

pass - Beta

You should follow the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 24 July 2018