Official Statistics

Quality statement

Updated 13 July 2023

Applies to England and Wales

Introduction

These statistics are published in compliance with the Ministry of Justice (MoJ) quality strategy for statistics, which states that information should be provided as to how the bulletin meets user needs.

The MoJ aims to provide a high quality and transparent statistical service covering the whole of the justice system to promote understanding and trust. This statement sets out our policies for producing quality statistical outputs and the information we will provide to maintain our users’ understanding and trust.

Quality management and assurance

A number of quality assurance processes are carried out to ensure that the published statistics are fit for purpose. These include:

  • For judiciary data:
    • a series of checks are carried out to ensure that all variable values are within expected ranges (e.g. age is plausible given the retirement ages for judicial appointments). Where issues are identified, these are fed back to the HR team and investigated; where changes are required, data is re-extracted and re-checked.
    • a consistency check of data for leavers, new entrants and those in post is made (for example, leavers should not also be counted as holding appointments; new entrants should not appear in the previous year’s data).
    • a high-level sense check is undertaken against the previous year’s data, and with the judicial appointments team, to check that any changes are plausible in light of recent appointments.
  • For judicial appointments data:
    • data are independently extracted from the applications system by two statisticians, and any discrepancies resolved. Figures for each exercise are also cross-checked with those held by the JAC programme office.
  • For legal professions data:
    • data are provided by professional or regulatory bodies in aggregated form; basic consistency checks are carried out (e.g. checking row and column totals). Figures are checked for plausibility against data published by the professional bodies based on other sources.

Dimensions of quality

The following considers the judicial diversity statistics against the different dimensions of statistical quality, as outlined by the Government Statistical Service.

1. Principle 1: Relevance

Relevance covers the degree to which statistical information meets user needs.

These statistics present information on the diversity of the judiciary, judicial appointments and the legal professions who make up the eligible pool for legal judicial roles. They allow users to make an assessment of how diverse the judiciary is at different levels, and at different stages of the appointments process.

Information on known and assumed users and uses of these statistics is given in the guidance which accompanies this release. While we believe that the statistics currently meet these needs to a sufficient degree, we have identified a number of areas where we hope to develop them further, based on user feedback. One of the main areas is to, where required, expand on the coverage of characteristics collected and improve on the declaration rates for those already collected but are as yet too low for analysis.

We welcome feedback from users and will use this to develop a better understanding of user satisfaction and any unmet needs.

2. Principle 2: Accuracy and reliability

Accuracy refers to the closeness of estimates to the true values they are intended to measure.

Reliability refers to the closeness of estimated values with subsequent estimates.

Overall, these statistics are considered to be sufficiently accurate for their use. Data are derived from various administrative systems. While extensive validation of the data is undertaken to ensure figures are accurate, as with any large scale administrative database, there may be some inherent degree of inaccuracy within the figures presented.

Coverage errors

The operational uses of the data – for judicial HR, managing judicial appointments or maintenance of membership lists – means it is likely overall numbers are close to the true values. However, as figures represent a snapshot at one point in time and there can be a lag between changes occurring (e.g. a judge leaving) and the system being updated, it is unlikely that the figures will be exactly accurate. Data for this bulletin are extracted to represent the position as at 1 April in each year. This snapshot is taken some time after the reference date to enable updates to be made, better reflecting the true position as at the reference date.

For judicial appointments, the JAC relies on the information held in the JARS database (for pre-2020) and on the JAC digital platform (for more recent data) for operational purposes, and so has a clear incentive to ensure that information is highly accurate. In addition, the data presented in the Official Statistics are also subject to quality assurance procedures to ensure internal consistency and consistency with other records relating to the selection exercise.

For the judiciary, the extent of discrepancies can vary by appointment. It is believed figures for judges are a close approximation to the actual number, though there can be cases where e.g. individuals hold more than one appointment and these are not correctly linked on the system. However for magistrates, following a data reconciliation exercise carried out in 2020, it was discovered that previous years totals were overstated by as many as 1,000 as a number of leavers had not been correctly removed from the HR system.

As noted above, it is unlikely that the eligible pool figures presented here will exactly correspond to those eligible for judicial appointment (as there are some – likely to be a relatively small number - who are eligible who are not included in the pool as defined); however, it is considered that the pools presented are sufficiently useful as a guide and comparator. Further work to analyse the pool is planned prior to the next publication.

Measurement errors

For the diversity characteristics, while age is usually reliable (as based on date of birth), other characteristics, including ethnicity, are based on self-declaration by individuals. An individual’s perception of, for example, their ethnicity may not align with what others would consider it to be, and in some systems it is possible for individuals to change their information at any time. An analysis based on data for judicial appointments suggests that the impact of this is minimal, but not non-existent; for example there were up to 10 cases (over several thousand applications) where inconsistencies were identified e.g. the same individual had recorded different social mobility status.

For judicial appointments, age is recorded at the time of the close of applications. Accordingly, it is possible that age group distributions at the shortlist and recommendation stages may deviate slightly from the age groups presented. Such differences, if any, would be very small and non-material.

Non-response errors

Where a diversity characteristic is self-declared and non-mandatory, invariably there will be a proportion of individuals that have not declared, meaning their status is unknown for that characteristic. Therefore an assumption is made that there is no bias in whether individuals choose not to declare when calculating percentages of representation. As a result, there is a level of uncertainty around the figures that increases in relation to the proportion of unknowns, and so where the declaration rates (included alongside figures in data tables) are below 60%, calculations e.g. of representation rates are not shown.

Given the high declaration rates (typically 90% or better) for those characteristics included in the publication, we consider that the overall impact is likely to be small, however for smaller subgroups – e.g. particular appointments – it can be larger.

In particular, the database of the self-declared fields (including ethnicity and professional background) for the judiciary may be incomplete as (a) judicial office holders are asked to provide the information on a voluntary basis and to a lesser extent (b) such details have only been collected since October 1991. Further ethnicity data was collected from judicial office holders in post through a diversity survey undertaken by the Judicial Office in 2007. In May 2009, the Judicial Office began collecting ethnicity data from all new judicial appointees.

Processing errors

These are believed to be minimal, following the quality assurance process outlined above. We have started to further streamline data processing through the use of a Reproducible Analytical Pipeline (RAP) approach. The approach to data revision is described in section 7 below.

Sampling errors

As these data are from administrative systems, there is no sampling error as such. However, for judicial appointments, where RRIs are calculated, these are accompanied by confidence intervals to illustrate the natural variability of calculations based on small numbers.

3. Principle 3: Timeliness and punctuality

Timeliness refers to the time gap between the publication date and the reference period for the statistics. Punctuality is the time lag between the actual and planned dates of publication.

In 2020, data relating to 1st April 2020 (or the year ending 31st March 2020) were published in September, in order to allow sufficient time to develop the new combined publication. In 2021, this publishing date was moved to July i.e. around 3 months after the period to which they relate. Given that data for all judicial selection exercises completed within a year are published annually, this means that there can be a longer period between the completion of an individual exercise and the publication of data for it but this delay is considered justifiable to allow the orderly publication of annual statistics[footnote 1].

The 2022 publication is the third time a combined publication of judicial diversity statistics has been produced, however to date all publications have been released on the scheduled date.

4. Principle 4: Coherence and comparability

Coherence refers to the extent to which statistics produced by different statistical processes may be used in combination. Comparability refers to coherence across different time periods and geographical regions.

Geographical comparability

Figures in this publication relate to England and Wales[footnote 2]; Scotland and Northern Ireland are separate jurisdictions which publish their own statistics on judicial diversity. Comparisons between jurisdictions should be made with caution – for example, roles and the appointments process are different. Links to published statistics for other jurisdictions are given below, but direct comparison is not made within the publication.

For the judiciary, a regional breakdown is presented. In making comparisons between regions, the diversity of the local population and the mix of judicial roles within the region should both be taken into consideration.

Comparability over time

For the data relating to the judiciary, comparisons over time should be generally reliable, although the HR system and way that diversity data are recorded has changed several times in recent years. In the publication, comparisons are made back to 2014, a period when the main types of judicial appointment have been relatively consistent (any notable changes are noted via footnotes in the data tables)[footnote 3]. In addition, levels of representation within specific groups on these diversity characteristics may change year on year due to staffing movements including flows in and out (e.g. recruitment, resignations and retirements) and internal moves (e.g. promotions). The recruitment exercises run in recent years by the Judicial Appointments Commission (JAC) may also impact on diversity.

For judicial appointments, caution is needed in making comparisons between years as the selection exercises run from year to year will vary and this will impact on comparability. In the data tables, time series figures are presented for specific appointments which can be compared more reliably, though in the report the focus is on the latest year only. However, data relating to exercises that occurred prior to the release of this information as Official Statistics (in 2010) may not have been subject to the same level of quality assurance.

The data and associated representation percentages can be sensitive to when exercises are formally closed and how this compares to the report’s cut-off date of 1 April in each year. For example, there are often three large fee-paid exercises run on an annual basis (Recorder, Deputy District Judge, and Fee-paid Judge of the First-tier Tribunal and Employment Tribunal). The latter two exercises were still open on 1 April 2021 and so were not included in the 2020/21 report but have been included in this year’s statistics (for 2021/22). For this reason, we advise further caution when comparing judicial appointment statistics from one year to the next. It is worth noting that an ad-hoc experimental statistical report Statistical analysis of candidate progression through judicial selection tools was published in December 2021. It is based on the data from judicial selection exercises concluding over an extended period of time between 1 April 2015 and 31 March 2021.

Currently only one year’s data is presented for legal professions; we will explore whether it is possible to include a time series in future.

Coherence

The combined publication brings together data from different sources to present a picture of diversity in the judiciary, judicial selection and in the legal professions.

We consider that overall there is sufficient coherence between the different sources. While they are based on different systems, the key diversity characteristics (gender, ethnicity and age for example) are coded in a similar way. With effect from December 2011, the JAC has shared diversity data on candidates recommended for immediate appointment with the Judicial Office where the individual confirmed they were content for the information to be shared.

However, figures on new appointments (new entrants or promotions) for the judiciary do not directly match the recommendations for appointment made by the JAC over the same period. The reasons for this include:

  • there are a small number of recommendations that will either not be accepted by Judicial Office or the applicant will withdraw
  • not all those recommended will be appointed in the same year, appointments will be made dependant on demand and applicants accepting the appointment and location offered
  • JAC statistics will include applicants that are already in the judiciary and those applying from outside of the judiciary whereas the new entry tables exclude those who already had a judicial appointment, and the promotion tables only includes judicial office holders that are changing their primary appointment and entering a salaried role.

Information on judicial appointments is also included within the JAC Annual Report, and figures should be broadly comparable. However, while the JAC Annual Report presents the number of applications for financial accounting reasons, this bulletin counts the number of applicants within selection exercises for diversity purposes. As a result, the number of applications, applicants and recommendations within selection exercises may differ slightly. Furthermore, when counting recommendations, the number of people who were recommended is counted, rather than the number of full-time equivalent vacant posts to which the recommendations refer. If a recommendation is for one individual for a part-time post, the recommendation counts as one person, not as a fraction of a post.

Similarly, figures presented for the legal professions may match equivalent figures presented by the professional bodies in their own reports, though the broad patterns shown should be similar. This is a result of the definition of the professions used (as given above), in particular the focus on those practising.

For example, the diversity data about solicitors included in the report, and also published in The Law Society’s Annual statistics report, is taken from the individual accounts that each solicitor has with the SRA and covers solicitors with a practising certificate. This is different from the diversity data that the SRA publishes in its firm diversity data tool which is based on the data that law firms report to the SRA every two years. This covers solicitors and others working in law firms, not solicitors working in other roles.

5. Principle 5: Accessibility and clarity

Accessibility refers to the ease with which users can access the statistics and data. Clarity refers to the quality and sufficiency of the commentary, illustrations, accompanying advice and technical details.

These statistics are freely available from the gov.uk website, in an accessible format:

  • The statistical report is available in HTML format
  • The data tables are published in Open Document Spreadsheet (ODS) format

The report has been reviewed to ensure that the commentary, which is written by professional statisticians, is clear and impartial, and that statistical terms such as Relative Rate Index (RRI) and practical significance are clearly explained, though feedback is always welcome (see section 15 of the report).

Cost and burden

The additional burden on individuals (legal professionals, members or the judiciary or those seeking judicial appointment) as a result of providing information for these statistics is considered minimal – as the data are drawn from administrative systems, and already collected for diversity monitoring purposes.

The cost of producing these statistics is therefore restricted largely to the cost of staff time for the statistical team who compile the statistics, together with data providers who extract and help to quality assure the data, and colleagues within the different organisations who have helped to inform the development of the new combined report.

Confidentiality

This is covered in the section on ‘confidentiality and disclosure’ within section 3 of the guidance accompanying this quality statement.


  1. Previous publications for the JAC were on a 6-monthly basis, but the frequency was reduced to annual following consultation with users. 

  2. However the Employment Tribunal Scotland is also covered 

  3. A new HR system (e-HR) was introduced for the judiciary in 2016, which rationalised a number of existing systems that contained HR and training data. However it is not considered that this unduly affects the time series comparisons made within this publication.