Guidance

Statement of compliance

Updated 21 April 2020

Background

About the Race Disparity Audit

The then Prime Minister Theresa May announced the Race Disparity Audit in August 2016. The objective was to publish government data on disparities between ethnic groups in areas like education, health, crime and policing, and housing.

This work was led by the Race Disparity Unit (RDU), working with the Cabinet Office, the Ministry of Housing, Communities and Local Government, Government Digital Service, and the Office for National Statistics (ONS).

The original Race Disparity Audit was published in October 2017 as a report which accompanied the Ethnicity facts and figures website. When the website was launched it included 104 measures (web pages of data about a particular topic). Since then the RDU has continued to add measures to the website. The website presents the data and commentary in a user-friendly way. It was developed as the tool to disseminate data for the Audit. This was following talks with a wide range of external partners to understand potential uses of the data, and to test demand for a website.

The RDU has also published a number of analytical reports using data from the website (for example, the Chinese ethnic group summary), and blog posts setting out the Unit’s future and highlighting methodological and quality issues with collecting and analysing ethnicity data.

This statement of compliance intends to broadly cover all aspects of the Audit since 2016 – that is, the original report, data and commentary on the website, plus associated analytical reports and blog posts. This statement replaces the previous version, published at the time of launch in October 2017.

About the Code of Practice for Statistics

The Code provides producers of official statistics with the detailed practices they must commit to when producing and releasing official statistics.

It ensures that the statistics published by the government serve the public. When producers of official statistics comply with the Code, it gives users of statistics and citizens confidence that published government statistics are of public value, are high quality and are produced by people and organisations that are worthy of trust.

The framework for the Code of Practice for Statistics is based on 3 pillars:

  • trustworthiness is about having confidence in the people and organisations that produce statistics and data
  • quality is about using data and methods that produce assured statistics
  • value is about producing statistics that support society’s needs for information

Each pillar contains a number of principles and detailed practices that users should commit to when producing and releasing official statistics.

Why we have a statement of compliance

Ethnicity facts and figures includes a mixture of official statistics (some of which are National Statistics), management information, and analyses of existing statistics that haven’t been published by government departments. We are planning to add local authorities’ data (subject to consideration of quality).

Accordingly, we have not considered the website to be an official statistics product, but the RDU has decided to apply the Code of Practice for Statistics on a voluntary basis. (We accept that an argument can be made that the website is, in fact, a set of official statistics - but regardless of this debate, our intention is to follow the Code as closely as possible).

This statement of compliance serves 3 purposes:

  • it shows how we are applying each of the 14 principles in the Code
  • it provides transparency for users by recording the way that we approach the ongoing work of the Race Disparity Audit
  • it identify areas where compliance with the Code can be enhanced in the future

To achieve the purposes of this statement of compliance we have described our approach to the 14 principles, drawing on the detailed practices as appropriate. This document ends with a summary of actions.

For convenience we have included the Statistics Authority’s description of each element of the Code.

Trustworthiness

Confidence in the people and organisations that produce statistics and data.

Trustworthiness is a product of the people, systems and processes within organisations that enable and support the production of statistics and data. Trustworthiness comes from the organisation that produces statistics and data being well led, well managed and open, and the people who work there being impartial and skilled in what they do.

All of our work on the Audit has been guided by the public interest. The decisions we make – for example, about the range of data to be presented, the choice of time periods, quality assurance processes, and topics for analytical reports – have been guided by a clear commitment to transparency, accessibility and objectivity and continue to be shaped by extensive user testing.

By ‘public interest’ we are referring to the Audit’s capacity to demonstrate in an accessible way how people of different ethnicities are treated by public services.

Honesty and integrity

“People in organisations that release statistics should be truthful, impartial and independent, and meet consistent standards of behaviour that reflect the wider public good.” (Code of Practice T1: Honesty and integrity.)

Any appearance of biased or partial presentation of data might challenge the ongoing trustworthiness of Audit and the data on the website that underpins it.

We work closely with data suppliers in government departments to ensure that analysis and commentary identifies the salient points, and presents them clearly.

Independent decision making and leadership

“Organisations should assign a Chief Statistician/Head of Profession for Statistics who upholds and advocates the standards of the Code, strives to improve statistics and data for the public good, and challenges their inappropriate use.” (Code of Practice T2: Independent decision making and leadership.)

The analytical team in the RDU is currently led by the Chief Statistician of the Cabinet Office. This role has the lead responsibility for ensuring objectivity of different parts of the Audit, challenging the incorrect use of statistics, advocating the use of the 3 pillars of the Code of Practice with respect to the Audit, and encouraging collaboration. This last point is already an important part of the work of the Audit, as data is collected from other departments and republished on the website and in summary reports.

However, reflecting the origins and nature of the Audit – a relatively new, high profile statistical exercise with policy implications – we have accepted that decisions about content and timing of analytical reports will be influenced by Ministers. Indeed, we consider that Ministers’ commitment to the Audit, and their engagement in its development and in shaping the policy direction, have been crucial to the delivery of such a large cross-governmental project.

We anticipate that in future we will be able to enhance compliance with this aspect of the Code, while ensuring that Ministers and relevant policy colleagues – as users of the statistics – continue to have an input to the coverage of the Audit.

Orderly release

“Organisations should commit to releasing their statistics in an open and transparent manner that promotes public confidence.” (Code of Practice T3: Orderly release.)

The considerations around orderly release are particularly important when new statistics are released. The website draws together previously published statistics provided by government departments (or at least statistics that are in the public domain if not published by departments explicitly). However, we believe that we have complied with the spirit of the Code in many ways. For example:

  • the data on the website is updated as soon as possible once we have received and quality assured data from government departments
  • we release data and reports on a regular and timely basis
  • the topics for analytical reports and blog posts are informed by identified user needs both within the Cabinet Office and outside
  • we are clear and transparent on corrections we have made to outputs
  • we ensure separation of statistical and policy statements (and policy announcements are listed on GOV.UK)

We think that we could enhance compliance in the future by:

  • pre-announcing the publication of updates well in advance
  • ensuring we publish at 9:30am
  • naming a lead analyst on ethnicity summaries and topic reports

We consider that release arrangements that follow established best practice will enhance trust in the Audit.

Transparent processes and management

“Organisations should have effective business processes and appropriate resources to support their statistical functions and be open about their plans, priorities and progress.” (Code of Practice T4: Transparent processes and management.)

While the Audit was a No.10 commission, from the earliest stages of work the RDU voluntarily set the ambition to comply with:

  • the Code of Practice – in practical terms this was realised by close engagement with the UK Statistics Authority’s Office for Statistics Regulation, the ONS Good Practice Team, and the secondment of analytical staff to the RDU from across the Government Statistical Service
  • the Government Digital Service’s (GDS) service standard – a set of criteria to help the government create and run good digital services.

All public facing transactional services must meet the standard. It is used by departments and the GDS to check whether a service is good enough for public use. The first of the criteria is around understanding user needs.

Clearly defined processes and procedures are important aspects of good governance.

Among these, we developed (with input with our technical working group) a set of statistical principles for the Audit, covering:

  • its scope and coverage
  • types of data
  • responsible analysis and meaningful comparisons
  • commentary and presentation
  • quality and quality assurance
  • identifying gaps and harmonisation
  • ethnicity (classifications)
  • geography
  • specific statistical issues such as the presentation of data about income, and disclosure control

We also:

  • used the standard Government Digital Service (GDS) development model, including testing alpha and beta versions of the web tool with users
  • used Google Analytics to understand users’ journeys through the web tool
  • commissioned gateway reviews at the alpha and beta stages, and implemented the recommendations

While the RDU has detailed and extensive quality assurance and management processes in place, we can enhance our compliance with the Code by making the detail of these processes more transparent. These processes will include:

  • the agreements we have with departments to supply data, and the quality assurance checks we expect from them
  • the internal checks we carry out on data that has been supplied for each measure, and the flat files
  • the process for checking and clearing commentary with departments
  • checks that are done during and after content design to the commentary, charts and tables

Professional capability

“People producing statistics should be appropriately skilled, trained and supported in their roles and professional development.” (Code of Practice T5: Professional capacity.)

In the early days of the Audit, analytical staff in the RDU were seconded from analytical professions in government and already had appropriate skills and experience.

More recently RDU has moved towards a mixed model of secondments plus permanent staff. Recruitment is done by fair and open competition using the relevant analytical competency framework.

The analysis team currently consists of economists, social researchers and statisticians, and this mix brings a wealth of different skills and experiences to the unit. As noted in a previous section, the team is currently led by the Chief Statistician of the Cabinet Office. Continuous professional development is prioritised to ensure that the capability of the analytical unit remains high.

This includes development of analytical skills, Civil Service-wide training on (for example) managing information securely, and representing the unit on relevant wider analytical networks such as the GSS Harmonisation Champions Network, the GSS Quality Champions Network, and the [GSS Geography Champions Network](https://gss.civilservice.gov.uk/about-us/champion-networks/geography-champions/. These groups (and others) also provide an invaluable source of support to the analytical team. The analytical team is also supported by the digital team in RDU who administer the website.

As the analytical team moves towards more innovative solutions to collating, analysing and presenting data (such as the use of Reproducible Analytical Pipelines), we need to make sure that development continues to reflect these new techniques, for example new coding skills.

Data governance

“Organisations should look after people’s information securely and manage data in ways that are consistent with relevant legislation and serve the public good.” (Code of Practice T6: Data governance.)

The RDU does not collect personal, sensitive data about individuals using surveys or derive it from administrative data sources. Aggregated statistical data from departments is held on the Ethnicity facts and figures website infrastructure which has been built and subsequently developed using GOV.UK service standards, however.

For people who contact the team with queries, the website has a clear privacy statement on how their information will be used.

For statistical outputs, we have taken advice from the Office for National Statistics and also from the statistical experts in each department that has supplied data to us on how to protect the confidentiality of individuals. For each measure we have described the particular approach taken to suppress data that might otherwise breach confidentiality.

In the future, the RDU will move towards linking (or commissioning linked) datasets from different sources to develop more sophisticated analyses, or improve the quality of ethnicity data (for example for ethnic groups with small populations). This will potentially increase the risk of disclosure of sensitive information pertaining to individuals, so we will discuss with the Office for Statistics Regulation whether we need to take additional steps to ensure that we comply with the Code.

Quality

Data and methods that produce assured statistics.

Quality means that statistics fit their intended uses, are based on appropriate data and methods, and are not materially misleading. Quality requires skilled professional judgement about collecting, preparing, analysing and publishing statistics and data in ways that meet the needs of people who want to use the statistics.

It is helpful to distinguish between the quality of each part of the Audit (including individual measures, the website and summary reports) and the quality of the Audit as a whole.

In prioritising the data to be included in the Audit initially, the criteria we adopted included ‘quality’ and ‘relevance’. We are guided by statistical experts across government about the quality of particular sources, and where possible we have drawn on official (including National) statistics for data on the website, and for summary reports.

For each measure on the website we have included background sections covering a brief summary of noteworthy aspects of the data sources and the associated methodology, and relevant web links. Quality assurance arrangements have been adopted that make the most of the expertise of departmental statisticians.

At the level of the Audit as a whole, we have enhanced quality by ensuring that data and reports cover a broad range of topic areas, and use established presentational frameworks and statistical classifications.

We also included in the overarching analytical report a clear summary description of some important considerations of quality:

  • that there are some areas of public services where there is little or no data about ethnicity
  • that where relevant data is collected, a common challenge is having insufficient numbers of cases to study in the ethnic minority groups – inevitably this limits the degree to which firm conclusions can be made about differences between ethnic groups, and the ability to take account of other factors in analysis in addition to ethnicity
  • that the quality of data on the ethnicity of individuals varies and is generally better when reported by people themselves, as it is in surveys and the Census – administrative data (such as is collected from service users) can suffer high levels of non-recording of ethnicity and overuse of ‘other’ categories, undermining the ability to identify differences in how people in each ethnic group are treated

RDU has published its Quality Improvement Plan, which outlines the way that we will work with departments to improve the quality of ethnicity data. It will cover areas such as:

  • increasing harmonisation (based on the 2021 Census classifications)
  • improving the quality of data for ethnic groups with small populations
  • innovating to improve processes
  • filling data gaps
  • data linking to increase the amount of data available, and data quality

This will further help to enhance our compliance with the Code in the areas of quality.

Suitable data sources

“Statistics should be based on the most appropriate data to meet intended uses. The impact of any data limitations for use should be assessed, minimised and explained.” (Code of Practice Q1: Suitable data sources.)

The scope of the Audit is UK government data. At the launch of the Audit, there was an initial review that required all government departments to identify what data they held that could be analysed by ethnicity. The review identified a vast amount of information, some of which was already published and some that had not yet been analysed for ethnic groups.

The data identified by the Audit was very varied in quality and depth. It spanned the Census, published official statistics, numerous government surveys and departments’ own administrative records.

Some datasets were prioritised for inclusion in the first release of the Audit. The initial criteria for prioritising data reflected its quality, readiness, manageability and relevance to concerns identified by users of the data, including members of the public, NGOs, public services and government departments themselves.

The emphasis was on opening up data to the public where it was reasonably reliable, with caveats as necessary. In prioritising we were aiming to:

  • cover subject areas that matter most to our users – for example, health, education, work
  • provide more granular ethnicity breakdowns – that is, favouring a higher number of different ethnicities than merely ‘white or non-white’
  • include breakdowns by geography, income and gender
  • provide more granular breakdowns by local area or business unit – for example, school, prison, hospital

The prioritised list for launch was approved with government departments and No.10. Since then many more datasets (known as ‘measures’ on the Ethnicity facts and figures website) have been published, and there are now more than 170 measures available.

The RDU has concentrated both on the addition of new measures, and broadening the scope of existing ones to provide further disaggregated data such as by ethnicity and gender, or by ethnicity and local authority. We are also formalising the data supply with government departments by setting up agreements between the RDU and data suppliers.

Measures are still being updated, but we are now adding relatively fewer new measures to the website. We are increasingly focusing on providing more summary analyses of existing data (in the form of topic reports and ethnic group profiles).

The RDU has also written a number of blog posts highlighting issues associated with the collection and for certain datasets, or more generally when collecting or analysing ethnicity data. These include:

We are also planning a series of methods and quality reports . The reports fulfill 3 objectives.

First, they provide a way of making users aware of methodological issues that are relevant to their interpretation of data and RDU’s work to improve the quality of ethnicity data. They will complement the ‘Things You Need To Know’ section included on each web page. For example, the first report about ‘relative likelihoods’ emphasises the importance of considering whether they are statistically significant, as well as how large they are – and explains how to go about doing so.

Second, they will help us improve our website. Where our methodological work points to improved ways of deriving, presenting or explaining data, we will make the necessary improvements. For example, our report on the ‘Mixed and Other ethnic groups’ will allow us to draw users’ attention to the variation within these broad groups where we present information for these groups on the website.

And third, they will enable us to reach out to statisticians and other analysts who might be working on similar issues (either related to ethnicity, or technical statistical issues), to develop the community and to develop good practice.

We also recognise that a primary objective for the Unit going forward is to open up the opportunity for more analysis by the linking of datasets.

Sound methods

“Producers of statistics and data should use the best available methods and recognised standards, and be open about their decisions.” (Code of Practice Q2: Sound methods.)

To the greatest extent possible the Audit brings together official statistics (and in many cases National Statistics) because the methodologies underpinning these are demonstrably robust.

The RDU has taken the advice of experts across the Government Statistical Service about the strengths and limitations of different sources. In some cases, for example, we have aggregated more than one year of data to provide sufficiently reliable data.

We continue to collaborate with departmental statisticians on methods and standards through bilaterals and more formally through the Data and Digital Group of RDU staff and cross-government experts. This group meets 3 times a year to discuss data issues.

Each measure on the website includes a summary of the aspects of methodology, prepared in an accessible form in a section called ‘Things you need to know’. This includes details of the data source, the approach taken to data suppression (confidentiality protection), rounding, and whether there are any changes in methods between different years in the data (discontinuity in the data). In this way, we are being transparent about methods used, and the limitations of the datasets.

The main challenge to coherence has resulted from the different categorisations of ethnicity adopted across different data sets, and also as a result of changes to the categorisation over time. We have included a dashboard listing the ethnicity classifications used for each measure, and a description of the 2011 harmonised classifications for ethnicity.

The geographical coverage of the Audit is the UK, and we have included data about the position in Wales, Scotland and Northern Ireland where the data has been provided to us by (UK) government departments. We engaged with all the devolved administrations throughout the Audit to discuss the processes and systems involved and the scope to include data about devolved topics.

In order to benefit from best practice, and others’ expertise, rigorous public consultation and testing, the measurement framework used by the Equality and Human Rights Commission (EHRC) was referenced. The EHRC framework measures and monitors progress on equality and human rights in England, Scotland and Wales and consists of a number of domains, indicators and measures.

It aims to “describe the central and valuable things in life that are important to people and provide them with opportunities to flourish such as enjoying an adequate standard of living, being healthy, having good opportunities for learning and education, having legal security, and being free from crime and the fear of crime.”

There are a number of parallels between the EHRC domains and the areas that the government is responsible for. In their final public consultation in early 2017, EHRC identified education, work, housing, health, living standards and social care, security and justice, participation and private life as their 7 proposed domains. This helped guide the 6 domains we now use, which give structure to the underlying ethnicity facts and figures:

  • community and community action
  • crime, justice and the law
  • education
  • health
  • housing and living standards
  • work

EHRC have a framework model using domains, indicators and measures:

  • domains reflect the things or areas in life that are important to people and enable them to flourish
  • indicators are intended to capture and define the underlying concept that we are trying to measure
  • measures capture and define the specific statistics that we are using to measure the underlying concept

The RDU framework for the website has a parallel structure in the form of topics, sub-topics and measures:

  • topics reflect the areas within the public sector that are important to people
  • sub-topics group together similar subject areas within a topic
  • measures capture specific facts and figures on a subject e.g. victims of crime, unemployment

We did not base our framework on government structures or departments. Instead, the framework is based on people’s perceptions and how they group subjects together. The website is structured around this framework and has been thoroughly tested to check that users can navigate easily and find the content that they are looking for.

Assured quality

“Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely.” (Code of Practice Q3: Assured quality.)

The RDU has a multi-stage process to quality assurance. In using statistics held (and in most cases already published) by other government bodies, we can make use of the quality assurance processes used by those responsible for collecting and compiling the data originally.

The RDU’s processes have themselves emphasised quality assurance. With initial input from the technical working group we have developed templates so that data, metadata and important points, provided by departments, are standardised.

Data provided by departments for the website is checked, and any apparent problems discussed with the suppliers. The commentary about the statistics is based on material provided by the departmental expert and checked by the RDU’s analysts and content management team in order to ensure transparency, objectivity and consistency.

Each measure on the website includes information about relevant aspects of quality and methodology. As noted above, we are also clear and transparent on corrections made to published pages.

Value

Statistics that support society’s needs for information.

Value means that the statistics and data are useful, easy to access, remain relevant, and support understanding of important issues. Value includes improving existing statistics and creating new ones through discussion and collaboration with stakeholders, and being responsible and efficient in the collection, sharing and use of statistical information.

We consider that the rationale for and nature of the Audit are absolutely aligned with the principle of securing public value.

Relevance to users

“Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted on, and their use of statistics supported.” (Code of Practice V1: Relevance to users.)

We have put considerable effort into engaging with users about the scope of the Audit, the ongoing delivery through the Ethnicity facts and figures website and the content of analytical reports.

We have spoken to hundreds of users since the project started. This has helped us understand:

  • who our users are
  • what their different needs are
  • how those needs might already be met elsewhere
  • what problems they had that we could solve

The different groups of users included:

  • members of the public from diverse ethnicities and backgrounds
  • policy and programme officials and analysts from central and local government
  • organisations outside of government
  • academics
  • public service managers from sectors such as education, employment and health – for example, headteachers, job centre managers

Their needs vary greatly, as does their comprehension of statistical data. We have to present our content in a clear and meaningful way to non-experts in statistics and data. We also need to make sure it is accessible to people with disabilities.

Users with more expertise in statistics and data manipulation needed access to the data and richer background information to give context on how it was collected and analysed. Our users came from a variety of ethnic backgrounds, locations and demographic profiles to reflect the diversity of the website’s user base.

We released alpha and beta versions of the website (showing only data that had previously been published) to selected users and stakeholders, prior to launch.

RDU went for a live assessment of the site on 18 June 2019 (that is, moving from the public beta phase to the live phase), and got notification that we had passed on 3 July 2019.

Because the site had been around before July 2019, we were assessed against the old, 18-point version of the Digital Service Standard.

Accessibility

“Statistics and data should be equally available to all, not given to some people before others. They should be published at a sufficient level of detail and remain publicly available.” (Code of Practice V2: Accessibility.)

We have been guided by the statistical experts across government about the quality of particular sources, and where possible we have drawn on official (including National) statistics. We have viewed this expert guidance through the lens of accessibility and we are committed to ensuring that the different products in the Audit meet the needs and capture the attention of a wide range of users, including non-specialists.

So, for example, in many places we have presented estimates in a rounded form, rather than showing decimal places. Our judgement has been that, in the context of the Audit, the slight loss of precision is outweighed by the benefit of making the data and the messages easier to absorb.

As a web-based statistical service, the information about the measures included on Ethnicity facts and figures is freely available. Each measure includes metadata about data sources and aspects of the underlying statistical methodologies, for example, which we have sought to make as accessible as possible to a wide range of users. We have also provided links to more technical information produced by the department which provided the data, and the contact details of the department’s statistical expert.

Content designers review each website measure to make sure the content:

  • is written in plain English
  • avoids jargon and explains technical terms clearly and simply
  • is consistent across the website
  • highlights important information, caveats, or inconsistencies

The RDU’s Digital Team has also developed a style guide for the website. It describes writing principles that have been developed to make sure that content is clear, meaningful and trustworthy to all users.

To make things accessible for people with disabilities, we’ve always used the GOV.UK design system for layouts, templates, headers, footers and other aspects of design. The components in the system have been thoroughly tested, so we got a good level of accessibility from launch.

The charts, data tables, table of contents used a combination of good practice and automated accessibility testing.

Before the live assessment RDU commissioned the Digital Accessibility Centre (DAC) to do an accessibility audit of the website and content management system. That involved automated tests against checkpoints in the web content accessibility guidelines (WCAG) 2.1, and manual testing with assistive tech (with a specified list of technologies as per the service manual). The testers were disabled users who use the assistive technologies in their everyday lives, and we were able to go and visit DAC and interview them while they were testing the site.

The audit report gave RDU a list of improvements which we are currently working through. The objective is to incorporate them, re-test and publish an accessibility statement shortly.

Clarity and insight

“Statistics and data should be presented clearly, explained meaningfully and provide authoritative insights that serve the public good.” (Code of Practice V3: Clarity and insight.)

The purpose of the Audit is to describe, using statistics, the extent of differences in outcomes or treatment for people of different ethnicities. It is important to emphasise that the Audit is not seeking to explain why differences exist – in preparing commentary about the different measures we have avoided attributing causes for disparities.

The website and reports have been designed to meet the needs of a wide range of user personas. The statistics, data and explanatory material are presented in a clear, and accessible way, whether on the website or in accompanying summary reports. For example, on the website, both summary charts and tables are presented for each measure, and the reports use a mix of charts and maps designed using digital best practice.

Innovation and improvement

“Statistics producers should be creative and motivated to improve statistics and data, recognising the potential to harness technological advances for the development of all parts of the production and dissemination process.” (Code of Practice V4: Innovation and improvement.)

As far as we are aware, the Audit is unprecedented in scope and transparency. We consider that it is built upon well-established approaches to bringing statistics together from different sources – the ONS Social Trends publication for example – but the truly innovative aspects of the Audit are rooted in the exploitation of technology to provide users not only with descriptive statistics but also the underlying data to support secondary analysis.

And the provision of information about the measures in open source format means that others can take the data and use it for secondary analysis.

The RDU is now taking forward further innovations for the website. These include looking at expanding the amount of local authority data on the website and also exploring more geographic functionality on the website to allow users easier access to this data. A recent blog post highlighted the importance of more local authority data to the RDU.

More generally, there is an emerging area of work on intersectionality – that is data disaggregated by ethnicity and other variables such as geography, and also gender, disability and age, for example, to help inform policy and decision-making.

More innovation will come in making use of Reproducible Analytical Pipelines (RAPs) to reduce burdens on data suppliers, increase efficiency and improve quality assurance processes. We are focusing here on a pilot project to improve the flow of data from departments.

Efficiency and proportionality

“Statistics and data should be published in forms that enable their reuse. Producers should use existing data wherever possible and only ask for more where justified.” (Code of Practice V5: Efficiency and proportionality.)

All of the information presented on the website and in our summary reports is sourced from existing survey and administrative data held by government departments. There have been no costs of new data collection. We have sought to balance the burden on departments of providing us with data with our need for common formats and data structures, and we have worked closely with government analysts in developing our templates and specifications.

The RDU is currently reviewing the importance and popularity of the measures on the website using Google Analytics. This review, along with other forms of user engagement such as user laboratories and meetings with academics and local authorities, allows us to understand both where we might develop more analysis of topics of greatest user interest, and also identify where there might be evidence gaps. It might potentially reduce the data supply burden on some suppliers.

The provision of the underlying data in CSV format has enabled others to explore the data in full. We encourage feedback from users that will help us to enhance the service that the website represents.

Summary of actions

We intend to undertake the following actions to improve compliance with the Code of Practice:

  1. Pre-announcing the publication of updates well in advance.

  2. Ensuring we publish at 9:30am on the day of publication.

  3. Naming a lead analyst on ethnicity summaries and topic reports.

  4. Making the detail of quality assurance processes more transparent by publishing them on the website.

  5. Data linking will potentially increase the risk of disclosure of sensitive information pertaining to individuals, so we will discuss with the Office for Statistics Regulation whether we need to take additional steps to ensure that we comply with the Code.

  6. We will undertake the actions in the RDU’s Quality Improvement Plan that will increase the quality of the data that we publish. Actions include:

  • increasing harmonisation (based on the 2021 Census classifications)
  • improving the quality of data for ethnic groups with small populations
  • innovating to improve processes
  • filling data gaps
  • data linking to increase the amount of data available, and data quality
  1. We will improve the accessibility of the Ethnicity facts and figures website and publish an accessibility statement.

  2. We will take forward work on geography and intersectionality to increase the value of the data for our users.

  3. We will publish a series of ongoing methods and quality reports to help users understand and interpret our data more effectively.

  4. We will continue to publish a series of user-friendly ethnicity summaries and topic summaries to provide our users with additional value.

  5. We will use Reproducible Analytical Pipelines (RAPs) to reduce burdens on data suppliers, increase efficiency and improve quality assurance processes. This will focus initially on a pilot project to improve the flow of data from departments.

  6. We will review the measures on the website to allow us to understand both where we might develop more analysis of topics of greatest user interest, and also identify where there might be evidence gaps. It might potentially reduce the data supply burden on some suppliers when measures that are not used are identified.