Guidance

Quality Assurance of Administrative Data report for Benefit Sanctions statistics

Published 14 May 2024

The latest release of sanctions statistics can be found in the collection of benefit sanctions statistics.

1. Introduction

1.1 Background

This report contains information on the Benefit Sanction statistics administrative data sources used by the Department for Work and Pensions (DWP), as well as quality assessments on each source.

The UK Statistics Authority have published a regulatory standard including a Quality Assurance of Administrative Data (QAAD) toolkit. The toolkit was developed to help assure the quality of administrative data and in recognition of the increasing role that such data is playing in the production of Official Statistics.

1.2 List of Administrative Datasets

A brief summary of the administrative datasets used in the production of Benefit Sanction Official Statistics and their uses is described below.

Universal Credit Full Service (UCFS)

Universal Credit Full Service (UCFS Build) is the main data source for the administration of Universal Credit claims. It is used to determine who is on Universal Credit for the Benefit Sanctions Statistics, and the details of the various elements of Universal Credit.

Work Services Platform (WSP)

The Work Services Platform was used in Jobcentre Plus and contains information regarding live claimants of Universal Credit Live Service (UCLS) and allows work coaches to record doubts. A doubt is raised when a work coach believes a claimant may have broken their claimant commitment. Decision Makers also used the platform to record sanction decisions made.

Other Administrative Datasets

An evidence manager database was used in Contact Centres to gather evidence for new claimants. This database sends weekly snapshots of UCLS information.

A database of decision maker and appeals case recorder is a system for Decision Makers, across all benefits, to record sanction decisions such as Mandatory Reconsiderations and Appeals.

A payment manager system is used to calculate a claimant’s benefit entitlement.

A customer information system is the master personal details data store for the Department for Work and Pensions and is used to verify identities. Data includes Personal Details (Name, Address, DoB, Contact Details), Corporate Organisation Records, Benefit Awards, Personal Relationships and Benefit Interests

2. Quality assurance of administrative data (QAAD) assessment

2.1 UK Statistics Authority QAAD toolkit

The assessment of the Benefit Sanction administrative data sources has been carried out in accordance with the QAAD toolkit.

The QAAD toolkit sets out four levels of quality assurance that may be required of a dataset:

A0 – no assurance

A1 – basic assurance

A2 – enhanced assurance

A3 – comprehensive assurance

The UK Statistics Authority states that the A0 level is not compliant with the Code of Practice for Statistics. The assessment of the assurance level is in turn based on a combination of assessments of data quality risk and public interest. The toolkit sets out the level of assurances required as follows:

Level A1 – basic assurance

The statistical producer has reviewed and published a summary of the administrative data quality assurance (QA) arrangements.

Level A2 – enhanced assurance

The statistical producer has evaluated the administrative data QA arrangements and published a fuller description of the assurance.

Level A3 – comprehensive assurance

The statistical producer has investigated the administrative data QA arrangements, identified the results of an independent audit and published detailed documentation about the assurance and audit.

To determine which assurance level is appropriate for a statistics publication it is necessary to take a view of the level of risk of quality concerns and the public interest profile of the statistics.

Each administrative data source has been evaluated according to the toolkits risk and profile matrix (Table 1), reflecting the level of risk to data quality and the public interest profile of the statistics.

Table 1: UK Statistics Authority quality assurance of administrative data (QAAD) risk and profile matrix

Risk level Lower public interest profile Medium public interest profile Higher public interest profile
Low level of risk of quality concerns Statistics of lower quality concern and lower public interest [A1] Statistics of low quality concern and medium public interest [A1 or A2] Statistics of a low quality concern and higher public interest [A1 or A2]
Medium level of risk of quality concerns Statistics of medium quality concern and lower public interest [A1 or A2] Statistics of medium quality concern and medium public interest [A2] Statistics of medium quality concern and higher public interest [A2 or A3]
High level of risk of quality concerns Statistics of higher quality concern and lower public interest [A1 or A2 or A3] Statistics of higher quality concern and medium public interest [A3] Statistics of higher quality concern and higher public interest [A3]

Source: Office for Statistics Regulation

2.2 Assessment and justification against the QAAD risk and profile matrix

The data risk of quality concern and public interest profile in Benefit Sanction Statistics are rated by assessing:

(a) the possibility of quality concerns arising in the administrative data that may affect the statistics’ quality.

The Benefit Sanction data is regarded as being a medium risk of data quality concern. While every effort is made to collect data to the highest quality, as with all administrative data it is dependent on the accuracy of information entered into the system. Checks are made throughout the process from collection of the data to producing the statistics, but some data entry or processing errors may filter through.

(b) the nature of the public interest served by the statistics.

The Benefit Sanction statistics are regarded as higher public interest due to substantial media coverage of Benefit Sanction policies and their highly sensitive nature, the use of the statistics in Select Committee hearings and the impact of sanctions on the labour market.

Therefore, as defined by the risk and profile matrix (Table 1), the combination of medium level of data risk concerns, and higher public interest profile indicate that enhanced assurance [A2] is the minimum level required for Benefit Sanction statistics.

The QAAD toolkit outlines 4 specific areas for assurance, and the rest of this report will focus on these areas in turn. These are:

  • operational context and administrative data collection
  • communication with data supply partners
  • quality assurance principles, standards and checks applied by data suppliers
  • producer’s quality assurance investigations and documentation

Each of the four practice areas are evaluated separately, and the respective level of assurance is stated. This approach enabled an in-depth investigation of the areas of particular risk or interest to users.

The overall level of assurance for Benefit Sanction Statistics is outlined in the summary section.

3. Areas of quality assurance of administrative data (QAAD)

Table 2: Summary table of QAAD matrix scores for the Benefit Sanction Statistics

Task Low (A1) Medium (A2) High (A3)
Operational context and administrative data collection   A2: Enhanced assurance  
Communication with data supply partners   A2: Enhanced assurance  
Quality assurance principles, standards and checks by data supplier   A2: Enhanced assurance  
Producers’ quality assurance investigations and documentation   A2: Enhanced assurance  

3.1 Operational context and administrative data collection (QAAD matrix score)

Universal Credit Live Service (UCLS) Data Collection

UCLS was initially rolled out to a select group in certain regions from 2013 before becoming available in every Jobcentre (excluding Northern Ireland) by May 2016. New claims to UCLS ceased in January 2018, and by April 2019 the remaining UCLS cases were gradually migrated to UC Full Service (UCFS). For more information on Universal Credit as a whole, including the difference between live service and full service and the timeline of Universal Credit, see the separate Universal Credit background information and methodology document.

Data was collected from UCLS using the Work Services Platform (WSP). The WSP was used by Work Coaches to record doubts and by Decision Makers to record sanction decisions made for both UCLS. Eventually the UCLS volumes were so small that there was little to no change in the statistics, and it was decided to freeze the statistics. All UCLS data in the Benefit Sanctions Statistics was frozen with the final update published on 10 November 2020, covering data up until 1st April 2019.

Universal Credit Full Service (UCFS) Data Collection

The data which is used in the production of the Benefit Sanction Statistics publication is mainly collected from the administrative system (UCFS Build) used to operate Universal Credit.

The UCFS Build is a web-based system, where information can be entered by agents (Work Coaches and Decision Makers) as well as claimants and automated procedures. The primary risk to data quality at this stage is data being entered incorrectly or incomplete. Data from the UCFS system feeds into a central database, where identifying details are masked.

Other systems

A customer information system is used to verify identities. As some information cannot be fully obtained from the UCFS and UCLS administrative systems, a customer information system is used to validate, and quality assure existing data and supply missing information.

A database of decision maker and appeals case recorder is used across a number of benefits to record Mandatory Reconsiderations (MRs) and Appeal decisions. A sanction decision is made once the claimant is referred to the Decision Maker (DM). Evidence and information are collected, and the DM uses this to make a decision to apply a sanction or not.

A payment manager system is used to calculate a claimant’s benefit entitlement. The data is used to identify a drop in payment due to a sanction.

When an individual is sanctioned

There are 6 conditionality regimes for those on UC. Four of these conditionality regimes can be subject to a sanction, these are:

  • “searching for work”
  • “planning for work”
  • “preparing for work”
  • “unknown”

Initially to claim UC a Work Coach will set out what is required of the claimant in the Claimant Commitment. If the claimant fails to meet any of the responsibilities within their Claimant Commitment, they may have a sanction applied.

There are 4 sanction levels for Universal Credit:

  • Lowest Level
  • Low Level
  • Medium Level
  • High Level

Once a Work Coach raises a doubt that the claimants have failed to meet their claimant conditions, they are referred to a Decision Maker (DM). The decision-making process for Universal Credit has four stages: Referral, Original Decision, Mandatory Reconsideration and Appeal. During the stages, four decision types can be made: Adverse, Non-adverse, Reserved or Cancelled.

If an adverse decision is made, the amount deducted is calculated daily as a percentage of their standard allowance and is dependent on the claimant’s current personal circumstances and conditionality regime see Table 1 in Annex 2.

Should you wish to read more about sanctions please see the Benefit sanction statistics background and methodology

Strengths

  • DWP staff follow detailed guidance and undergo training to minimise errors during data entry. Further automatic validation checks are carried out after this, including checking personal details against a customer information system and rule-based checks at different stages of a claim’s lifecycle
  • Information is updated in a timely manner.

Weaknesses

  • The primary risks of error are data being entered incorrectly or incompletely into the systems. There is also the risk of potential fraud. Processes rely on information entered by claimants and agents
  • Developments to the UCFS system can impact on data consistency
  • There can be retrospective changes to the data

3.2. Communication with data supply partners (QAAD matrix score A2)

High level process flow of Benefit Sanction Statistics

UCFS and UCLS data is owned by DWP, and they are provided to analysts as a business requirement for management information, policy analysis and statistics.

Internal teams are responsible for providing data to the analytical community within DWP to agreed specifications and communicate with data owners on behalf of analysts in respect of data issues and proposed system changes.

Overview of Official Statistics

UCFS sends daily snapshots of the unstructured data which gets stored, processed, and thereafter fed into an extract, transform and load (ETL) process to form a data model. The relational tables in the data model are linked together, aggregated, and summarised to produce statistical output (files) as per agreed monthly schedule. Official UCFS data is processed by our internal team while one dataset, the Benefit Sanction Statistics (BSS) team use to denote decisions made in error, is processed by the respective team on an internal platform.

Once output files are generated, the data warehouse team responsible for transferring statistical data extracts to the statistical producers in DWP are notified. The processed data is then transferred securely and uploaded onto a secure server and made available to Benefit Sanction Statistics Team through analytical software.

To summarise some of the key responsibilities of our internal and data warehouse teams are:

  • to collect, store and transfer data securely and lawfully
  • to ensure the data is accurate and updated in a timely manner
  • to store special category data (e.g. ethnicity, health, religion info) more securely
  • to comply with internal policy to mask personal identifiers to ensure that individuals cannot be identified

Communication with internal teams

Our internal teams have an established communication channel and communicate regularly via email with stakeholders.

They maintain communication with UCFS on behalf of analysts in respect of data issues and proposed system changes.

Since data for UCLS is frozen, there is no ongoing active communication channels regarding UCLS data.

A database of decision maker and appeals case recorder is used across benefits, except UCFS, to identify MRs and Appeals. Data is uploaded by the internal DWP team who maintain operational small systems to a small service platform. The team who maintains operational small systems are responsible for developing the build and checking for any errors. Every month data is downloaded by the data warehouse team who inform stakeholders. Monthly reports are then generated for Senior leadership. Biannual reports are generated for a systems information asset management team who ensure compliance with internal policy and GDPR.

Communication with the data warehouse team

The data warehouse team continuously inform our internal teams of any issues with the data model and provide relevant feedback where required.

When the data warehouse team receive the notification the data model is updated data processing can begin. Data is processed into masked datasets for analysts to use.

For the database of decision maker and appeals case recorder, the data warehouse team update the statistics team monthly when the management information is available to access. The data is downloaded from a small systems platform that gives management information. A report is generated which the data warehouse team then load into a data processing platform, into the Data Warehouse, where it can be accessed by the Benefit Sanction Statistics Team through analytical software. Should issues arise, the data warehouse team can contact the team who maintains operational small systems to investigate the issue.

Data processing to create masked UCFS datasets starts when the data warehouse team receive a notification that the data model has been updated. Prior to starting, the data warehouse team instruct all analysts not to access the existing datasets until the data refresh is complete to minimise the risk of data issues. On completion, statistics producers are notified to confirm that the updated datasets are available.

Any issues posing a security risk are reported to the DWP security advice centre.

Data warehouse team sign-off arrangements for the UCFS datasets are to:

  • ensure datasets match the specifications on the quality control sheet
  • report data quality issues to data suppliers for further investigation
  • ensure any identified issues are resolved

Communication with operational teams

Communication with operational and other analytical teams ensure an understanding of UCFS system changes that impact on the official statistics. This is particularly important for Universal Credit because, as a relatively new benefit, the administrative systems are constantly developing. The BSS team hold regular meetings with operational teams to keep up to date with ongoing developments.

The team responsible for maintaining small operational systems have regular meetings with stakeholders to provide updates and ensure changes to the build are trialled with individual stakeholders before implementation.

Strengths

  • Stakeholders receive regular communication from data supply partners
  • Data supply partners receive feedback during the data transformation process from analysts (experts in the data) which therefore increases the likelihood of issues being identified and resolved
  • Awareness of UCFS system changes contribute to increased data accuracy

Weaknesses

  • The statistics teams are not involved in UCFS system design and have no direct engagement with the system designers. However, communication through policy can raise queries with regards to the system build / design.
  • There is a small production window to create the data model and UCFS datasets and issues are likely to cause delays

3.3 Quality assurance principles, standards and checks by data supplier (QAAD matrix score A2)

Internal teams QA checks

The internal teams perform sense checks on the outputs and create internal data quality reports to compare counts, naming convention and formats and highlight potential discrepancies to analysts if any for additional scrutiny.

They use feedback from analysts to identify and investigate potential issues and ensure any identified issue is promptly resolved.

Finally, with enhanced digital capability and continuous feedback from analytical community, processes are ever evolving, and improvements are made to existing QA processes.

Data warehouse team QA checks

The data warehouse team use the UCFS quality control sheet to check the specification of all variables in each of the UCFS datasets. Any inconsistencies are investigated to identify whether the issue is with the datasets or the data model. Notification that the database of decision maker and appeals case recorder has been updated are sent directly to the Benefit Sanction Statistics Team.

For UCLS data, code will fail should errors in variable names occur. This would be reported back to the DWP analyst who created the dataset who can communicate the change to the data warehouse team.

The team who maintains operational small systems QA checks

The team who maintains operational small systems check for and remove duplicate and inactive cases. Should cases be made with just demographic information they contact the case maker to relay that the case will be deleted after 7 days without further elaboration.

Strengths

  • Consistent checks every month allows understanding of observed differences
  • Continuously improved quality assurance checks and using stakeholder feedback to shape them
  • In-depth understanding of data and access to more detailed data ensure issues are identified and resolved efficiently

Weaknesses

  • You are not able to identify missing historic data from the quality report the internal team produce

3.4 Producers quality assurance investigations and documentation (QAAD matrix score A2)

Pre-processing checks

The statistics team generates observation counts for each dataset to ensure missing data is identified at the earliest opportunity and rectified before data processing starts. The back series counts for each UCFS dataset are expected to match those recorded in previous months with relatively small changes due to retrospection. If there are significant changes, this may indicate missing data and is reported for further investigation.

The statistics team check if previous data is available for comparative QA purposes. Space is maintained to securely store new data for the required retention period.

Code is used to identify free text fields in data that say that sanction decisions have been applied in error. 

The BSS team compare the database of decision maker and appeals case recorder to previous months with the expectation of an increased number of decisions as data is cumulative. Any errors would be fed back to the data warehouse team and investigated by our team.

Since UCLS data is frozen there should be no retrospective changes to the data.

The variable formats in each dataset are compared to previous months to identify changes and therefore reduce the risk of errors during data processing. If there is a change, it is resolved by modifying the code in relevant data processing projects.

The statistics team work closely and regularly communicate with policy teams ensuring any changes which could impact on the statistics are pre-emptively noted.

Data processing checks

The statistics team check that each stage of data processing has completed as intended. Generally, the checks ensure that all expected variables are populated and that total observation counts are comparable to previous months. Data is sense checked and abnormal results investigated.

An important stage of data processing is checking that all claimants have a Jobcentre office code. In the case of missing office codes, if the claimant was previously present in the dataset during the same claim (for example, in the preceding month), and at that time they were associated with an office, then this office is used.

UCFS datasets contain claimant data that covers the UK. However, during data processing, all claims registered at a Northern Ireland office are removed to create a Great Britain (GB) dataset only. There are quality checks in place to ensure that claims with a Northern Ireland postcode or Jobcentre office code are excluded from the GB dataset.

Output validation

Once the statistical outputs have been created, a series of robust quality assurance checks are carried out by the statistics team. This includes:

  • checking observation counts compared to the back series for several breakdowns
  • checking that revisions to provisional and historic data are reasonable
  • checking monthly percentage changes and proportions compared to recent trends
  • investigating unexpected trends and sharp changes
  • checking outputs match across all destinations (Benefit Sanction Statistics, Data tables produced alongside the publication, Stat-Xplore)

In addition, the statistical outputs are compared to Universal Credit management information data. Corroboration with management information data allows comparison of expected trends and observation counts.

Strengths

  • A pre-emptive approach ensures that issues are addressed prior to data processing. Resources within the statistics team are therefore optimised to complete other time sensitive data processing tasks
  • Pre-processing checks are in place to identify missing historic data
  • Extensive quality assurance throughout the statistical production process

Weaknesses

  • Quality assurance is mostly manual so human error is a risk factor
  • The identification of a data issue during pre-processing checks may cause a delay in data processing (and possibly on timely publication) especially if re-creation of UCFS data by data supply partners is required

4. Summary

The Department for Work and Pensions (DWP) considers the main strengths of the data used by the Benefit Sanction Official Statistics to be that:

  • UCFS system information is updated in a timely manner
  • information is verified either independently or by documentary evidence provided by the claimant
  • the communication and feedback between data supply partners during the data transformation process increases the likelihood of issues being identified and resolved
  • data supply partners continuously improve their quality assurance checks and use stakeholder feedback to shape them
  • the statistics team carry out extensive quality assurance throughout the statistical production process

The current limitations are that:

  • there are potential risks to the consistency of data collection due to continued developments in the UCFS system
  • there is the potential for fraud and administrative error as the system relies on the information submitted by claimants and verification by operational teams
  • the statistics team are not involved in the UCFS system design and have no direct engagement with the system designers

In constantly seeking to improve Benefit Sanction Official Statistics, steps will be taken to mitigate the limitations identified in this report, and progress will be communicated to users in the next Quality Assurance of Administrative Data assessment.

Benefit Sanction Official Statistics are assessed as being assured to level A2 (enhanced assurance) as outlined by the UK Statistics Authority QAAD toolkit.

If you are of the view that this report does not adequately provide this level of assurance, or you have any other feedback, please contact us via email at: epass.team@dwp.gov.uk with your concerns.

Annex 1: public interest

How do I work out the level of public interest of the statistics?

The first thing to note is that it is the public interest or value of the official statistics, as opposed to the individual data source. There are some questions that you can ask yourself when deciding whether the public interest profile is lower, medium or higher:

  • What use is made of the statistics?
  • What decisions do they impact? (eg spending by government)
  • What is the reputation risk attached to the statistics?
  • How broad is the user base?
  • Is there political interest in the statistics? (eg government commitments)
  • Are they required by legislation?
  • Are the statistics used to hold the government to account?
Level of public interest or value Example criteria
Lower Always likely to be politically neutral subject – Interest limited to a niche user base – Not economically sensitive – Limited media interest
Medium Wider range of use for decision making – Moderate economic and/or political sensitivity – Moderate media coverage
Higher Legal requirement, for example, for Eurostat – Economically important/market sensitive – Used to inform decisions about resource allocation eg by government – Highly political sensitivity, for example reflected by Select Committee hearings – Substantial media coverage of policies and statistics – Substantial public health concern

How do I decide on the level of data quality concern?

Consider the characteristics of the data collection and the reasons for the data:

  • Where are the data from?
  • Why are they collected?
  • How are the data entered? eg manually vs automatically
  • How many organisations are involved? eg one vs many
  • Why does it matter to my users if the data I am using is of poor quality?
  • Have the data collection or processing systems changed or are they changing?
  • Are the data supplied under contract?
  • Does the contract contain key quality indicators or standards?
  • Have there been policy changes that have changed the data collection requirements?
  • How many errors, delays, incomplete or resubmission of data have I had?
  • Do I pay for the data?
Level of data quality concern Example criteria
Lower Single data supplier – Simple data collection process – Well-defined classifications – Clear coding frame – Clear instructions for recording – Validation checks built into data collection system – Validation checks built into statistical producer’s system – No performance regime or use of targets – International standards for measurement
Medium Combination of factors from lower and higher levels with safeguards to moderate the outcomes: – More complex data collection – Use of data for payment by results offset by operational checks – Audit established: – internal, financial, clinical, sample/statistical – External oversight, for example by regulators – Multiple data providers offset by use of quality indicators
Higher Multiple data supply and/or collection bodies – Complex data collection – Subjective recording of variables – Manual recording and/or coding – Lack of consistency in coding – Lack of clarity in classification systems – No audit of administrative system within operational processes – Over-reliance on system checks – Performance management regime/use of targets – Lack of external oversight

Source: QAAD – FAQs for Statistical Producers

Annex 2: percentage deducted

Table 1: Percentage deducted

Claimant Deduction (%)
Single claimant in either the Searching for Work, Working – with requirements, Working No Requirements[footnote 1] or Preparing for work Conditionality Regime 100%
Claimant in a couple in either the Searching for Work, Working – with requirements, Working No Requirements or Preparing for work Conditionality Regime 50%
Single claimant in the Planning for Work regime or in No Work requirements regime on the grounds of childcare responsibilities, adoption or pregnancy 40%
Claimant in a couple in the Planning for Work regime or in No Work requirements regime on the grounds of childcare responsibilities, adoption or pregnancy 20%
Claimant in No Work requirements regime with limited capability for work related activities 0%