Guidance

Quality statement: fraud and error in the benefit system statistics

Updated 13 May 2021

1. Introduction

This report assesses the quality of the fraud and error in the benefit system national statistics using the European Statistics System Quality Assurance Framework. This is the method recommended by the Government Statistical Service Quality Strategy. Statistics are of good quality when they are fit for their intended use.

The European Statistics System Quality Assurance Framework measures the quality of statistical outputs against the dimensions of:

  • relevance
  • accuracy and reliability
  • timeliness and punctuality
  • comparability and coherence
  • accessibility and clarity

The Government Statistical Service also recommends assessment against 3 other principles in the European Statistics System Quality Assurance Framework. These are:

  • trade-offs between output quality and components
  • balance between performance, cost and respondent burden
  • confidentiality, transparency and security

These dimensions and principles cross the three pillars of trustworthiness, quality and value in the Code of Practice for Statistics.

The background information and methodology note provides more information on fraud and error in the benefit system and the methodology used to produce these statistics.

2. Relevance

Relevance is the degree to which statistics meet the current and potential needs of users.

The Department for Work and Pensions (DWP) fraud and error in the benefit system statistics provide estimates of fraud and error for benefits administered by the DWP and local authorities.

The series has been developed to provide information to various users for policy development, monitoring and accountability, as well as providing academics, journalists and the general public, data to aid informed public debate.

The statistics:

  • include DWP benefits and those administered by local authorities
  • are the primary DWP indicator for levels of fraud and error in the benefit system
  • are in the DWP business plan
  • are important for DWP assurance on the impact of anti-fraud and error activity across the business

The publication is essential for providing our stakeholders with:

  • a consistent time series for assessing fraud and error trends over time
  • data to assess current DWP fraud and error policy and evaluate recent changes to these or business processes
  • the evidence base for assessing the potential effect of future fraud and error policy options and programmes
  • robust data to inform future measurement options
  • estimates of fraud and error for the DWP annual report and accounts
  • data to measure government performance relating to objective 5 of the DWP single departmental plan: transform our services and work with the devolved administrations to deliver an effective welfare system for citizens when they need it while reducing costs, and achieving value for money for taxpayers. Read the latest plan (correct at the time of publication of this document, May 2021)
  • estimates that feed into the annual HM Revenue and Customs National Insurance Fund Accounts

We recognise that our users will have different needs and we use a range of different methods to contact them. We frequently meet internal DWP users to discuss their requirements. As for external stakeholders, we often contact the National Audit Office and we occasionally contact HM Revenue and Customs and Cabinet Office.

Engagement with other external users is usually through the DWP statistical pages of this website where we:

  • invite users to share their comments or views about our National Statistics, or to simply advise us how they use our statistics
  • advise users of updates and changes to our statistics through the future statistics release calendars and our fraud and error in the benefit system collection page
  • consult with customers on developments and changes to our statistical methodologies, publications or publication processes. We last carried out a consultation in the Summer of 2018

3. Accuracy and Reliability

Accuracy is the closeness between an estimated result and the unknown true value. Reliability is the closeness of early estimates to subsequent estimated values.

The statistics are calculated from the results of a survey sample, which are recorded on an internal DWP database. The survey combines data collated from DWP administrative systems and local authority owned Housing Benefit systems, with data collected from the claimant during an interview.

The estimates obtained are subject to various sources of error, which can affect their accuracy. Both sampling and non-sampling error are considered in producing the statistics.

Sampling error arises because the statistics are based on a survey sample. The survey data is used to make conclusions about the whole benefit caseload. Sampling error relates to the fact that if a different sample was chosen, it would give different sample estimates. The range of these different sample estimates expresses the sample variability. Confidence intervals are calculated to indicate the variability for each of the estimates. More detail on central estimates and confidence intervals is provided in the background information and methodology note.

Sources of non-sampling error are difficult to measure. However, where possible, these uncertainties have been quantified and combined with the sampling uncertainties, to produce the final estimates. Quality assurance processes are undertaken to mitigate against particular types of error (for example, data entry error).

Possible sources of non-sampling error that may occur in the production of the fraud and error statistics include:

  • Data entry error – the survey data is recorded on a database by DWP staff. Data may be transcribed incorrectly, be incomplete, or entered in a format that cannot be processed. This is minimised by internal validation checks incorporated into the database, which can prevent entry of incorrect data and warn staff when an unusual value has been input. Analysts undertake further data consistency checks that are not covered by the internal database validations
  • Measurement error – the survey data collected from the benefit reviews are used to categorise an outcome for each case. The correct categorisation is not always obvious and this can be recorded incorrectly, particularly for complex cases. To reduce any inaccuracies, a team of expert checkers reassess a selection of completed cases before any statistical analysis is carried out. This evidence is used as a feedback mechanism for the survey sample staff and also for the statistical analysis
  • Processing error – errors can occur during processing that are caused by a data or programming error. This can be detected by a set of detailed quality assurance steps that are completed at the end of each processing stage. Outputs are compared at each stage to identify any unexpected results, which can then be rectified
  • Non-response error – missing or incomplete data can arise during the survey. Supporting evidence to complete the benefit review may not be provided, or the claimant may not engage in the review process altogether. In other cases, the benefit review may not have been completed in time for the analysis and production of results. An outcome is imputed or estimated in these cases, by different methods that are detailed in the background information and methodology note
  • Coverage error – not all of the benefit caseload can be captured by the sampling process. There is a delay between the sample selection and the claimant interview, and also a delay due to the processing of new benefit claims, which excludes the newest cases from being reviewed. An adjustment is applied to ensure that the duration of benefit claims within the sample accurately reflects the durations within the whole caseload

The list above is not exhaustive and there are further uncertainties that occur due to assumptions made when using older measurements for benefits that have not been reviewed this year. For example, the last full fraud and error review of State Pension in Financial Year Ending (FYE) 2006 was based on a pilot sample of a limited number of benefit processing centres, and was not a true random sample. There are also some benefit-specific adjustments that are part of the data processing.

More detailed information about the quality of the statistics can be found in the background information and methodology note. This includes discussion of the limitations of the statistics, possible sources of bias and error, and elements of fraud and error that are omitted from the estimates.

4. Timeliness and Punctuality

Timeliness refers to the time gap between the publication date and the reference period for the statistics. Punctuality is the time lag between the actual and planned dates of publication for the statistics.

The fraud and error in the benefit system report is usually published around 8 months after the main reference period (for example, the report covering October 2018 to September 2019 data was published in May 2020). The Coronavirus (COVID-19) pandemic has impacted the reviews underpinning the estimates, with Universal Credit reviews for the May 2021 report not ending until November 2020 (for more information please see the background information and methodology note.

Due to the time taken to undertake the interviews and gather follow up information, final data from the reviews is not made available to analysts until 4-5 months after the start date of the last interviews. The production of the statistics and tables, and the associated clearance processes, then takes the analytical team about 2 months to prepare. Improvements over the last few years have reduced this from 3 months.

DWP pre-announce the date of release of the fraud and error in the benefit system report 4 weeks in advance on this website and the UK Statistics Authority publication hub, in accordance with the Code of Practice for Statistics.

The statistics are published at 9.30am on the day that is pre-announced. The release calendar is updated at the earliest opportunity to inform users of any change to the date of the statistical release and will include a reason for the change. All statistics will be published in compliance with the release policies in the Code of Practice for Statistics.

5. Comparability and Coherence

Comparability is the degree to which data can be compared over time, region or another domain. Coherence is the degree to which the statistical processes that generate two or more outputs use the same concepts and harmonised methods.

Our publication provides information on the estimates over time. Where breaks in the statistical time series are unavoidable, users are informed within the report by a text explanation, with clear sectioning within the time series tables and detailed footnotes.

Any changes made to the DWP or local authority administrative system data are assessed in terms of their impact on fraud, error and debt strategy and policy. These are then impacted against the fraud and error measurement review process and communicated to our internal users and the National Audit Office through our change of methodology log. The same is true for any changes made to business guidance, processes and review methodology, as well as our own calculation methodology.

We agree some methodology changes in advance with internal stakeholders using change request and change notification procedures.

External users are notified of any changes to methodology in the ‘Methodology Changes’ section of the fraud and error in the benefit system report. Substantial changes to the report structure or content will be announced in advance on the fraud and error in the benefit system collection.

The fraud and error in the benefit system statistics form the definitive set of estimates for Great Britain. They are underpinned by reviews of benefit claimants in England, Wales and Scotland.

The benefit expenditure figures used in the publication also include people resident overseas who are receiving United Kingdom benefits, except for Over 75 TV Licences and Financial Assistance Scheme payments, which also cover Northern Ireland. All other benefit expenditure on residents of Northern Ireland is the responsibility of the Northern Ireland Executive. The benefit expenditure figures do not include amounts devolved to Scottish Government (which totalled £3.2 billion in FYE 2021). Reporting the levels of fraud and error of this benefit expenditure is the responsibility of Social Security Scotland. Their estimates for FYE 2020 (which only related to Carer’s Allowance) were published as part of their annual report.

Northern Ireland fraud and error statistics are comparable to the Great Britain statistics within this report, as their approach to collecting the measurement survey data, and calculating the estimates and confidence intervals is very similar. Northern Ireland fraud and error in the benefit system high level statistics are published within the Department for Communities annual reports.

HM Revenue and Customs produce statistics on error and fraud in Tax Credits. Again, these estimates can be compared to form a whole benefit view.

6. Accessibility and Clarity

Accessibility is the ease with which users can access the statistics and data. It is also about the format in which data are available and the availability of supporting information. Clarity refers to the quality and sufficiency of the commentary, illustrations, accompanying advice and technical details.

The reports and supplementary tables can be accessed on the statistics pages on this website and the UK Statistics Authority publication hub.

Fraud and error in the benefit system statistics follow best practice and guidance from the Government Digital Service and Government Statistical Service, in publishing statistics that give equality of access to all users.

For data protection reasons, the underlying datasets are not available outside DWP. However, the additional tables published alongside the report provide detailed estimates, giving a breakdown of overpayments and underpayments into the different types of fraud and error, for the benefits measured in that year. The tables are available in both standard and accessible formats.

Technical language is avoided where possible within the report. To help users, the report contains definitions of key terms such as: Fraud, Official Error and Claimant Error. A more extensive glossary of terms and error types is included in the background information and methodology note.

Contact details are provided for further information on the statistics, guidance on using the statistics, data sources, coverage, data limitations and other necessary relevant information to enable users of the data to interpret and apply the statistics correctly.

7. Trade-offs Between Output Quality and Components

Trade-offs are the extent to which different dimensions of quality are balanced against each other.

The main trade-off for these statistics is timeliness against accuracy. We always assess the right balance taking into account fitness for purpose, and fully explaining any compromises in accuracy for improved timeliness.

As detailed in timeliness and punctuality, we wait a considerable amount of time for the data to be as complete as possible before our publication process begins, to ensure that the estimates are based on data which is as final and robust as possible. This means that we usually publish data around 8 months after the main reference period.

8. Balance Between Performance, Cost and Respondent Burden

The DWP fraud and error in the benefit system statistics are produced from survey data which have a high respondent burden. A compulsory interview, lasting between approximately 30 minutes and 2 hours, is required for all cases sampled for claimant error and fraud checking.

The total DWP cost for production of these statistics is approximately 150 staff (full-time equivalent). DWP are continuously looking at more cost effective and efficient options for sourcing and collecting data, reducing the burden on the respondent and the production of the estimates.

9. Confidentiality, Transparency and Security

All of our data is handled, stored and accessed in a manner which complies with Government and Departmental standards regarding security and confidentiality, and fully meets the requirements of the Data Protection Act (2018).

Access to this data is controlled by a system of passwords and strict business need access control.

Any revisions to our publications are handled in accordance with the department’s revisions policy.

10. Contacts

Feedback on the content, relevance, accessibility and timeliness of these statistics and any non-media enquiries should be directed to:

Statistician: Louise Blake

Email: caxtonhouse.femaenquiries@dwp.gov.uk

For media enquiries on these statistics, please contact the DWP press office.

ISBN: 978-1-78659-331-3