Research and analysis

Methodological note: Claimant service and experience survey 2018 to 2019

Published 20 July 2020

1 Background

1.1 Rationale for the survey

The Claimant Service and Experience Survey (CSES) allows the Department for Work and Pensions (DWP) to monitor claimant satisfaction with the services provided by DWP and to enable claimant views to inform improvements to the delivery of benefits and services.

The survey includes a range of measures on the four areas of the DWP customer charter:[footnote 1]

  • ease of access

  • getting it right

  • keeping you informed

  • right treatment

The Charter provides a standard and focus for driving improvements in customer service standards and service delivery. It supports the Department’s ambition to become an exemplar of effective service delivery. As well as collecting a measure of claimants’ overall satisfaction with DWP services, the survey quantifies DWP’s performance on satisfaction with claimants’ transactions and Jobcentre Plus services and explores the underlying measures that make up the four areas of the Customer Charter. It also measures the use and effectiveness of different channels of communication, with specific interest in the customer journey.

The survey focuses on ten key benefits:

  • State Pension (SP)
  • Pension Credit (PC)
  • Attendance Allowance (AA)
  • Carer’s Allowance (CA)
  • Disability Living Allowance for children (DLAc)
  • Personal Independence Payment (PIP)
  • Employment and Support Allowance (ESA)
  • Income Support (IS)
  • Jobseeker’s Allowance (JSA)
  • Universal Credit (UC)

1.2 Survey overview

Kantar conducted a Computer Assisted Telephone Interview (CATI) survey with 15,000 claimants between 29 August 2018 and 3 May 2019[footnote 2]. Fieldwork was conducted across four quarterly waves lasting just over one month each quarter. To be eligible for the survey, a claimant must have been in receipt of a benefit administered by DWP and have been in contact with the Department regarding that benefit in the three months prior to the date on which the sample was drawn. This ensured that claimants’ responses to the survey were informed by recent experience of interacting with DWP.

1.3 Policy changes

1.3.1 Personal Independence Payment

The composition of the population of Disability Living Allowance claimants continues to change as working age claimants are phased on to Personal Independence Payment. Remaining working age Disability Living Allowance claimants who had not migrated to Personal Independent Payment were not included in the 2018/19 survey.

In August 2018, there were just over 1.9m Personal Independence Payment claims in payment (and around 0.7m Disability Living Allowance working-age claims in payment), with just under 1.9m Personal Independence Payment claims in payment by January 2019 (0.61m Disability Living Allowance working-age claims in payment).

In 2016/17 the Disability Living Allowance benefit group sample included both Disability Living Allowance Working Age and Disability Living Allowance Child claimants. In 2017/18 and 2018/19 Disability Living Allowance samples only included those receiving Disability Living Allowance Child. These changes to the Disability Living Allowance benefit group sample composition mean that results from 2017/18 and 2018/19 are not directly comparable with 2016/17.

1.3.2 Universal Credit

In April 2013, Universal Credit Live Service was introduced for claimants within certain geographic areas. It has been available across Great Britain for single people claiming income-based Jobseeker’s Allowance since March 2016 but stopped accepting brand new claims in January 2018. Universal Credit Full Service was introduced in a small number of areas in November 2014. DWP has continued to roll out the Universal Credit Full Service more widely since May 2016.

In August 2018, there were just over 1 million Universal Credit claims in payment. This rose to just under 1.5 million Universal Credit claims in payment by January 2019.

In 2017/18, the survey included samples for both Universal Credit Full Service as well as Universal Credit Live Service. Findings for both types of Universal Credit were presented together.

As Universal Credit Full Service has now replaced Universal Credit Live Service across the country this year, the Universal Credit sample now only includes Universal Credit Full Service claimants

These changes to the Universal Credit benefit group sample composition mean that results from 2018/19 are not directly comparable with previous years.

1.4 Code of Practice for Statistics

Although this report is not an Official Statistic or National Statistic, this publication has made an active choice to apply the Code that underpins these standards.

The code is built around three main concepts, or pillars:

  • Trustworthiness: Having confidence in the people and organisations that publish statistics

  • Quality: Using data and methods that produce assured statistics

  • Value: Publishing statistics that support society’s needs for information.

We have applied the pillars of the Code in a proportionate way, as follows.

1.4.1 Trustworthiness (confidence in the people and organisations that produce statistics and data)

This report is published on an annual basis, following the completion of fieldwork and the work was conducted independently by Kantar.

The survey has robust processes to protect data confidentiality and to ensure legal obligations are met. Kantar abide by Data Protection Act 2018 and General Data Protection Regulation (GDPR) which is enshrined within this. Kantar also adheres to the Market Research Society and ESOMAR professional codes of conduct, and are accredited to ISO 20252, the international market research quality standard, and ISO 9001, the international standard for quality management systems.

Sample files containing the contact details of sampled claimants are kept separately and securely away from survey and can only be accessed by the research team at Kantar. Only anonymised responses are delivered as the survey output. DWP cannot identify who from the original drawn sample took part in the survey.

1.4.2 Quality (data and methods that produce assured statistics)

We seek through this report to be transparent about the methods and data used as well as any sources of uncertainty. Where appropriate, we note any limitations in data sources used. We also note any issues in data comparability over time, changes to the benefit landscape which change comparability as well as changes to methodological approaches (for instance, sampling).

The data contained within this report has been scrutinised and approved both within DWP and by the independent contractor Kantar (a full list of checks is included in section 5.1).

Once the data contained within the report (including data tables) has been through Kantar’s quality assurance processes, DWP quality assure the data through a rigorous process of checking the SPSS syntax used and further reproduction of data tables, highlighting and investigating any discrepancies.

It should be noted that the sample population is those that have contacted DWP during the three months prior to the sample being drawn. This is referred to as the ‘contacting population’ and is quite distinct from the overall DWP claimant population, many of whom may not have any (or less frequent) contact with the Department.

The sampling approach is intended to be as representative as is possible of the aforesaid ‘contacting population’ but there are some key limitations:

  • For some benefits, the sample sizes are relatively small and therefore findings are less robust (for further details see section 2).

  • This is not a random probability sample. Target sample numbers are set for each benefit group. Therefore, strictly speaking, statistical significance tests, that assume a random probability sample, should not be applied. However, we have conducted indicative tests on data that we describe in the main report to help identify likely substantive trends.

1.4.3 Value (statistics that support society’s need for information)

The data contained within this report is used to report on claimant satisfaction with DWP services across the ten main benefits and ultimately to inform delivery improvements and measure performance. The figures provide an overview of claimant satisfaction and associated measures relating to the DWP Customer Charter (ease of access, right treatment, getting it right, and keeping you informed) so the public can see the levels of claimant satisfaction with DWP services across a number of different measures. Results can also be broken down by equality characteristics.

It is used by Ministers and other senior staff to inform decisions on the best ways to provide high quality services in the most efficient manner to claimants.

The report is often used as a basis for further ad-hoc requests from external stakeholders, for example in the form of Parliamentary Questions or Freedom of Information Requests.

1.4.4 Collaboration, coherence and transparency

We seek to adhere to the cross-cutting themes which don’t fit within just one pillar of the Code of Practice of Statistics but feature in each of the three pillars of Trustworthiness, Quality and Value.

For instance, we have worked collaboratively with others to ensure data sources are suitable and that the data provided is useful.

We similarly adhere to a process to ensure patterns are found, examined and reported on to ensure understanding of our data is maximised. We also seek to use methods that are sound and embedded in scientific principles.

The data provided in this report is intended to be clear and open. We are clear on the strengths and limitations of the data and how data is used and protected. We also actively listen and act on feedback.

2 Sampling

2.1 Sampling undertaken by DWP

2.1.1 Sample eligibility

In order to provide useful information about the current state of service, the survey sampling strategy was designed to provide a representative cross-section of DWP claimants who have had recent contact with the department[footnote 3]. A claimant was defined as somebody currently claiming (or having recently claimed) one of the following:

  • State Pension

  • Pension Credit

  • Attendance Allowance

  • Carer’s Allowance

  • Disability Living Allowance (child)

  • Personal Independence Payment

  • Employment and Support Allowance

  • Income Support

  • Jobseeker’s Allowance

  • Universal Credit

As discussed in section 1.3, the survey must keep pace with structural changes in the benefit landscape. The 2016/17 survey covered only claimants of Universal Credit Live Service while the 2017/18 report incorporated weighted data for both Universal Credit Live Service and Universal Credit Full Service claimants, providing a combined Universal Credit measure. As Universal Credit Full Service has now been introduced across the country the 2018/19 survey does not include Universal Credit Live Service claimants.

Further, the size and composition of the population of Disability Living Allowance claimants was changing as working age claimants were migrated on to Personal Independence Payment. The 2017/18 report therefore only includes a sample for Disability Living Allowance for children (under 16).

As well as those receiving benefits directly, appointees[footnote 4] are defined as DWP claimants; as were parents or guardians of claimants of Disability Living Allowance for children below the age of 16.

All ‘professional’ claimant representatives were excluded from the research (e.g. Citizens’ Advice, solicitors making contact on behalf of a client, MPs making contact on behalf of a constituent). These parties were considered to be likely to make contact on behalf of a number of different people and therefore their responses an ‘average’ of all their contact with DWP, rather than being focused on a specific transaction.

2.1.2 Sample frame

The sample frame was established by including instances of contact that can be measured using available administrative data. Three different contact types are identified in the sample, namely new claims, change of circumstances and contact. DWP used timely administrative workload data to stratify the sample by these contact types, to ensure that the proportions of claimants by contact group in the sample are the same as those in the population. This addressed the principle shortcoming of the 2016/17 sampling process- that the proportions of claimants in the sample by contact type were not representative of those in the population.

The sample was drawn from the aforementioned DWP administrative workload datasets that are themselves derived from operational management information. All datasets can be used to identify claimants who have made a new claim. Similarly, all datasets – apart from Universal Credit Full Service[footnote 5] – were used to identify claimants who have been in contact with DWP to report a change of circumstance (bank details, address, marital status etc.).

Working age benefits require claimants to keep in regular contact with the Department, usually through an appointment at the local Jobcentre. All Jobseeker’s Allowance claimants who appeared in DWP administrative data as claiming these benefits during the preceding three-month period were therefore included in the sample frame, as were those in receipt of Employment and Support Allowance and Income Support. The frequency and mandating of contact for ESA and IS claimants varies according to the conditionality requirements of the claimant.

This served as an effective sample frame, allowing for the identification of claimants who had contact with DWP during the designated three-month contact period.

Some claimants were excluded from taking part in the survey for ethical reasons in line with Government Social Research practice. Claimants were excluded from being selected for the sample if they were identified as one or more of the following:

  • terminally ill

  • over 90 years old

  • potentially violent

  • requested not to be contacted for any DWP survey

  • a sensitive case (claimants whose details had been screened out due to severe implications if their private data fell into the public domain)

  • a prisoner

  • a person whose address is registered as the local Jobcentre (considered to be homeless)

  • someone previously sampled in a DWP survey in the last three years.

To create the final sample, duplicate cases were removed, for instance in circumstances where a claimant had claimed more than one benefit, had switched between benefits within the three-month period, or where there were data quality issues that created multiple cases for one claimant.

Further cases were removed where relevant contact details were not available, for instance a claimant’s name, address or telephone number.

In line with 2017/18 methodology, the CSES excludes disallowed claims at the point of drawing the sample, although there is scope for disallowed claims to be included in fieldwork due to the time-lag between drawing the sample and interviewing claimants.

Historically, benefit satisfaction surveys were conducted by separate surveys of each DWP benefit group. The Disability Living Allowance and Attendance Allowance survey included disallowed claimants while surveys of other benefit groups did not. When these separate surveys were grouped into the CSES, the sample of Disability Living Allowance, Attendance Allowance and Personal Independence Payment claimants continued to include disallowed cases, and it was therefore decided to revert to a consistent approach across all benefits from 2015/16. Therefore, the survey results of Personal Independence Payment and Employment and Support Allowance are comparable between 2015/16, 2016/17, 2017/18 and 2018/19.

However, after the sample is drawn and before the fieldwork takes place (which can extend up to three months), a claimant may withdraw their claim or have it disallowed for various reasons, including a failed medical assessment. Therefore, some people with disallowed claims may be interviewed.

2.1.3 Sample selection

Once the sample contact population had been defined, the sample itself was broken down by benefit into smaller sample datasets dependent on a number of factors; for instance, the number of live claims for that benefit, the frequency of internal reporting and policy interest. For instance, the interview quota for key working-age benefits (Jobseeker’s Allowance, Employment and Support Allowance and Universal Credit Full Service) was set at higher levels than disability or pension-age benefits (Attendance Allowance, State Pension, Pension Credit, Personal Independence Payment, Disability Living Allowance) to ensure the number of interviews for those benefits allows for robust quarterly and regional analysis. A weighting adjustment is used to correct for this differential sampling probability when reporting at whole sample level (see section 5.3).

As Disability Living Allowance Working Age continues to be replaced by Personal Independence Payment for almost all claimants, the Disability Living Allowance quota now only includes Disability Living Allowance Child claimants and no longer includes Disability Living Allowance Working Age claimants. The survey has never included Disability Living Allowance claimants who are aged 65+. These changes to the Disability Living Allowance benefit group sample composition mean that results from 2018/19 and 2017/18 are not directly comparable with the previous years.

The number of interviews for each benefit can be found in Table 4.1 (under ‘Achieved Interviews’).

2.2 Sampling undertaken by Kantar

The sample provided by DWP was reviewed and cleaned by Kantar before any telephone interviewing took place.

2.2.1 Sample checks

Kantar also conducted checks on the sample. These checks included:

  • Comparison between the agreed sample numbers and total numbers for each benefit group delivered by DWP, to ensure they matched.

  • Checks that all sample variables within each benefit were fully populated. These variables were later used for weighting.

  • Comparisons between the demographic categories for each benefit against the claimant contacting population to ensure they were broadly representative. For example, the proportion of males and females should be broadly similar in both the total contacting population and sample. Discrepancies were queried with DWP before proceeding.

2.2.2 Sample cleaning

Kantar cleaned the sample file, provided by DWP, to create a final sample file to load into the NIPO Computer Assisted Telephone Interview (CATI) server for fieldwork. The following processes were undertaken:

  • any records with a missing or invalid telephone number were removed

  • where duplicate telephone numbers, contact names or addresses exist, one record is selected at random to leave one unique record for each sampled claimant

2.2.3 Final sample files

Two files were then created:

  • an advance letter dispatch file (see section 4.1.2)

  • a file for the Telephone Unit to load into the NIPO CATI server

3 Questionnaire

3.1 Questionnaire structure

The core modules of questions were broadly consistent with the 2017/18 survey. The survey was structured into four sections: introduction, collection of key information, transaction-related questions and general questions. The latter two sections collected information for DWP’s four customer charter metrics. The content of these sections is described below. A flowchart of the questionnaire structure is included in figure 3.1.

3.1.1 Introduction

This section introduced the survey, confirmed the claimant’s benefit group as defined by the sample, and confirmed that the claimant had been in contact with DWP in the three months prior to interview. Claimants who self-reported no contact with DWP in the previous three months were deemed ineligible and excluded from the survey at this point.

3.1.2 Collection of key information

Establishing the type of transaction

This section collected information to determine the routing for the remainder of the questionnaire. The first part asked a question to establish the type of transaction undertaken with DWP within the past three months. Claimants often have had multiple contacts with DWP within the three months prior to interview, so the question is structured using a selection list. The list consisted of three sections, with answer codes being randomised within each section. The list was then read out in the order that appeared on screen until the claimant identified a transaction they had undertaken (for a complete list and randomisations, see Appendix A).

Communication channel

The second part of this section included questions to identify the modes of contact used to contact the Department both initially and in any subsequent contact, and similarly to identify the modes of contact and experience of claimants when contacted by the Department.

In this section, claimants were asked for more detail about their experience of contact with DWP for their transaction, according to the mode of contact used. The types of contact covered were:

  • online

  • telephone (calling DWP)

  • telephone (receiving a call from DWP)

  • written contact (post, email and text message)

  • face-to-face

  • online account

If a claimant’s transaction had involved contact with a member of DWP staff, they were asked further questions on responsiveness and the outcome of the transaction. This covered topics that were related to the DWP Customer Charter, including whether staff were helpful, polite, understanding, fair and sympathetic, and whether the information provided by DWP staff was correct.

3.1.4 General questions

To gather more information about the effectiveness of different modes of contact, a series of more general questions about points of contact with DWP were asked at the end of the questionnaire. These questions were not related to the claimant’s recent transaction. Claimants were asked questions on the following:

  • the accuracy and clarity of information provided by DWP

  • whether problems or complaints were resolved

  • different communication channels used for contacts outside of the transaction and their overall experiences

  • claimant’s experience, customer journey, failure demand and avoidable contact

The survey then concluded with a question asking for the claimant’s overall satisfaction with DWP services before collecting demographic details.

3.2 Changes to existing questions

The overall structure of the 2018/19 questionnaire was largely unchanged.

Minor updates and changes to the questionnaire were made to reflect changing categories and definitions, and in response to feedback on the operation of the questionnaire from interviewers and on areas of policy interest from stakeholders.

Some questions underwent substantial changes in the 2018/19 survey, either due to routing changes, the addition of new questions, or changes to the response lists. These changes are described below under the relevant sections.

Establishing the type of transaction

Some codes were added or removed from the question that establishes transaction type (see ‘Establishing the type of transaction’ on p11).

The question format was also changed from a single ordered list, to a list consisting of three sections with answer codes being randomised within each section (for a complete list and randomisations, see Appendix A).

General questions

There were routing and structural changes on:

  • generic questions relating to payments received by the respondent and any problems they experienced with DWP

  • questions on online usage. New questions were also added to this section this year

Figure 3.1 Questionnaire structure

1. Introduction

  • Introduction, establishing if appointee and confirming benefit group

2. Getting key information: establishing transaction type

  • Establishing most recent transaction

3. Getting key information: communication channel

  • Modes of contact
  • Channel specific sections

  • Satisfaction with transaction

5. General questions

  • Generic and complaints

  • Channel specific sections

  • Overall satisfaction

  • Demographics

4 Fieldwork and Response

4.1 Fieldwork

4.1.1 Interviewer training

Many of Kantar’s interviewers had experience of working on the survey in previous years. Regardless of this, all interviewers attended an internal briefing conducted by Telephone Unit managers. This was supplemented by a further briefing by the research teams at Kantar and DWP to explain the purpose of the survey and questionnaire structure. Kantar’s quality control exceeds ISO 20252 prescribed standards with at least seven per cent of completed interviews monitored. Call introductions were monitored throughout the fieldwork period and interviewers regularly received feedback and coaching.

4.1.2 Advance letters and opt outs

All claimants or, if applicable, their appointees in the supplied sample were sent an advance letter two weeks before the start of fieldwork. This letter explained the purpose of the study, reasons for their inclusion in the sample and how the survey would take place. In Wales, letters were produced in both English and Welsh.

The letters included a Freepost address and Freephone number for claimants to contact if they did not wish to be contacted further (opt out) or if they required help or further information about the study. Those who contacted Kantar to opt out before the start of fieldwork were removed before the final sample was delivered to the Telephone Unit. After that point, if a claimant opted out, their record was removed from the Telephone Unit sample file directly.

4.1.3 Main fieldwork period

In total, 15,000 interviews were carried out during four quarterly waves of fieldwork as detailed below:

  • Quarter 1: 29 August 2018 to 3 October 2018

  • Quarter 2: 25 October 2018 to 29 November 2018

  • Quarter 3: 21 January 2019 to 25 February 2019

  • Quarter 4: 29 March 2019 to 3 May 2019

4.2 Response

DWP set quota targets by benefit for each quarter to allow for analysis.

Table 4.1 summarises the interviews achieved by benefit across each fieldwork period.

Table 4.1: breakdown of achieved interviews by sampled benefit and quarter, 2018/19

Benefit Quarter 1 Quarter 2 Quarter 3 Quarter 4 Total
State Pension 113 113 113 111 450
Pension Credit 377 374 377 372 1,500
Attendance Allowance 113 113 113 111 450
Carer’s Allowance 114 113 114 109 450
DLAc 113 113 113 111 450
Personal Independence Payment 756 750 750 744 3,000
Employment and Support Allowance 750 752 750 748 3,000
Income Support 113 113 113 111 450
Jobseeker’s Allowance 190 188 188 184 750
Universal Credit 1,125 1,126 1,125 1,124 4,500
Total 3,764 3,755 3,756 3,725 15,000

In total, Kantar interviewed 25 per cent of the final resolved eligible sample[footnote 6].

The definition of the final resolved eligible sample includes claimants who were eligible for an interview and where a final fieldwork outcome was recorded[footnote 7]. This could either be a completed interview or a refusal to take part, either prior to fieldwork or when contacted by an interviewer[footnote 8]. The definition of final resolved eligible sample excludes:

  • claimants who died

  • claimants with invalid or incorrect telephone numbers

  • cases where a named claimant was unknown at the telephone number recorded in the sample

  • ineligible sample members - claimants who were not in contact with DWP in the three months prior to interview or could not recall contact

This figure should not be considered a response rate. DWP set quotas for the number of interviews required by benefit group. Once this quota is met, Kantar cease attempting to contact claimants in this benefit group.

The overall conversion rate was 17 per cent. This is calculated as the number of interviews achieved as a proportion of the overall fieldwork sample (based on the number of advance letters sent out). This figure includes ineligible claimants.

Table 4.2 provides further information on conversation rate calculations.

Table 4.2: Fieldwork conversation rates, 2018/19

SP PC AA CA DLAc PIP ESA IS JSA UC Total
Advance letters sent 5,117 10,268 4,822 2,902 2,749 12,653 15,320 2,774 3,263 30,362 90,230
Final resolved eligible sample 3,374 7,176 3,719 1,877 2,009 9093 10,821 2,044 2,465 19,187 61,765
Achieved telephone interviews 450 1,500 450 450 450 3,000 3,000 450 750 4,500 15,000
Eligible sample Conversion rate * 13% 21% 12% 24% 22% 33% 28% 22% 30% 23% 24%
Overall conversion rate 9% 15% 9% 16% 16% 24% 20% 16% 23% 15% 17%

* Eligible sample Conversion rate[footnote 9]

Table 4.3 and table 4.4 provide further detail on the fieldwork figures and the proportion of the total DWP contacting population compared with achieved interviews.

Table 4.3 Fieldwork figures, 2018/19

SP PC AA CA DLAc PIP ESA IS JSA UC Total
Prior to fieldwork                      
Sample provided by DWP 5,225 10,373 4,873 2,926 2,772 12,738 15,433 2,783 3,278 30,492 90,893
Advance letters sent 5,117 10,268 4,822 2,902 2,749 12,653 15,320 2,774 3,263 30,362 90,230
Calls to office to opt-out before fieldwork 363 597 457 47 11 241 366 -24 36 199 2,293
Sample loaded into CATI 4,754 9,671 4,365 2,855 2,738 12,412 14,954 2,798 3,227 30,163 87,937
During fieldwork                      
Final resolved eligible sample 1,367 1,958 1,473 784 823 3,344 3,564 863 829 6,514 21,519
Opt-outs during fieldwork 25 53 46 7 5 29 30 0 6 21 222
Refusals (including proxy refusals) ** 161 325 113 83 58 607 519 68 109 584 2,627
Incomplete interviews 13 52 14 9 3 81 66 6 15 92 351
Ineligible *** 504 1,223 277 245 154 1,146 1,156 162 64 440 5,371
Deadwood **** 1,239 1,868 826 780 586 2,417 3,345 568 734 10,736 23,099
Unresolved sample 1,194 3,121 1,339 596 725 2,508 3,891 755 850 7,974 22,953
                       
Achieved telephone interviews 450 1,500 450 450 450 3,000 3,000 450 750 4,500 15,000

** Refusals (including proxy refusals)[footnote 10]
*** Ineligible[footnote 11]
**** Deadwood[footnote 12]

Table 4.4: The proportion of the total DWP contacting population compared with achieved interviews, 2018/19

SP PC AA CA DLAc PIP ESA IS JSA UC Total
Number of contacts made in 2018/19 1,375,851 248,603 445,864 691,080 381,640 3,124,089 3,024,426 1,088,075 6,401,929 11,092,088 27,873,646
Proportion of total DWP contacting population 4.9% 0.9% 1.6% 2.5% 1.4% 11.2% 10.9% 3.9% 23.0% 39.8% 100%
2018/19 interviews 450 1,500 450 450 450 3,000 3,000 450 750 4,500 15,000
Proportion of total survey population 3.0% 10.0% 3.0% 3.0% 3.0% 20.0% 20.0% 3.0% 5.0% 30.0% 100.0%

5 Data management, coding and weighting

5.1 Data management

Kantar produced datasets on a quarterly basis. From Q2 onwards, each data delivery was accompanied by a cumulative dataset of interviews to date, ending with the annual dataset delivered after Q4 fieldwork.

The dataset was checked and cleaned each quarter. This included:

  • routing checks on questionnaire variables

  • checks on all sample variables

  • cleaning of variable names, variable labels and value labels

  • comparison checks on quarterly datasets

  • sense checks on key variables

Derived variables were also created for analytical purposes.

5.2 Coding

The CSES contains many pre-coded answer lists developed over the years that the survey has been running. Some questions allow open ended responses to ‘other specify’ which are reviewed each year to consider if new answer codes should be added. If so, codes are generated manually through analysis of the text and agreed with the DWP research team. In 2018/19 no new codes were created.

5.3 Weighting

5.3.1 Overview

Kantar generated weights for analysis at national, regional and benefit group level. These were derived using a multi-stage process beginning with calculation of design weights and then using these to help construct non-response weights. Design weights compensate for variations in sampling probability; non-response weights compensate for variations in response probability. The final weight incorporates both elements.

5.3.2 Design weighting

The design weight compensates for variations in sampling probability within each contacting claimant population. The design weight for each individual is equal to one divided by his/her sampling probability. Consequently, the larger the design weight the lower the sampling probability. For contacting claimants of Attendance Allowance, Disability Living Allowance, Carer’s Allowance, Income Support, Pension Credit, State Pension, Employment and Support Allowance, Jobseeker’s Allowance and Personal Independence Payment there was some variation in sampling probability by region (former Government Office Regions). There was also variation in sampling probability for Universal Credit by UC region. The sampling probability was higher in less populous regions than in more populous regions. The design weight compensates for this variation.

The sampling probability is calculated as follows for each case:

P(sampled) = (Number of case sampled in region ÷ Total contacting claimant population in the region)

The design weight is then calculated as:

design weight = (1 ÷ p(sampled))

An example – based on Universal Credit in the fourth quarter of the 2018/19 survey – is provided in the following table (Table 5.1). The same number of cases was sampled in both London and Essex, and Wales. However, the contacting claimant population is much smaller in Wales. As a result, claimants that live in Wales are four times more likely to be selected for the survey (sampling probability of 0.0057) than claimants that live in London and Essex (sampling probability of 0.022). To compensate for this, claimants that live in London and Essex are given a larger design weight than those that live in Wales.

Table 5.1: Universal Credit Full Service sampling probabilities and design weights for London and Essex, and Wales, Quarter 4 2018/19

A B C D
Number of cases sampled Contacting claimant population Sampling probability = A÷B Design weight = 1÷C
London and Essex 693 121,687 0.0057 175.6
Wales 693 31,519 0.0220 45.5

5.3.3 Non-response weighting

Non-response weighting is required to compensate for systematic non-response – controlling for the fact that contacting claimants with certain characteristics may have a lower propensity to participate in the survey.

DWP provided benchmark population statistics for each contacting claimant population (by age, gender, region and contact type). These were used as benchmarks for judging the representativeness of the respondent dataset (after design weights have been applied).

However, during fieldwork, some sampled individuals were found not to be eligible for the survey because they did not contact DWP in the three months prior to interview. The proportion of claimants who were found to be ineligible for this reason varies greatly by benefit group and by demographic type. Therefore, the benchmark data itself is not a perfect representation of the contacting claimant population. This problem is overcome by comparing the combination of eligible and ineligible claimants – not just the eligible ones – to the relevant DWP database profile.

A method called ‘rim weighting’ (sometimes also called raking or iterative proportional fitting) is used to match this combined eligible and ineligible sample to the database profile in a statistically efficient manner and allocate all of the claimants a weight. Once this is completed, the ineligible claimants are deleted from the dataset. This leaves only eligible claimants (those that completed a full interview) with rim weights that ensure the interview dataset is representative of the eligible population[footnote 13].

5.3.4 Analytical weighting

The rim weight that is produced following the two stages above is used as the input for a series of other weights that are more convenient for analysis purposes. In general, these weights are simply re-scaled versions of the rim weight described above. A scaled weight is simply the rim weight multiplied by a constant.

Benefit weight

  • This weight is used when analysing each benefit group separately and is used for most parts of the annual report.

  • The weights of each benefit group are scaled so that the sum of weights equals the sample size for that benefit group.

  • This weight should not be used for any cross-benefit analysis.

DWP weight

  • This weight is designed for analysis of the total sample, most notably when producing the overall claimant satisfaction figure.

  • It is equivalent to the general rim weight, scaled so that the sum of weights equals the total sample size.

  • Details of the proportions for each benefit group within the total contacting population can be found in Table 4.4.

Annual weight

The Benefit and DWP weights are created each quarter. However, while the sample size per contacting population is approximately the same for each quarter, the population size varies considerably. Consequently, for the purpose of an Annual Weight, each quarterly sample is scaled so that in the Annual dataset, it takes its proper share. For example, in Q4 of the 2018/19 survey, the number of eligible Universal Credit claimants in the population was around 1.5 times higher than in Q1. Therefore, the Q4 data share in the Annual Universal Credit dataset should be 1.5 times that of Q1. The Annual weight ensures this happens without changing the relationship between the weights within each quarter.

Appendix A

The order of selected transactions in the questionnaire was as follows:

HIER

Section 1 – randomised

1. Applied for a benefit

2. Had a reassessment of your entitlement following a change of benefit (for example from Employment and Support Allowance to Jobseekers Allowance)

3. Received a decision following a medical assessment

4. Had an interview or review meeting (for example at a Jobcentre)

5. Reported a change of circumstances to DWP (including changes in your employment status)

Section 2 – randomised

6. Asked DWP to reconsider or appeal a decision concerning the benefit you applied for

7. Discussed jobs or training opportunities with someone at the Jobcentre

8. Looked for job vacancies

9. Stopped a claim

10. Conducted a work search review

Section 3 – randomised

11. Reported problems regarding accessing or using your online account

12. Reported issues or changes to your housing costs

13. Reported self-employment earnings

14. Received notification of a sanction

15. Reported problems with a benefit you are receiving

16. Received notification of a change to benefit payment (for example from monthly to fortnightly)

17. Tried to get information about a benefit

18. Made a complaint

19. Made an appointment (e.g. for an interview at a Job Centre) or asked for an appointment to be changed.

20. Requested a form

21. No contact with DWP in the last 3 months [ineligible for survey]

  1. https://www.gov.uk/government/publications/our-customer-charter/our-customer-charter 

  2. Or the claimant’s nominated appointee. Full details of eligibility are given in Section 2.1.1 

  3. ‘Recent contact’ is defined as contact between DWP and a DWP claimant taking place in up to three calendar months before the sample was drawn by DWP (but no earlier). Contact is defined as any time a DWP claimant has phoned, written a letter, emailed, visited a jobcentre, filled in an online form, or got in touch with a DWP representative. Contact is also any time someone from DWP has phoned, emailed, texted or sent a letter to a DWP claimant. 

  4. https://www.gov.uk/become-appointee-for-someone-claiming-benefits 

  5. Due to the limitation of the UC datasets, it was not possible to accurately extract and identify claimants with a ‘change of circumstances’. 

  6. For an introduction to the computation of response and non-response rates see ‘Chapter 6: Non response in sample surveys’ in Groves R M et al. (2009) Survey Methodology 2nd ed. Hoboken, Wiley., Second Edition 

  7. A fieldwork outcome is recorded for all cases where the Telephone Unit have made contact with a respondent (e.g. appointment made, bad numbers). These outcomes remain interim until there is no reason for the Telephone Unit to make further contact, e.g. if an interview is achieved, or if a respondent has a ‘bad number’ (meaning they are not contactable through the phone number provided in the sample), at which point they are coded as a final outcome. At the end of the fieldwork period some cases remain unresolved as the Telephone Unit are either unable to make contact with the respondent despite multiple attempts or have been told to call back. Those who reported having had no contact with DWP in the last three months prior to the start of fieldwork are screened out as ineligible. 

  8. This figure includes claimants who have contacted Kantar before or during fieldwork to opt out of the survey. In these cases it was not possible to identify whether the claimant was ineligible or refused to take part in the survey. 

  9. The rate of eligible sample conversion will vary by benefit, as the calculation is dependent on the number of interviews achieved, the amount of final resolved eligible sample and how much sample is left unresolved. 

  10. Please note that there is a difference between an ‘opt-out’ and a ‘refusal’. Respondents are coded as an ‘opt-out’ if they make contact with Kantar to opt out of survey participation prior to being called. Respondents are coded as a ‘refusal’ if they refuse to take part in the survey during a phone call with a Telephone Unit interviewer. 

  11. Those who reported having had no contact with DWP in the last three months prior to the start of fieldwork are screened out as ineligible. 

  12. Deadwood includes cases where the telephone number was bad or incorrect, or where a respondent has died. 

  13. A rim weighting algorithm works by matching profiles one variable at a time, updating the weights each time, and then repeating the sequence until all respondent profiles match the population profile (typically after four or five iterations) or otherwise cannot be improved. For a discussion of the approach see Sharot, T. (1986) ‘Weighting survey results’. Journal of the Market Research Society, 28 (3), pg. 269-284.