Evaluation of the Warm Home Discount scheme 2022-23 to 2023-24: annexes (HTML)
Published 30 April 2026
Applies to England and Wales
Annex 1: Evaluation questions
The following evaluation questions have been agreed with the department at the evaluation planning stage.
How is the recipient population of WHD structured?
Sub-questions:
- What were the characteristics of WHD rebate recipients (between and within rebate groups)?
- Were there any key differences between Core Group 1 and Core Group 2 recipients?
- Did the characteristics of rebate recipients match expected characteristics of those at risk of fuel poverty? Are there any groups who were systematically underrepresented?
- How many WHD rebate recipients have received support from other government fuel poverty schemes? (such as, the Energy Company Obligation (ECO) scheme)
How effective was the implementation and delivery of the WHD rebate?
Sub-questions:
- How did rebate recipients hear about the WHD scheme? What sources of information did they draw on? Were these accurate?
- What were rebate recipients’ experiences of using the online eligibility checker? How, if at all, did experiences differ among recipients?
- What were rebate recipients’ experiences of using the helpline? How, if at all, did experiences differ among recipients?
- How effective and accessible was the helpline for people providing missing information or challenging the Core Group 2 decisions? How, if at all, did experiences differ among recipients?
- For those rebate recipients who were previously eligible, how, if at all, did their experiences of the reformed scheme differ?
- How well understood was the process around providing missing information or challenging a property’s energy score? Were there any common issues? How well did households understand why they were eligible, and what the eligibility criteria was?
- What were the energy suppliers’ experiences of the delivery of WHD rebates? How, if at all, did experiences differ?
- How did suppliers (who were involved in previous scheme years) perceive the administration of ‘Core Group 2’ compared with the previous scheme’s ‘Broader Group’? + How, if at all, did experiences differ among energy suppliers?
- What were energy suppliers’ experiences of the helpline?
- From the perspective of energy suppliers, how did the delivery of the England and Wales WHD compare with the delivery of Scotland’s WHD?
- How did the differences in the WHD scheme between England and Wales and Scotland affect delivery in England and Wales?
- What has been the effect of the scheme reform on administrative costs for suppliers?
How effective was the implementation and delivery of Industry Initiatives?
Sub-questions:
- What were energy suppliers’ experiences of delivering Industry Initiatives? How, if at all, were there differences between suppliers?
- Was there sufficient capacity within energy suppliers to deliver Industry Initiatives in-house or with third party organisations?
- What Industry Initiative measures did energy suppliers provide/offer households?
- How effectively were energy suppliers able to support their fuel poor customers through Industry Initiative measures?
- What was the geographical distribution of Industry Initiative support across England and Wales?
What outcomes have been achieved through providing WHD to recipients?
Sub-questions:
- How has the rebate been used by recipients? Did recipients increase their energy consumption as a result of receiving WHD? Was this influenced by high energy prices?
- What outcomes were achieved in relation to energy consumption, thermal comfort, and recipient well-being?
- How effective was the rebate at targeting those at risk of fuel poverty?
- Were there any differences in outcomes between recipient groups?
What are the wider lessons from the reformed WHD scheme?
Sub-questions:
- Were there any unexpected considerations?
- What learning can the department apply to future energy bill relief schemes or future iterations of the WHD scheme?
Annex 2: Development of theory of change, customer journey map and energy supplier process map
This annex provides details of the process by which a scheme theory of change, customer journey map and energy supplier process map were developed. These were developed by the evaluation team following a detailed review of project documentation and consultation with key stakeholders.
Purpose of the theory of change, customer journey map and energy supplier process maps
A theory of change (ToC) of the Warm Home Discount scheme was developed to inform the evaluation plan, including the development of fieldwork materials and sampling approaches. A ToC is a tool used to cohesively understand the complexities of a policy and determine its effectiveness. Its aim is to understand how a certain intervention leads to a chain of results and impacts that are intended or are observed. It is produced through the synthesis of existing evidence of causal pathways. A ToC is an important tool in policy evaluation as it provides an opportunity to develop core evaluation questions, to identify key indicators for monitoring and identify gaps in available data, and to provide a structure for data analysis and reporting.
A customer journey map was also developed to account for the fact that different cohorts of Warm Home Discount recipients access the rebate in different ways – for example, matched recipients receive the discount automatically while others need to use the helpline to access the rebate. The customer journey map provided a detailed account of the different routes and processes involved for different cohorts of Warm Home Discount recipients to access the rebate. The customer journey map was used to develop and confirm the evaluation team’s understanding of customer journeys to access the rebate, and to inform the development of fieldwork materials that were tailored to specific cohorts.
An energy supplier process map was developed to provide a clear account of the processes involved for energy suppliers in managing and issuing both Warm Home Discount rebates and Industry Initiatives. The energy supplier process map was used to inform the development of the energy supplier interview topic guides.
Process taken to develop the theory of change, customer journey map and energy supplier process maps
In the initial scoping stage of the evaluation, the evaluation team developed a draft ToC through a detailed document review and scoping interviews with relevant DESNZ, Ofgem and Department for Work and Pensions (DWP) policy and analyst colleagues. Evidence reviewed included:
- documentation on policy context and related policies, including the Domestic Energy Affordability Scheme, and existing secondary evidence relating to these policies
- DESNZ’s draft theory of change
- scheme monitoring data (Ofgem WHD Annual Report and Management Information data, WHD Annual Official Statistics, Helpline Provider KPIs)
- existing evaluation and appraisals of the WHD, including evaluations of previous scheme iterations and the impact assessment for the refined scheme
- international evaluations of similar policies, including bill support packages
The development of the ToC was structured around the following 7 core headings, in 3 categories:
- Context, inputs and mechanisms: key market, policy and social factors underpinning the rationale of the Warm Home Discount as ‘context’, while all activities and resources that enable the scheme to operate were considered as ‘inputs’. The ‘mechanisms’ related to how reforms are intended to achieve stated outputs and outcomes
- Activities and outputs: the activities detailed how the reformed Warm Home Discount supports fuel-poor and low-income households across England and Wales. All immediate and measurable results derived from these activities were recognised as ‘outputs’
- Outcomes and impacts: subsequent changes/benefits that resulted from outputs were classified as ‘outcomes’. These captured anticipated changes within the market and among recipients. Ultimate benefit realisations of the Warm Home Discount were identified in the ToC as ‘impacts’ and captured the final intended changes which justify the intervention’s rationale
The draft ToC was tested with key stakeholders including DESNZ, Ofgem and DWP policy and analyst colleagues during a theory-building workshop, following which the ToC was refined to provide a final version that was used to inform evaluation planning.
The development of the customer journey map and the energy supplier process map was done concurrently with the ToC development, and focused on ‘zooming in’ on the activities involved for different cohorts of customers in accessing the Warm Home Discount rebates, and the activities of energy suppliers to distribute rebates and Industry Initiatives. Draft customer journey map and energy supplier process maps were shared with key stakeholders with detailed knowledge of the processes involved, following which they were further developed and finalised.
Annex 3: Survey methodology
Sampling frame and sample design
The sampling frame used for the survey was official government records of WHD rebates issued during the 2023/24 winter.
To achieve the target of 4,000 completed surveys, a stratified random sample of 32,042 addresses were selected and invited to participate in the survey.[footnote 1] Households invited to participate were drawn at random from within each of the 6 main WHD recipient subgroups. These subgroups are described in Table 1 of the main report.
The sample employed a disproportionate sampling approach, where subgroups were not targeted strictly in proportion to their share in the full population of 2023/24 WHD recipients. Instead, some smaller subgroups were oversampled, relative to their true size, in order to achieve minimum sample sizes for subgroup analysis. Table A3.1 below shows the subgroups of recipients and their respective shares of the 2023/24 rebates issued. If a strictly proportionate sampling approach was taken, expected sub-sample sizes for Core Group 1 Unmatched, Core Group 2 Other and Core Group 2 No Letter would have been too small for meaningful statistical analysis.
Table A3.1 Shares of 2023/24 WHD Cohorts in England & Wales[footnote 2]
| Cohort | Rebates issued (Count) | Share of rebates issued |
|---|---|---|
| Core Group 1 Matched | 840,000 | 27% |
| Core Group 1 Unmatched | 30,000 | 1% |
| Core Group 2 Matched | 2,000,000 | 64% |
| Core Group 2 Unmatched, Core Group 2 Other, Core Group 2 No Letter |
220,000 | 7% (note the majority are Unmatched) |
| Total | 3,100,000 | - |
Following the pilot fieldwork (see below), it became clear that households from the Core Group 1 matched subgroup – the second largest of the subgroups – were likely to have a substantially lower response rate than the other subgroups. Consequently, when the remainder of the sample was selected for the mainstage fieldwork, the overall target for the Core Group 1 matched subgroup was slightly reduced, and the targets for some of the higher-performing smaller subgroups were somewhat increased beyond the original oversampling proposed. When making these changes, the evaluation team ensured that the shift in strata targets would not unduly affect the precision of estimates made drawing on the survey.
The number of households selected to receive invitations to complete the survey in the mainstage fieldwork was based on response rates observed during the pilot. However, these response rates could not be predicted with 100% accuracy. This meant that final individual stratum numbers came close to, but did not precisely match, their pre-defined targets, and the overall number of people who responded slightly exceeded the original 4,000 target, at 4,014.
Table A3.2 shows the shares of the target sample number for each subgroup compared to the sample numbers achieved. It also provides details on how many households were contacted to achieve each subgroup’s end number and the corresponding response rate for each subgroup.
Table A3.2 Sample achieved and response rates
| Subgroup | Households contacted | Target | End sample | Response rate |
|---|---|---|---|---|
| Core Group 1 matched | 6,635 | 600 | 593 | 9% |
| Core Group 1 unmatched | 1,440 | 200 | 169 | 12% |
| Core Group 2 matched | 17,818 | 2,225 | 2,202 | 12% |
| Core Group 2 unmatched | 4,024 | 675 | 723 | 18% |
| Core Group 2 Other | 1,351 | 150 | 189 | 14% |
| Core Group 2 No Letter | 774 | 150 | 138 | 18% |
| Total | 32,042 | 4,000 | 4,014 | 13% |
Questionnaire design and cognitive testing
Questionnaire design
When designing the questionnaire, established questions from previous national surveys were used, where possible, in order to reduce measurement error. These were supplemented by survey questions specific to the Warm Home Discount scheme, developed by the evaluation team. A summary of the topics covered by the questionnaire is shown in Table A3.3 below.
Table A3.3: Survey topics
| Main Topic | Subtopics |
|---|---|
| Understanding of the scheme | – Familiarity with the Warm Home Discount Scheme – Recollection of 2023/24 rebate |
| Experiences of the scheme delivery and use of the rebate | – Eligibility route – Communication with/from energy provider and/or government – Way WHD is received – Use of the WHD payments – Use of new customer helpline and online eligibility checker – Perceived outcomes due to WHD |
| Scheme outcomes | – Financial security – Health – Well-being – Thermal comfort – Sense of agency over energy bills / costs |
| Targeting of the scheme | – Household data: Household size and composition, household income, disability status, entitlement to benefits – Dwelling data: Housing tenure, fuel type, energy bill payment mode, number of bedrooms – Data for household reference person: employment status, ethnicity and age – Receipt of other energy support |
| Further research | – Willingness to take part in further research – Contact details |
Cognitive testing
Eighteen cognitive interviews were carried out with participants from the target groups for the survey. Cognitive interviewing is a well-established method for pre-testing survey questionnaires. It involves using qualitative interviewing techniques to understand whether, in practice, survey questions function as intended. For example, conducting cognitive interviews can help identify when a question or option is being misinterpreted in a way not previously anticipated. The approach can also be helpful for highlighting any questions that assume a level of knowledge or recall of a subject that the survey participants do not have in practice.
These interviews were largely carried out over the phone due to the fact that many participants were not familiar or comfortable with video calling and screen-sharing technology. The interviews took the format of concurrent probing, where interviewees were first asked the survey question in exactly the same way it would be administered in the telephone mode of the survey, before being asked a series of additional questions to elicit information on how the interviewee had interpreted the survey question.
Some additional questions (probes) were specific to individual questions, with other more general probes being asked for a number of questions, for example:
- How easy or difficult did you find this question?
- How well do you remember this?
- What do you understand by the term “[word or phrase used in question]”?
- Can you tell me why you selected that option?
- Were you able to find a response option from the list that captures how you would answer this question?
The main aim of the cognitive interviews was to pre-test the new survey questions that were bespoke to the Warm Home Discount survey. However, the cognitive testing protocol also included some of the established questions as, although these questions had previously been tested, they often came from surveys conducted with a broader demographic than the Warm Home Discount survey.
The changes that were recommended following the cognitive testing involved minor revisions rather than removal or substantial redesigning of questions. For example, explanations were added in brackets where terms were not consistently understood or adding options that participants felt were missing from a pre-defined list.
Data collection model
The Warm Home Discount survey was designed as a mixed-mode survey that could be completed either online, by telephone, or via a self-completed paper survey. Survey recruitment took an online first or ‘push-to-web’ approach. Sampled households were sent a letter inviting them to take part in the survey. The letter also contained a link and QR code to access the online survey, along with a unique username and password.
For those who were less able or comfortable with the online format, the letter also contained details of a phone number and email address that respondents could contact to request a telephone interview or a paper questionnaire to self-complete at home and return via post. The vast majority of respondents (96%) opted to complete the survey online, with just 3% of respondents completing the survey via the phone and 1% completing a paper questionnaire.
Most sampled households who did not complete the survey after the initial invite received a follow-up invitation in the post 1 to 2 weeks after the initial invitation. There were 2 exceptions to this: 1) households who had requested to opt out of further communication after the initial invitation, and 2) households from smaller subgroups where the response rate to the initial invitation letter was high enough to reach the target without a reminder letter. Table A3.4 below shows the number of initial invitation letters and reminder letters sent to each of the 6 subgroups.
Table A3.4 Invitation and reminder letters sent
| Subgroup | Invitation letters sent | 1st reminders | 2nd reminders (Batch 1 of pilot only) |
Total |
|---|---|---|---|---|
| Core Group 1 Matched | 6,635 | 6,410 | 474 | 13,519 |
| Core Group 1 Unmatched | 1,440 | 1,380 | 67 | 2,887 |
| Core Group 2 Matched | 17,818 | 16,982 | 926 | 35,726 |
| Core Group 2 Unmatched | 4,024 | 3,714 | 184 | 7,922 |
| Core Group 2 Other | 1,351 | 660 | 26 | 2,037 |
| Core Group 2 No Letter | 774 | 86 | 36 | 896 |
| Total | 32,042 | 29,232 | 1,713 | 62,987 |
Piloting
As the questionnaire primarily drew on pre-existing survey questions and had also been pre-tested via the cognitive interviews, no significant changes to the questionnaire were expected to follow the pilot. Consequently, rather than a stand-alone pilot, the survey used a soft-launch approach where a small proportion of the overall mainstage target was collected in advance of the main period of fieldwork to test assumptions about likely response rates and the logistics of administering the CAWI (online), CATI (telephone) and paper survey modes.
The pilot had an initial target of recruiting 300 (75%) of the overall sample target and a conservative assumption of a 10% response rate. Accordingly, 3,000 households were invited to participate during the pilot phase. Although the end-of-pilot sample far exceeded the 300 target, yielding 491 valid survey responses, there were substantial variations in response rates between the 6 different recipient cohorts:
- 2 subgroups achieved or exceeded the 10% response rate after a single invitation letter
- 4 subgroups were below 10% after a single letter but exceeded 10% after a single reminder letter
- 1 subgroup (Core Group 1 matched) was still below 10% after 1 reminder letter and only exceeded 10% after a second reminder
This response-rate data from the pilot was used to forecast the number of letters and reminders that would be required to reach the overall target of 4,000 responses during the mainstage fieldwork.
Information on each cohort’s response rate in the pilot also led to refinements to the sample design. In the original design, oversampling was used to achieve minimum sizes for smaller subgroups, but following the pilot, this oversampling was increased slightly to compensate for lower response rates among certain subgroups.
Data processing and weighting of survey responses
Weighting
The survey adopted a disproportionate stratified sampling design, where smaller recipient subgroups were oversampled relative to their true share of all rebates issued in Winter 2023/24. Left unweighted, this would introduce bias into the survey results reported for the whole sample, as these estimates would overrepresent these smaller groups. Consequently, whenever whole-sample results were estimated, population weights were applied to bring the distribution of recipient cohorts in line with the most recent available government data about the number of rebates issued to each subgroup in 2023/24. Figure A3.1 below provides details of how the unweighted survey sample share of each subgroup differs from the weighted sample share.
Table A3.5. Sample shares vs. true population shares of rebates issued
| Subgroup | Sample size | Share of sample | Total WHD rebates issued | Share of total rebates[footnote 3] |
|---|---|---|---|---|
| Core Group 1 Matched | 593 | 15% | 840,000 | 27% |
| Core Group 1 Unmatched | 169 | 4% | 30,000 | 1% |
| Core Group 2 Matched | 2,202 | 55% | 2,000,000 | 64% |
| Core Group 2 Unmatched, Core Group 2 Other, Core Group 2 No Letter |
1,050 | 26% | 220,000 | 7% |
| Total | 4,014 | - | 3,100,000 | - |
Data processing
Coding of verbatim text
For some questions, respondents could enter verbatim text into an ‘Other, please describe’ option. These verbatim responses were analysed by the evaluation team and, where appropriate, back-coded into existing question options. New codes were also created for common verbatim themes that accounted for over 10% of an ‘other’ option. These 2 coding processes improved the usefulness of the survey variables for analytical purposes by reducing the number of responses assigned to the residual ‘other’ category.
Inconsistent responses
One of the early questions in the survey was important for routing respondents to particular follow-up questions. This question asked respondents ‘How did you find out that you were entitled to the Warm Home Discount this winter?’
Some respondents selected responses to this question that were inconsistent with government records about their rebate route. For example, some people who had received rebate automatically selected an option indicating that they first needed to call the helpline to determine their eligibility. The subgroup with the largest share of inconsistent responses was the ‘No Letter’ subgroup, with 44% of respondents from this group picking a response that was inconsistent with their official rebate route. Other subgroups were less affected, ranging from 8% to 26%.
These inconsistent responses had the potential to introduce measurement error into follow-up questions that asked about respondents’ experiences of a given rebate route. For example, anyone selecting an option indicating that they used the helpline would be shown a follow-up question, which would ask whether the helpline staff had explained why they were eligible in a way that they could understand (even if official records indicated they received the rebate without having to contact the helpline to determine their eligibility).
To avoid these inconsistent cases adding additional noise to follow-up questions, the evaluation team opted to separate these inconsistent responses. This meant that answers from anyone who had been routed to a follow-up question after selecting an option that was inconsistent with official records were reassigned to an additional ‘supressed’ category. It is likely that the length of time between receipt of the rebate (October 2023 to March 2024) and when the survey took place (September to November 2024) may have been behind these inconsistent responses, making it harder for respondents to recall their eligibility route, particularly if they had received the rebate via other routes in previous winters.
Analysis and reporting
Tabulations of the survey responses were created to inform further analysis and triangulation with qualitative fieldwork findings. For all questions, overall results, weighted by cohort, were presented along with unweighted results disaggregated by recipient cohort.
Where appropriate, tabulations of responses were further disaggregated by the following subgroups:
- Household composition
- Age band of the oldest household member
- Disability status
- Ethnicity
- Work status
- Tenure type
- Age band of the property
The tabulations by cohort and other subgroup classifications also included a column proportions test. This test considers the row of each table independently to compare pairs of columns and test whether the proportion of respondents in one column is significantly different from the proportions in the other column.
In addition, the evaluation team conducted logistic regression analysis on specific question responses. Logistic regression is a statistical technique which is used to identify which factors (attributes of a survey respondent) are associated with a binary outcome (yes/no), while holding constant the influence of other factors that could potentially influence the outcome. For example, what influence does living in a single-person household have on the likelihood that someone reported that they used the discount “to reduce future gas and electricity bills” relative to other factors. The logistic regression models included the following explanatory variables:
- Recipient sub-group
- Household Type
- Tenure
- Ethnicity
- Presence of any physical/mental health conditions
Annex 4: Qualitative data collection methodology
Sampling approach
Recipient interviews
In the first phase of the evaluation, 2 waves of recipient interviews were conducted. The first wave covered experiences of the 2022/23 scheme period. 955 interview invites were sent and 26 interviews were conducted. The second wave covered experiences of the 2023/24 scheme period. 381 interview invites were sent and 52 interviews were conducted.
Topic guides were developed to cover topics relevant to the evaluation questions. Key topics of focus included knowledge of the scheme and referral routes, use of the online eligibility checker and helpline, key processes, use of the rebate, changes in energy consumption, and perceived outcomes on thermal comfort, health and well-being.
Sampling for the first wave drew on recipient contact data provided by DESNZ to RSM. DESNZ provided a sub-sample of individual records, including postal address, Core Group (1 or 2) and the data matching cohort (matched, unmatched etc.). A further secondary sampling frame drawn from helpline recipients who had consented to be contacted for research purposes was also used to target unmatched cohorts. Purposive sampling was used to ensure coverage of all recipient types, rather than aiming for a representative sample (the majority of which would be formed by matched customers in CG1 and CG2), due to the evaluation’s focus on key processes including use of the helpline.
Table A4.1: Recipient interviews wave 1: Interview sampling per cohort
| Cohort | Invites sent | Target sample size | Sample size achieved | Response rate |
|---|---|---|---|---|
| Core Group 1 Matched | 135 | 7–9 | 2 | 1% |
| Core Group 1 Unmatched | 202 | 7–9 | 5 | 2% |
| Core Group 2 Matched | 135 | 7–9 | 2 | 1% |
| Core Group 2 Unmatched | 203 | 7–9 | 7 | 3% |
| Core Group 2 Other | 194 | 8–10 | 5 | 3% |
| Core Group 2 No Letter | 55 | 4–5 | 5 | 9% |
| Unknown | 31 | 0 | 0 | N/A |
| Total | 955 | 50 | 26 | 3% |
Table A4.1 shows the number of invites sent and sample size achieved by cohort for wave 1. The invites for wave 1 saw a low response rate of 3%, meaning that only 26 interviews were conducted, possibly because invites were sent nearly a year after the 2022/23 rebate had been received. Additionally, 4 recipients reported not having received the Warm Home Discount during the 2022/23 period, possibly due to recall bias or confusion with other energy-related support schemes such as Energy Price Guarantee or Energy Bill Support Scheme. While the sample was considered sufficient to generate findings for each evaluation question, it limited confidence in cross-group comparisons and highlighted a reliance on reaching the ‘point of saturation’ in qualitative interviews.
Sampling for wave 2 drew on the survey. At the end of the survey participants were asked whether they consented to be contacted for further research, and those that consented formed the sampling frame for the second wave of recipient interviews. Purposive sampling was used to ensure coverage of all recipient types. Table A4.2 shows the number of invitations sent, sample sizes achieved and response rates.
Table A4.2: Recipient interviews wave 2: Interview sampling per cohort
| Sub-group | Invitations sent | Target sample size | Sample size achieved | Response rate |
|---|---|---|---|---|
| Core Group 1 Matched | 139 | 10 | 10 | 7% |
| Core Group 1 Unmatched | 47 | 10 | 7 | 15% |
| Core Group 2 Matched | 83 | 10 | 12 | 14% |
| Core Group 2 Unmatched | 42 | 10 | 11 | 26% |
| Core Group 2 Other | 51 | 5 | 8 | 16% |
| Core Group 2 No Letter | 20 | 5 | 4 | 20% |
| Total | 381 | 50 | 52 | 14% |
Recipient topic guides were developed to cover topics relevant to the evaluation questions. Interviews were semi-structured, to allow coverage of all key areas whilst also allowing space to surface new insights. Separate topic guides were developed for matched and unmatched recipients to take account of the different processes involved in receiving WHD.
Energy supplier representative interviews
Interview invites were sent to 18 energy supplier representatives and 10 interviews were conducted, which was a response rate of 56%. All obligated energy suppliers were invited to participate, ensuring coverage of all suppliers in the Ofgem WHD annual report.[footnote 4] Interviews were based on topic guides developed to cover topics relevant to the evaluation questions. Topics included experiences of administrative processes and costs, which include experiences of Industry Initiatives and the targeting of the most fuel poor customers. All 10 interviewees were involved with Industry Initiatives delivery as they were made compulsory in scheme year 2022/23 as part of wider WHD reforms.
Table A4.3: Invitations and interviews with energy suppliers categorised by customer base
| UK customer accounts (approx.) | Number of suppliers invited | Response rate | Interviews completed |
|---|---|---|---|
| 7 million + | 3 | 100% | 3 |
| 5-7 million | 2 | 0% | 0 |
| 3-5 million | 1 | 100% | 1 |
| 1-3 million | 3 | 66% | 2 |
| 0-1 million | 6 | 50% | 3 |
| Unknown | 3 | 33% | 1 |
| Total | 18 | 56% | 10 |
Analysis and synthesis
Interviews were recorded, transcribed, and sample-checked for accuracy. For analysis, the recipient interview transcripts were imported into NVivo15.[footnote 5] The energy supplier interview transcripts were imported into Excel due to a small number of transcripts. The following processes were employed to analyse evidence obtained from interviews:
- Development of coding frameworks: Separate coding frameworks were designed for recipient interviews (see A4.4 below) and energy supplier interviews (see A4.5 below), ensuring alignment with key research questions and topic guides
- Thematic coding and synthesis: Interviews were coded to identify key emerging themes, which were then synthesised into descriptive accounts of broader themes. Where feasible, themes used in qualitative analysis were aligned with those in quantitative analysis to facilitate integration
- Comparative analysis: Demographic criteria (such as gender, location, age) were applied to enable comparisons across different demographic groups
- Explanatory analysis: Thematic accounts were used to develop explanations underpinning the descriptive findings
- Case study development: A selection of recipients who had not previously received support were examined as case studies to illustrate key findings regarding their experiences and outcomes
Table A4.4. Recipient interviews analytical framework
Primary code: Introduction
| Secondary codes: |
|---|
| Whether they understand WHD and how it differs from other schemes (yes/no) |
| How they found last winter in terms of keeping their home warm |
| Whether they heard about WHD before the interview (yes/no) |
| How they heard about the WHD |
| Where they looked for WHD information |
| Usefulness of WHD advice |
| When they received the rebate in their account |
| Whether they received the rebate in previous years (yes/no) |
| How the experience of receiving the rebate was different before |
| Whether applied for other energy support (yes/no) |
| Other energy support received |
Primary code: Government letter
| Secondary codes: |
|---|
| Whether they received a letter |
| Whether they expected to receive the discount |
| Understanding of eligibility |
| How they developed understanding of eligibility |
| What they understood from the letter |
| Usefulness of the letter |
Primary code: Helpline
| Secondary codes: |
|---|
| Whether they were aware of the helpline |
| Whether they called the helpline (yes/no) |
| Why they called the helpline |
| Advice given via the helpline |
| Understanding of the advice given |
| Whether helpline was successful in resolving the issue (yes/no) |
| How the helpline could be improved |
| Whether they challenged their energy score (yes/no) |
| Understanding of the process for challenging energy score |
| How easy the challenging energy score process was |
| Issues experienced whilst challenging the energy score |
Primary code: Eligibility checker
| Secondary codes: |
|---|
| Whether they were aware of the eligibility checker (yes/no) |
| Whether they used the checker (yes/no) |
| Usefulness of the eligibility checker |
| When they accessed the checker (before/after the letter) |
| How checker could be improved |
Primary code: Receipt of other energy bills support
| Secondary codes: |
|---|
| Whether they received other energy bills support (yes/no) |
| Which other energy bills support they received |
Primary code: Energy and heating decisions
| Secondary codes: |
|---|
| How often they think about heating their home |
| Type of heating they have |
| Whether they struggle to heat their home (yes/no) |
| How they pay their energy bills |
| What they do on a particularly cold day |
| How rising energy prices affected them |
Primary code: Outcomes
| Secondary codes: |
|---|
| What they did differently because of the discount |
| When they started doing something differently |
| How receiving the discount affected their finances |
| Whether they felt they had more money to spent on anything else |
| Impact of the discount on energy consumption |
| Impact of being able to heat their home |
| How they spent the rebate |
| Impact of the rebate on other spending |
| How the scheme can be improved |
| Lasting benefits from the discount |
Table A4.5 Energy supplier interviews analytical framework
Primary code: Introduction
| Secondary codes: |
|---|
| Role |
| Responsibilities |
| Company’s engagement with the WHD |
| Length of delivery of WHD |
| Whether affected by supplier participation threshold reduction (yes/no) |
| Examples of how affected by supplier participation threshold reduction |
| If newly included, whether had signed up voluntarily previously (yes/no) |
| How customer base aligns with other energy suppliers |
| How customer demographics differ from other suppliers |
| Examples of any other demographic differences |
| Whether customer base is split proportionally in England/ Scotland (yes/no) |
| Details of disproportionate split |
Primary code: Implementation and delivery
| Secondary codes: |
|---|
| Examples of positive delivery experiences |
| Examples of delivery challenges |
| Whether scheme delivered differently across regions (yes/no) |
| Examples of how the scheme is delivered differently across regions |
| Examples of how the scheme can be improved across regions |
| Descriptions of reporting arrangements with Ofgem |
| Reporting arrangements challenges |
| Whether they have experienced compliance related challenges |
| Examples of compliance related challenges |
| Whether there is a process for customers who slipped through the gaps (yes/no) |
| Examples or process for customers who slipped through the gaps |
Primary code: Scheme reform
| Secondary codes: |
|---|
| Whether involved in previous years (yes/no) |
| Challenges to administering CG2 |
| Ways challenges differ from administering Broader Group |
| Whether scheme reform has affected administrative costs (yes/no) |
| Ways scheme reform has affected administrative costs |
| Other ways the reforms affected them |
Primary code: Recipient experience
| Secondary codes: |
|---|
| Whether information or guidance provided (yes/no) |
| Ways information or guidance provided has changed |
| Whether customer sufficiently understand why they’re eligible (yes/no) |
| How well customers understand the process for challenging energy cost score |
| Under-represented groups in need of support |
Primary code: Industry initiatives
| Secondary codes: |
|---|
| Whether they provide Industry Initiatives (yes/no) |
| Which Industry Initiatives they provide |
| Reasons for selecting specific Industry Initiatives |
| How the reform affected the Industry Initiatives provided |
| Unintended consequences from the reform |
| Experience with delivering Industry Initiatives |
| Challenges with delivering Industry Initiatives |
| Industry Initiatives that were particularly hard to deliver |
| Delivering initiatives in house/via external providers |
| Customers targeted with the initiatives |
| Effectiveness of the initiatives |
| Data collected to showcase effectiveness/impact |
| Experience with reporting to Ofgem |
| Geographic locations of initiative delivery |
| Reasons for geographic locations of initiative delivery |
-
3,000 of these addresses were selected for the initial pilot, followed by a further 29,042 addresses for the mainstage fieldwork. ↩
-
DESNZ (2024) ‘Warm Home Discount statistics, 2023 to 2024’ ↩
-
DESNZ (2024) ‘Warm Home Discount statistics, 2023 to 2024’ ↩
-
Ofgem (2024) ‘Warm Home Discount Annual Report, Scheme Year 12’. At the time of writing, Ofgem had not yet released data on the 2023/24 scheme year. ↩
-
NVivo 15 is computer software that is used to organize and analyse information such as interview transcripts. ↩