Home Energy Advice Tool (HEAT)
Service report for DESNZ's Home Energy Advice Tool (HEAT) alpha assessment
Service Standard assessment report
Home Energy Advice Tool
Assessment date: | 14/05/2025 |
Stage: | Assessment |
Type: | Alpha |
Result: | Amber |
Service provider: | Department for Energy, Security and Net Zero |
Service description
This service aims to solve the problem of:
- information on home energy improvement measures and finance is often fragmented and not well signposted resulting in user confusion, making it challenging for consumers to find the information they need. The new Home Energy Advice Tool (HEAT) will provide a gov.uk service, centred around tailored advice and delivering information to consumers in one place. This will allow an effective digital consumer journey, where consumers can identify and choose which home retrofit improvements they make, find a trusted installer, apply for existing public funding and learn more about sources of green finance, including those who do not qualify for public grants.
Service users
This service is for:
General End-User Group:
- Lower Income Owner Occupier – Those who do not have the money to consider all the options but want to improve home heating. They may be unsure how and who to trust for advice/information.
- Able to Pay Owner Occupier – Those who don’t know the right solution for the house type and are not sure who to trust for advice/information.
- Private Tenant – Those not in control of the situation and are easily put off by information overload.
- Social Renter – Those who don’t know where to start in terms of options and do not know what they are eligible for as a social tenant.
- Professional Landlord – Those who understand grant entitlement per house/stock but find it challenging to confirm reliable, full proof options and are sceptical of return on investment of low carbon home improvements. They do understand the impact on home resale value.
Things the service team has done well:
- the team have a clear Vision for the service and have collaborated with other teams in the eco system to learning lessons.
- good evidence demonstrated of utilising existing data from the current clunky service to show insights, informing the design of the new digital service.
- demonstrated strong relationship with policy and delivery teams across organisation to understand interdependencies and sharing of knowledge.
- navigating through legacy implementation and fast paced change of policy.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
- conduct moderated research to understand the contexts of the most sensitive user groups.
- conduct moderated research to understand the contexts of service users with accessibility needs.
- work with existing teams within the Department to leverage opportunities for engaging with these two dimensions.
The service team can use existing unmoderated research to refine the samples for the next rounds. The service team also can leverage its already existing work with internal department teams, bearing in mind that sensitive contexts and accessibility should not be delegated. This is because the service team should understand those key user contexts to build the service. We recognise the team is in a good position to doing so on the Beta phase and has made progress on all of items mentioned above.
This can be achieved in the next phase (Beta). The team have identified this will be in their plan so the panel is confident that they are well placed to achieve a green rating for the Beta Service Assessment.
2. Solve a whole problem for users
Decision
The service was rated amber for point 2 of the Standard.
‘Expectation is that a more detailed roadmap than provided on slide 57 to give confidence the team have a plan to address these gaps by April 2026 and provide findings of the survey.
During the assessment, we didn’t see evidence of:
- a detailed roadmap has not been provided which will show how the project team will look to address and identify installers, range of local authorities and other user groups identified but left out of scope for Alpha.
- no analysis from the Survey has been provided other than a screenshot of a Mural. We don’t know what questions were asked, findings gathered or how it has shaped research plan going into and continuing through Beta. Considering they have only done 3 moderated usability testing knowingly focusing on limited personas and user groups. There is more emphasis required on the Survey and what the team have learned.
3. Provide a joined-up experience across all channels
Decision
The service was rated amber for point 3 of the Standard.
During the assessment, we didn’t see evidence of:
- exploring how to join up the user experience across all channels. The team described the Department’s existing offline support teams. However, users’ experience of government interactions is not structured along departmental lines. For Beta, this should include a plan to address assisted digital support journeys and the challenges users are likely to face.
- challenges faced by front line operations within DESNZ and Local Authorities.
4. Make the service simple to use
Decision
The service was rated amber for point 4 of the Standard.
During the assessment, we didn’t see evidence of:
- enough rounds of usability testing with only findings for two rounds demonstrated in assessment whilst a third round was still under analysis.
- understanding current device usage (including browser and operating systems) on existing services which will be absorbed by HEAT and how they plan to address any issues (if any) on chosen platform.
- consistency of meeting GOV.UK styling, currently on the Figma prototype the landing page includes full width banners, including imagery and videos. Further research required to understand how other services on GOV.UK have targeted audience to advise and apply for grants and services.
- during beta, more robust evidence will be needed to ensure use of imagery and videos meets user needs For example, videos previously on GOV.UK showed little engagement, high drop off rate and causing frustration due to static content.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- sufficient research on accessibility (see point 1). The team conducted a survey to derive a high-level segmentation of users’ self-reported access requirements. However, this approach should not replace including disabled people and people with other legally protected characteristics in research, especially as key user segments in this service are likely to face sensitive living conditions.
- research with low income (LIVA), vulnerable and those facing other barriers to installation and how they can be supported (low confidence, language barrier, digital poverty).
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
The team now have a dedicated performance analyst and a permanent resource is planned for Beta. The team also have a test engineer scheduled to arrive on the team for Beta.
7. Use agile ways of working
Decision
The service was rated green for point 7 of the Standard.
Optional advice to help the service team continually improve the service
- it was clear in the assessment that DESNZ utilise contractors on their development team and have a pool of UCD. However, this approach does not stop the excellent collaboration that happens on this product, and it is important that during Beta these collaborative calls continue across the team to ensure shared knowledge.
8. Iterate and improve frequently
Decision
The service was rated green for point 8 of the Standard.
9. Create a secure service which protects users’ privacy
Decision
The service was rated amber for point 9 of the Standard.
During the assessment, we didn’t see evidence of:
- threat modelling, where specific risks and vulnerabilities to the service have been identified so mitigation plans can be formulated and tested in Beta.
10. Define what success looks like and publish performance data
Decision
The service was rated green for point 10 of the Standard.
The team now have a dedicated performance analyst and a permanent resource is planned for Beta.
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
- the team has not developed any new components for this phase. They have confirmed they will publish all new code in a GitHub repository in the next development phases
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated amber for point 14 of the Standard.
During the assessment, we didn’t see evidence of:
- how your users would be affected if the online part of your service had technical problems.
- any analysis to understand any specific support needs that might be needed for this service. An excessive reliance on the existing support and maintenance plan might not ensure the service is reliable.