Energy Performance of Buildings Register Live Assessment

The report for the Energy Performance of Buildings Register Live Assessment on 9 December 2021

Service Standard assessment report

Energy Performance of Buildings Register Live Assessment

From: Central Digital & Data Office (CDDO)
Assessment date: 09/12/2021
Stage: Live
Result: Met
Service provider: Department for Levelling Up, Housing and Communities

Previous assessment reports

Service description

The Energy Performance of Buildings Register holds over 24 million energy assessments of domestic and non-domestic buildings.

Find An Energy Certificate is where users can view the following energy assessments:

  • Energy Performance Certificates (EPCs) - domestic and non-domestic
  • Display Energy Certificates (DECs) and their advisory reports
  • Air Conditioning Inspection Certificates and reports

The service helps users ensure that they are complying with regulatory requirements when selling or renting buildings and provides information about the energy efficiency of their buildings with advice about how to improve energy efficiency and links to other relevant services.

Getting A New Energy Certificate is where users can search for energy assessors to undertake domestic or non-domestic energy assessments of the types listed above.

Energy assessors send completed assessments in xml format via software managed by six energy accreditation schemes – only energy assessors who are accredited to one of the six schemes can send assessments to the register. This information is used to create certificates and reports as stipulated by regulations.

Service users

  • Citizen users: homeowners, tenants and domestic landlords
  • Commercial users: owners and tenants of commercial buildings and public buildings
  • Energy assessors
  • Data users

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a strong understanding of the different user groups and their needs. They continue to learn about users as the world and context changes and update their artefacts and ways of working to reflect this
  • the team make good use of the feedback from calls to the service desk as an indicator of how well the service is performing
  • the team provided good examples of research and design working together to solve problems and improve the user experience
  • the team clearly had a good working relationship with policy colleagues and used this to push back on requests and maintain a user centred focus to their work
  • the team provided evidence of how they had addressed their recommendations from the beta assessment report

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure any research undertaken by team members is of good quality and abides by the department’s ethical and data management standards. The team should seek guidance and oversight from other researchers in the community
  • consider a broader range of research methods, including in-person research as Covid restrictions allow. The panel felt that although the team had a good research rhythm, the approach had become a little stale. Recruiting a new User Researcher is a good opportunity for the team to inject some more creativity into their ways of working and ensure no one is being excluded
  • consider alternative ways of addressing the question of whether the service works for everyone. The team had good examples of meeting the needs of accessibility and assisted digital users, but are there other factors which impact how well users can meet their aims or complete a task? For example, do new and inexperienced landlords and assessors have the same level of success as those who’ve been doing it for years?

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • all service users identified by the team appear to be able to meet their immediate needs of attaining an EPC
  • the team are working with GDS, BEIS and others to ensure signposting and meta data is in a good place to direct the users to the correct service
  • there are plans in place to explore how this service can be extended to met the future needs of its users and the related industries in relation to climate policy and Net Zero aims and obligations

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how the service can influence the policy and design of artefacts such as the Energy Efficiency Ratio (ERR) chart. This may be the most prominent location for this information, alongside the certificate itself, and continuing to develop the content around this data could help users in their journey to meet their property needs in relation to energy in general
  • continue to keep the discussion with the policy team open in order to influence the wider journeys this service is part of, and to see this service as a potential point of education for users
  • to that end, challenge the scope of the service. This includes thinking more critically about the edges of the transactional service and how it leads to policy outcomes. In many ways, this service feels like an enabler for a lot of this. If the intended policy outcome is to make housing in the UK more efficient, how does the service contribute to that? This beta recommendation has not been clearly addressed so could be taken into the next phase

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are carrying out qualitative and quantitative research into issues and queries that come through to the support desk and user satisfaction feedback survey to improve the digital journey
  • assisted digital needs appear to have been met by the support desk. Often going above the call of duty with the practical help provided, as long as the volume of calls remains manageable, this seems like providing an appropriate solution for users

What the team needs to explore

Before their next assessment, the team needs to:

  • consider extending their positioning of the “print” functionality to explain why this might be a less desirable way of exposing the certificate. The example given of the MOT email containing a link to the certificate could be investigated within this service
  • make sure the support desk staff understand the online part of the service. This could mean working with them to promote usage of the service to meet their needs instead of relying on the support desk

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the service does what it sets out to, provides certificates based on a basic set of information provided by users
  • the team were able to demonstrate how they have iterated and improved the content used in the ‘get’ and ‘find’ user journeys and that these designs had been validated with user research
  • linking from search engines, GOV.UK, etc works well and users don’t appear to have any issues accessing the service
  • the service utilises the components and patterns present in the GOV.UK Design system - such as radio groups, warning text and the experimental accordion

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how costs are presented in the “recommendations” section, as there do not seem to be any indicators that the costs are “correct as of” a certain date. To stay relevant and simple for users to understand, this feature will need to be updated at regular intervals. It can help users’ understanding if data is given a timeframe or context. This can better help meet their needs
  • continue to question data and indicators which users may not find easy to understand, such as letter and number ratings. Maintaining an open dialogue with policy teams and relevant departments can build trust and enable usability improvements to be discussed and made more easily

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is functional without javascript as it employs progressive enhancement
  • the team have researched with users with accessibility needs and are iterating based on those findings
  • the service is accessible to WCAG guidelines and the team obviously take this seriously and are working to make the service as usable and inclusive as possible
  • using GOV.UK design patterns and components means the service is device-agnostic, mobile being the predominant device currently used

What the team needs to explore

Before their next assessment, the team needs to:

  • engage with the wider community to ensure that the EER chart is as usable for all as possible. Currently the colour contrast is not 100% sufficient for users suffering from red-green colourblindness (Deuteranopia)
  • take care not to take a helpful support desk for granted. The team have heard that some users were having the certificates printed out for them by staff and whilst this is helpful, if volumes increase, this may not be sustainable behaviour. The team could consider adding content or using interaction patterns to promote the link being copied or sent via email versus the “traditional” way of printing

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the entire team is in-house and build off permanent staff
  • the range of ‘agile’ processes and ceremonies shows a good understanding of agile principles

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure an in-house/team User Researcher is appointed within the team

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the range of ‘agile’ processes and ceremonies shows a good understanding of agile principles

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team releases to production up to 10 times a day thanks to the trunk-based development. This shows the maturity of the team and good stakeholder management

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is focusing the skills on application stack security and delegating to GOV.UK PaaS the platform security
  • Open Web Application Security Project (OWASP) security scanner and dependency vulnerability monitoring are a part of the development process
  • the team has built relations with the cyber team

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has published performance metrics (including on data.gov.uk) so KPIs can be accessed. KPIs and performance metrics are also shared with policy and board directors regularly
  • data is used throughout the service development and design including user feedback
  • the team have developed relevant and considered KPIs to their service

What the team needs to explore

Before their next assessment, the team needs to:

  • hire a performance analyst to work on the team rather than on a temporary basis which will help them develop their KPIs and methodology
  • consider changing how the satisfaction score is presented and calculated on the public facing dashboard to clearly present shifts in satisfaction score
  • include the accessibility review to include the public facing and internal dashboards to ensure that data visualisation is accessible

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was empowered to choose the right technology
  • the service is cloud native, based on modern tech stack, open source dependencies, and is using GOV.UK GaaP components. Notably GOV.UK PaaS
  • the team outsourced to SaaS all commodity requirements and focused on the business unique needs

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team develops in the open from day one See: https://github.com/communitiesuk?q=epb
  • releasing the open data is built into the policy

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team chose to rely on GOV.UK PaaS instead of building their own bespoke hosting platform
  • the service roadmap includes further improvements in service scraping resilience
  • the service employs progressive enhancements; additionally increasing service resilience

Published 24 March 2022