Energy Performance of Buildings Register beta assessment

The report for DLUHC's Energy Performance of Buildings Register beta assessment on 19 November 2020.

Digital Service Standard assessment report

Energy Performance of Buildings Register

From: Government Digital Service
Assessment date: 19/11/2020
Stage: Beta
Result: Met
Service provider: Department for Levelling Up, Housing & Communities

Previous assessment reports

Service description

The Energy performance of Buildings Register holds over 20 million energy assessments of domestic and non-domestic buildings.

Find An Energy Certificate – this is where users can view the following energy assessments:

  • Energy Performance Certificates (EPCs) – domestic and non-domestic
  • Display Energy Certificates (DECs) and their advisory reports
  • Air Conditioning Inspection Certificates and reports

The service helps users ensure that they are complying with regulatory requirements when selling or renting buildings and provides information about the energy efficiency of their buildings with advice about how to improve energy efficiency and links to other relevant services.

Getting a new Energy Certificate – this is where users can search for energy assessors to undertake domestic or non-domestic energy assessments of the types listed above.

Energy assessors send completed assessments in xml format via software managed by 6 energy accreditation schemes – only energy assessors who are accredited to one of the 6 schemes can send assessments to the register. This information is used to create certificates and reports as stipulated by regulations.

Service users

  • Citizen users – homeowners, tenants and domestic landlords
  • Commercial users – owners and tenants of commercial buildings and public buildings
  • Energy assessors
  • Data users

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The team demonstrated a sound understanding of their service’s users and their needs. The team has carried out a good amount of research utilising an appropriate range of methods, for example workshops with data users and usability testing with end users. This is especially impressive considering the impact of the pandemic; affecting both research and the researcher’s ability to be the voice of the user for a team working remotely.

The panel was particularly impressed that:

  • the team has tested with 215 users from October 2019 to August 2020, of which 34 had additional accessibility needs, 8 had low digital literacy and 1 had limited fine motor skills
  • the team has considered and tested with colourblind users and put particular thought into ensuring that colour is not the only differentiator in the energy performance chart
  • key knowledge and insight has clearly been retained within the team even though their current user researcher has only been with the team since August

What the team needs to explore

The team’s research suggests that commercial users needing a specific type of assessment can use the service but might benefit from being able to filter search returns pages to only display assessors accredited to carry out that type of assessment.

The team explained that since April 2020 domestic landlords are unable to rent out properties with an EPC rating below E. They also mentioned that there is a possibility a policy change might see the minimum EPC rating increase to C or above which could significantly increase demand for new EPCs.

Although the service provides good advice on how to improve a property’s energy performance, the process that landlords go through to get a new assessment hasn’t been researched in detail. In particular, it might not be clear to landlords whether they need to to book a reassessment, update the EPC or get a new EPC (the correct action).

Before their next assessment, the team needs to:

  • research with commercial users looking to get specific types of EPC to ensure the service meets their needs - and identify possible usability improvements
  • carry out research with landlords who need to get a new EPC in order to let out a property, to ensure their needs are understood and met by the service

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

The team has reacted well to research challenges, for example identifying data researchers to participate in research then understanding their data import needs. The team has a research plan and has commissioned DAC to carry out an accessibility audit. Additional accessibility research is planned for December 2020.

What the team has done well

The panel was impressed that:

  • the team has conducted research with local authorities (data users) who are using EPC data to identify areas with low EPC rating and take action, including policy changes
  • the team has split the original single EPC service into separate ‘get’ and ‘find’ EPC service journeys which meet these separate user needs
  • the team has researched whole user journeys, including how users can complain to both assessors and schemes if dissatisfied with an EPC

What the team needs to explore

Before their next assessment, the team needs to:

  • research if, how and when people might be encouraged to check a property’s EPC as part of wider life events, such as buying, selling or renting a property
  • consider the needs of property portals - is there a way to encourage them to link to GOV.UK EPC data as the authoritative, canonical register instead of generating their own EPC graphs from those displayed in property listings?
  • consider doing more research with users looking to get an EPC and faced with a list of possible assessors - although the team cannot be seen to give competitive advantage to particular assessors or schemes, is there any way to help users make an informed choice?

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was made up of people with a diverse mix of skills and expertise, with a plan to have an on-going Civil Servant service team
  • the Service Owner was fully empowered and governance was enabling the team to deliver for users

What the team needs to explore

Before their next assessment, the team needs to:

  • consider bringing in an interaction or service designer as a more integral part of the team. This would make it easier for the team to explore wider journeys that this service is part of, including those that feature services offered by the private sector, and to challenge scope of the transaction

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working well in an agile way, learning and adapting as they go
  • the team has adapted well to home working, using more online tools to ensure that COVID-19 has minimal to no impact on progress
  • the team has a good approach to planning and road-mapping, prioritising well as a team

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • although not formally going through a private beta phase the team displayed good evidence of how they have iterated and improved the service to meet user needs
  • they have good plans in place to create a sustainable service team to ensure there is on-going continuous improvements
  • from a technology perspective, the team makes 30 to 40 small deployments to the live service on a typical day. This high frequency of deployment is associated with high performing teams. This has been achieved through good development processes, including trunk based development, pair programming, quick-running deployment pipeline and extensive automated testing

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the tech stack is predominantly open source software, which is well-known and production grade. This includes Ruby, PostgreSQL, ELK logging. Proprietary software is restricted to supporting cloud services, such as CDN and CI/CD
  • managed services have been used where possible, to reduce the burden operation and maintenance. This includes PaaS hosting, database, CI/CD and logging
  • Infrastructure as Code with continuous deployment helps with flexibility and reliable infrastructure

What the team needs to explore

Before their next assessment, the team needs to:

  • retire the infrastructure for the data pipelines from the legacy version of the system. The team is nearly at the point where it is redundant. Trimming this will reduce the amount of tech to maintain. (Remaining ETL for AddressBase may be better hosted in PaaS.)

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a good understanding of the security and privacy issues, and have put appropriate measures in place
  • a security review was carried out by an independent technical architect in advance of public beta. In addition an independent penetration test is now complete

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the design of the API reflects government API standards, including being RESTful, providing OpenAPI documentation and schema, making use of the Postman API framework
  • for hosting the team is using GOV.UK PaaS, which minimizes what the team have to manage themselves and makes use of government-wide services

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the end-to-end service has been tested on a variety of devices, browsers and with a representative sample of users, including those with accessibility needs
  • load testing has brought confidence in reacting quickly to surges
  • monitoring is in place for typical service level metrics
  • runbooks have been developed, to operate efficiently when doing common tasks
  • when there was an incident with an exceptional deployment, the incident was written up, including root-cause analysis, and shared with customers. In addition a runbook was added, to avoid similar problems in future

What the team needs to explore

Before their next assessment, the team needs to:

  • plan how you will monitor key indicators of system health and reliability (often described as ‘service level objectives’), so that you can get actionable alerts and manage investment in maintenance and improvement

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a good understanding of the impact of being unavailable, including obligations under legislation and potential implications for the housing market
  • the team can talk about their data recovery strategy and testing. The service and its infrastructure are in good shape to redeploy at haste, with runbooks also detailing how to recover the databases from snapshots

What the team needs to explore

Before their next assessment, the team needs to:

  • test their data recovery strategy on a regular basis

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • tested the service with all user groups, some of which have varying levels of expertise and familiarity with the task at hand. This was a slightly unusual private beta, in that the service has been available from GOV.UK for the duration of it, but the team has dealt with that well and managed to make sure the service works for all that need it

What the team needs to explore

Before their next assessment, the team needs to:

  • influence the wider journeys this service is part of. When searching for an energy performance certificate, users are likely to be in the middle of a journey that goes beyond their service. To ensure users can do what they’re trying to do, this needs more investigation. This includes looking at services offered by the private sector, like helping users find homes to rent
  • go further in challenging the scope of the service. This includes thinking more critically about the edges of the transactional service and how it leads to policy outcomes. In many ways, this service feels like an enabler for a lot of this. It will allow the team to learn more and to learn more quickly. This is a chance to improve how the underlying policy is measured and iterated, to make sure that it works well for users. If the intended policy outcome is to make housing in the UK more efficient, how does the service contribute to that?

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • uses the GOV.UK Design System and the patterns it offers with consideration
  • goes beyond the patterns offered in the GOV.UK Design System when it’s necessary for their service. When this happens, it is done really well – for the EPC graph, the team demonstrates frequent iteration and consideration for accessibility, by, amongst other things, including alternative text for those that struggle with its visual representation

What the team needs to explore

Before their next assessment, the team needs to:

  • think about how to iterate the ‘legacy’ design patterns, like the EPC graph. The team explained that this has been in use for a very long time, is recognised by users, and can often be seen in print. Whilst this makes sense, this new digital service offers a chance to learn about how effective this pattern is, and the team should not be afraid to challenge it
  • consider further how the wider service is represented on GOV.UK, perhaps by using a GOV.UK step-by-step pattern. This would have to work more closely with other parts of government which are part of this journey, which some teams have had success doing using service communities

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed a good understanding of all user journeys and has incorporated content about the service they’re working on related pages, such as ‘Buying or selling your home’ on GOV.UK

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • they have worked hard to gather data for the metrics
  • their metrics should help ensure the service remains cost neutral - critical in giving value for money
  • a significant improvement on the previous system

What the team needs to explore

Before their next assessment, the team needs to:

  • develop automated processes requiring less manual input
  • investigate processes which do not rely so heavily on Google Analytics, given the ability of users to opt out. Perhaps logging tools can assist with this

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • they have worked hard to improve the KPIs from they previous system
  • they deliver the requested APIs
  • have a good list of bespoke KPIs they can provide

What the team needs to explore

Before their next assessment, the team needs to:

  • they have a cost per certificate not a cost per user. It might make sense to have a cost per user to show the true value the service is providing
  • investigate whether they can have a dashboard showing their KPIs which updates automatically and includes some of their bespoke APIs

17. Report performance data on the Performance Platform

Decision

Point 17 of the Standard no longer applies.

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has tested the service with the Minister
Published 24 February 2022