Renewable Electricity Register beta reassessment report

The report for Ofgem's Renewable Electricity Register beta reassessment on 21 January 2022.

Service Standard reassessment report

Renewable Electricity Register

From: Central Digital & Data Office (CDDO)
Assessment date: 21/01/2022
Stage: Beta
Result: Met
Service provider: Ofgem

Previous assessment reports

Service description

A service aimed at supporting users in the administration of their obligations/activities within three government environmental schemes:

  • Renewables Obligation (RO)
  • Renewable Energy Guarantee of Origin (REGO)
  • ROO-FIT

To provide this service, we need to support users by:

  • Maintaining a record of renewable electricity generating stations accredited within these schemes
  • Validating data submissions for generated electricity
  • Issuing RO certificates (ROCs) & Renewable Energy Guarantees of Origin (REGOs) for eligible renewable energy generated
  • Ensuring compliance by obligated energy suppliers

Service users

RO, REGO and ROO-FIT scheme participants as well as Ofgem internal users who administer the scheme:

  • Generators who own/operate renewable energy generating stations and claim support under the relevant schemes.
  • Agents who act on behalf of generators to claim support, completing all their tasks and interactions with Ofgem
  • Traders who buy & sell ROCs
  • Licenced electricity suppliers who have an obligation to present ROCs to Ofgem annually to represent a certain proportion of the electricity they have sold, which has been generated by renewable means

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated they had conducted more research with users since the first assessment and have research plans in place to allow this to continue. They also more clearly surfaced the research conducted in previous phases. They outlined the ways in which assets from previous research are stored and shared via an internal wiki
  • a plan for ongoing usability testing to test additional features when moving into live was outlined, and this sits alongside a plan to gather analytics
  • the previous feedback on the lack of a dedicated user researcher has been addressed and is part of a wider plan to support the ongoing resourcing of the team

What the team needs to explore

Before their next assessment, the team needs to:

  • the panel would have liked to have seen a clearer distinction between user research activities (like semi-structured interviews or usability testing) and wider engagement activities (including workshops and questionnaires) to give a more detailed picture of the way that user needs were being evidenced. This was much clearer in the supplementary materials, which helpfully broke down the four rounds of usability testing during beta

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team challenged feedback from the last assessment that parts of the end-to-end journey were out of scope, and have also conducted more research to address gaps across the journey and have plans to continue to do more
  • the team demonstrated a clear understanding of all the constraints that might impact their service and how they have worked to challenge them
  • the team have conducted some research into whether the service would be better broken down into multiple services and received feedback that would indicate their current approach works for users

What the team needs to explore

Before their next assessment, the team needs to:

  • The panel would have liked to have seen evidence of how breaking down the service into multiple smaller services that focused on meeting the needs of a single user type or role compared to the current design. The evidence so far gathered appears to relate to the grouping of organisations, whereas the panel’s feedback focused on whether the grouping of tasks by role could have led to a simplified design for users. We do however appreciate that at this point it is possibly no longer feasible to explore this topic, and with the positive feedback the team have gathered relating to their current design, also no longer a point of concern

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has delayed the requirement to slow the design/research cycle to allow insights obtained through research to be integrated into the service as quickly as possible
  • the team have started to address gaps in research relating to new users and touchpoints outside of the digital service, and have plans to continue researching these going forward
  • the team have recognised that some aspects of the design of the dashboard were confusing for users are have redesigned aspects of it to improve the layout based on user feedback

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure they complete all the planned research to address the gaps identified from the first assessment
  • continue working in rapid design and research cycles, ensuring that any new features go through multiple rounds of iteration

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are working to remove the requirement for Javascript and are aware of how this will impact on the user experience
  • the team have developed a new research plan which aims to understand the needs of users with low digital literacy
  • the team are doing additional research on how support channels will work for all users and have plans to test this as well

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that any recommendations or requirements that result from research focusing on support channels and users with lower digital literacy are prioritised for development in beta to ensure their service remains accessible and inclusive

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • there is now access to a user researcher in the team. This is a very valuable addition to the team and it was great to hear so much from them in the reassessment

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the team maintains access, at least, to the user focussed members of the team. It seems like there’s a large development team and this should be maximised through continued UR to ensure that the service continues to reflect user need
  • explore continuing with regular show and tells to inform the wider BEIS/Ofgem community about your continuous development and share your best practice.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • work had been carried out to improve the service beyond what was seen in the beta assessment. We were shown the concepts for the login page and saw how the user feedback mentioned in the first beta has been actioned to iteratively improve the process.
  • a permanent UR being in place made a difference to the amount of evidence that was available to show that the user’s needs are driving the front end development

What the team needs to explore

Before their next assessment, the team needs to:

  • explore how they can add narrative to their evidence so that assessors can see a clear path from hypothesis to initial design, user researcher findings and how those findings are turned into development to further improve the service. While it became clear that this process is being followed, the service team need to do a better job of articulating this. Sticking to a regular cadence of show and tells will help you to improve these skills so that you are better able to articulate the great work that you’re doing at service assessments and more broadly as Ofgem develops more digital services

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have aligned technology choices with the wider OFGEM technical stack and organisational expertise, and have helped guide OFGEM’s cloud strategy to ensure it meets the requirements of the service
  • existing services and platforms are being re-used where possible, and new microservice components are being developed with organisational re-use in mind
  • the team will implement a “magic link” sign-in flow that circumvents the limitations of Microsoft’s ADB2C identity management solution to allow users to log in to the service without JavaScript
  • there is an agreed “emergency release” process to enable urgent code changes to be released to production quickly if needs be

What the team needs to explore

Before their next assessment, the team needs to:

  • explore whether the “magic link” login flow can be used more widely, either for all users of this service or more generally across the organisation, to get better value for money from the significant investment in its development
  • build all new pages and functionality in the service using progressive enhancement, to ensure that all users can access the core functionality

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is to be congratulated on working with OFGEM’s security team to bring about a change of attitude to open-sourcing code across the entire organisation. The work put into achieving this has already benefited the Green Gas Support Scheme, and will benefit all OFGEM teams working to the Service Standard and Technology Code of Practice in future
  • the code will be published in a publicly accessible repository next week

What the team needs to explore

Before their next assessment, the team needs to:

  • the open sourcing process described involves publishing an artificially sanitised version of the codebase, with squashed commits and a separate review process before publication. This means that the “open” version of the code will lag behind and have a different commit history from the live running code. Much of the benefit of source control comes from developers being able to see the commit history to help them understand the reason why the code is how it is. The team should explore genuinely coding in the open (i.e. committing directly to the open repository), which will remove the overhead of the review-squash-publish workflow

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the new “magic link” sign-in flow will provide a fallback to allow users to reliably access the service without JavaScript
  • monitoring dashboards have been built out that allow the service team to see activity in the system and health of the infrastructure, along with automated alerting into a Teams channel for events that might indicate issues with the service
  • the team understand the backup and restore options available from the Azure SQL managed service chosen

What the team needs to explore

Before their next assessment, the team needs to:

  • rehearse and document the process of restoring data from a backup of the managed service
  • consider whether there would be a benefit to keeping additional backups outside of the Azure tenancy
Published 3 February 2022