Renewable Electricity Register beta assessment report

The report for Ofgem's Renewable Electricity Register beta assessment on 1 September 2021.

Service Standard assessment report

Renewable Electricity Register

From: Central Digital & Data Office (CDDO)
Assessment date: 01/09/2021
Stage: Beta
Result: Not Met
Service provider: Ofgem

Previous assessment reports

Service description

A service aimed at supporting users in the administration of their obligations/activities within three government environmental schemes:

  • Renewables Obligation (RO)
  • Renewable Energy Guarantee of Origin (REGO)
  • ROO-FIT

To provide this service, we need to support users by:

  • Maintaining a record of renewable electricity generating stations accredited within these schemes
  • Validating data submissions for generated electricity
  • Issuing RO certificates (ROCs) & Renewable Energy Guarantees of Origin (REGOs) for eligible renewable energy generated
  • Ensuring compliance by obligated energy suppliers

Service users

RO, REGO and ROO-FIT scheme participants as well as Ofgem internal users who administer the scheme:

  • Generators who own/operate renewable energy generating stations and claim support under the relevant schemes.
  • Agents who act on behalf of generators to claim support, completing all their tasks and interactions with Ofgem
  • Traders who buy & sell ROCs
  • Licenced electricity suppliers who have an obligation to present ROCs to Ofgem annually to represent a certain proportion of the electricity they have sold, which has been generated by renewable means

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understood the space very well. They appeared very close to their expert users, and had done a great deal of work to prepare for future data migration including side-by-side testing with existing users
  • the team has considered accessibility needs, tested with users, conducted reviews and have generally good insight into how issues may affect their user group
  • the team had done intensive research at certain points, including at the start of Beta, and in the period before the assessment
  • the team are planning for a slow release of the service to new groups of users, so that they can monitor performance carefully

What the team needs to explore

  • this feels like a very tech-heavy and tech-led project, very much dominated by the central problem of data migration. It would be good to see more strategic thinking about design decisions and user journeys
  • the lack of a dedicated user researcher for much of private beta has meant a limited capacity to test and iterate designs. The team are requesting research & design support for the next stage but this is not yet agreed. This is a risky gap for such a complex service
  • where research has been done, it was not clear that the team had responded to results by redesigning and re-testing; for example, testing of the registration journey had identified a long list of potential improvements, but these have not yet been implemented
  • it would be good to see some clearer examples of design iteration (for example, in thinking about content, or language, or specific user journeys). Edited to add: In follow up conversations, the team did share some examples of how they had changed their thinking about various elements of the design in response to user feedback
  • the team would benefit from using multiple methods (qualitative, quantitative, analytics) to evaluate the service)

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a good knowledge of the constraints faced by the service and had raised these with Ofgem
  • the team demonstrated an understanding of the other services offered by Ofgem and the wider service landscape
  • the team had conducted a lot of work to migrate existing data and avoid users providing the same information to government

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the service is scoped to how users think. A lot of the end-to-end journey, including routes into the service and the use of guidance, were considered out of scope for the service team. This limits the team’s ability to fully solve a problem for users and to evaluate how well their assisted digital support model will work
  • explore whether tasks would be better solved as single transactions. The team have combined a number of tasks within their service which has created confusion for some users (as evidenced by the poor performance of the initial ‘dashboard’ page)
  • revisit whether there is any capacity to push back on the large number of constraints identified by the team. Legislative constraints have informed a lot of the service and interaction design, and technical and budgetary constraints have hampered the team’s ability to properly ideate and identify better solutions for users

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • frontline staff and subject matter experts have been involved in the team’s decision-making process throughout beta
  • the team have conducted user research with frontline staff
  • the team have used data and insights gained from pre-existing support channels to inform the design of their service

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct more testing of any offline aspects of the user journey. The team acknowledged that this is an area of weakness and plan to do more to address this gap

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had identified a large number of pain points and user needs from the previous iteration of the service and were using these to inform their designs
  • the team had conducted usability testing and identified potential changes to the design of their service that could improve the user experience
  • despite not being a requirement, the team had chosen to use a variant of the GOV.UK Design System and GOV.UK Frontend to build the majority of their service

What the team needs to explore

Before their next assessment, the team needs to:

  • start working in more frequent research and design cycles. The team presented little evidence that they were regularly iterating on their designs and testing how they performed with users. To achieve this the team should also increase the pace at which they are able to test designs with users, which at the moment is impacted by the requirement to make any changes to the live service
  • conduct more testing with less-experienced users. As things stand the service requires a high-level of assumed knowledge and experience, which will present a barrier to any users who lack this. Although two of the three schemes are currently closed to new users, the team should reflect on other scenarios where a user might lack this experience and accommodate their needs accordingly
  • test all the parts of the service that the user interacts with - online parts and offline parts. In particular, the team should be designing and testing their guidance and email content, which are key touchpoints that have been mapped by the team to multiple stages of their user journey

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have already conducted an accessibility audit and are working towards having an accessibility statement ready for when the service goes live to users
  • the team has a plan in place to address the issues raised in the audit prior to the service going live to users
  • the team have conducted testing with users who have access needs

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that a user can use their service without the need for javascript
  • test their assisted digital support with users, and make sure users who need assisted digital support know that it’s available and how to access it

6. Have a multidisciplinary team

Decision

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • despite the team being a mix of civil servants and contractors there was a real sense of team cohesion and joint ownership

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that there are sufficient team members in UR, Service and Content Design. The proposed team post-Softwire is well resourced for developers, with a lot of QA support too. However the commitment to continuous iterative improvement will not be met without adequate insight from a user researcher. The team needs to ensure that this key role in particular is filled

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used a two week sprint cadence and held regular agile ceremonies
  • the fortnightly show and tells with a large invited group of stakeholders was a great touchpoint for governance and for others interested in learning more about the service

What the team needs to explore

Before their next assessment, the team needs to:

  • consider different ways of working once in live service and Softwire support has ended. For example, a sprint approach may not be appropriate for running a complex live service. Lean/Kanban approaches should be investigated
  • explore ways of presenting project artefacts that match agile project management. Move away from Gantt based presentation towards thinking in terms of roadmaps

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • key finding from UR was tagged clearly to show user satisfaction with various facets of the service
  • simple changes to the service were made to provide clear data at-a-glance in response to user feedback

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate the user journey with new users to validate assumptions about the structure of the service. UR to date seems heavily weighted to existing users of the service
  • consider the value in splitting the front end into three services, one for each distinct user type. This would simplify the user journey. This approach should at least be tested to challenge the current assumptions
  • investigate how to show evidence for iterative changes to the service design in particular through separate phases of research

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service leverages existing firewall and virus scanning services from OFGEM’s “Hub” Azure tenancy, avoiding the need to re-procure or re-implement these services over again
  • automated security scanning of deployment artifacts (Docker containers/Kubernetes pods) is built in to the release pipeline
  • automated ZAP scanning of new releases to pre-production environments is built into the release pipeline
  • the service has been penetration tested at both application and infrastructure level, with no major issues uncovered

What the team needs to explore

Before their next assessment, the team needs to:

  • review the cookie information and consent page to ensure that it is a true reflection of the cookies set by the service. Currently only “necessary” cookies are listed there; users should have to explicitly opt-in to any optional analytics cookies that are introduced in future
  • review the security services provided by the “Hub” subscription to ensure they meets the requirements of this service, and ensure that any monitoring and alerting at the “Hub” level is visible to the service team

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had defined goals for the service that expand their success indicators beyond the standard KPIs
  • the team had made best use of the standard KPIs considering the complexity of the service. In particular, showcasing that with multiple endpoints to the journey, there needs to be more than one completion rate calculated

What the team needs to explore

Before their next assessment, the team needs to:

  • choose an appropriate tool to set up user behaviour tracking to develop their understanding of pain points and error issues experienced by the user, which can help shape service improvements
  • implement a solution to gather user satisfaction and feedback. Decide on a process for how often to review user satisfaction data and feed into service improvements
  • consider consolidating the various reports into a single view to make it easier to see how the service is performing

11. Choose the right tools and technology

Decision

The service did not meet point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have aligned technology choices with the wider OFGEM technical stack and organisational expertise
  • existing services and platforms are being re-used where possible, and new microservice components are being developed with organisational re-use in mind
  • the team are speaking to Microsoft to see if the drawbacks of the current ADB2C implementation can be worked around

What the team needs to explore

Before their next assessment, the team needs to:

  • the current ADB2C forms for users to sign up and log in to the service do not function without Javascript. The service should be usable with just HTML, allowing all users to access the service. More than 1 in 100 users will not run Javascript for one reason or another, and so Javascript should only be used for progessive enhancement of the service, and not for core functionality
  • if Microsoft confirm that ADB2C can not be used without Javascript then a different identity and access solution should be found
  • minimise the manual intervention required in the software release pipelines as much as possible, to allow fast iteration of the service and deliver value to users as quickly as possible

12. Make new source code open

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have opened the discussion with OFGEM’s head of assurance about their desire to make the source code open

What the team needs to explore

Before their next assessment, the team needs to:

  • understand the unequivocal requirement of both the Technology Code of Practice (point 3) and the Service Standard (point 12) to make all new source code developed within government open
  • continue to liaise with the Security, Privacy and Resilience Department to get agreement that code can be made open. Security through obscurity has never been a recommended or successful strategy
  • commit to developing all new components of the Beta service in the open, except where there is a specific demonstrable reason to keep it private e.g. the implementation of fraud detection algorithms

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are using a range of open technologies to build the service (.NET Core for application development and Kubernetes for deployment)
  • the team are reusing frontend components from the GOV.UK Design System and Ministry of Justice Pattern Library where possible
  • GOV.UK Notify has been evaluated as a communication platform, but does not meet all the needs of the service
  • new microservices are being developed with internal OFGEM re-use in mind

What the team needs to explore

Before their next assessment, the team needs to:

  • consider whether any new frontend components developed for this service can usefully be fed back into the GOV.UK Design System
  • if the new utility microservices were open-sourced then there is the potential for them to be re-used more widely across government and beyond

14. Operate a reliable service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are rehearsing, refining and iterating the data migration process, with multiple rounds of quality assurance on both individual records and aggregate statistics on the bulk of data transferred. This should result in high confidence that data has been properly migrated from the legacy system when the full migration is carried out
  • application logging is in place and a full audit trail of user activity will be implemented before the service is live. Infrastructure monitoring functionality is being built out and will give a view of cluster health and network activity in the service

What the team needs to explore

Before their next assessment, the team needs to:

  • browsers can fail to run Javascript for a number of reasons; for the service to be reliable for all users it should be usable with HTML only, and only require Javascript for progressive enhancement
  • much of the monitoring is “owned” by OFGEM’s infrastructure team rather than directly by the service team. Ensure that the service team has suitable visibility of all the logs and metrics they need to be assured that things are working as they should be, to enable the service team to respond to issues promptly
  • develop a clear understanding of what events in the system should trigger alerts to the team
  • explore options for regularly backing up data outside of the Azure tenancy, for disaster recovery
Published 3 February 2022