Biodiversity Net Gain Digital Service

The report for Defra's Biodiversity Net Gain Digital Service alpha assessment on 2 March 2022

Service Standard assessment report

Biodiversity Net Gain Digital Service

From: Central Digital & Data Office (CDDO)
Assessment date: 02/03/2022
Stage: Alpha
Result: Met
Service provider: Defra

Service description

The Environment Act (2021) requires developers to deliver a ‘biodiversity net gain’ as a condition of their planning permission. To deliver biodiversity gains, habitats can be enhanced on the developer’s own building site, or at an off-site location, by paying a landowner to manage their land.

These off-site gains need to be verified by government and the local planning authorities, to ensure they are legitimate, accountable, and transparent.

The ‘Register land for off-site biodiversity gain’ service helps users record and verify off-site habitats they are creating, to ensure trust and transparency in nature’s recovery.

This service will create an up-to-date record of biodiversity gains across the country, showing where, what, and how habitats are being enhanced by development.

Service users

This service is for;

Primary Users:

  • Development Based Users

  • Developers
  • Consultant Ecologists

  • Land Based Users

  • Landowners and farmers
  • Land agents and farm secretaries
  • Habitat organisations

Secondary Users:

  • Local Planning Authority Officers
  • Register Operator Officers
  • Natural England and Defra
  • UK Citizens

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • there were very clear and consistent examples of user-centred design throughout the alpha. User stories are comprehensive and well-articulated

  • user needs were well researched and engagement was thorough. This enabled the team to identify the range of issues, behaviours and motivators and how these complemented and/or conflicted with each other

  • where engagement was restricted, such as with farmers and Local Authorities, proxy measures were used via relevant bodies, unions, Future Farming contacts

  • user journeys and emerging assumptions were externally validated with specific users to ensure that they were robust

  • as part of their prioritisation of user needs, the team did consider further user types such as NGOs and emerging players in this market and assessed their significance

  • plans were in place to continue to refresh their understanding of the user journey, needs and influences on the user experience, iterating the service where necessary

  • the team emphasised the need to use the language of the users, reflecting user behaviours and need for guidance

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that research is conducted with all their user groups - for example the public was highlighted as a secondary user, but research had focussed on primary users. The panel would expect this to kick off in beta as part of the search and view research
  • demonstrate how they are feeding research into the other programme teams, and how the learnings from their private beta are reflected in other work
  • continue to engage and test with lower digital literacy/ accessibility users, and consider how the product will be used across a variety of devices

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has clearly understood and demonstrated constraints that affect the service, including local planning authority mandates, and worked in a user-centred way with policy teams to solve the problems this and other constraints may pose to the BNG service
  • the team has taken a user-centred approach to scoping the service, and focused on the Register part of the journey, ensuring the scope is not too broad or narrow
  • the team was able to clearly explain how this prototype joins up with other things, including the Biodiversity metric 3.0 tool and Planning Portal, into a journey that enables users to register an offsite BNG. The team has taken responsibility and are collaborating with the Defra content team to agree how the above user journey will work
  • the team has considered alternatives to this service, and has scoped this prototype appropriately, coordinating the upscaling of a postal application service with another team
  • the team has been working in the open, sharing findings and learnings through published blog posts

What the team needs to explore

Before their next assessment, the team needs to:

  • continue working towards an easy way for users to provide information, including login details, to the service: the prototype currently uses Government Gateway. The team should continue working towards using an alternative sign-on such as Defra ID or GOV.UK sign-ins
  • continue working collaboratively with other teams and organisations, including the Defra content team, Natural England, and GDS to solve the Register problem for users

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team works regularly with Defra policy and legal teams to ensure that they are involved in user research and involved in prioritisation decisions
  • designers and researchers regularly incorporate user-generated insight into the design of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure designers and researchers continue to work with front line operations staff, ensuring that the service design and UI is driven by user-generated insight
  • ensure that plans to increase takeup of the digital service do not make it more difficult for offline use of the service, including the postal channel that is being upscaled. For example, the postal application should be signposted well and throughout the digital service, to enable users to access it from any point in their digital journey
  • ensure that the design of the upscaled BNG postal application service and new telephone mechanism take into account relevant insight from the design of the digital BNG service, and vice versa
  • create a detailed service blueprint that shows the different touchpoints a BNG user could encounter, such as the online service, email and paper application touchpoints

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a content designer on the team, showing DEFRA’s strong commitment to user-centred content design.
  • the team has done work to understand how users of similar services to BNG are engaging offline and plan to work with the business in Beta to evolve the operator team and business function and use that data to test with users
  • the team has ensured that the service can be accessed by users with a range of devices by working with the Prototype Kit, GOV.UK Design System, Voiceover, and by performing manual accessibility checks

What the team needs to explore

Before their next assessment, the team needs to:

  • look into the process of 2i in content design (this article was published in 2015 but the process it describes is valid and worthwhile), and ensure that there is support within the team to carry out this process on the service as part of design iterations. Performing regular 2i provides quality assurance for all content going forward, and will help build on or reinforce content and editorial guidelines within the department.
  • explore how the Start page information could be better formatted to provide all the information the user needs upfront - the Guidance content type may serve this type of information better than the existing Start page content type that is being used currently
  • explore how specialist language can be simplified to make the service more accessible and inclusive. The team has done some really good work to ensure that the service uses words that users use. Research has shown that even specialist audiences prefer simple language. Accessibility needs can be temporary and situational as well as permanent, and clear and simple language reduces the cognitive load on the user. Consider using contextual guidance rather than replacing specialist terminology
  • consider making the eligibility check clearer upfront, to make the journey clearer for users - the first questions seem to be assessing eligibility before the user can proceed to registering, but the service name is geared towards registering
  • explore whether it should be the case that landowners are specifically singled out in the questions, or whether this may be isolating the developer audience
  • continue exploring service names to hit on a name that is both concise and unambiguous
  • refer to the Check answers component, specifically when designing the navigation from Check answers back to the service, when a user has decided to make a change to an answer
  • carry out rigorous testing across the service - for example, in section 2, part 3 is made up of two questions but this is not made clear in the progress bar - I am also told my form is incomplete but all the sections have been marked as complete

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • although the scope of this service is England only, the team evidenced how they have engaged with the wider family of government/public sector stakeholders, including Nature Scotland and Natural Resources Wales, which facilitated the sharing of best practice
  • the team is maintaining an open source repository via GitHub to maintain quality and interoperability
  • the team has done a lot of research on how to make maps accessible

What the team needs to explore

Before their next assessment, the team needs to:

  • analyse how the operational costs versus benefits of offering supporting services such as helpline and guidance will be a key part of Beta
  • carefully assess the dependency on PDFs, as these do not comply with WCAG guidelines and will not be compatible with the longer-term hosting requirements of GOV.UK
  • ensure that there is an assisted digital support mechanism in place, and that it is clearly signposted throughout the service and not just at certain points in the journey
  • ensure that all parts of the journey are fully accessible, including offline parts of the journey and email touchpoints
  • consider making it more explicit upfront that registration can only proceed if you have a Government Gateway ID, so users know what to expect, if this is the solution taken into beta
  • in section 2, consider providing more information before the Check Habitat data screen – to manage expectations that the user will see this next

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a full range of skills in place for Alpha and has identified the additional support that they will need for Beta
  • the team has recognised and mitigated against the challenge of working in a new policy area, ensuring that key points are documented to help any new team members get quickly up to speed
  • the team are working across the programme to share knowledge, including sharing of resource

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that any resource sharing doesn’t impact unnecessarily on their delivery

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have clearly established ways of working, and are able to honestly reflect on and make changes to this
  • the team have good and clear support from their Service Owner and feel supported by their governance processes, as demonstrated by the discussion on scope creep
  • the team were clearly passionate about their work and were able to talk with confidence and enthusiasm about what they’d been doing. Strong relationships were clearly demonstrated and the panel hopes that this can continue as the team develops into Beta

What the team needs to explore

Before their next assessment, the team needs to:

  • no recommendations on this point

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a clear commitment to continuous improvement, sharing their depth of knowledge to feed UR into policy implementation
  • dramatically increasing construction costs which may be curtailing development and building plans might significantly challenge the team’s predictions around take-up and application transaction volumes. The team are cognisant of this and have referred to Defra’s own market analysis which is in the public domain
  • the team were able to clearly demonstrate how they have learned from and iterated with the learnings that they have
  • the team has allocated content design resource from the outset, and the service has clearly benefited from this approach

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that BNG may be vulnerable to potential risks of being undermined by conflicting government policies (such as levelling up and economic prioritisation) so cross-government engagement and horizon-scanning will be crucial to longer-term viability

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has been tested with mitigated security and accessibility
  • the team is aligned to a set of architectural principles along with security and data standards and Data Protection

What the team needs to explore

Before their next assessment, the team needs to:

  • align further with DSA The Data Standards Authority, to ensure all areas of privacy are being protected in the service

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have very thoroughly considered what they want to measure throughout the service, through the lens of different stakeholder groups
  • the team have sought advice from external communities on what good would look like for targets, and are keen to use the private beta to drive out benchmarks
  • the team have considered how they can tie their metrics in with the monitoring for the wider process, including the helpdesk

What the team needs to explore

Before their next assessment, the team needs to:

  • access the cross-gov performance analyst community, having struggled with this. The Lead Assessor is happy to make introductions which should help to remove some of the blockers discussed
  • ensure that they have end goal targets identified (noting that these can change) so that the team know what they want to achieve

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • in preparation for purchasing, a clear Proof of Concept approach was conducted involving a process for software testing
  • the team adhered to the rules and guidelines for software selection
  • there was a review for WCAG
  • Defra has a cloud hosting capability for Azure and AWS, which the service has utilised the Azure cloud following recommendations for cloud-first

What the team needs to explore

Before their next assessment, the team needs to:

  • address the uploads for pdf files
  • further explore the Content Disarm and Reconstruction software for Poc testing and purchasing
  • consider the current version of WCAG 2.1

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the Azure side has standard mechanisms for CI/CD pipelines including the use of Azure Pipeline/Azure Devops. This will link to the public GitHub repositories
  • the team works with open source repositories

What the team needs to explore

Before their next assessment, the team needs to:

  • process and document work with open source

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • HTTPS, TLS 1.2 protocols were leveraged to as open standards for text
  • the team were also using Defra software development standards, (Common standards across the estate)
  • the team published a blog on their processes
  • the team used learning from other departments
  • the team were using Geoserver which is the open source for the map server functionality

What the team needs to explore

Before their next assessment, the team needs to:

  • explore an API Gateway for the management between clients and backend services

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • open standards were being incorporated and utilised
  • the team explored wider government knowledge and looked across the government estate at what can be learnt from other mature HMG services
  • the team plan to automate existing unit testing, this will be run as part of regression with the CI/CD pipeline this will facilitate a reliable service ensuring bug fixes do not break existing processes

What the team needs to explore

Before their next assessment, the team needs to:

  • explore third party software conducting additional process for PoC for macro enabled metrics in the next stage
  • explore pdf alternatives
  • ensure this service is not tied to a cloud provider

Published 4 April 2022