Digital Land alpha assessment report

The report for DLUHC's Digital Land alpha assessment on 18 January 2022.

Service Standard assessment report

Digital Land

From: Central Digital & Data Office (CDDO)
Assessment date: 18/01/2022
Stage: Alpha
Result: Met
Service provider: Department for Levelling Up, Housing and Communities

Service description

Digital Land makes planning and housing data easier to find, understand, use and trust. Our platform collects planning and housing data from a wide number of different data providers, including devolved local planning authorities. The data is transformed into a consistent state, nationally. The data can be visualised on a map, searched, and downloaded in one of a number of different formats, designed to meet the needs of data consumers.

Service users

This service is for:

Data Providers

People and organisations who are producing, maintaining, and publishing data relevant to planning and housing for example Local Planning Authorities, Central Government Departments and the Planning Inspectorate (PINS).

Data Consumers

Digital teams inside and outside of government developing services to transform the planning and housing system. People and organisations that need planning and housing data for analysis which informs policy decisions, and increases the transparency of the planning and housing systems, including Central Government, Proptech companies and policy colleagues within DLUHC, and campaign groups.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • ​​the team has a good understanding of their users and their needs and identified that there are various roles that users of this service would undertake. The service team provided good evidence on the high level overview of the users
  • the service team has carried out a significant level of user research with various methods used

What the team needs to explore

Before their next assessment, the team needs to:

  • the service team would benefit from carrying out user research with participants with low digital skills

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team started by focussing on areas where data already existed but tested the risks around areas with limited data
  • they identified early on that rather than fixing this for each service, a platform was the best solution to solve issues with making data available and more consistent
  • the team focussed on testing riskiest assumptions
  • the team identified that access to consistent data was a problem that blocks the development of good services

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • Digital Land is a distinct platform from Data.gov.uk with a specific need however there is a risk that data providers are having to publish data to multiple government platforms

What the team needs to explore

Before their next assessment, the team needs to:

  • in beta it would be good to explore and clarify the distinction for users

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have used the design system where possible and engaged with the design community when developing components further, for example the map feature. The team will need to test these thoroughly however the principles behind them are thorough
  • data is being presented without bias and transparency is a core consideration

What the team needs to explore

Before their next assessment, the team needs to:

  • as the service progresses there will be a need to develop an onboarding process for data providers
  • there is concern that this creates more work for data providers, the team mentioned that providing data will become more of a priority for these users as policy changes are implemented. This should be reviewed in beta

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have mapped the process where human interventions may be required and tested interventions they may need to take

What the team needs to explore

Before their next assessment, the team needs to:

  • the team will need to test the services to make sure it works for users with a range of needs as they scale up to more providers

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a good mix of civil servants and specialist contract support, with a clear plan to recruit permanent roles to build internal capability (where necessary) and manage knowledge transfer
  • the team has put some thought into how the work will scale in the next phase and how teams might be structured to handle the increased work
  • there is a good understanding of specialist shorter term skills required versus more permanent roles required

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate that the new structure is operating successfully, especially how the new front door team manages demand for the rest of the expanded portfolio
  • demonstrate that the governance structure empowers the teams rather than slows down delivery

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team have demonstrated exemplary ways of working in the open.
  • agile principles are embedded in the team’s work, avoiding agile process dogmatism
  • has been pragmatic in how it works with the wider organisation

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that their approach to working in the open doesn’t become diminished during the next phase
  • increases the use of its public channels to reach out to users

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have been building in feedback loops so providers can see the benefit of providing their data
  • usage data is being monitored and the team are proactively contacting local authorities who might be struggling
  • examples were shown where research helped identify what meta data needed to be added
  • the service has been simplified significantly based on research to reflect the user’s expectations

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service generally uses publicly-available data harvested from local planning authorities. However, the team was sensitive to issues of disclosing personally identifiable information (PII) and cited this as reason for some data processing steps that obscure PII from source documents. Compulsory Purchase Orders seemed to be a particular concern for the future
  • the data workflow and continuous integration pipeline seems like a pragmatic solution for alpha, given the close connection to github identities and ability to enforce MFA for developers. The new workflow runner component will need to be looked at carefully to check permissions.

What the team needs to explore

Before their next assessment, the team needs to:

  • threat model - We did not see much in the technical documentation about protection against spoofing of the service, which could enable fraud. We understand that the ultimate responsibility for data remains with the publishers but the reputation of the service as a nationally-recognised front-door to this information is significant. The team should conduct explicit threat-modelling and document their mitigations for any potential security and privacy risks they identify
  • look at integrity risks. While confidentiality isn’t a primary concern for a system of public record, the team could consider solutions to protect the integrity of the data. For example, shared secrets or public key infrastructure other than standard TLS protocol bearers, out of band hashing and comparison of expected data files, etc
  • cache - The restful nature of the frontend application should make third-party edge caching relatively straightforward - the service should ensure availability by caching aggressively wherever possible. The team may want to explicitly consider how and how long to keep aggregated data online even after a breaking change or unavailability of the source dataset from the local publisher
  • least privilege - In the current design, system administrators might have too much access to potentially change data. Further technological choices may need to be made in beta to support wider authentication requirements and to enforce the “least privilege” principle

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has put some serious thought into performance metrics
  • they have a good understanding of how they might measure success
  • have already built dashboards that track performance measures

What the team needs to explore

Before their next assessment, the team needs to:

  • turning their thinking into service performance KPIs

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the asynchronous python-based workflow is an efficient way of harvesting slowly-changing data and is likely to stay compatible with local planning authority endpoints over time
  • the service is successfully aggregating almost 400 data sources in a consistent way
  • the selection of public commodity cloud hosting in Amazon Web Services and Azure is appropriate for this application and commodity cloud components are well-used

What the team needs to explore

Before their next assessment, the team needs to:

  • consider the API - The question of whether the service is or should be a website, an API or a data download service (presumably with stable links to standard filetypes) is not completely solved, and is critical to understand for the beta phase. An early-version API is partly documented but we did not see evidence of testing this API with service teams, other than an indication that the Reducing Invalid Planning Applications (RIPA) service would like to consume service data using an API. The opinion of the panel is that a well-governed API is a good goal for the service, and that such an API should allow users to download datasets in bulk. Creating a beta API may require setting up a developer support team to engage with users of the beta API, along with new streams of work to research, iterate, improve, monitor and secure the API
  • add metadata - While the data handled by and exposed by the service is quite well documented, individual datasets available for download do not tend to have attached metadata. Attaching this accurately may require new data handling tools to make this possible - the new workflow management and sequencing engine may help with this
  • optimise performance - The harvesting performance is asynchronous and not likely to be a bottleneck. Handling the display and processing of geographic data is another matter, and the existing tools have not yet optimised the performance of map layers. Caching geodata is a hard problem, but to make the service usable on the desired scale the team will need to make a concerted effort to set performance goals and address performance issue

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before their next assessment, the team needs to:

  • keep the architectural decision log up to date and fill in missing information
  • continue to use open repositories to represent both processing logic and data as far as possible
  • where documents are to be cached in some other way (for example, if there is a need to cache binary PDFs), the team should find a similarly open way to store and retrieve the raw files
  • confirm that the data licences for all of the documents and data being processed from local planning authorities are compatible with the Open Government Licence of the service

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team adopted the standard GOV.UK Design system appropriately, extending where necessary
  • the use of common formats and tools such as comma-delimited text files and GitHub Actions-based workflow create a very “lowest common denominator” feel to the service, which seems appropriate – the team has avoided unnecessary complexity
  • the use of the GOV.UK Registers model in the specification of data was particularly welcome

What the team needs to explore

Before their next assessment, the team needs to:

  • look at metadata - Consider adopting CSVWeb or another metadata standard to define delimited files more accurately. Metadata can help in the long run with the versioning, provenance, linking and structured vocabularies of the many data files heading in and out of the system, and is a standard for government
  • contribute to the Design System Maps component as it develops
  • consider whether the service could interoperate with data.gov.uk , either through reciprocal links or even as a custom subdomain. Given the specialist nature of the harvesting workflow, we do agree with the service team that a separate application is appropriate
  • consider PaaS - The team is already considering hosting using the GOV.UK Platform as a Service, which ought to be a good fit for this service and would offer some advantages
  • consider the role of a federator as opposed to a system of record. If the data pipeline begins to apply business logic of any kind, such as has been considered with the redactions of personally identifying information or entity and link resolution, the service could become a de facto system of record rather than simply a federator of local planning authority data existing elsewhere. The team should consider their approach to this and what responsibilities they would have operating a system of record

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the federation service appears to work reliably for the roughly 400 datasets being tested, with regular re-harvesting of datasets from documented local planning authority sources
  • the service relies on commodity cloud-hosted services which are themselves reasonably reliable
  • the team uses high-availability hosting now for some components with plans to expand the resilient components in beta

What the team needs to explore

Before their next assessment, the team needs to:

  • look at links - In beta, the team will need to look more carefully at issues arising from “link rot”, in which local data sources change formats or locations over time or drift out of date in other ways
  • monitor and throttle - The team already has plans to add more monitoring and alerting to the service, which is welcome. The team has already experienced some automatic traversal from bots, and will need to create a solution to throttle these requests (not necessarily block)
  • design robust APIs - If the team decides to go “all-in” on the API strategy, the team should follow the government API design guidance (including technical guidance) to ensure that the data API is robust and maintainable
Published 3 February 2022