Local Land Charges - Alpha Assessment

The report from the alpha assessment for Land Registry's Local Land Charges service on 19 September 2016 and reassessment on 31 January 2017.

Stage: Alpha
Service provider: HM Land Registry

About the service

Service Manager: Allison Bradbury

Digital Leader: John Abbott (LR) / Emma Stace (BEIS)

The service will provide users with accurate and up-to-date information on Local Land Charges, and provide a mechanism for relevant Local Authority staff to update Charges.

Assessment Result - 19 September 2016

Result: Not met

To meet the Standard the service should:

  • Conduct additional research and development into broader use of the service with users in other roles, including citizen and professional users.
  • Ensure that the team size and make-up is appropriate, particularly by recruiting a content designer and front end developer.

Detail of the assessment - 19 September 2016

Lead Assessor: Simon Everest

User needs

The team have focussed their research on the users in local authorities who record local charges. They have have done some contextual research, and some usability testing of early prototypes. And they have collected what they know about users into some personas and other representations of user needs.

The team has learned that there is wide variation of process between authorities, and considerable variation of skill and knowledge among users - particularly in mapping the area a charge covers.

The team’s understanding of the details of these working practices is limited. For example, the team had identified the need for users in local authorities to create an accurate record of charges. But the team could not explain how local authority staff do that currently, what problems there might be, and how the proposed design will support users in meeting that need.

The team has tested prototypes with local authority users. But the lack of an integrated prototype has limited the team’s ability to test their proposed design and get evidence that the design will work for the different types users in different types of authorities. And the lack of a content designer has limited the team’s ability to refine and test the language used in the prototype, and to resolve issues with terminology.

All the local authority user that the team have found have above basic digital skills. But the service is quite complex - creating large and detailed map polygons. And the team have identified users who may struggle with the task, and represented those users in a persona. The team presented their thoughts about how they might support local authority users, but it was not clear how this related back to what they’d learned about those users.

The team could not show evidence of any research with disabled people working in local authorities. The team had done some thinking about accessibility and planned to commission an accessibility audit and do some accessibility testing in beta.

The team have done little research with professionals, businesses and members of the public who will search for and use information about land charges. The team planned to incorporate those users during beta.

But these are large and complex user groups. Understanding their needs and designing a service that works for them will require a separate discovery and alpha.

Building a service for the local authority users before doing any discovery or alpha work with professionals, businesses and citizens has inherent dangers. For example, the prototype for local authority users asks them to enter additional information about a charge, including free text. How will other types of users see that information? Will they be able to understand and use it? Will it meet their needs?

Team

There is a large team in place, covering most of the expecting roles, but with a couple of gaps around Front End Developer and Content Designer.

The lack of a frontend developer in the team was an issue. This manifested itself in the lack of an integrated alpha solution for usability testing and must be addressed prior to beta.

The team didn’t have a dedicated content designer, and the solutions presented suffered because of this. A content designer will be able to work with the research team to learn more about how people understand the legal or technical language used within this domain.

The danger of focussing on the internal-facing service first is the team aren’t challenging complex language which will become more obvious when designing and conducted further user research with citizens. This feels like a missed opportunity at the moment. There’s a danger that this could make delivering a simpler, clearer citizen facing service more difficult if this isn’t considered now (ie. the language used is a core part of how the data is managed and organised).

Technology

The service team are building on existing platforms that other teams in the Land Registry are using, utilizing the same deploy pipelines. The integration between this and the other teams was positive. Particularly on the delivery of shared addressing and mapping services.

The panel were pleased by the interactions between the service team and the Data teams at GDS. The use of the registers data specification in the schema for Local Land Charges was good to see, and the team had performance tested this schema against a relational schema showing due diligence in the selection of the data model.

The team had deviated from the open standard for locations points (https://www.gov.uk/government/publications/open-standards-for-government/exchange-of-location-point). The rationale for this was that all the maps used in the service were based on OSGB36. The panel would like to see the team allow the data in the service to also be able accessible in the open standard.

The team should ensure that they understand the technology stack used by the users and ensure that any in-browser solutions are developed using progressive enhancement techniques so that the solution is usable by all Local Authorities.

Design

Focus of alpha was on a tool for local authorities. The user needs presented as part of the assessment all reflected this tool.

There was a lack of citizen focus in the assessment. The real user need is something like: “find out about land charges, or show me all the data about the piece of land.” This is how people find out about things like tree prevention orders, listed buildings etc. It doesn’t include other things that might affect the land in the future (out of scope). Delivering this service would eventually make is cheaper and faster for people to get this information.

The visions for the service is to create one register that holds all this data, that’s then kept up to from local authorities.

Main purpose of the alpha (8 weeks) was the feasibility of the data integration, talking to GDS about registers (good progress), and seeing if the mapping tools/technology would work. Key questions the technical demo and alpha were trying to answer:

  • Does it work with the register spec model?
  • What is a local land charge (redefining the concept/model for how this is maintained)?
  • Should everything have a polygon?
  • Establish whether local authorities will use the service to keep information up to date (and keep the data up-to-date)?

Service Scope/Future plans

It wasn’t clear for much of the assessment if we were looking at “register a local land charge” or “local Land Charges” as a full end-to-end service.

The team had identified working of ‘citizen-focussed aspects of the service’ for beta, but it’s important they now do further discovery/alpha work to support citizen facing work, even if the technical build around the data/register continues into beta.

In terms of what was presented for an alpha assessment, there wasn’t a clear enough proof of concept (end-to-end demo) that shows us the way forward.

Evidence of different approaches

Most testing and iteration had been focussed on the technical solutions for the map interface. Changes were indicated alongside versions of the heroku hosted prototype.

The team had developed personas for local authorities but it wasn’t clear how/if these had shaped the design direction for the service.

Plan for digital service being unavailable

Plans were in place for this (see technical section).

Support model for users with lowest digital skills

Testing in alpha hasn’t come across people with lower digital skills. The team need to extend their research to test with people with lower digital skills - even if this means extending the research to those that aren’t experienced local authority staff or people that understand the use of technical language/and land registry terminology.

The team talked about learning and development being in place. The panel challenged about making this work first time for people, by focussing more on content.

Users succeed first time

With the focus on using the map as a drawing interface it wasn’t clear how progressive enhancements or access via smaller screens or touch interfaces and accessibility were being considered. The team were confident about that the solution would be accessed on desktop devices but did have concerns about legacy/older browsers within local authorities.

Consistency with GOV.UK

The team presented a more technical build and a further ‘alpha’ prototype. It wasn’t clear why the working maps demo wasn’t brought more inline with GOV.UK styling. The team should look combine these prototypes for further testing as soon as possible.

The Alpha protoypes were using the GOV.UK toolkit, but more work needs to be done to look at appropriate styling for an internal system. The team should look at the examples from other departments on hackpad and might also find the DWP elements and patterns useful for further examples when designing an admin interface.

The team also need to think more about how to monitor the quality of content/data maintained by local authorities. Looking at tools (data quality strategy).

Digital take-up

The team hadn’t thought about start pages and didn’t explain how people would find or access the service (even within local authorities)? User journey maps were presented but need to show what happens before, and happens next as part of the service design.

The plan for digital take up was unclear, the team talked about a phased or dual rollout.

During their presentation the team showed that 38% currently use some sort of digital mapping interface but they will need to develop further support to work with the remaining 60%+ people to support a digital solution.

There’s no Assisted Digital lead for the team at the moment (covered part-time in the Land Registry). This needs someone with operational oversight to lead on assisted digital learning and planning.

Analytics

The team have begun to plan for implementation of analytics capture, including the four agreed KPIs.

Recommendations - 19 September 2016

To pass the reassessment, the service team must:

  • Demonstrate evidence of needs for the whole service, beyond officials working on updates to the register.
  • Fill gaps in team capability, particularly through recruitment of an appropriately skilled content designer and a front end developer

The service team should also:

  • Continue efforts to identify users with accessibility requirements to conduct one-to-one research with, alongside continuing existing plans for an accessibility audit.

Digital Service Standard points - 19 Sept 2016

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Not met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Not met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it n/a

Reassessment Results - 31 January 2017

Result: Met

The service met the Standard because:

  • The team have refined and clarified the service scope following feedback from the original assessment. This addressed the panel’s concerns about the breadth of work carried out during Alpha, and the lack of end-to-end service focus. The service is now focused on Local Authority staff (primarily) making changes to the register.
  • The service is now able to process and output data in open formats, not just OSGB36.
  • Land Registry have enhanced the team with additional specialists recruited into key posts, and partnership with a supplier.

Description

The service will enable Local Authority users (and a limited number of non-LA authorised users) to add and amend entries in the Local Land Charges register. Land Registry is currently working to migrate existing Local Land Charge registers from individual local authorities into a single central register. This new service will provide authorised users to make changes to data they’re responsible for in the new central register.

Following an assessment on 19/9/2016, the service has been re-assessed against points 1, 3, 9, 12 and 13 of the Digital Service Standard. This report concerns only these 5 points, and should be read in conjunction with the earlier assessment for comments about remaining elements of the service standard.

Detail of the reassessment - 31 January 2017

Lead Assessor: Simon Everest

Point 1 – Understanding User Needs

The service has clarified its scope to focus on local land charge officers in local authorities, and their need to maintain an accurate register of local land charges. They have started with this need as the team cannot build future citizen facing services without first solving the underlying data issues.

The team have done good research to understand the work processes in councils, and created personas to described what they have learned.

There’s more to do, but the team now have good research questions for beta e.g. workflow, specialist knowledge, mapping tools, etc.

The team have done some user research on the service name. They should do more.

Point 3 – Sustainable Multidisciplinary team

The team has addressed concerns about capability and filled key gaps around content design and front end development. A dedicated content designer is essential for services which have traditionally been hard to understand for people outside the legal or conveyancing profession. Additional front end development capability is vital to allow rapid, low-cost prototyping to research alternative hypotheses during early development phases.

Land Registry have additionally recruited a Design Lead for the organisation, which will help improve standards across the organisation’s portfolio as well as providing additional connections to the government design community.

The team are working with Kainos, providing additional short-term capability whilst supporting development/upskilling and knowledge sharing with permanent Land Registry staff.

Point 9 – Using open standards and common government platforms

The team rapidly demonstrated that they were able to process alternative, open standards-based data formats alongside the public-facing use of the OSGB36 format. They have a strong user-need for this approach for the public service, but are not constrained by it and have demonstrated their ability to transform to alternate data standards in the back-end.

Point 12 – Simple and Intuitive enough

The team demonstrated that they were iterating the service based on learning from user research – e.g. moving the legislation picker to the beginning of the user journey to avoid wasted effort.

The team have done some work on accessibility, researching with people with access needs, and are planning an accessibility audit. The organisation have disability champions in their developer community. In selecting private beta partners, the team should ensure they have good numbers of users with access needs, and make specific plans to learn from those users.

There are some areas that require further content design work and research, particularly around language, including legal terminology and in navigation/options (for example, the ‘miscellaneous’ legislation category). The team also need to test the map editor in the end to end flow, immediately by simply switching between the two in a usability test, and soon by embedding the map editor.

Point 13 – Consistent User Experience with GOV.UK

The team demonstated that they are using GDS styles and the GOV.UK prototyping kit. Some UI patterns still need work and could be more consistent with GOV.UK.

It is important that the team are able to show the complete end-to-end user journey that users are expected to take. If the service is expected to include a GOV.UK Start page, it is vital that the team understand how Local Authorities will find this page (for example, understanding their existing internal services such as an intranet or portal) and can demonstrate this end-to-end service for a range.

Recommendations - 31 January 2017

To pass the next assessment, the service team must:

  • Continue prototyping and experimentation in parallel with ongoing technical development, ensuring that alternative hypotheses are tested throughout (particularly relating to mapping and the end-to-end journey). There are some outstanding questions from Alpha that will require a broad approach to research, and shouldn’t simply mirror the technical build.
  • Provide a clearer focus on the end-to-end journey users will undertake – we can’t assume that these users will ‘start from Google’, so a deep understanding of work patterns in Local Authorities is needed.
  • Continue to research and iterate the language used in the service, up to and including the service name, to ensure it makes sense to newcomers as well as experienced staff.

The service team should also:

  • Share design patterns back to the cross-government design community (Hackpad etc.) where they have been developed beyond existing patterns. Experience researching the mapping interface and UI would particularly benefit the wider community.

Digital Service Standard points - 31 January 2017

Point Description Result
1 Understanding user needs Met
3 Having a sustainable, multidisciplinary team in place Met
9 Using open standards and common government platforms Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
Published 12 October 2017