Sign your mortgage deed - beta assessment

The report from the beta assessment for HM Land Registry's sign your mortgage deed service on 22 August 2017.

From: Government Digital Service
Assessment date: 22 August 2017
Stage: Beta
Result: Met
Service provider: HM Land Registry

The service met the Standard because:

  • The team have developed a service which is demonstrably easy to use, tackling a complex problem in a user-centric way.
  • The team have successfully implemented GOV.UK Verify, in a service in which it is vital to accurately and dependably authenticate the user.
  • The team are working in an agile, iterative and open way, learning from doing and sharing their code.

About the service

Description

The service enables users remortgaging their property to sign their mortgage deed online. This is currently a paper-based process, requiring conveyancers to send paper forms to borrowers, which are then signed and returned. The service is using GOV.UK Verify to provide effective authentication, replacing the current requirement for witnessed ‘wet’ signatures.

Service users

The users of this service will be remortgaging a property, eg in the event that they are seeking a better mortgage deal, or releasing equity to fund lifestyle changes or significant purchases.

Detail

User needs

Initial research conducted by the team indicated that the current paper based service could be improved and made easier by making it digital. The research has also enabled the team to have a good understanding of the users of the service (Borrower, Lenders, Conveyancers, and Internal Caseworkers), and their needs and pain points they currently have to overcome. The team also demonstrated a clear empathy for the users, which was encouraging to see.

The panel were impressed with the quality of research the team has conducted in the Alpha/Private Beta stage, and with the outputs that have been produced e.g persona’s, journey maps etc. This has included surveys with contact centres, pop up research, contextual research with conveyancers, and user testing in the lab. The panel were pleased to see that the research was undertaken by the whole team, that key stakeholders viewed lab sessions, and show and tells were used to communicate findings further.

They have also conducted some usability testing with people with disabilities, however this has mostly been conducted through an audit by DAAC, and moving into the Public Beta stage the team must incorporate this testing into their regular sprint patterns. They have also conducted research with people who have low digital digital skills and should continue this going forward.

Most of the user testing to date has focused on users who can use the service and manage to sign their mortgage deed digitally. In public beta we would recommend that the team do more work looking at the unhappy path, and how they support users who can’t complete the service first time. The team mentioned that the support for the service will be covered by their contact centres, and that they had done some testing of scripts with staff. We would recommend that they continue to do this, measure the need and effectiveness of this support model, while ensuring it provides the correct support for those who need it.

Team

The panel was impressed at the extent to which the team was made up of permanent civil servants, with an approximate 9:1 ratio of permanent staff to contractors. All of the key roles were in place, including access to a dedicated data analyst, and the team was receiving additional support from subject matter experts from across the organisation, and their IT infrastructure support.

It was evident that some challenges had been faced combining the agile, iterative approach of the digital team with the more traditional approach of the IT infrastructure group, but that pragmatic, workable solutions were in place resulting in an effective collaboration. This should continue, as the team should be aiming to be able to release more regularly than is currently the case as a matter of routine.

The panel welcomed the presence of the team’s legal advisor at the assessment, and it was clear that support from Legal and other functions in HMLR had been vital in ensuring that barriers to effective service delivery were resolved.

Technology

The team demonstrated a solid and pragmatic technical solution which fulfilled the needs of the users, business and technology communities with appropriate trade-offs where necessary.

The team showed a clear ability to iterate their service with an automated deployment pipeline and infrastructure as code environment builds on the public cloud. However, when working with the brownfield on-premise mainframe estate this isn’t the case, but a good working relationship with the more traditional IT release management team was displayed and a pragmatic way of working is in place. This is deemed an appropriate trade-off for the service to move forward without becoming embroiled in the broader legacy deployment problem.

The team has a reasonable approach to privacy considerations and security in general, it is recommended that they consider communicating with the wider government community about their use of attack trees.

The team is correctly applying the open source by default principle, with suitable decision making if a repository remains closed. It is recommended that the team investigate if coding directly on GitHub has worthwhile benefits rather than the current sanitization and publish from GitLab model in place.

The team are integrating with common platforms where the opportunity arises. For instance, Verify is used for authentication. Since notifications are an aspect of the service it is recommended the team liaise with the Notify team in order to assess if it is applicable to their need.

The performance of the service, in terms of testing, resilience and disaster recovery, appears commensurate with volume of transactions to play through the next delivery phase. Further investigation during the assessment would have been made if the expected usage of the service was higher. It is recommended the team implement their plans for an information radiator showing insightful performance dashboards (specific to the service) before expected usage rises to a significant volume.

The panel also wanted to congratulate the team in their solid use of APIs for extending reach into third parties such as Coventry Building Society. We look forward to seeing this continue with other lenders and intermediaries.

Design

The team demonstrated how they have tried a number of different approaches to meet user needs. Experimenting with a transactional flow, that asked users questions based on their mortgage deed, and a document flow, that presented the deed as a single document.

The team explained how through lab research they were able to observe the document flow caused the least amount of confusion for users and was more successful overall.

The team also explained how they had tested different approaches to ensure users were aware of the significance of signing the mortgage deed and how research showed the approach they have chosen has the best combination of security, confidence and clarity.

The team tested an unhappy path where users didn’t receive their code immediately, they have also changed the two-factor code so it doesn’t contain any ambiguous characters that could confuse users.

As the user needs a link from the conveyancer to start the service, there is no current journey from any GOV.UK content. The content review sets out some recommendations for the start page content.

The team have used the GOV.UK design patterns, however there are a couple of small design issues that are raised in a separate design review document. These should be addressed before going into public beta.

Analytics

The team have taken a thorough and considered approach to measuring and analysing the service. They have strong support from an experienced data analyst (on a part-time basis), and are effectively linked into the wider GDS performance community. They’re using Google Analytics effectively to monitor users’ journeys and pain points through the service, although the relatively low number of un-moderated users accessing the service at present inevitably means only a limited amount of quantitative data is currently available. This should grow as the service use increases, and the team are in a good place to gather intelligence and act on it.

Recommendations

To pass the next assessment, the service team must:

  • The team should consider integrating GOV.UK Notify to support the text messaging service. Notify is already in place elsewhere in HMLR, and would likely be a good fit for this service.
  • Test more unhappy paths and support models to ensure users are able to resolve any issues that may arise, including those on the side of the conveyancer or lender.

The service team should also:

  • The team consider the separation of concerns and responsibilities in the architectural components. Clearly this service is implementing an architectural paradigm, based on microservices, for a broader programme of work delivering a platform, which is a valid thing to do. It is recommended that rather going straight to microservices, small monoliths are considered which allow greater developer productivity and for the problem domain to be explored quickly and easily in order to define the right boundaries for the services.
  • The team continually assess the use of the eSecurity service and develop an exit plan. Although the reality that it was already operational when the service began and is fit for purpose it does represent a potential vendor lock-in. In the future, the features it provides might be best served by a component owned and run by the Land Registry.
  • Building on the success of exposing APIs over the mainframe to start removing data and features from the mainframe in order to strangle away it’s use.
  • Work through the implications of extending use of the service into initial mortgaging, once the remortgaging service is working effectively and reliably in public beta.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

This service now has permission to launch on a GOV.UK service domain. These instructions explain how to set up your *.service.gov.uk domain.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 30 July 2018