Biometric Residence Permits (BRPs) - Live Assessment

The report from the live assessment of the Home Office's Biometric Residence Permits on 21 June 2016.

Stage Live
Result Pass
Service provider Home Office - UKVI

The service met the Standard because:

  • The user needs, and how the service meets them were clearly understood and described
  • The team are using agile methods to iterate the service, intend to continue iterating the service after live and have the budget available to do so.
  • The team has built the service on a technology stack that enables them to not only build and iterate the service rapidly, but they have been able to open source the forms library and get it reused in other services.

Detail of the assessment

User research

The team has done solid research to identify needs of both external and internal users and was able to demonstrate how the outputs resulted in better design of the service.

The user research included people with different levels of accessibility needs, identified via third parties. The panel was impressed by the clips from usability sessions done with those users. The team has also spoken about the work they have done with call centre team responsible for assisted digital support and their plans to share learnings from this with other UKVI services and the Home Office in general.

A plan for continuing supporting and iterating the service was explained and will be put in action once the service is live.

Team

The small team will remain in place after going live, with additional support from Home Office Digital’s Service Optimisation team.

The team will retain full control of the service, and are continuing to work in an agile manner, transitioning from Scrum to Kanban as it suits their needs and continuing to iterate the system and release to live on an as needed basis.

The team, while lacking a full time interaction designer, has shown capability in all the necessary functions of the team, and where they lack in team capability, have access to shared capabilities from across the Home Office. Importantly as the service has a simple interaction flow but complexities around content the team has chosen to invest in having a content designer as a core part of the team.

Technology

The BRP team have a well exercised process for promoting code through environments with supporting automated builds and test phases, allowing them to push critical updates in a prompt manner. The team has multiple members permitted to authorise security releases.

Business as usual updates are signed off by the service manager, without requiring input from outside the team.

The team exercise code in an automated manner during development, but validation of deployments into production is still a predominantly manual process.

The team also does not appear to have a dedicated set of performance tests, although some functional tests can be pressed into service to check for the more egregious regressions.

The team appear to be reliant on using people to manually check the volume of emails (the service’s output) arriving in a mailbox as a coarse form of monitoring.

The team have released a framework supporting their application as open source code; more noteworthy is their continued efforts supporting this code and promoting it with other teams within the Home Office plus other departments and working with those other teams to improve the code in question.

Due to the air-gapped nature of the case working systems behind the service, there is a reduced opportunity for validation of some data entered and instead the service relies on human intelligence and in some cases contacting the customer to clarify or correct information.

At this time the team appear to have taken a business decision not to introduce automated lookup of address data, but have figures to indicate that address entry errors are rare.

Design

The service uses standard GOV.UK components and patterns. The team has shared patterns with the design community and other departments.

The team has researched writing for non-English speakers and have shared their work with the wider content design community.

Extensive Assisted Digital support, backed up with several methods of research, has been created with the UKVI contact centres. The approach is being shared with other Home Office teams.

The service is accessible and has been tested both heuristically and with user research.

Analytics

The team have a good grip of the KPI’s for their service, and are measuring them using the performance platform.

The team are working with some hypothesis based development, taking the time to measure, build and learn from each change they make.

Recommendations

During live operation, the service team should:

  • Conduct regular design and content design reviews within Home Office, especially if there is not a designer in the team day to day.
  • Further work looking at users moving between the 5 transactions and how to measure completion.
  • Ensure additional members of the team - or perhaps peers from another team - are able to step in and authorise releases to relieve some of the reliance on a single person at all times, in particular delegating authority for holiday and sickness periods.
  • Make more use of the response time data they’re already collecting and bolster monitoring and metrics to give them finer and more timely feedback on potential issues post release.
  • Periodically review the decisions around data validation and monitor the level and impact of any problematic data is having on the number of clarification and correction steps their caseworkers have to undertake in response.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 22 December 2016