Corporate report

EU Settlement Scheme Private Beta 2 review

Published 3 September 2020

  • Date of review: 16 / 17 October 2018

  • Stage: Alpha – for release 2 (private beta 2), only products in use in 2018

  • DDaT Book of Work ref number: Listed as “TBC” (IITP-44 for status checking)

Context

What we looked at

This service will enable EU nationals (and their family members) to show they are entitled to be in the UK as the UK leaves the EU.

The panel looked at the end-to-end service for private beta 2 (PB2), divided into various sessions with different teams. The first session looked at the overall service. Follow up sessions looked in more detail at specific sections of the service:

  • Find out – which relates to the communications plan
  • Apply – which covers a number of steps, including the content on gov.uk, the identity app and the online application form
  • Decide – the focus of which is a caseworking system, enabling staff to make decisions on applications;
  • Fulfil – which is about the letters and emails individuals get about their application
  • Stay and leave – which covers what happens after an individual gets confirmation of their status, including how they prove this and how they update their details

This report

This report is structured around the sessions the panel attended.

Overall, it was good to see the end-to-end service, and understand what the user journey will be in private beta 2. We also heard from some excellent teams, with committed staff who have achieved significant improvements to ensure the service works for as many users as possible first time. We also saw action on some of the risks we raised, particularly on the content board enabling the team to ensure consistency across the service.

The panel identified some risks. While these do not stop the service progressing, the team should review and update at the next assessment. These are recorded below. Points listed as areas of improvement are ones the team must review and provide an update at the next assessment.

The service and cross-cutting points

Overall, it was good to hear about the end-to-end service, as it will work for users in private beta 2, and to get a feel for how each element and stage will work together.

Areas of good practice - what the team has done well

  • The team is committed to making the service as simple and straightforward for users. They recognise that there are a wide range of users, many of whom don’t know they would need to use the service
  • They have sought to make it easy for the vast majority of users while maintaining the integrity of the immigration system and allowing for edge cases to be dealt with by exception
  • It was good that the team had thought about things like naming – for example, not using “indefinite leave to remain” because, although legally accurate, it would cause confusion for users
  • The team had thought of options to manage demand. They wanted to come to a balance that didn’t over-complicate the system (including recognising the impact this would have on users), but gave UKVI some control over numbers while testing the private beta
  • The team has correctly identified that we cannot charge users for the assisted digital support, although a paid-for option can be offered alongside this for those who want (not need) additional support. It will be interesting to see the findings on this, and the impact on the service
  • There is a content board in place to share findings across the different parts of the service and the areas of the Home Office. There were examples of where this is looking at cross-cutting issues (like translation) and understanding the service-wide impact of changes within one product (like changing the order of information in a letter)
  • The team are working with CSOC (the Cyber Security Operations Centre)
  • The team feel this project has given opportunity and stimulus to develop strategic capability for Home Office which will streamlining other services in future

Areas of improvements - What the team must do before next assessment

  • The team couldn’t tell us who the Information Asset Owner is for the service. If someone isn’t in place, one must be appointed urgently
  • The team must show that the service works for everyone. It was good to hear that the private beta will include some vulnerable groups of users, but the panel were concerned that assisted digital research seemed to be approached as a separate project. There are two particular actions on this:
    • Prioritise the planned research with those user groups who are at greater risk of exclusion – such as families, vulnerable people, those with disabilities, and those lower down the digital inclusion scale
    • After the session, we heard about separate research which had been commissioned with individuals who currently seek help to make applications. This sounds very useful. The details, including the findings, must be shared with the user researchers across the service, so the needs of these individuals are reflected across the whole service
  • There must be a clear process for knowledge transfer, supporting change of team members, new supplier contracts and a plan for handing over to more civil servants teams if relevant
  • The team provided a short session on the technical approach to the service. However, the panel did not have time to go into the technical detail of each individual product. The panel therefore want to run a couple of specific sessions with technical and security colleagues to go into more detail in these areas. We did not spot any risks that are not included in this report, so we do not suggest postponing private beta 2 for this session, but we do want to hold it before the next assessment
  • Overall, the code has not yet been open sourced, although it has been written in a way that other parts of the Home Office can take on the code. We recognise that there are areas which cannot be opened (for example, policy before it is announced) but the code should be open wherever possible
  • There are some things that will not be explored in private beta 2 – different user groups and additional products that won’t be included until later. These need to be researched and tested before the service is fully available in March 2019

Areas to consider - What the team should consider to improve the service

  • Private beta 2 is heavily reliant on employers (universities and the health and social care sector) passing on communications to employees. The team have had feedback from employers involved in private beta 1 and is engaging with trade organisations. However, if the reliance on employers continues in future iterations, we would strongly encourage some wider research. For example, understanding the digital capability of employers and whether they have the structure to support these processes. We would suggest starting by reviewing the research already undertaken for the status checking right to work service, to understand how a wider range of employers engage with the immigration system
  • We think some low-cost analytics might increase understanding which proportion of users use their employer’s email address in the system. This might help design communication plans
  • We recommend the master list of contingencies includes the technical risks identified in the session specifically around the Android app
  • The panel recommend re-instating the show and tells to share findings with the wider team, caseworking colleagues and the wider Home Office

Find Out

This section is about the communications plan, which at this stage is about employers (and the Local Authorities and organisations participating in private beta 2). This isn’t a digital / technology area, but it was really helpful to see such a core part of the service.

Areas of good practice - what the team has done well

  • The team has learnt lessons from private beta 1 – there were examples of both changing wording based on users/employers feedback about tone and wanting facts rather than reassurance, Also changing what materials were provided so employers could use them in different ways
  • There is a clear commitment to keeping on learning from feedback throughout private beta 2, feeding into both the detailed guidance and the main marketing campaign. Again, that is both the detail and tone of communications
  • There was user testing on the gov.uk mainstream guidance in September, and this is currently being updated by GDS
  • The main marketing campaign is planned for next year, specifically allowing for this to take account of feedback from the private betas
  • There is a content board in place to share findings across the different parts of the service and the areas of the Home Office. There were examples of where this is looking at cross-cutting issues (like translation) and understanding the service-wide impact of changes within one product (like changing the order of information in a letter)
  • Teams can also escalate cross-cutting issues to relevant groups. They meet with different frequency, depending on the status of the work so the number / urgency of questions
  • The team has responded well to the challenge that their users want certainty around political decisions which they cannot give. They have also recognised the impact media reports have on user expectations. The team has invited the media to see the application process, to help ensure that it is reported accurately

Areas to consider - What the team should consider to improve the service

The team said that they are working with employers, not individuals, on these communications. We would encourage more direct engagement and research with individual users, as it could really help with future communications and potentially understanding the devices and support users have access to. We recognise that this would need to be proportionate, as going through employers is only relevant for the private beta. However, we would recommend some low-cost actions – like reviewing whether users request a link to the application from a work email (the original invite will have been sent to a work email, so this will show whether users forward it on), and using analytics to see which device they use when they first land on the start page (this should be set up in the analytics package anyway).

There has been no end-to-end testing on the communications, although it is a common theme that users will skip the information and go straight to the application. We would strongly encourage the team to see if there are ways to do this end-to-end testing. For example, it may be that one of the employers already has analytics in place that measure how many of their staff click from a news story, to a more detailed communications piece, through to a link to request access to the new system (or to book an appointment etc). We do not suggest that the team put any additional burden on employers by asking them to set up these analytics – only that, if an employer has them set up already, it would be helpful to see them.

Apply

In this session, we heard about what a user will see from going onto gov.uk to submitting their details online. There are a number of products in this stage, such as the identity app and online application form.

Areas of good practice - what the team has done well

  • The team has demonstrated significant improvements for how this part of the service works effectively for users, and reduces costs of support or people dropping out of the system
  • The panel heard from a great team on this, covering a range of different roles
  • The team has had to overcome a number of constraints – not only the evolving policy, but also the technical choice pre-dating any research or design, earlier barriers to recruiting a representative range of users, and having very few precedents on developing apps as they are not normally the best solution. However, the team has tackled these and overall have done a great job in developing this part of the service
  • The team has carried out many research sessions, both targeted on specific parts of the journey and looking at the wider journey in this part of the service. They have tried to get a mixture of users, and plotted them against the Digital Inclusion scale
  • They have also tested the app with users with different access needs. This raised different usability issues (like introduction pages to help users plan assistance; animation to support those with dyslexia or language needs; and having instructions for taking selfies at the top of the screen so everyone looks at the camera)
  • In addition to the testing, the app has had two accessibility reviews, and the team is just waiting for the report from the second. The supplier and team have been very receptive to recommendations so that the remaining issues are mostly maximising access rather than merely enabling it. An accessibility audit of the web application form is delayed but is in hand
  • It was really good to hear that the team is booking in sessions with the Salvation Army, to see users who may need more support to go online, and we look forward to seeing the findings from these sessions and how they feed into the iteration of the service
  • The team showed clear iterations of the application process, and have more iterations planned. It was impressive to see the extent of some of the changes, especially working with the supplier for the app, and how these had been tested with users to ensure the changes had a positive impact. They had also used terminology common from search terms, both to make the mainstream guidance and the app more findable and reassuring for users
  • There is also a clear backlog of hypotheses and areas to test during private beta. This includes areas with particular challenges within the service, like how to explain some of the policy – such as different types of immigration status, continuous residence, appropriate documentation, family groups, and pre-settled status – and what this means for individual users. The team are working to be clear and minimise anxiety for the individual users
  • There is a clear content map, showing key questions about the service and in which product the answers will be found
  • The team have thought about the links they send in the email, to link to a generic high-level page
  • It was great to hear from the performance analyst. We recognise that he had only been on the team for a few days, but it was great to hear about how analytic collection is embedded in the user journey for the application process, and that this data will be used to iterate the service
  • The team will measure the 4 mandatory KPIs, as well as secondary metrics – and will review these as they gather data from the private beta, to ensure they are appropriate. These could be segmented by demographics and other attributes in future
  • The team are in touch with CSOC (the Cyber Security Operations Centre)
  • The team have successfully pushed back on initial suggestions, like asking users to submit evidence of other nationalities, using user research to make the case

Specifically on the app:

  • This is one of the few scenarios where an app is the right way of delivering a solution – here the identity verification solution would not work without one. As it is so unusual, there are very few precedents for the team to follow. Despite this, the team have delivered an app which follows gov.uk patterns where possible, and has gone through significant iterations to make it easier for users
  • The team have overcome real issues with the selfie process, completely changing the interface to help users. We previously discussed taking a ‘photo automatically, rather than the user having to manually do this. This also came up on the accessibility audit and has been added to the backlog
  • The Home Office owns the screens in the app, so could transition these to a new supplier if needed at the end of the current contract

Areas of improvements - What the team must do before next assessment

The team has identified that they haven’t seen a fully representative sample of users to date but have planned further research to address this. They also have a series of hypotheses to test. Being able to identify gaps and plan to fill them is a good thing, and the panel endorse the team’s views that these are really important in the private beta. This includes targeting those lower down the digital inclusion scale to ensure they can use the service in as many cases as possible and also re-balancing as the 50% of participants have been civil servants (who will have different awareness / expectations of the service).

The panel saw the software which will queue applications in the case of very high volumes. We recognise the need for queuing functionality but the chosen solution must be tested with users, including accessibility testing. There is a risk users won’t understand the queuing process and users with access needs won’t be able to use the queuing system. In both cases this could push users to offline support. User research and accessibility testing will identify how big this risk is.

Areas to consider - What the team should consider to improve the service

Across Apply

  • We understand why the Home Office want to be able to restrict access to the application process during private beta 2. In private beta 2 this is a manual process. The panel recommend the team explore if there are existing processes to control access to private betas that can be cheaply reused
  • The programme team are working with HMRC (and DWP). They should check there is a specific feedback loop, in case members of the public do call HMRC (or DWP) with queries. This includes being able to update the guidance for users on the settlement scheme in response to those calls if relevant
  • The team should also finalise what happens if the service is unavailable. They have considered this and put some plans in place. For example, they will put a note on gov.uk – but they could explore using Notify to tell users when the service is back up, so they can continue their application
  • The team are considering the Performance Platform. We would strongly encourage this. They should talk to the Performance Platform team now to ensure they understand the options and timing around publishing the data from public beta 2 and beyond

The “EU Exit ID Document App” and web application

  • The screenshots of the web application include “secret questions” or security questions. It is not clear when these appear to users, how they are used, or how an individual user can update them if their answer changes. We would suggest reviewing these against the NCSC guidance, and making sure there is a clear update process
  • We would recommend talking to Registered Traveller, to understand how they’ve asked users about criminal convictions from overseas, as they had some similar issues to the ones the team outlined here
  • The team should also look at the document upload journey. This is not just about the technical limitations (and ability of users to use it), but also to ensure that users only upload documents they need and do not spend effort getting and uploading documents that UKVI do not want to see

Decide

During this session we heard about the case-working system. This is a COTS (Commercial Off the Shelf) product.

We briefly mentioned the administrative review system, but we didn’t focus on this because it is an existing system and not changing significantly from the one that is currently in use.

Areas of good practice - what the team has done well

  • It was good to see a user researcher involved with researching and testing the system
  • There is currently a plan for the system to go through an accessibility audit
  • GOV.UK patterns have been considered where possible
  • The system prevents case ownership, which means no one person is responsible for all the checks for a single case, preventing bottlenecks if people are on leave
  • The team have set up a model office to check through if the system meets the basic requirements of the users. The panel would welcome an opportunity to visit the model office and see how it is being used
  • The COTS product does not include a feature requested by users (staff) to allow support cases to be put on hold or reopened. However, the team have worked with the business to mitigate this for the MVP (minimum viable product) and have a plan in place for enabling the functionality soon
  • Caseworkers can generate letters to applicants within the system, reducing the need for them to switch to Outlook to complete a case
  • The COTS product provides clear and detailed audit functionality
  • Updates to each technical component can be rolled out multiple times a day independently of each other

Areas of improvements - What the team must do before next assessment

The panel felt that this stage of the service contained the two biggest risks:

  • It is not yet clear that the caseworking system can be used easily and effectively by staff. The team have mitigated some of this risk by setting up a model office – this will help, but must sit alongside:
    • reviewing and prioritising the user research findings from research so far. This does not mean implementing them all immediately, but understanding the costs and impact of changes (impacts of some of the needs may be low while few staff use the system, but higher as more staff are brought into the team)
    • carrying out further research with actual users on the actual system. The limited user testing so far has been heavily restricted due to environment stability issues, resulting in reliance on static screens. All user types need to be researched, including the supervisors. The panel recognise the mitigation steps undertaken by the team, for example presentations and UAT
    • ensuring that the system is accessible, so staff with access needs will be able to use it, including on older devices if needed. The access needs team has been unable to get full access to the system so far to assess the size of this risk. However, they have identified 15 high priority issues on the homepage alone, relating to all types of disabilities (these have been raised with the team, but not corrected yet)
  • The panel recommend the service plans to adopt a strategic evidence upload capability as soon as possible

Areas to consider - What the team should consider to improve the service

  • Users (staff) can send emails to applicants to ask for more evidence to support their application. We recommend the team reviews the feature enabling staff to email applicants, and at the very least provide staff with guidance to ensure emails are consistent
  • The caseworking system works on the browsers installed on new devices, which are currently being rolled out to staff. The team were aware that this this may cause issues, especially for staff with assistive software, who will not be in the initial roll-out. The team must understand the potential impact of this, especially on staff with access needs
  • The team should also consider how a visually impaired caseworker (3 have been identified) could perform the tasks including (scanned) document reviews, with those requiring photograph comparison either set aside or assistance provided
  • The team has indicated that the system prioritises cases based on oldest case first, but with some caveats, such as cases with new information added to them will be prioritised. There is no SLA (service level agreement) for decisions, so there is a risk that some cases may languish while others get processed quicker. We would recommend monitoring this in private beta, to understand if this happens in practice
  • The team should arrange for a design review of the system by members of the interaction and content design team. They will give detailed feedback to help make sure that information and actions are presented as clearly as possible to users (staff), so they are clear on what they need to do without needing to refer to guidance
  • The panel are concerned there may be risks for ingesting data into the system. This should be monitored during private beta 2 to see if happens in practice

Fulfil

During this session we discussed the letters that are being sent to applicants as part of the process.

Areas of good practice - what the team has done well

  • It was good to see the decision letters and request emails: the team have taken the content from policy, reviewed by a content designer and tested with a user researcher. We could see the progression of making them more easily understandable. This included changing both wording, and where it appears on the page
  • The team have had to work around some wording that is required by the EU Commission. They have clearly worked hard to make it as clear and concise as possible for the applicant. User research has shown that despite their length they work well and do not contain unnecessary information

Areas to consider - What the team should consider to improve the service

  • The team have been unable to identify an existing owner for the letters. While this has enabled the team to design, research and iterate the letters during development, it is important that a suitable business owner is identified going forward
  • The (PDF) letters must be included in the accessibility testing going forward. The panel understands that this is currently being done
  • Applicants can opt out of using email. However, the caseworking system will always send emails. We would therefore recommend both being clear to applicants how they will get communications, and reviewing whether an email opt-out is appropriate (where there is no inclusion need)
  • The team should expand their research to include the scenarios when settlement status is not granted. The panel understands that this hasn’t yet been done for ethical reasons but, as this is a scenario that could cause applicants great concern, it needs to be appropriately researched

Stay / Leave

In this section, we discussed what happens after an applicant has been given confirmation that they have settled or pre-settled status. That includes how they view and prove their right to be in the UK, and how they update their details.

Areas of good practice - what the team has done well

  • The panel were very impressed with the team presentation, covering a range of different roles
  • The team has carried out many research sessions, both targeted on specific parts of the journey and looking at the wider journey in this part of the service. They have tried to get a mixture of users, including those with access needs and assisted digital needs and plotted them against the Digital Inclusion scale
  • There was a clear cycle of research, design and testing. The team has shown evidence of iteration based on testing as demonstrated by screens like the start page and profile variations
  • 2FA (two factor authentication) has tested well in both SMS and email. It was great to hear about how the team aim to make a service that balanced needs of security with accessibility. Wording such as ‘single use’ was found to be important at differentiating the code from static a password
  • The team have made the most of opportunities to reuse pre-existing aspects, such as GOV.UK Notify and Home Office Forms
  • There is really good use of analytics on the service, from both a user and system behaviour perspective
  • Clear thought has been given to the longer term strategic aims of the different functionality and systems we saw here. This is evident in the architecture and the front-end designs. For example, view and prove your status isn’t just for those applying under the EU settlement scheme, and the wording users see is clear for everyone
  • The team acknowledges that there is a difficulty in getting users used to the fact that they won’t be sent a physical card stating their status but will instead receive a digital permission. They are clearly still putting a lot of thought into how they can continue to iterate the designs of the service to reassure users
  • Assisted digital routes have been planned out and the team plans on testing these routes once live to ensure usability. The panel would have liked to have heard more about this, but the time constraints made it difficult

Areas to consider - What the team should consider to improve the service

  • The assisted digital journey needs to be carefully managed. Users must be effectively passed from the EU Resolution call centre to We Are Digital, Migrant Help or a local library based on their needs. We would like to see the findings from this (including any analytics) at the beta assessment