Appeal an asylum decision alpha assessment report

The “Appeal a Decision” service is designed to replace the existing service and process for appellants appealing Home Office decisions for visa or residency applications.

Appeal an Asylum Decision

From: Central Digital and Data Office
Assessment date: 1 October 2019
Stage: Alpha
Result: Met
Service provider: Her Majesty’s Courts and Tribunal Service

Service description

The “Appeal a Decision” service is designed to replace the existing service and process for appellants appealing Home Office decisions for visa or residency applications. Our service forms part of the First-tier Tribunal Immigration & Asylum Chamber (IAC), which is being transformed as part of the wider HMCTS Reform programme.

Service users

Appellants in person

Appellants in person are people appealing against a Home Office decision, without qualified legal representation. They often receive help from ‘informal supporters’ who are charity workers, friends and peers who sit by their side to help them through their appeal.

Tribunal Case Workers

Case workers who, with the approval of the Senior President of Tribunals, carry out functions of a judicial nature in order to move a case towards an outcome on behalf of the judges. HMCT Service Centre Administrators. Users performing administrative tasks to ensure the smooth facilitation of the appeal service cycle.

Judges

Judiciary users who will (if necessary) oversee a case’s hearing and subsequently record their determination and decision in our new system.

Home Office

Users will be sent information on the appeal, be able to submit evidence and have the ability to make decisions on whether to withdraw their earlier decision (based on strength of submitted evidence from the appellant) or request a formal hearing of the case.

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team identified and engaged a broad cross section of different types of appellant and professional users
  • user needs have been well articulated, and the team has integrated multiple user perspectives into their resulting service design
  • in particular, the team has been effective at engaging a hard to recruit group of appellants, with a well thought out strategy of engaging the charity sector to provide referrals
  • the user research has been effective at identifying pain points in the current journey, and there are clear examples of how this has been used to shape the design of the new service, which appears to have substantially reduced barriers to users accessing the service
  • the research has been professionally structured: clear evidence of hypotheses to shape the research; and there has been a clear focus in identifying and prioritising the knottiest problems to solve in the user journey
  • the research has also been well managed (eg attention to ethical issues arising from researching a vulnerable user group)

What the team needs to explore

Before their next assessment, the team needs to:

  • broaden the range of appellant segments included in the research. While the user engagement has been sound for the discovery and alpha work to date, there is a need to test out whether the needs of other user segments are the same as those already included in the research - which they may not be. This poses a risk, and means it will be important to explore other user groups as the project moves to private beta. Specific user segments which need to be researched to identify user needs and journeys include:

  • people appealing for reasons other than asylum and human rights
  • high and medium capability appellants, and low capability appellants currently not engaged with charities
  • out of country appellants
  • people in need of digital assistance and with limited English (including people accessing the service via mobile)

  • in particular, there is a need to recruit appellants who are not engaged with support charities (ie to use different recruitment channels) as their needs and levels of support may well be different
  • private beta research with users who switch between represented and unrepresented appeals as they progress through their journey
  • we would also be keen to see some more exploration to show that the team understand the impact of the structure of the process on users. For example, do users feel OK about not sharing their reason for appealing in the first interaction? Do the time limits for responding to directions impact their ability to get support with the case?

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • there appears to be a mature recognition of the limitations of the research conducted to date, and the need to drill down further into needs of different segments as they move into the next stage of development
  • there appears to be well established involvement of the whole team in determining the hypotheses to be tested at the next stage of research
  • there is a forward plan for research in private beta, with attention to areas where further insight and user need has been identified (such as use of language support option, and assisted digital users, including how these interface with the call centre)
  • the team has done effective work with a group of Tribunal Case Workers (a “tea party”) to identify the questions they ask appellants, which is being used to frame questions that can be posed and tested with appellants as they develop their service prototype

What the team needs to explore

Before their next assessment, the team needs to:

  • develop the research programme to address the points identified in Point 1 above

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service had a well-formed multidisciplinary team with all the relevant roles working effectively together
  • the team worked effectively with others working on related products within the wider service, with lessons from legal reps also informing the appellant in person service
  • the service worked in close proximity and with regular input from key roles, including Tribunal Case Workers and 3 Judges

What the team needs to explore

Before their next assessment, the team needs to:

  • responsibility for this service will transfer to DTS (Digital and Technology Services) around September 2020, it is essential that a fully formed team with all the relevant roles (including, but not limited to a product manager, developers, service design, user research and content design) is in place well in advance of that transition point so that there is effective knowledge transfer between both teams.

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated effective use of the Scrum methodology, aided with appropriate ceremonies and tools

What the team needs to explore

Before their next assessment, the team needs to:

  • consider the purpose and approach to Governance. It needs to enable the creation of a quality service, and remove blockers, especially in dependencies of shared components as well as be proportionate

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have tested and iterated not just the interface but the interactions between appellants and tribunal caseworkers and understand the effect the design is having on user behaviour
  • the team have iterated the journey based on their user research and worked with judiciary to establish where push back on the use of legal language is possible or not
  • the team have considered several different options for key parts of the user journey

What the team needs to explore

Before their next assessment, the team needs to:

  • be geared to continue to iterate the service during public beta and beyond. There MUST be an appropriately sized team of the right shape in place to continue this after the Beta assessment to enable continuous improvement to the service for all users that will realise further benefits for HMCTS, Home Office and service users

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team decided to use a modern yet mature stack (NodeJS/Express, Typescript) for building the frontend applications
  • the technology choices are open source, mature and common within the programme and across the government
  • the team adopted open source tools (Terraform) for provisioning and managing the infrastructure

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • all types of users – case workers, Home Office users and appellants – are provided with individual user accounts with OAuth2-based access control
  • continual monitoring of security vulnerabilities within the code base and dependencies is done through Continuous Deployment Pipeline which is run for every feature change introduced (this will include checks for Common Vulnerabilities Exposures which highlight exposure of sensitive data)

What the team needs to explore

Before their next assessment, the team needs to:

  • the team must show they’ve considered whether multi-factor authentication is necessary for this service and whether it would work for this user group.

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team adopted open source tools (Terraform) for provisioning and managing the infrastructure
  • the team uses common user interface patterns from the HMCTS Design System (which extends the GOV.UK Design System)
  • appeal progress notifications are sent using GOV.UK Notify

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should explore using GOV.UK Verify for identity assurance
  • as the backend system is shared and built using common components, the team should explore the idea of building the new frontend interfaces on top of the existing ‘represented’ application in order to avoid a premature disjunction and ensure the journeys are kept in sync regardless of the type of user (unrepresented or represented). This could also facilitate the switch from ‘unrepresented’ to ‘represented’ journey and back within one appeal

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • end-to-end tests are set across a set of consistent, immutable environments
  • end-to-end tests are done through Continuous Deployment Pipeline which is run for every feature change introduced

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • in case of unavailability the service will provide an appropriate messages to help users get in touch and use the alternative, offline service

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have worked hard to simplify the journey even further from the legal representative journey and reduced the amount of information requested
  • the panel was pleased to see the team working effectively with Home Office and the Judiciary to improve the whole user journey and influence policy where needed
  • the use of the Tribunal Case Worker role as an enabler is impressive to see, this has enabled service transformation that goes well beyond the digital touchpoint. The team should continue to focus on this interaction and how to embed this change as the service scales
  • the panel was pleased to see the mature thinking about assisted digital support that will be available. The team will need to demonstrate how this will work in more detail at a beta assessment, especially language support which will be a need of a significant proportion of users

What the team needs to explore

Before their next assessment, the team needs to:

  • it is not completely convincing that the ‘represented’ professional user facing and ‘unrepresented’ appellant in person services should be separate journeys, particularly because the team expects that some users will go from unrepresented to represented and back within one appeal. The panel feel that addressing this early on will save time and complexity later, particularly when an appellant view is added to the legal representative service. It is good that the back end systems are built to support switching between journeys. The team will need to demonstrate an understanding of how this interchangeability would work within the user journey at a beta assessment
  • the team should test the journeys into the service, both from the beginning and when users return after receiving a notification. They should try to get a realistic view of how these return journeys work for their users who use public computers or have limited access to a phone or email. This will need to include account creation and signing in within the context of a shared public device
  • the team should make sure they have a feedback loop to understand if people are getting the face to face case management appointments when they need them. It’s worth testing the assumption that this isn’t something that could be offered to all users. Perhaps users could have the option to request this and the TCW the discretion to decide? The idea of a flag from the Home Office for vulnerable people is also promising and should be explored further

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are using both GOV.UK Design System and HMCTS design patterns and using the HMCTS communities of practice to share and learn from others

What the team needs to explore

Before their next assessment, the team needs to:

  • establish through user research that the progress indicator pattern works for users on mobile or show how they have iterated it based on that research. There’s a risk the pattern breaks or obscures the call to action which will impact users who are checking for updates

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had completed extensive Assisted Digital research and considered a wide range of support options
  • a Digital take up strategy was in place via the Channel Mix and Optimisation team

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has established baselines for key metrics, including cost of transaction, clearance time and percentage of adjournments

What the team needs to explore

Before their next assessment, the team needs to:

  • establish baselines for other key metrics once applicable (example Digital take-up)

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have explored well beyond the standard KPIs and did exemplary work creating a logic model, scoping out links from activity and outputs to short-term outcomes and long-term impacts
  • there was a clear link between metrics and the stated service vision

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is considering plans to report appropriate performance data, including those driven by targets set by the No.10 Immigration Task Force

What the team needs to explore

Before their next assessment, the team needs to:

  • arrive at a decision on how KPIs will be published on GOV.UK alongside related services
  • engage with the GDS Performance Platform team to ensure that the service is geared up to publish KPIs when going into Public Beta

18. Test with the minister

Met - Does not apply for Alpha


Updates to this page

Published 4 February 2020