Social Security and Child Support Tribunal - Alpha Assessment

The report from the alpha assessment for HMCTS's Social Security and Child Support Tribunal service on 29 November 2016.

Assessment stage Alpha
Assessment result Met
Service provider HM Courts and Tribunals Service

The service met the Standard because:

They have broken down a complex service and focused on part of the customer journey that causes significant user frustration for the purpose of the alpha. The team demonstrated a very good understanding of the problems that users encounter with the current service and outlined their vision and roadmap to build an end to end digital service.

Good examples of technology were given, however the code is not open from the outset. As there is a plan and an open source policy referred to, this will not currently block the project. However, the service should not be submitted for beta assessment until the team is confident they can meet point 8 of the standard.

About the service

Service Manager: Helen Smith

Digital Leader: Matthew Coats

Anyone has the right to appeal against a benefits decision they don’t agree with. HMCTS provide an independent appeal service called the Social Security and Child Support Tribunal. It is used by appellants and their representatives. The Social Security and Child Support tribunal (SSCS) is part of the Social Entitlement Chamber within the unified Tribunals structure. It is the largest tribunal jurisdiction within HMCTS, receiving in excess of 200,000 appeals in 2015/16. Appellants have the right to choose an oral appeal hearing and do so in around 90% of appeal hearings, requiring a network of permanent and casual hearing venues across Great Britain. The Tribunal is designed to be more informal than courts, with a more inquisitorial role and most appellants represent themselves. The service is currently fully paper based.

Detail of the assessment

Lead Assessor: Jeremy Gould

User needs

The team has developed a strong understanding of the users of their service, through a range of research activities, and with a wide variety of users. They have documented that understanding in a solid set of personas and journey maps. This is impressive work given that the team was initially blocked and delayed in some of their research.

The team has used what they’ve learned to develop a prototype that addresses some of the most significant problems users currently experience, e.g. knowing where they in the tribunal process. The team also has good evidence that their prototype works well for most users.

The variety of users included in their research has given the team a good understanding of support and access needs.

The team has a good plan for continuing user research into private beta. The team will need to actively recruit less confident users who will be reluctant to take part in any beta. Without this, it will be hard for the team to test their support model.

It was particularly impressive to see how the team are doing the hard work to change policies and practices that cause problems and barriers for users.

Team

The team has been formed well with a strong emphasis on collaborative working across the functions. The Service Owner and team members at the assessment demonstrated a good understanding of the broader strategic context, the organisational imperative for change and the user need to develop a better service.

There is a concern that all technology specialists in the team are contractors and there are very few civil servants on the team. Technology decisions are overseen by an Architectural board which provides some continuity and ownership of decision making in the organisation. However this board should not mean that this team are disempowered to make choices as they are closed to the project and to the need.

Technology

The team has scoped out an alpha section of the application with good reasoning around where the boundaries should be. The application was based on existing hosting solutions present in the department, and technology choices were made for sensible reasons.

As mentioned in the team section, there is a concern about the dependency on contractors and also on the hierarchical nature of technology choices in the department.

Additionally the web operations function during the alpha were quite separate to the product team. This has led to model of support tickets for changes to infrastructure and so on. This looks to have been addressed for the beta as the team are migrating to a new platform and have a new operations model in place. This new model allows operations staff to be dedicated to the team and should alleviate the concerns that we see in the alpha.

The team demonstrated some sound understanding of data privacy and are building the application to minimise personal data held within the application. They are aiming for a single long lasting link to access the data, with no credentials. This is leading them to carefully consider what data to hold as well as which users can access the system. They are aware that this model may not hold as the project progresses and will need to be assessed regularly. The team did show evidence of close relationships with Information Assurance and Data Privacy specialists within HMCTS which will help with these decisions. The panel recommends regular reviews as the beta commences to ensure data in this system cannot be used to perform spear phishing and other attacks on the users.

The panel is disappointed to see that the code is not open from the outset. HMCTS are looking for examples of reuse derived value before sanctioning opening the codebase. This runs counter to point 8 of the standard. As there is an open source policy referred to, and there is a plan this is not going to block the project. However the panel expects that the code will be open by the beta assessment. If the project is brought back for a beta assessment without opening the code, it will not meet point 8 in the service standard and will not pass overall.

A key alpha outcome is to understand dependencies. This is critical for the application as all data is held by an existing case management system. The team have demonstrated an understanding of the difficulties and processes by which they are to engage with this application. However this will still prove to be a challenge through the beta. The panel recommends prioritising this integration early. Additionally the team must ensure that the suppliers are fully engaged with the agile process used by the development team.

The team showed good examples of use platforms and standards, utilising GOV.UK Notify, the prototyping kit and frontend toolkit. Additionally they had engaged with the Courts Finder project and are looking at ways of integration and sharing data between the projects.

Design

Scope and shared responsibility The scope of the service is addressing a situation which ideally a user should never have to go through. There will always be appeals to DWP’s benefits decisions, but the current rate of these and the rate of reversals after tribunals shows that there are improvements to be made to DWP processes. This is not directly the responsibility of this service team, but it is their responsibility to make sure relevant findings from research is fed back to DWP. The team showed understanding of this, and agreed it was important, but were reluctant to talk about any shared responsibility with DWP. The team should be clear that this is one user journey, and the responsibility for it is shared across government departments.

Push vs Pull The service acts in 2 ‘modes’ - Push and pull. This showed the team had considered the user’s situation involves a lot of anxiety, and that they could pre-empt that in some cases with ‘push’ notifications, and provide a ‘pull’ service to reduce anxiety at any other time it develops throughout the user’s journey. In particular, the content and timing of notifications had been well thought through. This is particularly impressive because if they are successful, there shouldn’t be a need for the ‘pull’ tracking at all. During beta and live, it would be good to see how iterating on the frequency and content of the notifications could reduce people checking the web service, as well as reducing phone-calls to the contact centre, as this could be an indicator of reduced anxiety.

Notifications In general, the content of the notifications was well thought through. For beta, consider researching how users feel about the different channels and the content that comes through. For example, are users comfortable receiving text messages which mention they are on benefits?

Progress bar The team have created a ‘progress’ bar element which is not an established government design pattern. The team showed good rationale and research behind creating this new element. However, status tracking is not a unique problem in government services, so it would be good to see more about how the team has learnt from other teams who’ve tackled this problem. There is a page on the design hackpad about tracking which the team should be referencing and contributing to, and it would also be good to see engagement with the cross government design community on this pattern, either via the cross government slack channel or the ‘service-designers’ email list.

In particular, more focus needs to be placed on constrained situations, such as:

Small screens. The progress bar will need to be vertical on smaller screens. How does this affect the information users get from the page? Screen reader users. It was noted that hidden text has been provided as a backup for screen readers. From the prototype, it seems that this means screen readers get more clear information than other users. For a beta assessment, the team will need to show a real need to differ the information for non-screen reader users this much. Users who can’t see low contrast between colours. The grey of the progress bar line does not meet minimum contrast guidelines against the white background.

In general, the panel will need to see more evidence on how people interpret the progress bar in beta. What information have users gained from looking at the page (in their words)? What questions do they still have after looking at it?

Particularly when a user is on the first step, the green turns to grey at the same time as the circle turns into a line, which could be confusing. Make sure this step in particular is clear to users. The visual design also has a strong association with a tube map, which may be confusing to some users.

Submit evidence One of the needs for the service is to remind users they need to submit evidence by a certain date before the tribunal. The team showed they knew that this is important, but the way this is communicated on the page needs to be clearer. The phrase ‘What you need to do’ could be well used here. See the above hackpad link for more on this.

Subscribe to notifications The way that users subscribe to notifications was barely mentioned in the assessment. The team said that users will be prompted to subscribe when they get in touch, but a more active prompt when the user starts the journey is needed to. The team mentioned that they will start looking at that part of the process during beta, which will be needed to pass that assessment, but it would have been good to see some more work done on that during this phase as well.

Analytics

The team has demonstrated that they are thinking about future success measures, and have made contact with the performance platform team at GDS. They are working to demonstrate how they might know that a Beta/Live service wasn’t working.

Recommendations

To pass the next assessment, the service team must:

Open source the code as soon as possible Actively recruit users with support and access needs to learn whether their service and support model works well for all users. Many service users will be reluctant to take part in the private beta, so a substantial amount of work is needed here. Fully test patterns which are not established on GOV.UK, such as the progress bar. Address the more complicated issues in the full, end-to-end, user journey (electronic transfer of evidential documentation, electronic sharing to adjudicators, online appeal application etc) before they return for a beta service assessment. Ensure strong relationships with the supplier for the existing case management system and align the shared deliverables to minimise risk. Keep engaging with the assurance team to ensure data privacy concerns are mitigated.

The service team should also: Engage with the cross government design community and published design assets to make sure any new elements are consistent with GOV.UK standards. Look to recruit permanent civil servant to roles in the team, especially technology specialists. There are currently very few civil servants on the team.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 10 February 2017