Juror Summons Beta Assessment

The report from the live assessment for HM Courts and Tribunal Service Reply to Juror Summons beta assessment on 18 January 2018.

Reply to Juror Summons

From: Government Digital Service
Assessment date: 18 January 2018
Stage: Beta
Result: Met
Service provider: HM Courts and Tribunal Service

The service met the Standard because:

  • The service team have continued to develop the service according to users needs, tested with even more users and achieved great digital take up rates and satisfaction levels
  • The service is built by a multi-disciplined team based on a good technology stack
  • The digital service has triggered business transformation of the legacy process achieving faster transactions and reducing support

About the service

Description

The service enables users to respond online to their jury summons letter through a faster and clearer user experience and will also introduce the option for a third party response.

Overview of service users

Any citizens throughout England and Wales between the ages of 18 and 75 who have been summoned for jury service and HMCTS staff at the juror Central Summoning Bureau who will be using the internal facing interface.

Service users:

  • The users of the public facing service are summoned citizens and their representatives
  • The users of the internal back office service are the Jury Central Summoning Bureau’s administration staff

Detail

The Reply to Juror Summons digital service scope includes some transformation to the back office processes and a new web application to speed up the processing of replies.

Each year, the back office team processes on average between 300,000 - 350,000 paper summon replies and handles over 200,000 telephone calls. The current process for summoning jurors is extremely paper-led with communication done by paper, phone and email.

The new digital route has a target uptake of 20%, but the team already see an uptake of 45%. In order not to become victim of their own success, there must be considerations for a robust throttling mechanism to onboard users in pubic Beta which tests the performance and scalability of the service. The team is very aware of this and will perform a staged rollout to other courts and regions, mainly based on their back office teams work division. The panel was pleased to see provisions to automate the straightforward cases, which should enable staff effort reduction and service scalability.

User needs

The team has a dedicated user researcher who demonstrated a good understanding of user needs. A significant amount of user research has been conducted, including a focus on accessibility. Full accessibility testing has been carried out with a considerable number of representative end users, including testing with screen readers.

The user researcher is well integrated within the team, providing early feedback of any significant issues seen. There is rapid iteration of research feedback, with several examples of where research has informed the design of the service.

User types have been mapped out on the digital inclusion scale, but there are no design personas. Two user groups are referred to: citizens and Bureau Officers. A web app has been built to enable Bureau Officers to process digital responses. This was considered in the scope of this assessment and it was noted that the team has also conducted research with this user group to ensure the app meets their user needs.

The team has a good awareness of the issues around a digital service for replying to a Jury Summons. They have given a lot of thought, for example, to the format of questions and answers, the importance of making them clear and easy to understand to avoid the risk of enabling someone to answer in a way that could avoid doing jury service.

During private beta the majority of usability testing was carried out with users on the lower end of the digital inclusion scale, and users with access needs (including work with screen readers). The emphasis on accessibility is commendable, since making a service easy to use for people with access needs also makes it easy to use for everyone. However it is recommended that any future usability testing should also be conducted with citizens in general, including those higher on the digital inclusion scale, and on mobile devices.

The panel had some concerns about the usability of some question formats, see below. It is recommended that these questions should be monitored during public beta to ensure that most users are able to interpret and answer them. This can be done by checking for errors and drop off rates. We also encourage the team to continue to explore and test alternative solutions to these:

  • Residency qualification question
  • Free text box for deferral requests
  • Deferral date picker

The team identified additional benefits from users responding via the digital channel. These include a higher level of consumption of the juror video, which has resulted in more users being better prepared for jury service. This is a value add benefit and should continue to be tracked via analytics.

It’s not known if the digital channel has had a positive impact on the number of citizens needing to call the Contact centre, but this data would be valuable to identify benefits of the digital channel. It is recommended that this data is collected either via a survey question, or by asking the Contact centre to ask callers if they have used the digital or paper service.

The user researcher should produce a research plan that shows all user research activities to be conducted during public beta. This should include measuring the use and service satisfaction of the planned Assisted Digital service, ready for Live assessment.

Team

The multi-disciplined service team maintained consistency of roles and individuals during their Alpha and private Beta stages. Although split between two geographical locations (the development and test staff is in Glasgow and the rest are in London), the team have managed to establish a culture of close collaboration and are making good use of modern work tools such as Confluence, Rocket Chat, Skype, Webex, InVision and others.

The team development and test force is still predominantly supplied through a partner consultancy company, however knowledge transfer sessions are being carried out between the contractors and the new permanent DevOps recruit. The team is expecting to be able to embed a shared HMCTS performance analyst soon. The service is still heavily reliant on the delivery partner CGI and should continue the efforts to recruit more permanent staff while they are still contractually covered by CGI until some time in 2019.

The team feel empowered by their senior stakeholders and confident they can apply business transformation and change policy if required.

Technology

The team uses technologies that are mainstream and mostly open, such as Java with Spring Boot, Node.js with Express, Apache Webserver, and RESTful API.

The team don’t have direct control over the part of the infrastructure closer to the juror datastore (which is owned by the wider MOJ department), but that does not seem to pose major issues, because the team confirmed they are well supported and their SLA in case of an emergency is good.

The team demonstrated a strong understanding of security and performance needs. They undergo regular penetration tests and have tools in place to automatically detect security vulnerabilities. They have put thought into sizing their servers appropriately and have run performance tests to plan capacity for when the traffic is going to increase. They can also rely on the auto-scaling capability of their Cloud provider (Azure).

The frontend side of the user-facing application is built on lightweight components (e.g. JQuery) rather than heavy frameworks. The team has stated they have employed progressive enhancements techniques, so that the application is fully usable, even if Javascript and/or CSS resources are turned off. This is effective for delivering user experience.

The service is not designed to provide access to much sensitive data, and the database of jurors is protected by several layers of firewalling and is constantly monitored. That greatly limits the extent and risk of security and privacy issues. They are also diligently preparing for their upcoming GDPR assessment.

The team have procedures in place for when their service becomes unavailable, they have a dedicated support team and can react very quickly if something in their application needs to be changed as a matter of urgency.

The team should carry on its effort of trying to adopt GOV.UK Notify for their communication with citizens, even if the integration is not solely up to them, but the wider MOJ department needs to be involved.

The team have set up a repository to make a big portion of their code open, which is a great step forward. They also explain well how to set up the project. They should continue their effort in this direction, by clarifying what license the code is released under, by open sourcing new code, and, ideally, use their Open Source repository not just as a “synchronised mirror”, but as a live repository for the team.

Design

The panel is impressed with the breadth of research the team have done. Especially with access needs users and accessibility testing. This gives confidence that the service design is meeting user needs.

The team demonstrated a good amount of iterations and importantly learnt and improved the design of the service based of research.

The amount of supplementary content to help users could be making the service more complex. There were points in the beta prototype where help was repeated. This could point to a lack of confidence in the overall design of the service. It would be good to explore if content could be reduced and explore the impact on the service.

There were some consistency issues in parts, that when improved would make the service easier to use. Things like consistency use of H1’s, space around form inputs and less use of light grey text will improve the service for users.

Analytics

The team showed that they have used analytics to feed into building the service and have really thought about what metrics they need to use to measure their service. They have been working with the Performance Platform to provide their mandatory KPI data and have the dashboard ready to go for when the start receiving data.

They need to implement anonymising of the users IP address. GDS to provide instructions on how to do this.

We recommend that the team carries out a performance framework workshop to build a Performance Framework. There is more information on dataingovernment.blog.gov.uk describing how they can do this.

The team does not have a performance analyst embedded with them but once they go into public beta they will have access to a team that can provide them with this resource. For the next assessment it would be good to understand how this works, including understanding;

  • How does the findings from a performance analyst feed into into sprint planning and building the service
  • How does the performance analyst receive the task to be investigated

Recommendations

To pass the next assessment, the service team must:

  • Move away from using the free version of Google Analytics and anonymise the user IP
  • Address all findings listed in the Accessibility Review and conduct a WCAG test of the service before going on GOV.UK

  • Use GOV.UK Notify for citizens communication and contact the GDS Platforms Engagement team to explore the option of using GOV.UK PaaS for hosting

The service team should also:

  • Detail how a performance analyst will work with the team and feed into the backlog and priorities for iterating the service
  • Monitor user understanding of the 3 question formats that have been highlighted as being of concern
  • Track the impact of the digital service on the need for users to call the Contact centre

  • Explore if content could be reduced and the impact on the service
  • Produce a research plan for public beta, including the Assisted Digital service and the Bureau app
  • Conduct research into the feasibility of dropping the paper form from the pack sent out with the jury summons, to encourage digital by default.
  • Clarify the license for the code hosted in their open source repository
  • Improve consistency to make the service easier for users - consider your use of H1’s, space around form inputs and less use of light grey text

Note: This service will be allowed to launch on a GOV.UK service domain only after the issues flagged in the GDS accessibility review are remedied and a WCAG compliance test is successfully passed.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

  • searching the Government Service Design Manual
  • asking the cross-government slack community
  • contacting the Service Assessment team

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 30 July 2018