Reply to Juror Summons
||Government Digital Service
||18 January 2018
||HM Courts and Tribunal Service
The service met the Standard because:
- The service team have continued to develop the service according to users needs, tested with even more users and achieved great digital take up rates and satisfaction levels
- The service is built by a multi-disciplined team based on a good technology stack
- The digital service has triggered business transformation of the legacy process achieving faster transactions and reducing support
About the service
The service enables users to respond online to their jury summons letter through a faster and clearer user experience and will also introduce the option for a third party response.
Overview of service users
Any citizens throughout England and Wales between the ages of 18 and 75 who have been summoned for jury service and HMCTS staff at the juror Central Summoning Bureau who will be using the internal facing interface.
- The users of the public facing service are summoned citizens and their representatives
- The users of the internal back office service are the Jury Central Summoning Bureau’s administration staff
The Reply to Juror Summons digital service scope includes some transformation to the back office processes and a new web application to speed up the processing of replies.
Each year, the back office team processes on average between 300,000 - 350,000 paper summon replies and handles over 200,000 telephone calls. The current process for summoning jurors is extremely paper-led with communication done by paper, phone and email.
The new digital route has a target uptake of 20%, but the team already see an uptake of 45%. In order not to become victim of their own success, there must be considerations for a robust throttling mechanism to onboard users in pubic Beta which tests the performance and scalability of the service. The team is very aware of this and will perform a staged rollout to other courts and regions, mainly based on their back office teams work division. The panel was pleased to see provisions to automate the straightforward cases, which should enable staff effort reduction and service scalability.
The team has a dedicated user researcher who demonstrated a good understanding of user needs. A significant amount of user research has been conducted, including a focus on accessibility. Full accessibility testing has been carried out with a considerable number of representative end users, including testing with screen readers.
The user researcher is well integrated within the team, providing early feedback of any significant issues seen. There is rapid iteration of research feedback, with several examples of where research has informed the design of the service.
User types have been mapped out on the digital inclusion scale, but there are no design personas. Two user groups are referred to: citizens and Bureau Officers. A web app has been built to enable Bureau Officers to process digital responses. This was considered in the scope of this assessment and it was noted that the team has also conducted research with this user group to ensure the app meets their user needs.
The team has a good awareness of the issues around a digital service for replying to a Jury Summons. They have given a lot of thought, for example, to the format of questions and answers, the importance of making them clear and easy to understand to avoid the risk of enabling someone to answer in a way that could avoid doing jury service.
During private beta the majority of usability testing was carried out with users on the lower end of the digital inclusion scale, and users with access needs (including work with screen readers). The emphasis on accessibility is commendable, since making a service easy to use for people with access needs also makes it easy to use for everyone. However it is recommended that any future usability testing should also be conducted with citizens in general, including those higher on the digital inclusion scale, and on mobile devices.
The panel had some concerns about the usability of some question formats, see below. It is recommended that these questions should be monitored during public beta to ensure that most users are able to interpret and answer them. This can be done by checking for errors and drop off rates. We also encourage the team to continue to explore and test alternative solutions to these:
- Residency qualification question
- Free text box for deferral requests
- Deferral date picker
The team identified additional benefits from users responding via the digital channel. These include a higher level of consumption of the juror video, which has resulted in more users being better prepared for jury service. This is a value add benefit and should continue to be tracked via analytics.
It’s not known if the digital channel has had a positive impact on the number of citizens needing to call the Contact centre, but this data would be valuable to identify benefits of the digital channel. It is recommended that this data is collected either via a survey question, or by asking the Contact centre to ask callers if they have used the digital or paper service.
The user researcher should produce a research plan that shows all user research activities to be conducted during public beta. This should include measuring the use and service satisfaction of the planned Assisted Digital service, ready for Live assessment.
The multi-disciplined service team maintained consistency of roles and individuals during their Alpha and private Beta stages. Although split between two geographical locations (the development and test staff is in Glasgow and the rest are in London), the team have managed to establish a culture of close collaboration and are making good use of modern work tools such as Confluence, Rocket Chat, Skype, Webex, InVision and others.
The team development and test force is still predominantly supplied through a partner consultancy company, however knowledge transfer sessions are being carried out between the contractors and the new permanent DevOps recruit. The team is expecting to be able to embed a shared HMCTS performance analyst soon. The service is still heavily reliant on the delivery partner CGI and should continue the efforts to recruit more permanent staff while they are still contractually covered by CGI until some time in 2019.
The team feel empowered by their senior stakeholders and confident they can apply business transformation and change policy if required.
The team uses technologies that are mainstream and mostly open, such as Java with Spring Boot, Node.js with Express, Apache Webserver, and RESTful API.
The team don’t have direct control over the part of the infrastructure closer to the juror datastore (which is owned by the wider MOJ department), but that does not seem to pose major issues, because the team confirmed they are well supported and their SLA in case of an emergency is good.
The team demonstrated a strong understanding of security and performance needs. They undergo regular penetration tests and have tools in place to automatically detect security vulnerabilities. They have put thought into sizing their servers appropriately and have run performance tests to plan capacity for when the traffic is going to increase. They can also rely on the auto-scaling capability of their Cloud provider (Azure).
The service is not designed to provide access to much sensitive data, and the database of jurors is protected by several layers of firewalling and is constantly monitored. That greatly limits the extent and risk of security and privacy issues. They are also diligently preparing for their upcoming GDPR assessment.
The team have procedures in place for when their service becomes unavailable, they have a dedicated support team and can react very quickly if something in their application needs to be changed as a matter of urgency.
The team should carry on its effort of trying to adopt GOV.UK Notify for their communication with citizens, even if the integration is not solely up to them, but the wider MOJ department needs to be involved.
The team have set up a repository to make a big portion of their code open, which is a great step forward. They also explain well how to set up the project. They should continue their effort in this direction, by clarifying what license the code is released under, by open sourcing new code, and, ideally, use their Open Source repository not just as a “synchronised mirror”, but as a live repository for the team.
The panel is impressed with the breadth of research the team have done. Especially with access needs users and accessibility testing. This gives confidence that the service design is meeting user needs.
The team demonstrated a good amount of iterations and importantly learnt and improved the design of the service based of research.
The amount of supplementary content to help users could be making the service more complex. There were points in the beta prototype where help was repeated. This could point to a lack of confidence in the overall design of the service. It would be good to explore if content could be reduced and explore the impact on the service.
There were some consistency issues in parts, that when improved would make the service easier to use. Things like consistency use of H1’s, space around form inputs and less use of light grey text will improve the service for users.
The team showed that they have used analytics to feed into building the service and have really thought about what metrics they need to use to measure their service. They have been working with the Performance Platform to provide their mandatory KPI data and have the dashboard ready to go for when the start receiving data.
They need to implement anonymising of the users IP address. GDS to provide instructions on how to do this.
We recommend that the team carries out a performance framework workshop to build a Performance Framework. There is more information on dataingovernment.blog.gov.uk describing how they can do this.
The team does not have a performance analyst embedded with them but once they go into public beta they will have access to a team that can provide them with this resource. For the next assessment it would be good to understand how this works, including understanding;
- How does the findings from a performance analyst feed into into sprint planning and building the service
- How does the performance analyst receive the task to be investigated
To pass the next assessment, the service team must:
The service team should also:
Note: This service will be allowed to launch on a GOV.UK service domain only after the issues flagged in the GDS accessibility review are remedied and a WCAG compliance test is successfully passed.
Get advice and guidance
The team can get advice and guidance on the next stage of development by:
- searching the Government Service Design Manual
- asking the cross-government slack community
- contacting the Service Assessment team
Digital Service Standard points
||Understanding user needs
||Improving the service based on user research and usability testing
||Having a sustainable, multidisciplinary team in place
||Building using agile, iterative and user-centred methods
||Iterating and improving the service on a frequent basis
||Evaluating tools, systems, and ways of procuring them
||Managing data, security level, legal responsibilities, privacy issues and risks
||Making code available as open source
||Using open standards and common government platforms
||Testing the end-to-end service, and browser and device testing
||Planning for the service being taken temporarily offline
||Creating a simple and intuitive service
||Ensuring consistency with the design and style of GOV.UK
||Encouraging digital take-up
||Using analytics tools to collect and act on performance data
||Defining KPIs and establishing performance benchmarks
||Reporting performance data on the Performance Platform
||Testing the service with the minister responsible for it