Make a Plea - alpha

The report from the alpha assessment for HMCTS's Make a plea service on 21 February 2018.

From: Central Digital and Data Office
Assessment date: 21 February 2018
Stage: Alpha
Result: Met
Service provider: HM Courts and Tribunals Service

The service met the Standard because:

  • The team have been regularly iterating and prototyping the service and testing with their users during alpha
  • The service team are working together in an iterative, user-centred manner
  • Thinking about the end to end service including the changing the paper version of make a plea form based on research

About the service

Description

The service enables defendants to plead guilty or not guilty in response to prosecutions under the Single Justice Procedure at Magistrates Courts.

Service users

The users of this service are members of the public that have been served with a notice of prosecution under the Single Justice Procedure.

Detail

User needs

The team have carried out a good range of research activities during Alpha across the different user groups although lab testing has focussed on TfL defendants as they will be the first to trial Make a Plea. As the team move in to Private Beta they should ensure that they carry out the same depth of research with TV licence and DVLA defendants.

There is strong evidence that the team have been iterating the design of the prototype based on research during Alpha and this should continue. There were gaps in the testing of comprehension of some pages which could help to inform some of the design choices – in particular it will be important to test what claimants understand about why they are being asked about outgoings and the declaration page.

It is clear that the team are not just focussing on the digital part of the journey as they have been testing the notice which is sent to defendants and recommending changes. They have also been successful in changing the paper version of make a plea form based on the research which is very impressive.

One of the key user needs is to understand that responding with a guilty plea can reduce their fine by up to a third. This is captured in the need the team described of knowing the consequences of not responding to their notice but it may be worth calling this out more explicitly. Increasing understanding of this fact will be what drives down the high number of people who don’t engage with their notice (80%) and who as a consequence are paying a higher fine.

The team has a good plan for continuing user research into private beta. The team should explore the options they have for carrying follow up research with people who have been through the process – in particular it will be good to learn from those who don’t engage to find out what can be done to address this.

Team

There are three teams working across ATCM and Make a Plea online and certain roles are shared between the teams. While in theory this could lead to problems for individuals in having to shift context between pieces of work, in practice it is positive as it allows for individuals to have a full understanding of these heavily inter related projects.

The team prioritise working in a truly multi-disciplinary manner. Developers are involved in discussions at the initial design stage and all team members are encouraged to go out in to the field to see the service in action and get a clear understanding of the problems they are trying to address.

The team should make sure that retrospectives are looking at improvements on how the team is working rather than becoming about technical decisions.

There is only 1 content designer split between this service and related internal systems. Given that this service contains many concepts that users need to understand, a content designer should be dedicated to the citizen-facing service. They should be backed up with the appropriate amount of interaction design capability and user research, so that different ideas about how to ask for information can be designed, prototyped and tested.

Technology

The team has made sensible technology choices, using open-source technologies and common platforms in line with the service standard, and are moving to public cloud hosting.

The online service, like the linked ATCM system, uses Java Server Faces as the framework running on the Wildfly app server. It communicates with ATCM via an API gateway supplied by a third party, Tyk. Both the user-facing service and the API gateway are covered by protective monitoring. The alpha is hosted in UKCloud, but is part of a larger project to move to new Azure hosting in the near future. The infrastructure is automated through use of Ansible and Terraform. The service integrates with GOV.UK Notify to send confirmation emails. The code written for the alpha will be carried forward for the beta.

The team are aware of issues around storing personally identifiable information and have carried out a privacy impact assessment. They have ensured that the online service does not persist any data in Azure, passing it straight on to the ATCM backend on submission. They should continue to ensure that they comply with the requirements of GDPR.

Currently the team are operating on a two-week release cycle. While this makes sense for changes that need to integrate with the API and downstream service, they should ensure that it does not constrain their ability to rapidly iterate the frontend. They should try to move to a much more frequent release process for these kinds of changes.

The code for the service is not currently open but the team have committed to releasing the repository, which has already been through automated cleaning to ensure that no secrets are present in the history. They must ensure they release it as soon as possible.

The team currently has no frontend development capability, and is unclear how they will respond to changes to GOV.UK frontend code. For the beta they will need to have a clear idea of how updates to frontend code and packages will be managed. A frontend developer will also be useful for accessibility and browser testing.

Design

The team has made a very mature prototype for the service, generally utilising GOV.UK design patterns, and have iterated many parts of the service.

There has been discussion with GOV.UK about the service, and this should continue, especially as more prosecutors are added to the service. The team, with GOV.UK, will have to make decisions about the number and naming of start pages for the service, given there is already a service in place on GOV.UK, and the service will not cover all prosecutions.

The team should try splitting out some of the more complicated pages in the service into multiple pages, with one thing asked per page. This would remove the need for so many titles and so much help text throughout the pages. It will also then be easier to iterate and try changes to each question (such as different ways of asking about finances).

As the service moves into different kinds of prosecutions, they will need to make decisions whether one flow will work for all needs, and seperate transactions and flows built per kind of prosecution (or per prosecutor). This will also have to include handling relevant support information such as contact centre numbers and email addresses.

A lot of the information that users provide is optional but will help magistrates to make decisions on the sentence and how it will be paid. Users must understand what information to provide and what the implications are of providing or not providing the information.

Given that the team has identified that a large minority of their users have mental health needs, the team must show through research that all users have made informed choices about the information they have provided.

The team has also identified many users who do not speak English, so some research about how users who need assistance deal with the service (both formally through organisations such as CAB and informally with friends) would be very beneficial.

We also suggest further research with magistrates to find out if they are getting the information they need.

The team has done good work at making complex content understandable (such as mitigation). This should continue and will probably be a lot of the refinements made during beta. A dedicated content designer will improve the quality of the service and make improvements more quickly. The team should continue to try to remove legal jargon from the service.

In the current design users are presented with a number of bodies and methods of accessing support for the service. It is unclear which is the best method and who to contact. More iteration and research needs to be done to make sure users know who to contact and that contact information is presented at the right time. Also the placement of contact information should be iterated and researched, on both desktop browsers and mobile.

Analytics

The service team have identified additional performance measures in addition to the four mandatory KPI’s to improve the service, some of which use data generated by the existing service, these will be used in private beta to inform research activity.

The service is in conversation with the Performance Platform to agree a joint reporting page that will clearly differentiate reporting of their service from the existing Make a Plea online Service.

The team have agreement from, and access to a co-located dedicated Data Analysis team within the Common Platforms Programme that will create a bespoke performance dashboard and provide performance analytics capability for the service.

In addition to the four mandatory KPIs, the service team will test the value of their Beta service by analysing time taken to complete user journey and points at which users drop out of the service. The team should also look at instrumenting parts of forms to see what questions users may be having difficulty with.

During Beta assessment, the service team can expect to be asked for benchmarks at end of Alpha and measurements at the end of Beta.

Recommendations

To pass the [next assessment / reassessment], the service team must:

  • Carry out further research with TV licensing and DVLA defendants in advance of further roll out
  • Have a dedicated content designer working on the service, with adequate user research and design iteration to make sure that users understand the concepts in the service and are making informed choices about what information to give
  • Have frontend developer capacity in the team to manage browser compatibility and testing, technical accessibility changes and changes to GOV.UK style
  • Show through research that all users (especially those with access needs and disabilities) can use the service, understand what they need to do, and have made informed choices about their plea and what information to provide
  • Formally register with performance platform
  • Make the code available openly

  • Test with the minister responsible

The service team should also:

  • Continue discussions with GOV.UK about appropriate start pages and content for the service
  • Try splitting out questions onto more pages (one thing per page pattern)
  • Try different ways of asking for complex sets of information, such as finances
  • Research then iterate support and contact information throughout the service
  • Research assisted digital channels and less formal methods of assistance, such as friends and family
  • Create a bespoke performance dashboard
  • Move to a faster release process, especially for frontend changes.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 6 August 2018