Claim for crown court defence - Live Assessment

The report from the live assessment for MoJ Claim for crown court defence service on 28 July 2016 and the reassessment on 29 September 2016.

Stage Live
Service provider Ministry of Justice

About the service

Service Manager: Liz Citron (represented by Stuart Hollands as Product Manager)

Digital Leader: Matthew Coats

The service allows advocates and litigators (or their representatives) to submit claims for payment of defence costs in the Crown Court to the Legal Aid Agency (LAA). Currently this is a paper process, with a dedicated team processing often complex submissions. Development has been carried out by Ministry of Justice Digital (MOJD) on the LAA’s behalf.

Results - 28 July 2016

Result: Not Met

To meet the Standard the service should:

  • Ensure an appropriate team to support further development and improvement to the service is in place and has secure funding for the life of the service.

  • Resolve identified issues with the front end design, taking into account recommendations from their accessibility audit, ensuring that user experience of the entire end-to-end journey is considered.

  • Resolve concerns about service resilience and security, as identified in the technology and analytics sections of this report.

Detail of the assessment - 28 July 2016

Lead Assessor: Simon Everest

User needs

The team showed that they have clearly identified the functional needs of people who use the Claim for Crown Court Defence service. There are two users of the service: legal professionals and caseworkers. The service provides separate flows for both users. Throughout service development, a total of 140 caseworkers and providers have been involved in testing the service and providing feedback. The team has engaged with ‘providers’ (legal professionals) more than caseworkers. The research has combined contextual visits at people’s place of work with telephone calls. There has been no lab testing carried out so far. The service team explained that it is hard securing professional people’s time to participate in lab testing. The researcher has been accompanied by different team members while performing the contextual testing. The researcher has been responsible for sharing research outcomes with the rest of the team, following the test sessions. Video material of the test sessions has been captured by the user researcher. Whilst the lack of lab testing is not a reason to not meet the service standard, we recommend including lab testing of the service during the following sprints. Lab testing enables the whole team to be involved. Each person who observes a lab session takes away different learnings that can then be shared between all members of the service team. The whole team shares the problem. GDS can help secure access to the GDS usability lab, plus support in strategies for securing participation of the right users.

There has been no face to face testing with users who have accessibility needs, although the service team has engaged with the Digital Accessibility Centre in Neath. We urge the team to continue their efforts to conduct primary research with disabled users if possible. We recommend that all high priority recommendations identified in the DAC report be incorporated into the service before passing into Live.

Assisted digital support has not been identified as a critical need for this service, due to both users types of the service being professionals who use digital technology for their work on a daily basis. Having said that, it was noted that the support telephone number was not displayed on the digital interface screens. This should be addressed before passing into live.

Team

The team attending the assessment demonstrated a very strong understanding of their subject matter, and were positive and enthusiastic about meeting their users’ needs. They are demonstrating very effective agile working, and have access to appropriate tools and techniques to support development. Although the team are part of MOJD, several members have substantial experience working in the LAA who ‘own’ the service. This will help to avoid any client/supplier separation between LAA and MOJD, and ensure that LAA take an integrated, end-to-end view of the service, which might involve policy or procedural changes beyond the ‘digital’ parts.

The team have a good blend of skills, but do not currently have a performance analyst dedicated to the team. Currently the function of performance analyst has been picked-up by others in the team, but in the absence of a specialist, they do not currently implement changes to the generic analytics set-up to collect a richer set of data to inform future iterations of the service.

Significantly, the team do not have guaranteed funding for ongoing development. Currently although there is a desire to continue development, in tandem with developing additional services for Legal Aid Agency, funding has only been secured for two more months.

It is vital that that this uncertainty be resolved, and appropriate funding put in place to ensure future improvements to the service, particularly as LAA are currently considering mandated use of digital, rather than paper, channels. Additionally, the planned expansion of API-based usage will potentially require further future development as 3rd party suppliers develop their approaches to integrating with the service. We would expect any live service to have a permanent (if not necessarily full-time) complement of service manager (product manager in MOJD terms), user researcher, front end developer and performance analyst to ensure future iterations and continuous improvement of the service, and are able to call upon other specialists when required.

Technology

The application is complete for live, with nothing left on the backlog required for the move to live. The system has been in place for several months and has processed a large number of claims. The panel were pleased by the support mechanisms in place for handling both user issues and bug fixes, which is integrated into MoJ Digitals IRAT (Incident Response and Tuning team). However, as before, there remain concerns that the ongoing continuous improvement model isn’t yet fully formed, with funding for further development only confirmed for an additional two months from Live.

The API development is good to see, and the issues raised at beta have been addressed. The team are supporting development against this API and this should be a key feature of the live platform.

It was mentioned that the application does not yet support no downtime deploys. This is an essential feature of live services and the panel would expect that to be addressed as a matter of urgency.

The team are using standard Ministry Of Justice deployment patterns, which enable support for the live application to be handed over to the platform team at MOJ. However, the panel was unsure as to how mature the monitoring and metrics processes were and to how rich a data set the team were able to see. Whilst the panel appreciates that the responsibilities are split between the two teams, the panel would like to see that the team was able to both identify and diagnose issues arising from the product. As the team move to live the panel would expect that alerting and monitoring are of a high standard and the team would not rely on Google Analytics for performance data, for example.

It was positive to see that performance testing has been done recently. The panel would like to see this occurring on a regular basis and we would like to see the team addressing any performance issues arising. The panel would also like to see the team spend time investigating potential spikes that may occur and ensuring that their service can withstand at least double the expected maximum.

In the event of extended downtime it was unclear as to how the service team would respond. We would expect there to be as a minimum a holding page to direct users to the call team, and we would like to see the team be able to articulate the needs of their users with regards to any outage.

Design

The visual design and layout was much improved - simpler, and closer to GDS style.

We would still suggest investigating splitting the process up into simpler steps. We saw issues with validation, where it’s hard to find the field responsible for the error. This is one of the problems tackled by having smaller pages, as there are fewer fields to consider on each page it’s easier to find the problem field.

In a video recording we saw a user unable to proceed, as the error message did not update when the field was corrected. We recommend looking into that issue.

At the time of assessment, a full accessibility review had not been completed, and this is mandatory for passing. In addition, you need to do user research with your users who have access needs to make sure the journey works well for them.

We were concerned that the full end to end journey had not been through user research. By this we mean the full journey an end user would experience - apparently this starts with an invite email. While we accept that many people have now used the system, we would still strongly recommend this research as issues can be clear in qualitative research that don’t show up elsewhere. It would also give the team a more full understanding of the complete user journey.

The weakest aspect of the journey, as acknowledged by the team, seems to be the calculator spreadsheets. Whilst many users will have their own systems to calculate costs, if evidence suggests that a significant number of users are still dependent on the MOJ spreadsheets once Live, this should be tackled – it’s clearly a significantly worse experience compared to the rest of the journey.

Analytics

We were impressed that the team had a measurement framework in place and really considered the kind of measures that they needed. It was also good to hear that the team has recognised that some measures they had originally used were not useful. The team has iterated through and updated the measures accordingly. Feedback data from Zendesk helped develop the service and inform changes

Google Tag Manager (GTM) has been implemented but is not used, which could be a security issue. GTM is used to quickly implement changes to the analytics tagging setup without using a developer resource. If GTM is not set up correctly there is a risk of Javascript code being injected into a page.

IP address of users to the site are not anonymized before the IP address is sent to Google Analytics for processing.

The SIRO has not signed off on the use of Google Analytics. It is important that the team fully understand both the benefits and risks of their use of Google Analytics, and advise the SIRO accordingly to assist with the sign-off process.

Point 17 has been met and their dashboard on the Performance Platform is updated regularly. The panel were pleased with how the team collects and uses user satisfaction data

Recommendations - 28 July 2016

User Needs

We recommend including lab testing of the service during the following sprints. Lab testing enables the whole team to be involved. Each person who observes a lab session takes away different learnings that can then be shared between all members of the service team. The whole team shares the problem.

GDS can help secure access to the GDS usability lab, plus support in strategies for securing participation of the right users.

We recommend testing the service with users who have access needs.

Make sure that telephone support is displayed on all service screens.

Technology

The team should ensure that the monitoring and alerting features of the platform are such that issues in performance, security and functionality are all able to be discovered and diagnosed. It was unclear as to how real-time and feature rich some of the reporting was, and we would expect that the team would be aware of any degradation in performance or functionality and be able to diagnose these issues.

The performance testing of the service requires some more investigation, we would expect that the team would understand the expected usage patterns over a year, where the peaks are likely to be, and to ensure application can scale to accommodate these peaks comfortably.

Analytics

The team should set up a Google Analytics funnel to track the drop off of on the form. The team should remove GTM and use the regular Google Analytics tag instead.

The panel recommends that the team anonymizes the IP address in the Google Analytics tag and requests sign off from their SIRO on the use of Google Analytics.

To pass the reassessment, the service team must:

  • Secure funding for ongoing support and improvement of the live Service beyond the basic 1st/2nd line and bug-fix support provided by MOJD’s IRAT team.

*Have carried out an accessibility review showing the service meets at least level AA of WCAG 2.0 Continue to seek to carry out task-based user research with users who have access needs (for example, but not limited to, using a screen reader or screen magnifier)

*Make sure that the support number is visible on all service screens and that this has been validated in user testing.

  • Remove Google Tag Manager tags and replace them with Google Analytics tags

  • Anonymize the users IP address in the Google Analytics tag

  • Get sign-off from their SIRO on the use of Google Analytics

The service team should also:

  • Carry out full end to end task-based user research (from the initial contact of receiving the invite email)

  • Look into ways of improving the experience of using the calculator spreadsheet (including ways of avoiding its use altogether)

  • Add a performance analyst to the team

Digital Service Standard points - 28 July 2016

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Not Met
3 Having a sustainable, multidisciplinary team in place Not Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Not Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Not Met
11 Planning for the service being taken temporarily offline Not Met
12 Creating a simple and intuitive service Not Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Not Met

Reassessment Results - 29 September 2016

Result: Met

The service met the Standard because:

The team have promptly and effectively taken on board the concerns raised at the previous service assessment

Detail of the reassessment - 29 September 2016

Lead Assessor: Simon Everest / Martyn Inglis

User needs

The team have completed an accessibility audit and resolved the issues raised (and confirmed that they already met the WCAG AA standard). The team have also done some usability testing with disabled people who will use the service. And the team has improved the service as a result, in particular the display of information.

The team has worked hard to find disabled people to research with, and must continue to be proactive in this when the service is live.

The support routes for the service are now clearer on the service pages, and the team have good evidence that their phone support is trusted and effective.

The team has done more research around users’ route into the service, and has simplified and improved the initial setup for businesses as a result.

Although the team haven’t had an opportunity to carry out laboratory-based research, they have managed to stream on-site user research back to the development team to ensure everyone has an opportunity to observe. This is a good compromise where it is hard to get appropriate users to take part in research away from their desks.

The complicated calculator spreadsheets are still a concern. The team is continuing to look at either improving the calculator spreadsheets, or building the calculations into the service.

Team

Team now have a further 9 months of funding, addressing the previous concerns about ongoing sustainability of the team with only 2 months guaranteed funding. Additionally doing co-location with the LAA to share knowledge and have a well defined IRAT relationship.

Technology

The service currently has downtime during database upgrades - this needs to be rectified. Although we would not fail the service for this, we would very strongly recommend that it is resolved.

The team have an effective monitoring approach in place, related to requirements of the pre-existing MOJD IRAT team. There are dashboards for the team, alerts for response times/outages etc., and an operations manual is in place.

Performance testing suggests the service is relatively slow, and vulnerable to significant spikes in demand. Establish 10/s as best performance. Higher causes issues. Highest figures so far are 6500 or so submissions in a week, so probably in bound.

The team are able to run tests regularly. Not scheduled but teams can run these whenever they need.

Downtime - LAA has offline contingency. - Due to the length of time to submit claims (3 months), so a day or two of timetime unlikely to have effect and can always go paper if needed.

Design

All issues raised in the previous assessment have now been resolved, and it’s good to see that some additional issues that have emerged from subsequent user research have also been addressed.

The service now has the support contact number reproduced on all pages, which will help in the event of any problems. Research with vision-impaired users has led to a change in the presentation of tabular information which has been shown to be easier to read and understand.

The onboarding journey has also had some major changes as a result of new learnings triggered by concerns raised at the assessment. It is very positive to see the team learning from their research, and adapting and improving facets of the service that were originally considered ‘solved problems’.

Analytics

It is good to see that the team has removed the Google Tag Manager tags. It is also good to see that their SIRO has signed off on the use of using Google Analytics as the tool used to collect and report on usage of the service. While they do not yet have a Performance Analysts on the team they are looking to add this role to the team.

Recommendations - 29 September 2016

The service team should also:

  • Rectify the downtime caused by database upgrades. Use experience available in other services to work towards zero-downtime upgrades throughout the service.

  • Continue proactive research with users, particularly those with disabilities.

  • Recruit a performance analyst.

  • Develop an appropriate solution to replace the cost calculator spreadsheets.

Digital Service Standard points - 29 September 2016

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 17 February 2017