The service met the Standard because:
- The recommendations from the initial assessment were all addressed
- The team have put in place a strong approach to assisted digital
- The team have made good progress in other areas since the initial assessment
About the service
Service Manager: Jane Adderley
Digital Leader: Craig Wyna
Details of the reassessment
Lead assessor : David Illsley
Researching and understanding user needs [point2]
Following on from the recommendations made by the original beta assessment panel, the team has done a great job in addressing identified gaps and improving its approach to the design of the service.
Accessibility audit has highlighted a few potential issues that the team has reported as fixed. One of the highlighted challenges was the service time out, which has been reported as addressed in “guidance”. Unfortunately, the team was not able to test with real users and demonstrate the experience. However as service adoption grows, it is likely that a number of users is likely to grow with people who are new to the service, users who are reporting financials, and users with low digital skills or access needs. It is important to test “time out” time experience and communications with users (even though not in a real situation) and look closely at time out analytics and calls related to that issue. Going live, the panel would be interested to know what team has learned and how service was changed as a result.
Following on the recommendation to develop AD support the team has built an impressive co-browsing support, with trained Call Centre staff able to identify and support users with AD needs and put in place a plan for users who are not online at all. The AD support service has been tested by specialists from Tinder Foundation and identified AD calls have been analysed. The team has reported that currently the amount of calls related to AD is very low. For the live assessment the panel would be interested to know if the situation has changed and how the service is being developed as a result of the calls analysis. AD calls could be an excellent resource to recruit users with access needs to qualitative testing of the product iterations.
The team has a challenging task of simplifying the language in the service while the environment outside is operating specific jargon. The team has reported testing with users who have not done return yet, analysing help text on the pages, getting feedback through call centre staff, all those are really valuable resources of information. It would be useful to report at the life assessment how service language has been evolving as a result of user research.
The team has a great plan in place to collect evidence in order to influence changes in legislation and support decision making. The panel would be interested to hear about the outcome from this.
Running the service team [point 3]
The team has expanded over the last few months, bringing in skilled, experienced contractors to help up-skill Commission staff, alongside a separate plan for digital transformation within the commission.
Designing and testing the service [points 5, 10, 12]
The team have recognised the technology lock-in and limitations raised by the panel in the previous assessment. As part of a broader digital transformation, the Commission are exploring open source alternatives. The team plan for this alternative approach to be in place in time for the live assessment.
The team are now carrying out load testing, with target user numbers based on averages based several years worth of submission data.
When running concurrent sessions for load testing, the team should ensure that different test users are used, rather than the same account many times concurrently, in order to avoid overly optimistic results due to caching effects.
The average the team was targeting may flatten out peak transaction rates - 1900 submissions a day, evenly spread, may be very different from 1900 submissions in the last 30 minutes of the business day - the team should gather metrics to give accurate peak transaction rates in future and use those to influence load testing targets.
Improving take-up and reporting performance [point 17]
The team have made good progress understanding their KPI’s and have worked with the performance platform team to get a page in place.
To pass the live assessment, the service team must:
- Continue to iterate and improve the service, focussing on user needs
- Continue to upskill staff and focus on building a strong digital team
- Take control of their own technology, and have limited dependence on single-supplier technology
The service team should also:
- Continue to work with the commission digital transformation programme and the GDS transformation support team
- Review the performance testing approach to ensure it is realistic.
Digital Service Standard points