Adult Social Care Provider Information Return - alpha

The report from the alpha assessment for CQC’s Adult Social Care Provider Information Return service on 30 October 2017.

From: Central Digital and Data Office
Assessment date: 30 October 2017
Stage: Alpha
Result: Met
Service provider: Care Quality Commission

The service met the Standard because:

  • The team provided evidence of achieving a sufficient standard to pass alpha reassessment in each of the key areas, namely;

  • Managing data, security level, legal responsibilities, privacy issues and risks

  • The team follows good architecture and development principles and is looking at re-using both Department of Health and GDS technologies and services where applicable. They are already using GOV.UK Notify and will be looking at GOV.UK Verify and GOV.UK PaaS for their beta.

  • Understanding user needs

  • The team demonstrated that had improved their approach to user research since their last assessment, and were able to break down their users into four personas.

  • Creating a simple and intuitive service

  • The team had clearly thought about how users will get to the CQC website, and through the service. They had created a mock-up of that part of the journey. The research that the team have done suggests this will be a reliable way to contact users.

  • Having a sustainable multidisciplinary team in place

  • It is clear that having a dedicated team on this service, with dedicated user researchers, and the addition of more designers (content and UX) has improved their service.

About the service

Description

The service enables adult social care managers to send information about the quality of the care they provide to the Care Quality Commission (CQC) ahead of a formal visit from an CQC inspector.

Service users

The users of this service are adult social care service managers.

Detail

The panel was impressed with the way the team had taken on board the recommendations made in the first alpha assessment, and put in significant work to implement changes. It is clear that having a dedicated team on this service, with dedicated user researchers, and the addition of more designers (content and UX) has improved their service. This can be seen in how user research has shaped their user journey through the service. They have also worked closely with GDS on managing data, security level, legal responsibilities, privacy issues and risks, and are looking at how to reuse GDS components.

User needs

The team had conducted research into the needs of different types of care providers, including smaller providers and larger national providers. They used personas and user needs lists to summarise their findings. They had also taken steps to understand why people completed their return on paper. Only 20 providers interacted only on paper. The team were able to interview each of these providers. They identified connectivity and hardware issues preventing them from interacting online.

There was a reasonable justification for a telephone-based Assisted Digital support service. This was based on on similar support provided for Registered Managers completing the required accredited training for the role. The telephone number for support is clearly viewable at the at the start of the service and there was a plan to test this during the beta phase.

The team explained difficulties testing the end-to-end journey: users spread completing the return over two week period spending 20 hours on the form, including offline time. It would be difficult to test an end-to-end service with a user without impacting on their caring work.They described a good compromise, combining end-to-end walk through research with more intensive testing of individual components. The team outlined an ambitious programme of research for the private beta phase. A dedicated user researcher will need to be focussed on this project in the beta phase to achieve this.

Team

The team have made changes to make sure they have a sustainable multidisciplinary team in place. This includes bringing skills in house from consultancy firms. The panel is pleased to see that the team have now split roles for user research, design and content. The team uses collaborative tools to make sure that they work together, and learning is shared through agile ceremonies.

Technology

The team have built both a prototype service and a working system using Drupal and PHP. They’re already deploying this on Azure using Kubernetes and Docker, but are very keen to use GOV.UK PaaS to reduce their need to have a dedicated infrastructure team. They are also exploring the possibility of using AWS for hosting, but would still prefer to use GOV.UK PaaS. They have some limited amount of test and deployment automation, but acknowledge that a more reliable, repeatable build process they can have confidence in (through a range of automated testing techniques) should be a priority during beta. The system they’ve built is more mature than we might like to see at alpha, but the overall approach is consistent with what we’d like to see in early beta.

The team follows good architecture and development principles and is looking at re-using both Department of Health and GDS technologies and services where applicable. They are already using GOV.UK Notify and will be looking at GOV.UK Verify and GOV.UK PaaS for their beta. They are also going to advise parallel teams in their organisation to work with the GOV.UK Registers team to build a register of adult care providers. The team are working to open their code, and had hoped to address this in time for re-assessment, but are not yet coding in the open. They have, however, already shared some of their code as Drupal community modules, so that other Drupal users can benefit from them.

The team intend for the service to be available 24/7, to support the already busy schedules of reporting managers, and are doing initial performance tests based on the idea that their service should support every care provider submitting reports all at once. This is an admirable goal, but the team should try not to optimise too early, lest they build complexity into their system that isn’t yet appropriate. The team have some simplistic service-level healthchecks to ensure the system remains available, and are intending to investigate more robust monitoring services to support this. They have also suitably decoupled their service from the legacy backend that it integrates with. At the moment, the team has no data recovery strategy in place, but are going to be working on this during beta.

The team have good relationships with their information assurance team and SIRO, but should explore more appropriate levels of security control for the service. Notably, full IDAM and role-based access control is not necessary at this stage; requiring the reporting manager to remember a password (and, ultimately, reset it) from one year to next offers little better security than simply sending a unique “submit your report” link each year. When CQC has a broader suite of systems, perhaps a full IDAM solution will be more appropriate.

Design

Awareness via email

The first step in the user journey is receiving an email when it’s time to send CQC the information required. CQC will have the user’s email address from the registration process (outside the scope of this assessment), and the research that the team have done suggests this will be a reliable way to contact users.

It’s great that the team have thought about how the users will get to the CQC website and have created a mock-up of that part of the journey.

However, the content of this email needs more work. For example, the users will need to know at this point in the journey how long they have to complete the service, and how big the task is. In beta, the team should do research to find all the needs of users at this point in the journey and make sure the content in this initial email meets those needs.

Start page

The title of the service as stated on the service page is “Complete your adult social care provider information return.” The team should explore a more plain language name for their service, such as “Tell us about your care service” or “Tell us about the quality of your care service.”

The design of the start page of the service is consistent with the GOV.UK pattern for start pages, including the use of the GOV.UK style start button and information callout. This is great to see being implemented on the CQC website, and should work well with users who are now generally very familiar with this layout.

Task list

The service is using the GOV.UK task list pattern, which is great to see on the CQC website. This looks like a great use of this pattern, as research has identified a need for users to see an overview of what they need to, and jump around the tasks with the answers they have to hand.

This pattern is currently in development, so the team should make sure they keep up to date with how it develops over the course of their beta and make sure they are up to date at their next assessment.

Download all questions as PDF

The team found in research that users need to have a printed copy of all the questions, so that they can look at the questions whilst they are away from their computer. It’s great to see the team responding to this need in their service’s design, but using the PDF format is not the right way to do this. PDF is a proprietary format which requires additional software to be installed on a user’s device to view.

The team should explore meeting the need for all users with a printable HTML page.

This applies to the ‘check your answers’ PDF at the end of the journey as well.

Form content

The team have adapted the ‘One thing per page’ design pattern sensibly, by keeping the pages short but acknowledging that with the current length of the form, putting each question on a separate page would be overkill.

The current length of the form is however very burdensome on users. The team acknowledged that there is much more work to be done to find the justification for each question that is asked, including doing much more work with inspectors and other users of the data that is being collected.

The team mentioned that the design of the service means some questions will only be shown to users that the question applies to, which is a good start on reducing the length.

In beta, the team need to make sure they work closely with users of the data to make sure there is a real need for each question. User testing the API that is offered to third parties will also feed into this, and in beta the panel will expect to see that API fully tested with proper documentation.

End of the journey – Check your answers, declaration and confirmation page

The end of the journey is consistent with GOV.UK patterns, although some of the content needs a closer look. For example, there is no need to say “Thank you” on the confirmation page.

Recommendations

To pass the next assessment, the service team must:

  • Be able to show how they are following the agile principle of maximising the work not done, as the current service is more in line with what we would expect at beta than alpha.
  • Be able to show that the current team is able to focus on iterating and delivering the service through to Live assessment in a way that suitably addresses their users’ needs in a sustainable manner.
  • Have tested and iterated the end-to-end user journey.
  • Show they have explored the recommendations detailed in the design section of this report.
  • Implement an appropriate level of automated testing (unit, integration, and smoke tests) and deployment to give a high level of confidence in their deployment pipeline, with minimal human intervention.
  • Be able to show that they are building their services in the open.
  • Show that they have appropriate disaster recovery and backup strategies in place, and are testing these on a regular basis.

The service team should also:

  • Consider removing the need for a password from the sign in step, and replacing it with a suitably unique, complicated URL for automatic login to the service, as this could simplify the login experience, and allow them to focus on the core of the service until identity assurance management is actually necessary across a suite of services
  • Investigate using GOV.UK PaaS to host their service
  • Ensure they have an appropriate set of developers and operations people on the team to iterate and run the service
  • Ensure that the development and ops teams work together so that they build an easily recreatable service with appropriate support arrangements.

  • Maintain their multi-disciplinary team resourcing profile, with added resource to manage their content.

  • Work with senior leadership across their organisation to make sure there is sufficient technical skills to support services and deployment at scale.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it n/a

Updates to this page

Published 24 July 2018