Adult social care provider information return alpha assessment

The report from the alpha assessment for Care Quality Commission's adult social care provider information return on 1 August 2017.

From: Government Digital Service
Assessment date: 1st August 2017
Stage: Alpha
Result: Not met
Service provider: Care Quality Commission

To meet the Standard the service should:

Design, test and iterate the end to end journey, based on the new process of continuous information updates, including the assisted digital journeys.

Have a designer and a content designer working as part of the development team

Implement an adequate level of user authentication using IDAM and also explore the use of GOV.UK Verify

Conduct more research into the needs of different types of care providers and look at what should be built to support those needs

About the service

Description

The service collects information from providers about the quality of care they provide, to enable Care Quality Commission to monitor the quality of care between inspections.

Service users

The users of this service are Adult social care providers. There are around 13,000 adult social care providers (legal entities) and 25,000 individual care locations currently registered with us.

Detail

User needs

The team has identified a range of kinds of care providers, notably from single owner/operators to large corporate care providers with hundreds of care facilities. It is not currently clear how their needs differ, and what should be built to support their differing needs.

The team has identified a range of needs for the service but these were demonstrated in terms of features of the service rather than the user needs. The service would benefit from explaining the needs of the user in the personas and user stories.

The service team explained that they have not been able to identify users with accessibility needs or assisted digital requirements. This is an area that needs the service needs to address and it is not clear what the journey is for those with assisted digital needs.

Current completion rate for the service is 75-85%. One of the reasons for low level of completion is due to an inspection happening before the form was due, negating the need for completion. It would be good to be clearer about how much this represents so the needs of those that are non-returners could be investigated more thoroughly. The inspection teams and contact centres should be used to help understand these needs.

It is good that an inspector is working as a part of the team so that their needs are represented.

The team have an objective to move from a yearly or pre-visit information return to continuous information updates; this was not explored in the alpha and will require a lot of consultation, prototyping and research with care providers, inspectors and policy teams to ensure the right information is being requested without undue burden placed on the care providers. This could also change the basic design of the service.

Team

Design and content design are currently fulfilled by a mixture of many members of the team (user researcher, business analyst and product manager). It is crucial to have separate members of the team who are design specialists and content design specialists. This will increase the speed of the team getting to a useful, usable and understandable service.

The team are upskilling current CQC civil servants to take on several roles, including user research and development. This is good to see, but to continue their progress and learning, and keep momentum on building the service, specialists in these fields will also need to be part of the team.

Technology

The service team has built not only a mock website, but also a mirroring functional prototype using open source technology stack including PHP, Drupal and Mule. They are planning to deploy the service on Azure (provided internal policies are approved) - a choice mainly dictated by available licences and preference, however the team is open to explore AWS as an option and is empowered to make a switch to AWS if needed. It was assuring to see that the deployment is already done with industry standard tooling such as Docker and Kubernetes and there is a sufficient knowledge on what environments and release process for Beta. The code QA at the moment only includes unit testing, however there are plans to build automated behaviour and web interface tests.

The team follows good architecture and development principles and is looking at re-using both DH and GDS technologies and services where applicable. They are already using GOV.UK Notify and will be looking at GOV.UK Verify for their Beta stage. The service is well decoupled with Micro services and Restful APIs are used to both input and share data. We would advise the team to contact the GDS Registers team and look at possibly opening a register of all adult care providers. It commendable that the solution is built with potential re-use offering to other public agencies, such as the local authorities and Skills For Care.

Addressing security, the team have already established who their SIRO is and identified their data classification and basic threat model. The The prototype is currently protected by using a security token within the access URL which would not provide sufficient level of security over personal data and misuse. The team are currently looking on various IDAM options including GOV.UK Verify. We would strongly advise to have at least LOA1 authentication in place before opening private Beta.

The service already handles personal data securely and there has been steps to comply with GDPR. More work however is needed in this area, especially in identifying the correct type of GDPR data actor and also protecting the data from the free text fields.

The team is not coding in the open yet, however there is a desire to open the code soon and publish it to a public repository. In terms of service availability, there are already plans on building appropriate automated health monitoring and support cover. The team needs to identify business acceptable critical outage time and make a clear plan of both how to communicate and recover in event of a technology failure. Additionally, business have to identify all risks connected with a potential critical outage and if there are any mitigations to be put in place.

Design

CQC is an independent regulator and has an exemption from implementing the GOV.UK style. However we would expect the service to be usable, accessible and understandable. The team has taken steps towards this by implementing some design patterns from the Service Manual, such as the task list. There has also been work on the wording of the questions asked. There are issues with the current frontend implementation, including button highlights and error handling.

The team has plans to break down the questions in the 5 main categories to remove the need for long free text answers that do not necessarily give the information the assessors and investigators need.

There are areas of the service that have not currently been explored - such as how users will

sign in, user management and delegation, updating information in the service and the interface and service for inspectors.

The team will need a plan to recruit and research the service with users with a range of access needs regularly during beta development, as well as conducting a more formal, technical accessibility audit when the codebase and service is more stable.

Analytics

The team had a clear idea of the type of analytics and key performance indicators (KPI’s). They have clearly thought about which KPI’s and metrics they need to measure on both the online and offline sections (such as number of calls to the helpdesk) of the service and have looked at these beyond the mandatory four KPI’s.

It is encouraging that the team is already using Google Analytics and has contacted the Performance Platform team to make the necessary arrangements.

Recommendations

To pass the reassessment, the service team must:

  • Design, test and iterate the end to end journey, based on the new process of continuous information updates, including the assisted digital journeys.
  • Have a designer and a content designer working as part of the development team
  • Implement an adequate level of user authentication using IDAM and also explore the use of GOV.UK Verify
  • Conduct more research into the needs of different types of care providers and look at what should be built to support those needs

The service team should also:

  • Have a developer who specialises in frontend development to integrate the GOV.UK frontend toolkit or improve the quality of the frontend codebase
  • Investigate how care providers will give continuous information updates and change the service design accordingly
  • Start planning how to recruit and research with users with access needs during beta development
  • Ensure all technical knowledge is transferred from the team in Alpha to the Beta team
  • Work with GDS Registers to explore creating a public register of adult care providers
  • Work with GDS PaaS team to explore hosting options
  • Identify all third party integration points and establish what data is shared
  • Contact other government services and bodies already collecting similar data from the public and try to establish ways to share this data using open formats.

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Not met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Not met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK n/a
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 24 July 2018