Register as a childminder

The service is an online digital service which helps potential childminders apply.

Service Standard assessment report

Register as a childminder

From: Central Digital and Data Office
Assessment date: 10 October 2018
Stage: Beta
Result: Met
Service provider: Ofsted

The service met the Standard because:

  • the service team had a sound understanding of their users and evidenced that they were meeting user needs
  • the service team demonstrated that they are making good progress in building digital capability across the organisation
  • the service takes advantage of the Continuous Integration practices and deploying on multiple environments, which helps iterating and improving on a frequent basis while ensuring the capacity, resources and technical flexibility to do so

About the service

Description

The service is an online digital service which helps potential childminders apply. This is an enhanced replacement service which aims to tackle very high returns and problems applying.

Service users

The users of this service are potential childminders and internal admin users taking applications.

Detail

User needs

The team demonstrated a solid understanding of who their users are and their needs. Some of these users (external helpers) are also providing good insights into the needs of childminders, who are the primary users.

During the private beta, the team has carried out surveys and telephone interviews with six of the ten childminders who have completed the service. Their experience was generally positive. They also undertook interviews with two internal helpers (ARC staff) who found it far easier than their current service. Some external helpers received a video demonstration of the new service, which helped the team identify support materials they would require. Unfortunately, none of the helpers were involved with the private beta childminders, however, local authority helpers are heavily involved at the pre-application stage and some do support applicants at application only.

There is a concern over the low number of private beta users. Although, the assessors understood the impact of time to complete and where the service fits into the bigger picture of becoming a childminder. It’s still wasn’t clear why the team hadn’t been able to recruit more, but the team had tried different recruitment approaches and were preparing to open up to applicants in London.

Throughout private beta, the team has continued to test the whole journey and parts of it with users. This has been a mixture of laboratory and contextual settings. The team provided many examples of how user research had impacted the design and content of the service.

The service has not been tested with the Minister as Ofsted do not have one. The team is due to test it with their Chief Inspector imminently.

The team has undertaken an accessibility audit in this period on the childminder facing service. They have outstanding actions around colour and text to resolve. They have undertaken user research with people with dyslexia, autism, hearing and visual impairments. They are confident from their research and information from ARC this covers most of their user’s needs. The internal facing service has not been audited or tested with accessibility users, and this should be addressed as a priority.

The childminder users are predominantly low to medium digital inclusion and they have a high prevalence of dyslexia and English for Speakers of Other Languages (ESOL) needs. The team has tested the service with this group and have a proposed process for an assisted digital journey involving their Facebook group, external helpers and ARC colleagues (telephone). However, none of the private beta users required assisted digital support and this journey should be tested as a priority moving forward.

Despite the lower digital inclusion, most childminder users are heavy users of mobile devices. The journey has been tested on the users’ own devices.

The team gave a good description of how findings from the research are used to drive forward design and iteration and shape priorities. The majority of team members have attended sessions, despite them usually taking place in the evenings.

The team talked about household members not understanding why they were being asked about their health. This needs further research and analysis to prevent delays in the childminder becoming registered.

The team has a good plan for undertaking research in public beta. They need to ensure the scenarios they have yet to test for example assisted digital and household members with ESOL needs are effectively identified, so research can be undertaken with these users.

Overall, the team demonstrated a great understanding of their users, a good mix of methodologies and great engagement amongst the team of research findings.

Team

The service team demonstrated that they are fully empowered with good support from senior staff. There are big cultural changes that the senior staff are trying to implement, which has helped in ensuring that the team remain and have longevity, which will continue beyond public beta and into live running. The assessment team was encouraged by the plans to recruit permanent staff, however, recognised that there was still some way to go to ensuring the in-house development of a permanent digital skills base.

The team has a full complement of roles which include service manager, product owner, content designer, user researcher, designer, and an outsourced technical function to Informed (a third party supplier). Informed provides technical resource (provided to April 2019). The technical roles include:

  • technical architect
  • service / UX designers
  • account / project manager
  • business analyst as required
  • front and back end developers
  • testers
  • support for transition planning
  • web operation/system operations
  • technical support functions including hosting support

The project has an active Senior Responsibility Owner (SRO) who is plugged into the project and regularly attends the show and tells and a new deputy director who is driving big cultural changes in Ofsted. The project started with only two permanent staff and have run at some points on a very small internal team. The team size is now much healthier and there is obvious knowledge transfer between contractors/suppliers and permanent staff. Although, the developer and technical roles are contracted out to a third party supplier the team demonstrated good knowledge sharing between the supplier and internal technical teams, which will be invaluable if and when the support for the digital service is passed to Ofsted to run.

The team has a helpdesk function that is run by the internal team, Arc who provide a telephone service for potential childminders. The service relies heavily on them and other government services including local authorities for assisted digital help. However, caseworkers are on hand to help childminders through the process if needed.

The team is split over two sites Manchester (including Altrincham) and London, however, the team talk regularly through stand-ups, sprint planning, retros and three amigo (agile) sessions. It was encouraging how the team shared user research findings among the team and out into wider communities including design patterns. All of the team attends user research sessions including the developers.

The team has just finished private beta with ten users and although this is encouraging that two users completed the journey end to end the assessors would have liked to have seen a private beta with more users. The service team explained that they were now keen to prove the system with more users and will roll the service out in the London region to prove the capability and improve on the current system, which sees a very high rate of applications returned to applicants due to errors (100% in some months).

Technology

This is the first project in Ofsted to take advantage of the Continuous Integration practices and deploying on multiple environments which helps iterating and improving on a frequent basis while ensuring the capacity, resources and technical flexibility to do so.

The technical architecture consisting of a Python/Django web application running on an Nginx web server using PostgreSQL as a relational database and Amazon Web Services (AWS) Simple Queue Service for message queueing. A notable architectural change after the alpha assessment recommendations is the migration from RabbitMQ to AWS Simple Queue Service for easing the deploy in a high-availability/clustered scenario.

The team evaluated different tools and systems and it currently uses Worldpay for payment processing and want to move to GOV.UK Pay; it has a two-factor authentication using GOV.UK Notify for both email and text messaging; it integrates with a legacy system using a Simple Object Access Protocol (SOAP) web service.

The project uses an open source stack and is coding in the open, all the sources being publicly available.

They are monitoring the infrastructure and application health on multiple levels (application, load balancer, database/storage and simple queue service) detecting DDOS and server login attempts and they are using open source network intrusion detection systems.

In consultation with the Ofsted SRO and Security Authority, they assessed security risks, potential threats and fraud vectors and the service are expected to be available 24 hours a day, 365 days a year they use appropriate messages to inform users in cases of interruptions or outages.

Design

It was great to see convincing examples of the team iterating on the design of the service based on user research. For example not only did they take on the recommendation to use a more standard task list pattern, but, having found issues in research, they had made changes to the pattern to tackle those issues.

It is also very impressive that the team shared their findings and solutions with the wider community on the Design System backlog, so others can benefit and we can iterate the pattern for other services.

The content of the service is also clear and well designed, and we saw good examples of this being based on user research too.

Elsewhere, the service makes good use of standard GOV.UK patterns, and one thing per page.

The authentication system is unusual, but it is good to see a system try to solve the issues we often see with a username and password based system. As it is unusual, it would be good to monitor this aspect of the service specifically, to make sure it’s both usable and secure in the real world. If it continues to be successful we’d recommend sharing this pattern with the wider community, for example through blogging.

It was a shame to hear that the team had designed out a tricky ‘household members’ part of the service, only to have put it back in at short notice. This would have a very wide range of users and confidence levels, so it would be good to get evidence that this part of the journey works for those users, and is as simple as it can be. It is a possibility that web based journey may not work for all these users and other options may need to be explored.

While not the main part of this assessment, the admin system also seemed well designed and based on user research. One observation though would be to not rely solely on comparing it to a worse previous system, but making it as clear as possible on its own merits. We’d recommend trying the admin system with someone without much previous experience, to get evidence that someone new to the job could pick it up without too much training or support.

Analytics

The team has very limited access to easily available data that they can use to baseline performance and feed into building the service. The current application system does not have any digital analytics tracking capability.

Having said this they have been able to deep dive into log-file data to provide a set of metrics and have developed a good set of KPIs. They have also used data from their call centre and ARC partners to feed into building the services though have acknowledged that the data from the call centre is limited in what it can provide due to their call logging processes. For example they cannot tell if a user has called more than once.

They also use information from their Facebook page to help understand issues their users are having with the service.

The team has recently managed to get 50% of a performance analyst’s time. The performance analyst will sit with the team and be actively involved in all team ceremonies.

The team is using the free version of Google Analytics to track user behaviour on the service and plan to upgrade to the premium version to bring them in line with the agencies other services. They have not anonymised the IP address.

They have got Senior Information Risk Owner (SIRO) sign off to for using Google Analytics.

The team is working hard to determine the true cost per transaction, which is especially difficult in a save and return type of service such as this.

They have not started a conversation with the Performance Platform team at GDS, but plan to within the send few weeks and are working on developing their mandatory KPIs.

The team expects a 100% digital take up as this will be the only channel available to use this service.

There is a major concern that Personal Identifiable Information (PII) is being passed to Google Analytics within the slug for each form page. The team need to make sure that no PII is passed to the analytics package.

Overall, given their limited availability of good data, the team has made good use of data to help build the service.

Recommendations

To pass the next assessment, the service team must:

  • the internal facing service has not been audited or tested with accessibility users, and this should be addressed as a priority
  • the assisted digital support journey should be tested with real beta users
  • more research and analysis is required on the household members journey
  • ensure that they anonymise the IP address in the Google Analytics tagging
  • ensure they don’t pass any sensitive data through the analytics platform
  • investigate if PII is being sent to Google analytics and take action to prevent this
  • address the accessibility issues found at the audit and any other new ones found during research with the user with access needs. Examples of issues found so far:
    • document titles are generic and do not give the name of the service
    • on the task list page zooming is cutting off status tags’ content
    • Axe and Siteimprove are indicating issues with structure, links and colour contrast
  • Create a realistic plan to roll the system out to more users to test the end to end journey

The service team should also:

  • produce a performance framework type of document that will help to determine the best metrics and measures they need to measure success for users needs and also to help use the right data to inform future iterations of the service. There is some good information on the service manual for this and a good blog post on how to run a Performance Framework workshop
  • contact the Performance Platform team to start the process of agreeing their KPIs and building their public facing service dashboard
  • document the code and explain how a team in another department can reuse it

Next Steps

You should follow the recommendations made in this report as you continue to develop your service.

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 23 January 2020