Apply to register as a childminder - alpha

The report from the alpha assessment for Ofsted’s Apply to register as a childminder service on 5 October 2017.

From: Central Digital and Data Office
Assessment date: 05.10.17
Stage: Alpha
Result: Met
Service provider: Ofsted

The service met the Standard because:

  • As part of a wider programme of work, the scope of this service is clear.
  • The service team is working successfully in an iterative, user-centered manner.
  • The service team demonstrated an understanding of the users of the service and their pain points.

About the service

Description

The service is for people who want to submit an initial application to register as a childminder. (It is not, at this stage, considering those who want to renew, change or amend their registration.)

Service users

The users of this service are users who want to register as a childminder. There are additionally a small number of internal users who will use this system to process the applications.

Detail

User needs

The service team has worked hard to understand the user needs, having conducted a thorough discovery and extensive user testing. They used a variety of user research methods, including field work at relevant events, interviews with childminders, card sorting exercises and co-creation workshops, as well as prototype usability testing.

The team gathered high level insights initially and then created more detailed user needs and user stories. They created detailed personas from the users they’d spoken to and used these to shape their user research plan and the service design.

The user researcher has carried out a number usability testing sessions on different prototypes, usually at the end of each sprint. The findings were regularly fed back into the service design and the team demonstrated a couple of specific examples where this had happened.

The team has started to look at the assisted digital needs, but recognised there was more to do in this area. Their main discoveries in this area so far have been around users with English as a second language and low digital confidence. The team has been recruiting via local authorities, but have found it difficult to recruit specific users due to lack of data available. The team have a research plan for beta which includes going to areas with poor broadband access and finding users with lower digital skills. Some accessibility testing has taken place and users with dyslexia, autism and visual impairment were included in the usability testing.

Team

The service team had access to a good range of skills. The majority of roles are filled by contract staff, although there are plans to change this in some cases. The panel was slightly concerned by the ongoing absence of in house technical skills on a service that will require substantial technical work to take it into private beta.

The service team was able to show excellent evidence of working in an agile way. This included regular ceremonies, the iteration of approaches (e.g. implementing Slack to help communication, changing their approach to estimation), a clear process for managing user stories through their lifespan and the involvement of other areas in the work of the team (including IT, Policy and the Application, Regulation and Contact Centre/ARC).

The panel was impressed by the way in which the team managed to overcome the issues of holding a show and tell across multiple office locations. The service team was also able to demonstrate an appropriate and responsive governance structure, including senior support for their work.

Technology

The service is a web application which performs data capture, and then injects data into an incumbent back-end system. The back end is considered outside of scope for this assessment, but there is a risk that integration with it may have an impact on the team’s ability to deliver a working solution and iterate. The team believes it has the buy-in of the relevant stakeholders to mitigate this. Nevertheless, the team should consider the risks and limitations that could potentially arise and ensure strong communication ties are established early and plan accordingly.

We discussed that the team has not yet considered fraud or attack vectors. During the demo, it was shown that end users do not require a password to access their data. The authentication is performed by a mobile phone message. There was some concern that this may be insufficient to mitigate the risk of data leakage. Given the sensitivity of some of the private data captured on the service, the team must consider whether this offers sufficient protection for end users and, if so, demonstrate with evidence. Similarly, other fraud or attack vectors must be considered to protect user data.

The team are currently coding in a private repository, but have undertaken to code ‘in the open’ from day one of the beta phase. The team must demonstrate this to meet the service standard for beta.

Given the ongoing transformation work happening in parallel on the incumbent back end system, and the potential points of difference of data capture and limitations of data injection from the service into the backend, the team have taken the approach of retaining and storing information in the service and providing a back end for administrative staff to access application data. We established in the assessment that this would result in staff having two systems to access. Depending on the tasks carried out, there could be a risk of data fragmentation and inconsistency arising, as well as inefficiency as a result of having to access two systems. Careful design may help to mitigate this, so the service team should consider this early with engagement from relevant stakeholders.

The service team intend to deploy the application in the public cloud, although the supplier has not yet been chosen, there is a preference for Amazon Web Services. This would allow for easier integration with other government platforms such as Platform as a Service (PaaS) in future.

The proposed technical architecture consisting of a Python web application, deployed in Docker containers with a RabbitMQ messaging middleware is a common one. However, the agency are currently at the early stages of building their own in-house capability to support the service. The proposed architecture, while logical, will require a relatively advanced level of skill and substantial investment to support, which will have an impact on the total cost of ownership.

While the team has considered the Government’s Platform as a Service option and discounted it, it has not demonstrated that it has considered alternative managed services which could deliver similar functionality to that proposed. Particularly in Amazon Web Services (the service team’s preferred cloud hosting option) there are proven managed services available cheaply and with less complicated tooling required to support. For instance, RabbitMQ is a notoriously difficult technology to deploy in high-availability/clustered scenarios and there are already comparable (for the use case) alternatives such as Simple Queue Service (SQS) or Simple Notification Service (SNS). Serverless technologies (in this case Lambda) could also reduce the total cost of ownership.

The team must consider these alternatives early in the next phase before committing to a particular technology choice, in conjunction with the development of a plan for how they will support the service going forward.

The team identified a varied landscape of devices which may be used by end users to access the service. While the team have identified that they wish to improve the compatibility in comparison to the incumbent service, it will need to demonstrate a comprehensive testing strategy which may include a device lab and assisted digital testing.

Design

The team demonstrated how they had tested different approaches for multiple aspects of the service. The team showed they had iterated the design from a linear flow to using a variant of the tasklist pattern based on user research. They had also tested several iterations of the GOV.UK start page.

The service includes multiple screens of guidance for first time users. The team explained this guidance was added to the service as their research revealed that users wanted guidance at the point of use. However, as this guidance is shown at the front of the service, and the application process can span several days or weeks, the guidance isn’t provided at the point of use. As the guidance is behind the start page and not on GOV.UK it means this is not available to users who have not decided to start the application yet, nor would it appear in search engine results.

The team described how the current guidance on GOV.UK is not working for their users and how some points are incorrect. The panel recommends the team liaise with the GOV.UK content team to improve this content collaboratively and to explore journeys from GOV.UK more thoroughly with users, including testing improved guidance.

The service uses the GOV.UK design patterns, although in some places these have been deviated from. On the guidance pages the service is using different coloured background panels to split up the information – the team explained this was to help dyslexic users with contrast issues, however many dyslexic users will override the colours on a website themselves, so they may never see this change. The panel felt this wasn’t the best solution, and alternatives should continue to be explored.

The service also uses a variation of the task list pattern. The team explained they had tried alternatives to the pattern from the service manual as the numbering and statuses didn’t fit with the service. This may test well, but it isn’t in keeping with the GOV.UK visual style. The panel recommends that the service try to improve the text-based task list, experimenting with different wording and proving that doesn’t work before resorting to graphic elements. This research should be shared with GDS on the Design Patterns Dropbox Paper so the pattern can be improved.

The panel was pleased to see the team had accessibility in mind while designing the alpha and had already undertaken some user research with users of screen readers.

Analytics

The team had thought through the KPIs that they want to use for the next phase of the service in detail and were able to show how they would collect these.

Recommendations

To pass the next assessment, the service team must:

  • Prioritise work on the technical integration of this service with the existing legacy infrastructure. The service team should share a copy of the detailed architecture of this approach with GDS when they finalise it.
  • Return to GDS with details of further plans if the service team wishes to exceed a figure of 200 invited users during the Private Beta phase.
  • Develop a full forward plan for development of the service. This should detail the number of users at each stage up to and including live, along with the technical and integration approach to handle the increasing usage of the new service. (NB: It would not usually be the recommended approach to turn off the existing service early in Public Beta, so thought should be given to what alternative routes into the service will exist if this plan does go ahead.)
  • Continue user research, and particularly 1-to-1 sessions based on usage of the prototype service. These should include sessions outside of London and the panel would encourage the service team in their plans to go to areas of England where there are currently low levels of childcare (e.g. Cumbria).
  • Use the Private Beta period to evaluate the value of the inclusion of health details, references and the specific duplicate DBS details that users are currently asked to supply. This evaluation should be used to determine whether there is a user need for these details to be included in this service and, if so, to what extent details are required.
  • Consider the models for managing digital services in government and, in particular, to evaluate the different options (including entirely outsourced and entirely permanent teams) for maintaining ongoing services from a technical standpoint.
  • Work with content designers on GOV.UK to collaboratively improve the guidance and test journeys that start at Google and lead through GOV.UK to the service.
  • Create a plan for assisted digital research, showing how this will be supported.
  • As part of Private Beta, create a support model that considers the needs of users with low digital skills.

  • Undertake a full accessibility audit before launching a private beta.
  • Follow the recommendations in the design review which will follow this report.

The service team should also:

  • Look into the Home Office’s Passport beta to see how they managed users uploading photos from their phones to the service.
  • Consider and test the ideal intervals for proactively updating applicants on the status of their application.
  • Consider and test the inclusion of contact/help details and information on house inspections at the beginning of the flow as well as at the end of the flow.
  • Determine their technical approach to address databases to handle the multiple uses of postcodes in the user flow.
  • Work on the possible approaches for users who have people in their household who could become 16 during the time period of the application. (This would currently block the application as a result of the person becoming eligible for requiring a separate criminal record check during this time period.)
  • Continue their good work in involving a wide variety of government and non-government (including subscription member organisations) stakeholders in creating detailed communication plans for the public beta phase.
  • Evaluate whether there is scope for eliminating the issues around non-matching with DBS record details by auto-checking this.
  • Ensure that their testing approach includes mobile devices, tablets and multiple browsers/operating systems.
  • Develop analytics provision so that patterns (e.g. a strong seasonal flow at the beginning of the school year) are better understood.
  • Create a plan for what happens in the event of the service being offline (for example, transfer people to the existing system if the service is offline, or predicted to be offline for a set period of time).
  • Continue conversations with Government Platform as a Service and Pay.
  • Encourage Ofsted to continue iterating their approach to funding digital projects to ensure that future phases of this service can be appropriately supported.
  • Undertake testing with the staff handling the application process through the service as well as with external users.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Not Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Not Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it N/A
Published 30 July 2018