Data collection service - alpha assessment

The report from the alpha assessment for DfE’s data collection service on 6 February 2018.

From: Central Digital and Data Office
Assessment date: 06 February 2018
Stage: Alpha
Result: Not met
Service provider: DfE (Department of Education)

To meet the Standard the service must:

  • Refocus on identifying the service’s main user needs and base their decisions on these needs. At the moment the functionality of the existing service plays too important a role in the team’s decision making process, designs, and research approach.
  • Conduct research with users with a range of digital skills and access needs. It was not clear how the helpline could be considered an appropriate support option without understanding the needs of users first.
  • Carry out research and design into how users find and access the service, and how users view and use the reports and feedback DfE provides on submitted data. How users find the service is within the stated scope for alpha.
  • Evidence that robust work has been done into security, threats and fraud modelling to ensure the service is safe for users.

About the service

Description

The Data Collections Service (DCS) collects data from training providers (users) via a secure website. The data encompasses all learners (16 years and upwards, but not normal university students) and what they are doing at each training provider (e.g. colleges, further education colleges, local authorities, large employers, private training providers). The system validates data, computes the earnings and determines payments and allocations that are due from Government to the providers.

The Data Collection Service interfaces with other internal business areas such the Apprenticeship Service and provides data to downstream systems such as the Data Warehouse. The data is converted into National Statistics, performance tables and risk assessments.

Service users

There are over 2000 (and growing) unique organisations in England that use the system with about 6000 users in total. They are generally the student administrators / data managers within the following providers: Private Training providers, Local Authorities, Colleges, Universities, Employer providers and National Career Service.

The service team had done a good job of narrowing down the scope to focus on users with the least experience of the Data Collection service. This means that they will be able to meet the demand created by the Apprenticeship Levy.

Detail

User needs

The team showed understanding of the variety of users of the service, and how their context influences how and when they need to use the service. Based on the research carried out during the discovery and alpha stages, including questionnaires, interviews, and usability testing, the team presented a set of personas to describe these users. The team decided to focus primarily on novice users, as their research has shown these users will likely face the biggest difficulties.

The panel was pleased to see that the team currently has two dedicated user researchers and that the team feels they have adopted a user research culture in which they are all involved in research activities. The team carries out regular research and iterates designs based on the findings from this work. Currently the team holds usability testing sessions twice per week.

The panel was concerned that some user needs were expressed in terms of technical solutions related to how the current and future service works or may work, rather than the underlying user needs. The functionality of the existing service plays an important role in the team’s decision making process, designs, and research approach. The team must instead focus on identifying the service’s main user needs and base their decisions on these needs. Writing the user needs using the ‘As a … I need/want/expect to… So that…’ user stories format may help clarify the underlying aims and motivations.

The panel was also concerned that no research has been carried out on two keys parts of the service: how users find and access the service, and how users view and use the reports and feedback DfE provides on submitted data. The team has not conducted any research on the end-to-end service.

The team has not done research with users who have low digital skills or access needs. As a result it is currently unclear how the service will meet their needs and how support will be provided. The team did present a plan to involve DfE’s Accessibility Advisor, who will test the service using different assistive technologies.

Team

The panel was pleased to see a fully-formed alpha team in place. They are co-located making it easier to collaborate well and have also made the proactive decision to place themselves next to the existing Business As Usual team, who work on the existing service. That has led to a positive learning environment, lots of personal interaction between the teams and knowledge sharing.

The team is made up of a mixture of civil servants and contractors. They are however heavily reliant on contractors and are actively recruiting for civil servants to mitigate the risks associated with knowledge transfer.

The full team was involved in user research, with everyone benefitting from watching the user research sessions. Agile ceremonies such as retrospectives were working well for the team, they were able to demonstrate examples of positively changing their ways of bringing up blockers as a direct result of retros.

The team is trusted and empowered to make decisions affecting their online service, however, the panel felt there were areas that should be in scope, such as the helpline, that were currently out of reach for the team and service manager. The panel was impressed to hear of the agile governance in place, which included show and tells every week with Director level attendance every time. Working in this way keeps stakeholders involved and consulted in ongoing development. It also boosts team morale knowing that there is genuine interest in their work from across the organisation.

Technology

The team presented a comprehensive and deep piece of technical work that showed they had thoroughly mapped their existing service and the functionality it is providing. They knew where they wanted to make improvements and had often either already tried things out or had firm ideas about where work would be needed for beta. However, care should be taken to avoid building a drop-in replacement without taking the opportunities to change the business processes around it to more effectively meet the needs of today’s users.

The front-end prototypes were built with the gov.uk prototyping kit and the assessor was impressed with how it had been applied, especially the functionality to allow the app to be configured at the start and then step the business logic through processing stages on demand.

Great care has been taken in the selection of the cloud computing technologies. The chosen platform provides DfE with value for money and exploits their status as an educational body in order to secure advantageous pricing. They have taken the time to understand the needs of the devops users of the existing IaaS platform and have specified PaaS and SaaS solutions where pain points can be alleviated. The team has identified opportunities to use the new cloud based platform to deliver efficiencies by scaling the paid-for resources in a way that reflects the periodic changes in usage levels that they have observed in the wild. They have also identified ways to reduce the large memory resource requirements of the existing system. The changes have been measured against alternatives, including “doing nothing”.

No work has yet been undertaken to provide the report creation functionality. It is recommended that the team work to simplify the number of reports that are generated by understanding how the reports are used by the recipients.

Whilst the approach taken to CI is practical and pragmatic, there is a large amount of complexity in the deployment and release pipeline. The release pipeline is not owned by the service team and is shared with several other teams. However, the team have negotiated well with their existing ITIL change control processes. The service team must ensure that this arrangement continues to allow them to deliver value to their users, evolving it, changing it or replacing it where appropriate.

The Individualised Learner Record Submission back-end service represents a challenging system to test but the team have put a lot of effort into building cases for each of 600 validation rules and making them available to the testers on demand. They also know a lot about the browsers their users are using as well as how many users there are.

The team appear to have made good use of code from other digital teams, but the assessor would like them to open source all of the code written for alpha, even if they will not be reusing it in the future.

Due to the magnitude of funding that the Individualised Learner Record Submission service controls and the vulnerability of some of the data subjects, the assessors would like to see a more formal presentation of the security, threat and fraud modelling that has been conducted.

The assessor also recommends working closer with gov.uk Notify around the service’s requirements for sending email to users.

The team demonstrated that they had both technical and business contingencies in place should the service be offline. Moreover, there was evidence of a detailed and nuanced approach to the impact of a period of unavailability at different times during the collection cycle.

Design

The team has designed 5 iterations of the service, working in week-long sprints. At alpha the panel would expect bigger changes between prototypes, as service teams should be testing different flows and making substantial iterations to the service.

The team has used a mixture of design methods. Some design features are directly based on user needs or refinements from usability testing. Other features have been taken from the existing service and put into prototypes to gauge the response from users. The existing system should not be the starting point for designing solutions, user needs should be. The team has a good awareness of things that aren’t working, but they should be more bold with cutting (or de-prioritising) aspects of the service that don’t have a strong need. This will make the service simpler for users.

There are important parts of the service that have not yet been designed, despite being within the stated scope for alpha. It is not currently clear how a user finds and accesses the service. There is an assumption that the service needs to exist behind a secure login – this needs to be prototyped and tested with users.

The team does not yet have a user-facing name for the service. This must be worked on as a priority, as it has implications for the scope of the service and how it relates to the other data collection services that the Department for Education delivers. The service should have a verb-based name that describes the task the user is trying to complete. Deciding on the service name will help the team focus on solving one whole problem for users.

The service team is aware that there is overlap with the ‘DfE Data Exchange’ service, and it is encouraging that they have started to work together on how the two services might merge. The panel would like to see more evidence of a plan for how this would happen. There is a danger that the Data Collection Service could end up sitting within the Data Exchange service and create a portal within a portal. This would make services hard to find, and could confuse what each service is there to do. The panel recommends adding a service designer to the team (or perhaps across the two data collection projects) to help with this work.

The team were planning on using the existing helpline as the support model for the service and had no plans to change it in the first instance. As no user research had been done into the needs of users with the lowest level of digital skill, the panel was unable to have confidence that the helpline would meet these needs. There is an issue with the ownership of the helpline - the Data Collection Service has a responsibility to provide support for those who need it. That support must be designed, researched and delivered in the exact same way as the digital service – starting with user needs.

The panel did not see that the service has been designed and tested to meet needs of users with low digital skills or access needs. The service team must do this work and put in place a plan to test their support.

Analytics

The team has registered their service with the Performance Platform.

In addition to the mandatory KPIs, they have been working on a good set of performance metrics, based on the performance framework.

The team has tried to use data to inform their work so far, in particular from the existing Helpline. As the Helpline didn’t have the level of data they needed, the team spent time with advisors to get a better level of insight. The service agreed that they don’t yet have influence over the data collected at the Helpline but are including Helpline colleagues in their show and tells to raise awareness and start this work.

Recommendations

To pass the reassessment, the service team must:

  • Refocus on identifying the service’s main user needs and base their decisions on these needs. At the moment the functionality of the existing service plays too important a role in the team’s decision making process, designs, and research approach. The same approach must apply to design of the service.
  • Conduct research with users with a range of digital skills and access needs. It was not clear how the helpline could be considered an appropriate support option without understanding the needs of users first.
  • Carry out research and design into how users find and access the service, and how users view and use the reports and feedback DfE provides on submitted data. How users find the service is within the stated scope for alpha.
  • Evidence that robust work has been done into security, threats and fraud modelling to ensure the service is safe for users.
  • Open source all of the code written for alpha even if it will not be reused in the future.

The service team should also:

  • Avoid a “tech refresh”, preferring to re architect around today’s user needs.
  • Work closer with gov.uk Notify around the requirements for sending eMail to users.
  • Ensure that the arrangements around the release pipeline continue to work, evolving, changing or replacing them where necessary.
  • Develop a user-facing name for the service.

Digital Service Standard points

Point Description Result
1 Understanding user needs Not Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Not Met
8 Making code available as open source Not Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up N/A
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 6 August 2018