DfE Data Exchange - alpha

The report from the alpha assessment for DfE's Data Exchange on 14 February 2017.

From: Government Digital Service
Assessment date: 14 February 2017
Stage: Alpha
Result: Not met
Service provider: Department for Education

To meet the Standard the service should:

  • conduct further user research with users in schools to gain a better understanding of their end to end experience, including their existing tools
  • ensure the team is both sustainable and includes appropriate representation from key disciplines. In particular it is essential that a service designer is in place to work alongside user research and service manager to ensure developments meet clearly understood user needs.
  • work with the GDS Open Standards team to ensure they have appropriate evidence for choosing to not adopt an existing standard

About the service


The team are developing three services concurrently. They’re building a developer hub to allow software developers to register and request credentials to DfE APIs. They are also building 2 API-only services to: - allow school administrators, via third party software, to submit pupil data to DfE at least once a month without manual processing - allow third party software to communicate pupil information with other third party software via a DfE messaging platform.

Service users

The users of these services are third party software developers and end users of third party software used in schools.


User needs

It is clear that the service team wants to create a service that meets their users’ needs. The team was able to evidence that they have done strong discovery style research with a representative variety and number of users. They have an inherent initial understanding of user needs. Their identification of a broad range of users and stakeholders shows to their well rounded understanding of the service’s context.

The team should also be commended for their decision to make a major change to a core element of the service based on feedback from MIS supplier users. This is not an easy decision to make and once again demonstrates a desire to ensure the service meets user needs.

However, the user research done to date is not representative of the level of user research required for the alpha stage of a service. In particular, the team is at risk of not meeting user needs in the design of their service by including end users too late in the process. There has been no engagement to date with end users in terms of the ideas and prototypes that have resulted from discovery research. The team will need to be careful going forward to ensure that these engagements happen soon and focus on understanding how the ideas work with the end users’ real context, rather than opinion based feedback from users. This could be achieved by observing the users undertake tasks using a prototype of both focused parts of service and of its end-to-end experience.

The team should also take some time to better understand the difference between user needs and feature-oriented user stories. As mentioned above, the team does appear to understand its users’ needs, but often when talking about user needs they were expressed and used as user stories. There is some guidance available on the difference between, and how to translate, user needs into user stories in the Start by learning user needs section of the Service Manual.

At this stage of the project, we would expect to see some more explicit thinking around access needs and assisted digital users. The team were able to explain the kinds of assisted digital support available to current users and clearly have a strong understanding of what is available to them. However, explicit assisted digital user stories and a plan around how this experience will grow as the service changes would be highly beneficial.

The team was able to identify some gaps in their user research, the kind of research that they would like to do next, and the need for a dedicated user researcher. They clearly have the intention to continue doing user research and do it better. However, a defined plan for future user research was not articulated.


The panel were impressed by the knowledge the team had developed about the service, under a clearly empowered SRO/Service Manager. The team has included the majority of key roles, with civil servants in product and service management roles, whilst technical and development specialists have been brought in on short-term contracts.

Currently the team has shrunk substantially and there is a significant risk that knowledge developed during the alpha will be lost. Having more core skills available from permanent DfE staff, with measures in place to upskill or transfer knowledge from temporary staff to permanent wherever possible, would offer greater resilience throughout DfE digital projects. Where this isn’t an option, the contracting arrangements should aim to reduce the potential for abrupt starts and stops in development and to achieve a gradual transition between contracted staff to avoid losing knowledge and expertise unnecessarily.

The team has been employing agile practices and made good progress in developing and improving ways of working during Alpha. It’s clear that remote working has been challenging. Whilst improvements were made during Alpha, DfE should strive to colocate all team members as this will assist with collaboration, knowledge-sharing and transfer.

Based on the complexity of this service the panel recommends involving a service designer, but you may also want to involve an interaction designer.


The team have focussed more on technology decisions in this alpha than we’d normally expect, though in this case it’s appropriate.

The team started with an assumption that they’d use the SIF standard for data exchange. Work with software suppliers has shown the SIF standard introduces unnecessary complexity and doesn’t meet developer user needs. The team propose to take a simpler approach using RESTful APIs. The panel are content this decision has been well considered but the team should validate it with the GDS Open Standards team to ensure they have appropriate evidence for choosing to not adopt an existing standard.

The team intend to develop in public cloud in beta, are aware of lock-in risks and described intended mitigations. The team explained they are very conscious of privacy concerns and so are limiting the details captured and stored by DfE and frequency of collection from the school management information systems. This helps to mitigate privacy concerns in relation to the data retained by DfE.

However, the data model the team is developing contains sensitive information about pupils and therefore carries substantial privacy risks. As the service will facilitate data exchange between institutions that includes this sensitive data, the team should ensure the design and assurance of this component considers the risks.


There are a few aspects to what’s being assessed:

  1. The end-to-end service for schools’ administrators in which they are required to submit data to DfE
  2. The API, which once provided to the private sector developers of schools’ apps, aims to enable the above service to be far quicker for school administrators
  3. Other potential uses of the API and data standard, which include schools sharing data with each other, with local authorities, and between apps

The briefing material seemed to focus on points 2 and 3 of the aspects listed above, but during the assessment the team talked more about the service of submitting data to DfE, which appears to be the main driver for the API platform.

The panel recommends focussing more on the end-to-end service that is being improved. That service should be called something like “Send pupil data to the Department for Education” to follow the naming your service pattern. This end service should have its own multidisciplinary service team, could be a service on GOV.UK, and should come in for its own service assessment.

Prototype of end service

For alpha, the panel expect to see a prototype of an end-to-end service. The prototype needs to have gone through enough usability testing and design iterations to be confident it’s the right thing to build in Beta. In this case, a lot of the front-end for the end service will be built by a suppliers and procured by schools. However, we would still expect to see a prototype showing a realistic representation of the end service and for that prototype to be iterated on during Alpha.

The team did show some evidence of work on the end service prototype but it was not a realistic representation of what the end user would use and it had not been iterated. The team also presented a video showing an idea which they explained would not be taken through to Beta but this was mainly to demonstrate a change in direction they had taken away from the SIF open data standard.

To pass an Alpha assessment, the panel expect the team to demonstrate their understanding of how user needs can be met by a simple and intuitive service so that users succeed first time. Front end prototypes that have been iterated help to represent the team’s understanding of the user journey and how to best meet user needs. Although the team will not be building the school management information systems that schools use to collect, maintain, and send the data, the team should have a good understanding of what an interface to meet the user needs could look like and be able to provide recommendations to suppliers (in a fair and open way).

The team proposed working with a couple of software suppliers to develop a prototype in beta. This approach presents a number of challenges, including cost and investment, ownership, the relevance of one supplier’s prototype to another, and treating suppliers equally. The panel recommend the team develop an agnostic prototype that, along with user needs and learning from research, can be shared with all school software suppliers. API Library The API documentation seems well organised and it is clear that a lot of thought has gone into making it intuitive to use for developers. However, the documentation needs to go through usability testing with its users. We recommend doing usability testing by observing a developer try and use the documentation to do a common task, like submit some data via the API. In addition, the team mentioned that the library is intended for publishing on GOV.UK. To pass an alpha, we expect to see a prototype which uses the GOV.UK style and design. The team should also keep an eye on blogs about developer documentation.

DfE Data Exchange Portal

In general, GDS recommends against building portals. This is because they usually group services together which have different user needs and no user need for them to be grouped. This makes accessing any one of the services harder as the user has to avoid navigating to the wrong service. Many portals also group services with different users.

The above is true for the portal presented in this assessment. The tasks that users can do within the portal should be thought through and presented in context of the user’s journey, and services should be built within that context. More understanding of that journey is likely needed, but it looks like these services should be delivered through GOV.UK digital services, and through the software built by suppliers. Seeing the three things demod at the assessment as separate services will likely help negate the need for a portal.


At this early stage the team showed encouraging signs of a positive and considered approach to identifying appropriate measures, and understanding what success would look like. Key targets include retiring and simplifying existing technology currently in use in DfE, whilst making substantial improvements to challenges users face, particularly around duplicated effort and existing challenges with poor quality services. They are actively investigating the use of analytics tools, to deliver better insight and to create feedback loops that will improve data quality.


To pass the reassessment, the service team must:

  • Test a few iterations of the portal prototype with end users of the service. This should be done in the context of the service as a whole to identify where potential issues could arise. The parts of the service that are not directly designed by the service team, such as software suppliers’ interfaces, can be represented by lower fidelity prototypes. Best practices for testing need to be followed, including minimal context setting, looking out for what works rather than find out what users like or dislike, and using proper moderation techniques.
  • Create a solid plan for user research, including a determined methodology, a dedicated resource, and a plan for addressing changes to the assisted digital journey.
  • Ensure that the team is sustainable, all relevant roles are in place, and that the team is able to collaborate closely throughout the development phase. There is a complex mix of development requirements, and it may be better to prioritise a single service out of the three currently in progress rather than stretch a small team across the breadth of development, scaling up as necessary and when capability is in place.

The service team should also:

  • Read up on the purpose of user needs and reshape the identified user needs of the service using this increased knowledge. The team should be able to more explicitly link how the user needs have lead the design of the service.
  • Continue discussions and deliberations on open standards, including working with contacts in GDS. Highlight challenges around addresses and the impact on effectiveness of the NPD to GDS’s data and registers teams, to investigate potential for collaboration on improved address data.

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Not met
3 Having a sustainable, multidisciplinary team in place Not met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Not met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform N/A
18 Testing the service with the minister responsible for it N/A
Published 24 July 2018