Apply for ESA - Alpha Assessment

The report from the alpha assessment of the DWP’s Apply for ESA (Employment Support Allowance) service on 05 July 2016.

Stage Alpha
Result Met
Service provider Department for Work & Pensions (DWP)

The service met the Standard because:

  • The alpha has been developed to a good standard, in accordance with the service design manual.

  • A strong agile team is in place and are demonstrating a clear focus on building a service around understanding and meeting user needs, with continuous iteration and a focus on improving the end to end user journey.

About the service

Service Manager: A.Tovstevin

Digital Leader: K. Cunnington

The service provides a digital route for users that have a limited capacity to work to apply for an Employment Support Allowance (ESA).

The digital channel will initially follow the telephony ESA claim capture, providing improved service for customers and faster processing in DWP. and the service will support new claims for all ESA claim types in lieu of ramp-up of Universal Credit at which point the ESA service will continue to support ESA contributions cases (approx. 250k p.a) Future iteration will see inclusion of Health Data Gather and removal of current paper processes to further improve the user journey.

Detail of the assessment

Lead Assessor: J. dos Remedios

User research

The panel were impressed with the approach, focus and level of user research undertaken during the alpha phase, along with the presentation of a detailed plan covering each future fortnightly sprint for the coming months.

For the alpha a significant amount of data and analysis had been undertaken. User research to date included a wide breadth of users and those representing and assisting them, including those from charities and the advice sector. The team demonstrated multiple iterations of the prototype based on improving the user journey following user research.

Given some of the users will be diagnosed with terminal illnesses and will be making an application under difficult circumstances, the team’s recognition of these needs and ongoing work to ensure that these are researched and met was exemplary and very encouraging to see. Likewise, the team’s involvement of agents (DWP staff administering these claims) was very positive and will add significant value in improving the user journey and overall service. The team also referred to a number of improvements already made during the alpha phase, with the removal of a number of questions that will ensure that the digital route provides a more effective and faster service.

The team also demonstrated clear plans for research going forward, with research planned up until the end of September. This research will include testing with more users who have accessibility needs, and will involve conducting interviews in a lab and contextual research in participants homes, job centres, and other intermediaries.


A multi-disciplinary team is in place and there are no skills gaps or issues moving into the beta. The team have also secured a separate web ops resource, in addition to the centralised team, in recognition of their specific needs and to ensure velocity isn’t impeded. In-house capability development and knowledge transfer is ongoing and both the service and product managers demonstrated a very clear vision for the service, along with a very good detailed knowledge of the service and user needs. The team’s user research was also strong and it was clear that there was a good understanding across the key roles of how to develop and improve the service, with agreement on important stories in the backlog and priorities for the next sprints.

The service development is very dependent on policy changes and the team are working closely with policy, with team members integrated into the team, and this was noted as working well and a positive model. Likewise, there are a number of strong links and overlaps with other DWP services and the team made numerous references to points of existing and future collaboration. This included reusing patterns and micro services which was very encouraging and underlines the good work being done both by the service and across the wider DWP digital community.


The team have commenced the build for the private beta of the ESA system, utilising a sensibly chosen tech stack. They understand that the data they will capture is sensitive and are looking at minimising the lifetime of that data in their systems. Additionally they have a managed entry to the system to minimise the risk of attempted fraud and to limit the potential impact on the back office process of a sudden influx of applications.

Also they have built an automated pipeline to test and deploy their application. This is a strong outcome from the alpha.

The panel felt that the terminology being used to describe the security requirements of the system was outdated (see Government Security Classifications April 2014) and there is a concern that this has lead to an architecture that is more complex and perhaps expensive than required for a system holding OFFICIAL data.

The team had recently been expanded to include an operations engineer. This will be an essential role as the team move into private beta. The platform will need specialist support to move into a continuous deployment model, with live citizen data.

The panel were unclear about the need for PDF generation for the ESA agents. This would lead to unnecessary copies of the data on agent machines.

Finally this code is not open source. We expect all new code to be coded in the open unless there is a very strong reason to not be.


The service is following the GDS patterns and reusing proven ones from other DWP services where possible such as from Personal Independence Payment (PIP). The prototype was designed well and provides a good basis for developing the beta. The team are engaged with both Notify and Verify and keen to explore how these may be used for the service.

The panel was concerned that the end part of the form is potentially confusing, as users still have to undertake a number of off-line activities after completing the digital transaction. The current design does not make this clear enough. Whilst the team didn’t find any issues with this during the user testing, the scope of the testing so far has concentrated on completing of the digital section, so would not have necessarily exposed problems with the offline processes. The panel is concerned that once using live data in real circumstances, this has the potential to be a problematic pain point, compounded by the fact that the service outcomes are dependent on users completing the off-line activities. Given the importance of this stage in the user journey the panel has made a recommendation to undertake more work and testing on this moving into a private beta.

The panel also noted that the summary printout function was part of the edit page, and so any print would become invalid if edits were made. This should be reviewed so that the user prints the final submission (perhaps including the checklist of any offline steps that need to be taken).


For the alpha the team are already focused on the KPIs and will be concentrating on reducing the points of failure, drop out rates and reducing the completion time further. Back office case handling improvements will also be important, including quality control.



Although the team has undertaken a considerable amount of research so far, and has put a plan in place for research going forward, given the need for users to require a bank account to use the ESA service it would be sensible to focus some research on users who do not have a bank account. This will enable the team to understand the needs and barriers of this audience, and design a solution(s) that can help them overcome these.


The service must undertake further development and user research of the end stage to ensure that the user journey is as clear as possible, so that users are clear on the next steps and what they need to do. They should look at how other services have tackled similar problems, such Home Office’s passport renewals service. The team should also investigate digital methods of providing health evidence in order to reduce the number of offline steps. The number of private beta users must be carefully managed to show that any options and further development of the end page and next steps are testing well in the alpha, and that the next steps are being followed.

The print function should be reviewed to ensure it’s available at the most appropriate point for the user.


The panel would like to see the team evolve a technical solution that manages risk under the more recent classification guidelines. The service should confirm that the data classified at OFFICIAL and that risk mitigations around network segregation and encryption offset the need for two distinct platforms and a user managed bridge.

Classifications defined in Government Security Classifications April 2014 and Cloud Security Principles.

In CESG guidance, CESG explicitly state: “OFFICIAL and OFFICIAL SENSITIVE are not different security classifications”, additionally data aggregations should not lead to higher classifications. Risk should be managed equivalently.

The team must look to open source the code. Given the private GitLab instance there will need to be at a minimum a publishing solution for the code so that the source code is publicly viewable.

As the solution uses server side sessions in the tomcat containers, we would also expect the team to look at how they will manage zero downtime deploys and autoscaling.

Finally, the panel would like to see further thought about the use of PDFs, and these being stored on local machines. Minimising the egress channels for user data would be advisable given the data set, and building web interfaces to support this double entry private beta phase is recommended.

To pass the next assessment, the service team must:

  • The service must undertake further development and user research of the end stage to ensure that the user journey is as clear as possible so that users are clear on the next steps and what they need to do.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 23 December 2016