Apply for new style employment and support allowance beta assessment report

The report for the Department for Work and Pension's apply online for new style employment and support allowance beta assessment on 21st May 2020.

From: Government Digital Service (GDS)
Assessment date: 21/05/2020
Stage: Beta
Result: Not met
Service provider: Department of Work and Pensions (DWP)

Previous assessment reports

Service description

The service enables citizens to submit an application for ‘New Style’ Employment and Support Allowance (ESA), a benefit for people who are ill or have a health condition or disability that limits their ability to work. This information in the digital application will be used to create and make a decision on a claim in the legacy system JSAPs. Claim data will be entered onto JSAPS using a combination of DWP agent rekey and an automated robotic process. As a result of COVID-19 DWP have experienced unprecedented levels of citizen contact by our telephone service and there is a clear need to develop a digital service that will support those who are digitally capable to apply for New Style ESA.

Service users

Citizens who are unable to work due to illness (including COVID -19) / injury and who have paid sufficient National Insurance contributions in the relevant two tax years prior to their claim.

DWP agents will use an agent UI with the service.

1. Understand user needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is planning to engage with telephony agents to identify users of the service for ongoing user research to help understand user need

  • the team is intending to develop a discussion guide for use in interviews that will enable them to better understand user need

  • the team is intending to make more use of quantitative evidence from the current service in private beta

What the team needs to explore

Before their next assessment, the team needs to:

  • prioritise gathering existing quantitative evidence to decide whether it is beneficial or not to create a separate service route for COVID-19 users

  • conduct user research with users of the service to understand in greater depth their user needs so you can iterate and improve the service in the coming weeks

  • focus around orientation, eligibility, the application process, and ongoing user support once an application has been submitted

2. Do ongoing user research

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is intending to develop personas for low confidence users and users with accessibility needs

  • the team is intending to create and develop a discussion guide - this should provide an opportunity to gain clarity over what users find confusing or frustrating, and will be targeted at entry/exit points of the service - it will be structured around key phases: orientation, eligibility, application, ongoing service support - outputs will help feed into the development of common personas across multiple services

  • the team is intending to reach out to users through third parties

What the team needs to explore

Before their next assessment, the team needs to:

  • produce a clearly defined user research plan moving forwards, and to focus on the following areas:

  • testing the COVID-19 and non-COVID-19 workflows with users

  • user research into the understanding of work capability assessment

  • the user research plan must include users with a range of accessibility and low confidence needs

  • the research plan needs to detail which external parties are being engaged, how they will engage with users and potential users of the service, and how this information will be shared with the service team

  • it needs to have a realistic appreciation of the numbers of users that can be included in user research under the current restrictions

  • the user research plan should reflect the limitations placed upon it by internal governance policies; in the delays this will bring to iterative testing and delivery

  • the research plan should include a variety of other qualitative and quantitative research methods, in addition to telephone interviews, for example surveys, targeted metrics and analytics

  • engage with the GDS standards assurance team to seek further help and guidance on how best to structure online surveys

  • develop the discussion guide, which has great potential to help improve this service and others

To support, guidance on ‘Conducting user research while people must stay at home

because of coronavirus’ published recently on the Service Manual. In addition, a new blog post on ‘User Research and COVID-19: crowdsourcing tools and tips for remote research’ contains additional tips and tool recommendations.

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • a core team is in place with additional people to support the team during this accelerated launch.
  • the team is working well together despite the challenges of working remotely
  • the same team will continue during beta with the ability to flex to bring on more people when necessary
  • the team is working collaboratively with policy
  • major technical decisions and choices have been approved by the design authority

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working in 2 weekly sprints with a remote daily standup, and there is also a slack channel to help the team collaborate effectively
  • the team is using 2 kanban boards, with the design board feeding into the development board
  • the team is looking at the end to end journey

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that user research is prioritised
  • continue to collaborate with other service teams to ensure a joined up user journey for users

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has been iterated since the peer review in response to user feedback
  • the team see New Style ESA as a stepping stone to a more strategic service for health benefits and have a roadmap to iterate and improve the service as part of the long term vision

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that they have a plan based on user research
  • ensure that they have been able to fully understand user needs so that they can iterate the service in response to those needs
  • continue to collaborate across different working groups to ensure that the different benefits services are joined up and that the end user can easily understand them
  • continue to work with the Job Seekers Allowance and Universal Credit teams to integrate the services as much as possible to make them easier to use and understand for the end user

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

The new style ESA service was one of 10 services related to health issues, all these services were running on the same “health” platform with similar technical stack.

The applications hosted on the health platform were running on AWS EC2 (They had a plan to maybe move to ECS). Citizen-facing traffics were fronted Akamai (with DDOS protection), router and firewall.

Data was fed into backend services including DRS, JASPS (payment gateway) and CAMLite to provide information for the case workers to work on. Additional integrations were put in since the peer-review to the IAG robots service and JASPS.

Scaling of the services required scaling of virtual machines (EC2).

Case workers did not have direct access to this service, only to the data provided by the backend service and platform. Traffic to the backend platform was protected by an SDX boundary.

The plan to move to ECS sounds reasonable. It should enhance the scalability and resource utilisation of the service.

Session timeout was set to 60 mins.

What the team has done well

The panel was impressed that:

  • the team has built the service using microservice, modern cloud architecture, with queues to decouple the different components for reliability
  • there are facilities for monitoring the application and infrastructure with Grafana, logs and Dynatrace and the team has good visibility and control for the tools - there is scope for using Dynatrace better in the future
  • the design was based on DWP technical blueprints and sets of patterns and products used within DWP

7. Understand security and privacy issues

Decision

The service did not meet point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team built the infrastructure and application with security in mind.
  • Measures have been put in to protect the service with WAF, firewall and role-based access control
  • the team used field-level encryption at the datastore supported by KMS, data was not easily decrypted by developers and ops engineers
  • the team had DPIA signed off for their service
  • the team used hardened AMI images with Open SCAP

What the team needs to explore

Before their next assessment, the team needs to:

  • the last ITHC (IT healthcheck) was done on infrastructure and application over a year ago and the team needs to arrange an ITHC as soon as possible, as the service has undergone significant changes in the past year and is already live
  • establish a practise and process for regular IT health check/ pentest based on DWP security policy for both the applications and the infrastructure, for example annually and after every major change, including changing hosting platforms and introducing new components to the service
  • have a privacy notice navigated from the service
  • the service can explore the value for cold data backups against the risks and threats such as ransomware with their design board and make a conscious decision whether it is needed

8. Make all new source code open

Decision

The service did not meet point 8 of the Standard.

The team has not started the process to open source the major source codes. Within DWP, there is an internal engineering process and audit to ensure code quality before the code can be opened.

The team has not envisaged any reason not to open source the code for this service for other departments to reuse.

What the team has done well

The panel was impressed that:

  • the team reused proven patterns from other DWP services
  • the service was built using the DWP govuk-CASA framework
  • the source codes were visible internally at DWP

What the team needs to explore

Before their next assessment, the team needs to:

  • undergo DWP open source process to make the source code open under the correct license
  • there are quite a few bespoke components built particular to this service. Team may find similarity between the components, especially within “health” and consider reusing or standardising components in a bigger picture

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

Currently there was no obvious need for this service to integrate to GDS common platforms or external services.

What the team has done well

The panel was impressed that:

  • the team reused the health platform and hosting infrastructure from other proven DWP projects
  • the team was running on an existing DWP health platform on AWS rather than developed separate infrastructure
  • the service integrated with some of the existing DWP platforms, including DRS (file system), CAMLite (customer data) and JASPS (case management) and IAG (Robotics) with open API. Information of CAMLite was available on DWP

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the service had separate environments (Dev, QA, UAT, staging and prod) for different purposes
  • the team adopted continuous integration and deployment for application deployment
  • the team had conducted automated tests, both for components and integration end to end
  • the team had done adequate load-testing for the service
  • the team had a matured techops practice
  • the team had blue-green deployment to minimise downtime
  • the team understand how the service was affected by spikes with load tests
  • the team understand how to scale up the service when needed
  • the team started to gain understanding on their traffic patterns

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team achieved high-availability by deploying microservices into different availability zones (at least 2), and apply autoscaling group and load balancer to increase reliability
  • the team had disaster recovery planning around AZ failure
  • the team currently has an offline (phone) operation in parallel

What the team needs to explore

Before their next assessment, the team needs to:

  • consider running regular game days on recovering data and services to gain confidence and document process in emergency situations
  • understand what is required, and how long would it take to bring up the production infrastructure from scratch

12: Make sure users succeed first time

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is striving to work closely with other teams in DWP who provide complementary services, such as Universal Credit, to make the experience across the available benefits more consistent
  • the team is exploring changes in content to better meet user needs
  • the team is dealing with a considerable amount of uncertainty and change in policy and responding fast to meet requirements

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the journey as it still feels broken down into eligibility and application, with question repetition between these two parts - the team is working on the content to remove some of the confusion, but a better approach might be exploring the technical solution to integrate the information so users don’t have to repeat themselves
  • gather enough evidence to define whether users need to save and come back to the service instead of filling it all in one go - the team was advised to explore this topic after their alpha assessment but we understand so far the evidence is inconclusive – in user testing users respond the form is quick to fill in but feedback is coming through the live service about the need to save and continue
  • explore bringing both GP/doctors consent pages forward, before asking for their doctor’s information – and consider if you really need to collect this information if users deny their consent.
  • explore in research the two different declarations and whether there’s potential for users getting confused between them

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is making very good use of the GOV.UK Design System components and patterns, and has a consistent experience with GOV.UK

What the team needs to explore

Before their next assessment, the team needs to:

  • explore which domain the service will be hosted on - the service is still being hosted in a DWP domain rather than GOV.UK, and if that is to be the case permanently, the service should have a DWP branding not GOV.UK
  • read the content review and consider the changes proposed, especially around providing more clarity and certainty around eligibility
  • consider changing formatting to help with the formatting of some of the long question sentences - the GOV.UK Design System changed their guidance around headings sizes – the new guidance is to use size L rather than XL if the service has lots of question pages

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • since the service was launched into beta following it’s peer review, the service has moved from 100% telephony based to 90% of applications being submitted online
  • callers to the telephone line are also being encouraged to submit their claim online

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that a user research plan is put in place to make sure that the service is continued to be improved so that everyone can use the it

15. Collect performance data

Decision

The service did not meet point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working across the organisation to collect and share data appropriately to understand the end-to-end transaction, in particular with the operations teams.
  • the team is making good use of telephony data to understand the “offline” service to create a baseline
  • the team has made proactive use of available data sources, in particular user feedback, operational management information, and volumetrics through their Grafana dashboard
  • the team has arranged for a performance analyst to join the team to develop the performance analytics approach

What the team needs to explore

Before their next assessment, the team needs to:

  • start collecting accurate performance data on user journeys to understand where users encounter issues and dropout - the team highlighted that they would be investigating use of Google Analytics and similar tools for this purpose - the goal is to collate rich user journey insights that inform and review the service roadmap
  • embed performance analytics into their development process so that iterative enhancements can be monitored against baselines to assess if they’re achieving the target outcomes defined by their performance framework
  • compare the online journey to the previous offline service to understand the differences and measure improvements and negatives
  • investigate opportunities for user segmentation to identify trends in user behaviour and service performance

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working collaboratively to develop a performance framework for the service - the new performance analyst will work with them to refine this and develop the measurement approach
  • the team has identified performance indicators covering the end-to-end service beyond the users’ online journey - this is articulated through their performance framework which links all performance indicators to the service purpose

What the team needs to explore

Before their next assessment, the team needs to:

  • further develop their performance framework to ensure that service and policy objectives are comprehensively measured to enable continuous improvement
  • consider performance indicators for the onward journey of ineligible users - the service redirects these users, when appropriate, to the correct service (such as State Pension).
  • partner with these service teams to ensure these redirects are effective and identify opportunities to improve further

17. Report performance data on the Performance Platform

Decision

Point 17 of the Standard is not currently applicable as GDS performance platform cannot build new dashboards at present and data for new services is not being collected. At service assessments, teams are still asked to demonstrate that they are collecting and using data to improve their services.

What the team has done well

The panel was impressed that:

  • there was a clear plan for measuring the core metrics and providing data to the Performance Platform. The team has engaged the GDS team and have progressed this appropriately within current constraints

What the team needs to explore

Before their next assessment, the team needs to:

  • supply data for the 4 mandatory metrics so that the dashboard can be published and maintained when GDS is able to progress this

18. Test with the minister

Decision

The service did not meet point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is making plans but has not yet been able to arrange testing

What the team needs to explore

Before their next assessment, the team needs to:

Published 4 November 2020