Access to Work - Beta Assessment

The report from the beta assessment of the DWP’s Access to Work service on 21 July 2016.

Stage Beta
Result Met
Service provider Department for Work & Pensions (DWP)

The service met the Standard because:

  • The service team have a strong understanding of their users and their needs, and are ensuring that their digital service meets those needs, integrated effectively with the existing offline service.

  • The team is working well together, is clearly skilled and competent, and has done a great job of collaborating with key colleagues in both policy and operational delivery roles.

  • The team have fully understood the requirements of the Standard, and built their service in line with GDS design and technology principles.

About the service

Service Manager: Danny McLaughlin

Digital Leader: Kevin Cunnington

The service enables people with disabilities or long-term health conditions to claim grants for equipment and support to aid them at work. These grants may fund furniture or equipment, skilled support for communication and interpretation, or assistance for travel such as taxi fares. It is a discretionary service, supported by skilled and experienced support staff who triage applications and provide phone support and assessment of need.

Detail of the assessment

Lead Assessor: Simon Everest

User needs

As the Access To Work user researcher was not able to attend the service assessment, other team members combined to provide an overview of the research work that has taken place during private beta phase. In particular, the interaction designer, the business analyst and the product owner.

User needs were identified across the following user types:

  1. Advisor

  2. Employe

  3. Applicant/Employee

The service receives approximately 60,000 applications a year. One third of the applications are from new claimants and each application can result in a grant being awarded.

Based on data collected from the existing service, the most common reasons for applications are:

  1. Hearing 16%
  2. Sight 15%
  3. Dyslexia 10%
  4. Mental Health 5%
  5. Other 54%

Private beta has lasted from October 2015 - July 2016 (nine months). During this period, the team have used their direct telephone number as the help/support line displayed on online service pages, for people who want speak to someone.

They have received 28 calls and have taken 2666 applications.

They intend to hand over call management to a call centre during the first month of public beta. It is critical that the service maintains simple, easy, clear support for those users who are not confident using the digital channel. This is particularly important when considering that the service is aimed specifically at members of the public who find the task of completing a digital form more challenging than others.

All team members participate in usability testing.

A specialist advisor is part of the team and participates in all show&tells, plus lab sessions.

The specialist advisor has brought not only their experience of the service, but also their knowledge of local charities and workplaces as sources for recruiting beta testers.

Research has combined lab testing and contextual research, visiting people at their place of work. Research insight is fed back not only to the service team, but also to the operations and policy teams, who sit on the service steering committee.

Both the specialist advisor and the user researcher will be remaining full-time on the team during public beta.


The team were buoyant and positive, and clearly passionate and enthusiastic about their service and making it as good as possible. The team were knowledgeable about the service and subject matter, and clearly have strong relationships with both policy leads and operational staff, ensuring a broad range of business understanding. The team are empowered, and trusted to take design and development decisions supported by evidence.

Key skills and specialisms are all represented within the team, and there are good signs of collaboration and knowledge sharing across the team and their peers in other teams, which will help build capability in the organisation as a whole. A culture of knowledge exchange with other groups in DWP was apparent, and this should be recognised as a significant ‘plus’. They are demonstrating a strong grasp of agile principles, and are using appropriate tools and techniques to support this. There are some constraints within DWP on use of technology, limiting or driving the choice of tools/technology. DWP should ensure that the team have the right tools available and permission to use them, and remove any grey areas around corporate use (eg use of alternative computers or project management tools) that might slow progress.

At present, the team is heavily dependent on contractors and existing suppliers, with roughly a third of the team formed of civil servants. This is an area of slight concern, and it is important that DWP works to reduce its dependence on third parties and recruit permanent staff as the service moves towards Live.

A concern also highlighted under the ‘design’ section, is the consideration given to positioning this service amongst a range of other initiatives, including the Employment and Support Allowance. It is important that the team is taking a broad and holistic approach to presenting their service in that wider context. Users must be able to easily understand the full range of support available to them. This holistic approach may require this team and other relevant services in DWP and elsewhere to make significant changes to service ‘brands’, and the way they are accessed via GOV.UK.


The team demonstrated sensible ways of working with technology, and how they are able to iterate regularly and quickly implement learnings from users. Whilst we applaud this approach during alpha and early beta, we look forward to seeing them replace some of their more tactical integrations with their legacy systems with more robust and maintainable code as they move towards becoming live. Further consideration is needed for operational support so as not to overburden the small team as they have more, real users dependent upon their service.

Although at the time of assessment source code wasn’t publicly available, following strong feedback from the assessment panel, the team have taken rapid action to resolve this. Code is now available via GitHub, licensed as OGLv3.


The team demonstrated a good setup with both a content and interaction designer supporting the two product teams, themselves including both frontend and backend developers. Their design process is driven by user research with findings from research sessions being regularly fed back into product development. A good example being their improvements around accessibility, understanding how their demographic of users approach such online services. By allowing letters in phone input users can add descriptions to the phone numbers they use, highlighting particular needs or preferences for instance, allowing much improved user experience.

The team have been making use of all available GOV.UK design resources such as elements and the fronted toolkit. Furthermore, they have been actively involved in the wider cross-government design community and feeding back their research findings and design improvements to ongoing discussions and the design hackpad as well as presenting their work at a recent accessibility conference.

There are however a few points in the service where the panel could foresee potential issues arising further down the line. As discussed in the assessment, the large text inputs are a potential pitfall. Presenting the user with such a large and open ended question such as ‘How does your condition make it harder for you to do your job?’, with no suggestions of what the actual pertinent information they are hoping the user will input, opens the possibility of the user going into far too much, or too little, detail meaning both they, and the support workers don’t waste time and effort.

The team presented two new approaches to gathering this information, rather than using the large text box, for the ‘Help getting to and from work’ section, and for users with hearing impairments. The team should explore a similar approach with the other sections in the service, including the ‘Help during work’ and ‘Report a change’ sections.

There were good examples of the team simplifying content for users in the face of policy restraints, separating the ‘email terms’ for instance, keeping the information but stopping it from cluttering up the user’s journey. This approach could be furthered, confirmation pages could be more dynamic, based on the user’s inputs – if they asked to be contacted by a certain method – email or phone – this should be repeated on the confirmation page rather than just the generic message stating they ‘will get a phone call or email’ – this will cause potential confusion if they selected one method and not the other.

One small area of particular concern to the panel in one of the prototypes demonstrating was the highlighting of parts of questions in blue in order to highlight options previously selected by the user. The team question the strength of the user need that the user requires this particular information must to be highlighted at all. Regardless of this, the approach demonstrated not only breaches the accessibility principle that colour alone cannot be used as the only visual means of conveying information but the contrast between the black and blue is not strong enough for some users and most furthermore, it confuses the visual language used everywhere else on GOV.UK whereby blue text signifies a link in the text.

The name of the service is not a clear or accurate description of what it allows the user to do – they don’t access work through it, they access funding to help them work. This does not follow the principles touched upon in this blogpost and will be confusing to some users. A more self-descriptive name should be created before the service goes live.

While the team could clearly describe the scope of the service and the transactions that make it up (apply, accept, change, etc.) they should continue to test and improve the users journey between them. This should include offline sections of the process such as follow up phone calls, correspondence letters and site visits. They proved to have an understanding of these parts of the journey, but in order to have an understanding of a service as a whole, they must all be rigorously tested with users.


The team have a good understanding of data analytics, and plan to track the 4 KPIs (cost per transaction, user satisfaction, completion rate and digital take-up) in line with the Service Manual guidance.

During their ‘private Beta’ development the team have been making effective use of the quantitative data gained, and this should continue and increase during public Beta. The team have a data analyst available to them to support effective measurement, and we expect this to develop once the service is in public Beta.


To pass the next assessment, the service team must:

  • Work with content designers and the GOV.UK team to test and potentially change the service name, and consider how it should work in a GOV.UK context alongside other related services. ‘Access to work’, although it has some recognition amongst users, is vague and doesn’t adequately describe the service provided. This is a particular cause for concern with new or unfamiliar users. Whilst recognising that there is an existing ‘Employee and Support Allowance’ with a slightly different focus, DWP and the Access to Work team should consider how a more integrated and holistic approach to services supporting this specific user group can be achieved.

  • Ensure that code is made publicly available under a suitable license by default. The team have recently started doing this, and it should continue and be extended to ensure that unless there are strong reasons for not releasing, all code will be maintained in a publicly available form.

The service team should also:

  • Resolve the highlighting of text in content headings. If there is a genuine user need for highlighting it is important that this is provided in a clear and accessible form. We recommend that the team should collaborate with the cross-government design community to work through this issue.

  • DWP as an organisation should ensure that staff have access to the right tools and technology to work effectively, and that corporate rules support some flexibility in the choices available. It is important that the team are supported and empowered to take decisions appropriate to building a better service.

  • Begin developing a plan to replace existing legacy technology used by operational staff. It is important to ensure that staff also have a high quality user experience. Removing inflexible and costly back-end systems will enable the team to deploy a complete end-to-end service and will yield more savings, whilst boosting staff engagement.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 22 December 2016