Apply for Blue Badge Alpha Assessment

The report from the alpha assessment for DfT Apply for Blue Badge on 16 January 2018.

Service Standard assessment report

Apply for Blue Badge

From: Central Digital and Data Office
Assessment date: 16 January 2018
Stage: Alpha
Result: Met
Service provider: Department of Transport

The service met the Standard because:

  • All members of the team have a comprehensive understanding of user needs and complexities of the service and work hard to address them. The team works well together despite not being in the same location most of the time, and have a good handle on managing stakeholders in both DfT and local authorities.
  • Have explored various options for the design of their service and found a solution that makes the journey as simple as possible for users, including users that the ideal journey doesn’t work for, and opportunities for further enhancements to the journey by working with other parts of government.
  • Demonstrated a comprehensive plan to use common technology platforms, open source software and data security considerations.

About the service


The Blue Badge Scheme is a service is a service to support people in retaining their independence. The Department of Transport is responsible for the policy and legislation governing the scheme but the legal obligation to issue badges to eligible disabled people sits with Local Authorities. A new service to replace the existing one which is currently provided by a managed service provider whose contract comes to an end 31st December 2018. The new service aims to reduce overall costs and workload for the Local Authorities with increased benefits, and a consistent, efficient service that meets the user needs of citizens nationwide.

Service users

The users are disabled people and organisations who meet the eligibility criteria and wish to apply for a Blue Badge and the Local Authorities who each administer the scheme for their own residents, including confirming eligibility and issuing badges.


This is a complex service. It’s delivered by central government for 207 local authorities of different sizes and with different processes, and in all parts of the UK (with the local authorities having both responsibility for ensuring the applications are genuine and issuing the badges, and significant freedom regarding them demonstrated by the fact that in England the badge costs £10, in Wales it’s free, while in Scotland it’s £20).

The badge itself is physical and of non-standard size meaning manufacturing it is not simple. Many users of the service are disabled and a proportion of them is low on the digital inclusion matrix. Some users have lifelong conditions but are required to re-apply for the badge and provide evidence every 3 years.

Administration and enforcement provide challenges of their own, with many of the local councils opting to buy or build their own software to manage the badges, and with enforcement officers currently unable to access required information about them on the street and waiting up to 6 hours for the badge holder to return to their vehicle.

The current service is a legacy one and clearly does not meet needs of the users, and has been described as ‘the most complained about DfT service’. All IP for the current service belongs to the supplier.

Overall, the panel was impressed by service team’s work so far.

User needs

The team identified a number of user groups: first time applicants, blue badge holders applying to reapply their badge, friends and family members, local authority issuing officers, enforcement officers.

The team had used a variety of methods to understand their users: interviews, usability tests observation and a variety of surveys. They had made good efforts to reach out to users with Assisted Digital needs and worked with charities to find these users. The team had made good efforts to research with and understand the needs of local authority users. The research was led by a dedicated user researcher, but the panel was also impressed by the extent to which all members of the team, not just the user researcher, were able to talk about users and their needs.

The team presented two citizen personas and two Local authority personas as well as high level user needs for these personas. Whilst the team were able to talk clearly about their users it wasn’t always clear from the presentation how the research insights and the user needs linked together: it would be helpful for research insights to be shown next to the needs.

The team showed that they had test various prototype designs with users and had iterated on the basis of their findings. The team showed they had changed the prototype in response to feedback for example by changing the ‘walking’ questions to make them more relevant to someone’s specific situations and changing condition, and exploring alternative ways of organising a long task.

The team gave a convincing explanation as to why they were focusing on prototyping the first time applicant journey. However, they largely tested this with current Blue Badge holders and could do more in the next phase to test with first time applicants, especially given their specific needs identified in the persona.

The team had a reasonable plan for research in beta. They had identified Local Authorities to work with for a private beta and planned to continue a rhythm of research throughout the beta phase. They spoke about plans to test the Assisted Digital journey in beta as well us understanding more about how charities are supporting users.


While usually the panel would expect to see efforts from the Service Team to bring expertise in house and transfer knowledge from the supplier to civil servants, in this case some separation between the delivery of the product and the department is appropriately justified by contractual matters specific to the way Blue Badge service works between central and local government. Despite this, and the geographical dispersion between Manchester, London, and Swansea, it was clear from the their interactions and the way everyone was able to speak about user needs, different aspects of the service and its iterations that they communicate often and well and are a well functioning agile team.

It was good to hear how prioritisation of features through discovery and alpha was based primarily on things that were posing the biggest risk to delivery and based on primary research and potential value.

The panel was impressed to hear about team’s working within a department new to agile and how they ensured everything necessary to deliver is present, including permissions to use new tools (such as Slack and Confluence) and adding new specialists to the team where necessary. While the plan and team for beta stage are sound, the plans for live stage will need to be discussed at the next assessment to ensure continuing iteration of the service will take place.

The panel was impressed by team’s approach to show & tells and their focus on enabling remote participants to take part so vital considering the stakeholders of the service are all over UK. The team appears to value challenge from the stakeholders and treat the show & tells as an opportunity to address any worries and potential blockers to adoption of the service.

It was clear that the service and product owner are empowered and have a healthy relationship with all levels of organisational governance including on Ministerial level, which they are not hesitant to challenge to achieve what is best for the users.


The panel was pleased to see that the team is using the prototype kit for quick iteration to shorten the life cycle of a story and have a workflow in place that enables them to move the story along efficiently. For private beta, the team is building the service on a technology stack they understand well, that is open source so there is potential to reuse and share code.

The team has demonstrated a solid plan and expertise to set up continuous deployment systems, resilient and scalable cloud hosting, multiple environments, as well as performance testing and monitoring.

The panel was please to see the intent to use common technology platforms such as GOV.UK Verify, GOV.UK Pay and GOV.UK Notify as well as PAAS either directly implemented in the private beta phase or considered for future phases.

The team is working in establishing a threat model, potential fraud vectors and there is an established offline support model for when/if their service is unavailable. They’re working in making the service available to users and devices as per Service Manual browser compatibility matrix.


The core concept of this service – giving disabled people free access to parking – is deceptively simple. However, there are a few constraints which present some complex design challenges. They fall into the following 3 categories:

1. Eligibility and applying

The rules on exactly who is eligible for a Blue Badge are quite complex. This makes applying for a badge potentially complicated, but the team have done lots of hard work to make this as simple as possible within the current policy constraints, and utilised a variety of existing government patterns and platforms to make this possible.

The Check before you start pattern has been used well and should prevent ineligible applications. Further work should be done in beta to look at how “Check if you’re eligible for a Blue Badge” might be presented on GOV.UK as a separate service.

45% of applicants are “automatically entitled”, meaning that if they can prove they are in receipt of another specific benefit, they should get a Blue Badge. The team have explored various options for how users could prove this entitlement. The best way for these users would be if they could just ask government for a Blue Badge whilst they apply for the benefit which makes them eligible. The team have talked to the relevant people in DWP about work which hopes to make this user journey possible in the future, and in beta the team should ensure their service is able to adapt to working with DWP in this way.

The next best way for users to prove their eligibility is to give the service permission to ask DWP about their eligibility. The team have been actively engaging with the Personal Data Exchange work in GDS and understand that this might be enabled by a common platform within the next year, which would also include integrating with Verify to enable to user to give consent for the sharing of data. If this happens, the team should pay particular attention to whether users understand which part of government they are giving consent to, given the complex operating model.

The option the team are going to move forward with in private beta is uploading documents for proof of benefits. This seems like the best option currently, and the team is aware that this will cause users with low digital skills to be excluded from the online service and are prepared to do further research in beta to minimise this exclusion.

The team are also using established government design patterns for the uploading/taking a photo step.

The journey for the remaining 55% of applicants who are not ‘automatically’ eligible for a Blue Badge is much more complex, involving proving eligibility with medical records, and routing questions to ensure the users upload the right documents. Again, this seems like the best possible solution for these users and the team are aware that this will exclude some users, but have looked into the assisted digital support that local authorities will provide to make sure users still get their entitlement.

2. Enforcement and the physical badge

It was great to see that the team have explored various options for enforcement including less obvious options that are radically different from the current service such as beacons and digital badges. The team settled on sticking with physical badges because of the nature of the entitlement being for a person, who may be a passenger in someone else’s vehicle.

The information that is provided on the badges needs further exploring however. There is information such as the user’s gender which doesn’t seem to be necessary for enforcement, and is making the application process more complex. As the service manual says: “You should only ask about gender or sex if you can’t deliver your service without this information.” The team should explore this more before their beta assessment.

3. Issuing (and other case management tasks)

The team have considered well the needs of the users in local authorities who need to issue and manage the Blue Badges. They have built on established government design patterns to build a case management system that is simple and intuitive to use, and should save local authorities case workers time. The team have thought through how the system will need to work with case workers current processes and other systems.


The team has put significant thought into the success metrics, whittling them down from about 33. The current KPIs include all the mandatory ones, and service is already on the Performance Platform. The biggest challenge is take-up by the local authorities. This was highlighted by the team as the biggest risk they are working to address by a series of face to face meetings and an addition of a new team member in the beta stage. One of the KPIs is satisfaction of local authority staff with the service and the team is working on how to collect and appropriately act on it considering the variations between different authorities.

It was disappointing to hear that analytics from the current service are not available to the team. Once Google Analytics are provisioned for the currently live service the panel expects the department to assist the team in their discussions with the incumbent supplier regarding this. The panel expects to hear about insights from the use of current service as well as the private beta of the new iteration at the next assessment.


To pass the next assessment, the service team must:

  • Because complex interactions within the service (uploading a document, taking a picture with webcam) will make users who are low on digital inclusion need to step out of the digital journey the team needs to make sure the digital service is properly linked up to the assisted digital services provided by the council, keep iterating the service to further minimise the people who are excluded from the online service because of complexity of use.

  • Test the end-to-end Assisted Digital journey as well as journeys that are part online and part offline.
  • Work with GDS Content Team on the start page and potentially splitting eligibility and application journeys (usual pattern is that those are accessed separately).
  • Obtain analytics from the current live service and talk about how they influenced decisions about the new service.
  • Test with first time Blue Badge applicants.

The service team should also:

  • Research the Enforcement journey further to see which of the information on the badge is actually needed and see if changes to the process are possible, so that potentially unnecessary steps like the gender question can be removed.
  • Continue working with the team within the Verify programme who are looking at making data sources like DVLA/HMPO/DWP available so that they can automatically check eligibility / get information rather than requiring the user to upload it.
  • Look at how they will provision user authentication for Local Authority application management part of this service.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

The panel would suggest the renewals journey is treated as a separate journey - this will depend on what is found during user research and should be discussed with the assessment manager.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 24 July 2018