Apply for an armed forces compensation or war pension payment beta assessment

Service Standard assessment report Apply for an armed forces compensation or war pension payment 31/03/2022

Service Standard assessment report

Apply for an armed forces compensation or war pension payment

From: Central Digital & Data Office (CDDO)
Assessment date: 31/03/2022
Stage: Beta
Result: Not Met
Service provider: Ministry Of Defence

Previous assessment reports

Service description

The service supports users in applying for a payment from the Armed Forces Compensation Scheme (AFCS) or War Pension Scheme (WPS) through a single claim service. The schemes provide lump sum and ongoing payments to those who have an injury, illness or disability related to armed forces service. Payments can be made for any medical condition related to any aspect of a person’s past military service, in some cases back to World War II. AFCS makes payments for conditions related to service from 05/04/2005, WPS for conditions due to service before then. These schemes are an important part of the Armed Forces Covenant.

Service users

This service is for:

  • aged veterans – injured (mentally or physically) pension-aged individuals who are no longer working. Includes those who served in World War II, National Service and in subsequent years who have an old injury, illness or disablement they wish to claim a payment for
  • onward veterans - those of working age who have left the service and are following another career or looking for a new career, but have an old injury, illness or disability they wish to claim a payment for
  • wounded serving personnel – those of working age still serving in the armed forces with an injury, illness or disability related to past service
  • professional users – the Veterans Welfare Service, back-office processing teams and charitable organisations who assist users or administer the service. Also includes solicitors, those with Power of Attorney and those assisting others, for example family members, make a claim

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a good understanding of their user groups, both internal and external
  • the team had used a good range of available research methods during the height of the pandemic
  • the team had a clear user research plan for this phase
  • it was clear how the team had removed a significant amount of the burden from users making claims e.g. having one service rather than two

What the team needs to explore

Before their next assessment, the team needs to:

The aim of private beta is to show, through evidence, that the service works for everyone who needs to use it. It’s about testing the riskiest assumptions in the real world. There were several areas where this evidence was missing, and the team must address this to pass into public beta. The actions required are:

  • test the service with users who experience the full range of access needs. The team has completed some sessions but the prevalence of additional needs in their user group is higher than in the general population. Therefore, we need to see compelling evidence that people with a wide range of access needs can use it
  • test the service with users who fall into the assisted digital category. The team recognises this is an area where they have not done enough. The pandemic has made this harder, but they need to find a way around it. Especially now most organisations have returned to face to face, contextual research
  • demonstrate a better understanding of users who get support to complete the application and who it is that helps them. No research has been completed with this user group despite 35% of people declaring they’ve had support
  • learn more from the people who are completing the application outside of a user research session. The team didn’t have a grasp of the users who are coming through e.g. how many have declared they had help to complete the form. They need to have better mechanisms for attempting to use these people in research sessions. For example, optional fields in the post-completion feedback survey asking people to take part. This is common practice and shouldn’t cause GDPR issues with an appropriate privacy statement
  • track people through the process to ensure the measures that matter to them are met. For example, completing the application is unlikely to be someone’s goal, getting paid is. Understanding a user’s high-level needs and measuring how well the service meets those goes a long way to demonstrating whether the service works

The team would also benefit from:

  • reviewing their user needs. The needs presented refer to solutions or functionality (e.g. ‘complete a claim form’). These are created user needs and do not represent the real world problems or tasks users are trying to deal with. A more helpful user need might be “as a veteran with a serious injury I need to secure extra income so that I can support myself and my family”

  • diversifying their recruitment methodology.Relying on GOV.UK messaging and the helpline is not giving them access to the range of users they need

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to talk about the entire user journey from awareness (including the resistance some users have to admit that they’ve been disabled and apply) through applying and finally being paid. The live guidance pages for the armed forces compensation scheme and war pensions scheme also have process maps of what happens after the user applies and when
  • the team has reduced the number of times users have to provide the same information to the government, for example, switching from asking users to declare all benefits they receive to only those that cannot be easily checked by back office staff
  • the team is thinking about tricky scenarios such as multiple claims

What the team needs to explore

Before their next assessment, the team needs to:

  • design and usability test journeys for how a user starts the service (and finds out that they are eligible) with the expected GOV.UK patterns. The team must find out who will be the content editor for the starting point on GOV.UK (the Ministry of Defence or the Government Digital Service) and work with the content editors to understand what their proposed content is and how it will affect the onward journey (for example, if the team can have different entry points to start and save and return, or have a single entry point with routing pages to direct them to a new application or an existing one). The team should also consider if this means adding more eligibility questions, out-of-service guidance, and even communications campaigns
  • show how they’re designing and measuring across the full scope of a user’s journey - while the key transaction for users is applying for an armed forces compensation or war pension payment, the ‘whole problem’ for users is getting paid
  • continue to learn from teams working on overlapping services with their users - the team is already engaging with the Blue Badge service, the NHS and Pension Credit, but it may also be valuable to talk to teams delivering disability benefits at the Department for Work and Pensions (DWP). The service team may be able to build on existing work, for example with personas and design patterns, and even be able to simplify shared journeys

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made improvements to back office processes meaning that claims can be processed more quickly, for example enabling voice dictation to be used and for paper documents to be more easily scanned for processing
  • the team can make changes to all elements of the user’s journey including the letters sent for progress updates and decisions
  • the team has removed the identity checking step, as recommended in the alpha assessment report, so that users can apply immediately rather than waiting 24 hours
  • special users now are not excluded, as recommended in the alpha report, and are instead warned to get permission beforehand
  • the team is working with back office staff and making changes (as mentioned in Point 7)
  • the team will continue to support paper forms as well as the digital service

What the team needs to explore

Before their next assessment, the team needs to:

  • act on their plans to improve letters, and test them. The Service Manual has guidance and examples on writing effective letters
  • monitor if and how uploaded documents actually help with deciding and awarding claims. If they are useful, there may be value in talking about this in out-of-service guidance (more on this in Point 2). The planned work to access existing records (also mentioned in Point 2) may also help reduce potential documents required
  • monitor if there is a need to have the online service reflect later steps in the journey. Users will get emails or letters every 6 weeks (with future plans for text message updates) but users may also need to change some details after they have sent their claim
  • have a plan for improving the paper forms in line with the improvements made to the digital service

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done a lot of work towards making things simple for users, such as allowing users to save and come back later for up to 6 months, and using patterns such as ‘check your answers’ at the end of each section in the task list to help users check information as they go
  • the team has added some features to give users choice based on research, such as allowing them to add bank details later
  • the service uses the GOV.UK Design System and mostly follows design patterns and the GOV.UK style guide

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to simplify content and where needed split pages into separate questions, starting with the ones with the most text on them (including when detail components are expanded). For example, the declaration page combines communication preferences and a declaration, and hides crucial information in detail components. There are other pages that could also be split into multiple questions, for example those with “was/is serving”. The panel was also worried that instructions throughout the service (for example on the task list page) might be symptoms of usability problems that should be addressed with better user journeys. The virtual ‘get feedback’ sessions for government designers may be helpful if the team wants advice
  • review and fix navigation issues. For example, there is no way in the service to return to the task list page. This feature may be more important than having a ‘cancel application’ link (which at present goes to a placeholder page) on every screen. Options that the team can consider may also include grouping primary and secondary buttons and using the header navigation. The team may also wish to consider if there’s value in making the ‘setting up’ part of the journey different from the part where they can save and come back later, though this may also not be appropriate for their audience. If the team wants advice on patterns, the GOV.UK Design System meetup happens 2-3pm every second Friday (more details on #design-systems-community on cross-gov slack)
  • add verification or recovery options for ‘save and come back later’. Users do not confirm an email address or confirm a phone number, which means that they might type something incorrectly. If they lose access to their email or phone, or mistype one of the 3 personal data items in the setup, there is no visible way to either recover their details or get help. If the team will integrate with GOV.UK Sign In (as mentioned in Point 13) soon, they may wish to redesign this part of the journey with that in mind (and in collaboration with the GOV.UK Sign In team)
  • have a plan to fix other general GOV.UK style guide content issues, such as incorrect capitalisation of some buttons and tab titles - these are detailed in the separate content review
  • monitor if sections where users can skip questions are causing quality and processing issues later. Robust analytics (mentioned in Point 10) will help with this. An alternative pattern is to ask users if they know certain things and then only show the input fields if they do

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has designed the service to allow for people to apply for themselves, people applying as a legal representative of others, and other helpers
  • the team is mindful that some elements of this service such as recalling a disabling incident may be traumatic to veterans, and is providing support such as signposting users to the urgent help for veterans service

What the team needs to explore

Before their next assessment, the team needs to:

  • show how their service meets the needs of users with access needs. While the panel recognised that the team has done research with users with access needs and existing assisted digital channels such as the Veterans Welfare Service (who suspended home visits during the COVID-19 pandemic but are to resume these soon), it was not clear what evidence they’d gathered in private beta to show that it was suitable for a public beta
  • publish an accessibility statement based on the result of their accessibility audit. The team had not done an accessibility audit at the time of the assessment and so did not have an accessibility statement, but is scheduled to have an audit late April 2022
  • continue with their plan to improve the journey for people applying for or helping others. Analysis of expected and real numbers against risk of not improving the journey will help the team prioritise what changes they make

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is mostly comprised of permanent civil servants
  • The service team has good coverage of disciplines

What the team needs to explore

Before their next assessment, the team needs to:

  • supplement the existing and planned team with an interaction designer and an additional developer
  • increase the capability of the performance analyst
  • aim to spend more co-located time together as it could increase the speed and quality of iterations

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that the team:

  • supplied evidence they’re employing agile methods to deliver the service
  • engaged with a breadth and variety of stakeholders, doing fortnightly show and tells with both internal stakeholders and back office staff
  • exemplified ability to push back against stakeholders to simplify the service, for example, by reducing the number of questions asked of users and allowing for approximate dates rather than detailed ones

What the team needs to explore

Before their next assessment, the team needs to:

  • show a roadmap that prioritises meeting the needs of all users. While the team has a good understanding of the overall processes, the panel was concerned that some user groups could be missed out without a clear plan, for example disengaged users or those that are continuing to use paper
  • measure and report progress of iterations and KPIs (specific recommendations for KPIs are given in more detail in Point 10)

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed good evidence of iterating the service based on analytics and user research, for example, by informing users that the application could take ~45 minutes to complete

What the team needs to explore

Before their next assessment, the team needs to:

  • show more examples of how the team iterated according to Points 2 and 3 of service standard. The panel was concerned that whilst the iterations were clearly exemplified within the core user journey, the team will need to clearly show how they’ve iterated upon aspects of the users journey that have yet to be developed, for example, the Start page. The team will need to ensure users can easily find the Start page and move throughout the journey. At the next assessment, a panel would expect to see a few examples of how you’ve iterated the Start journey. Another area the team should focus on is how they’ve iterated upon the accessibility findings. The findings should help them understand where accessibility may pose a challenge, yet the panel will expect to see how you’ve tested the updates with users and show how they’ve iterated them, or explain otherwise. Lastly, the service team should ensure they do the same with users with assisted digital needs, especially the hard to reach or dis-engaged users
  • the team will need to show iterations in areas that are yet to be developed. The team will need to do the same with the offline aspects of the services, as is required by Point 3 of the Service standard, ensuring that all changes are based on robust analytics and user research and continually monitored

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has undertaken a PEN test and obtained a DPIA
  • they are on the GOV.UK PaaS and GOV.UK Notify which are accepted for the sensitivity of the items and information they are processing
  • in the current process and application framework they operate in, they have the most secure environment they can
  • there is an existing well known process for managing and securing the shared user access email

What the team needs to explore

Before their next assessment, the team needs to:

  • the PEN test had a number of recommendations, many of them straight forward, however all the recommendations should be actioned
  • given the sensitive nature of this service and the quantity of changes requested I would like to see another PEN test before the service goes live
  • the access to the multi user backend email should be monitored and audited, ensuring that the correct number of emails are received and granular information regarding who has accessed the email can be reported
  • ideally a plan should be put in place to transition away from email to a more modern approach to sharing content with a backend service
  • confirm during public beta the use pass through authentication so they can complete their application later, is acceptable. This has been signed off by their Cyber security SME and will be assessed by the MOD security accreditator

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has created a performance framework for the service with KPIs and metrics linked to the identified user needs
  • the team has investigated options for analytics tracking and are currently working to implement a solution that provides the analytics data required to fully monitor the service and its usage
  • the team is aware that analytics are not yet robust enough to ensure the service is working for its users; however they are taking steps to try and address this and upskill the team members in this area
  • the performance analyst does have access to backend system MI and is able to feedback trends/info to the team

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should continue to explore, agree and implement an analytics solution to allow them to track usage of the service and user interactions
  • KPIs identified within the performance framework should be expanded following implementation of the analytics solution to ensure the team is tracking user interaction with the service. For example, the team is monitoring service availability; however monitoring submission times will help the team understand when the service is most used and should be available for users
  • the team should also consider the level of tracking appropriate for the service. For example tracking use of the save and return function and use of other links (accordion expands, how many users select they are assisting someone with the application etc) to ensure the service presents data and options to users in an accessible and appropriate manner whilst gaining further insight about users of the service and how they interact with the process
  • implement tracking of the number of emails and 2FA requests attempted versus successful requests will ensure the team has full visibility about failure rates at each of these stages
  • whilst implementing an analytics solution, the team must ensure they have a way to validate if the amount of usage reported via their analytics solution is representative of their user base. This is due to cookie consent as not all users will interact with the cookie banner or will decline cookies. Using analytics to help support iterations to the service is essential but the team must ensure decisions are being made which benefit their representative users rather than just a subset of users
  • when a decision is made concerning where the service will be housed/referenced, the team should ensure they have access to the analytics data from this landing page to ensure they obtain feedback and insights from these pages to the service
  • the team currently obtains the customer feedback data on a weekly basis. During private beta, obtaining feedback on a more regular basis will allow the team to identify any pain points in a more timely manner

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has used a simple architecture hosted on GOV.UK PaaS, which should make long-term maintenance and support easier
  • they are using widely known tools, platforms and languages: PHP, Postgres, S3 and Cloud Foundry
  • they are making use of government services GOV.UK PaaS and GOV.UK Notify

What the team needs to explore

Before their next assessment, the team needs to:

  • we would like to see test automation and scripts available before go-live

12. Make new source code open

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team is sharing content publicly on GitHub

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the content on GitHub includes a readme and installation/configuration guides
  • database creation and configuration scripts should be available on GitHub

13. Use and contribute to open standards, common components and patterns

Decision

The service did not meet point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is built using GOV.UK Frontend and the GOV.UK Design System
  • the service makes use of standard components GOV.UK PaaS and GOV.UK Notify

What the team needs to explore

Before their next assessment, the team needs to:

  • the email provided to the backend does not have clear indicators that would aid future machine readability. There is no easy way of distinguishing between questions and answers
  • the JSON used to store the information should follow a more consistent naming approach ideally only use either CamelCase or underscores
  • the team should try and limit the number of levels used in their JSON
  • The service team should use date formats recommended on GOV.UK
  • they should store DOB as one date and not separate fields for: day, month and year

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team is using common GOV.UK platforms, used by a wide range of services
  • the existing approach will be available in the case of a full system outage

What the team needs to explore

Before their next assessment, the team needs to:

  • test scenarios regarding service outages prior to live
Published 25 January 2024