Voter Card Service Alpha Assessment

The report for the Voter Card alpha assessment on the 16 February 2022

Service Standard assessment report

Voter Card

From: Central Digital & Data Office (CDDO)
Assessment date: 16/03/2022
Stage: Alpha
Result: Not Met
Service provider: DLUHC

Service description

Electors will be required to identify themselves by showing an approved form of

photographic identification before casting their vote in a polling station at:

● UK Parliamentary general elections;

● local elections in England;

● local referendums in England; and

● Police and Crime Commissioner elections in England and Wales.

Service users

Main user groups:

  1. 70+ retired
  2. 30-60 white, non-working
  3. Mod/Severe disability
  4. Homeless

The main user groups have been identified and supported by research done in Discovery.

Please note: Although the user research has identified specific groups of users who are most likely to need a Voter card, the Voter Card service must be made available to every British citizen who is eligible to vote and wants to receive one, regardless of whether they own a photo ID document or not.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done a lot of research, using several different methodologies, and with a range of users including some of each of their different target groups. This is despite challenges reaching some of them
  • the team organised face to face research at a shelter with its homeless user group despite the challenges of face to face research in a pandemic and clearly learnt a lot from this
  • the team is making the most of previous research from other teams in the Electoral Integrity Programme

What the team needs to explore

Before their next assessment, the team needs to:

  • review its personas to make sure they are useful for driving design decisions in the team’s work
  • the Service Manual says to create “user profiles or personas that describe groups of users with similar behaviour and needs (new parent, caseworker, small business etc)”. It’s critical to get this right given the nature of the service and the panel were concerned that the team were using demographics rather than behaviours
  • the team might want to instead focus more on the barriers to accessing or using a voter card. For example, as the team has done around users not having an address for a card to be sent to
  • the team may benefit from looking at the Universal Barriers work to see how this might help them, as well as learning from team’s with similar challenges such as the ‘Open up identity for all’ work on the GDS’s Digital Identity programme
  • ensure it has a good articulation of the user needs of the Elector Registration Officer, and the administrators in the Local Authority Electoral Registration Offices
  • continue to engage with and learn from other researchers and stakeholders across government and the NHS who may have similar groups of users that you are finding it hard to get hold of. This is in order to see what lessons can be learnt and how you might continue to iterate your approach for recruiting these users
  • as identified by the team: do more face to face research sessions, including testing the paper processes; digging deeper with some of the more vulnerable and hard to access users; and looking at the role of family and carers in supporting users

Additional Comments

The panel were concerned that there wasn’t sufficient ownership of the Discovery research - many of the challenges presented were answered with reference to this, which had been conducted by another team. The panel would recommend that in future some of these discovery findings are validated so that the team can take more ownership of their work

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has worked with policy stakeholders to understand the legislative constraints, and the service meets these constraints

What the team needs to explore

Before their next assessment, the team needs to:

  • explore fully how the user’s problem could be solved whilst meeting policy requirements - within the scope and constraints that the team explained, the problem was answered, however for the user it wasn’t. Consider whether naming and designing the service as a ‘Check if you need a voter card’ may clarify the purpose of the card in voters’ minds, helping to remove the burden from EROs of voters applying for a card when they don’t need one
  • consider carefully, within these legislative constraints, whether there is an alternative solution, either an alternative service, or alternative form of voter card, that may solve the problem for users without valid ID for voting purposes
  • consider again whether NINO is the best form of verifying identity - the team described this service as being designed in response to voter fraud, so the team needs to explain how they are ensuring the service is protected against fraud via the NINO channel
  • continue working with other organisations on minimising the number of times users have to provide the same information to government, for example through a single-sign on like GOV.UK Sign In
  • provide more clarification on the expiry and renewal of a voter card, and how the photo gets updated over a voter’s lifetime. This is thought by the panel to be a key consideration, especially given the service’s role in preventing voter fraud

Additional Comments

The panel felt, as above, that the whole problem wasn’t being considered. This is outside the scope of this team, and they did explain their policy constraints, however the lack of ability to mandate to the EROs puts the success of this product at risk. This is a really challenging and politically sensitive service with a tight time frame, and there are no easy solutions to the problem as a whole

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has worked closely with policy stakeholders to make prioritisation decisions

What the team needs to explore

Before their next assessment, the team needs to:

  • test all parts of the user journey, including the offline parts, such as the paper processes - work with the relevant operations staff to undertake this testing and to use this research and the data that comes out of it to improve the service
  • explain clearly how EROs and polling station staff are kept up to date with changes to the online service so that they can explain clearly to users how it works, and that there is a robust process in place for how changes to the online process affect the offline process, and vice versa
  • before the next assessment, the team needs to show in more detail how a clear and consistent tested route for users to find details of phone, paper or face to face channels will work - the prototype shows a link for users to get help applying at the bottom of each screen but this journey was not available to test at the time of assessment
  • before the next assessment, the team needs to show a fully-scoped journey for users who need a replacement voter card - where they should go, what each step of the journey looks like, and the user guidance on this
  • before the next assessment, the team needs to show a fully-scoped journey for users without a fixed address or hostel address, who want to apply for a voter card - the journey at the moment leads them to a dead end (they are directed to contact their local ERO, but this form requires a postcode too, and there is no assisted digital support signposted). This journey is an edge case and should not be dismissed, in order to avoid disenfranchising these users

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has used the GOV.UK Design system as reference for the design components being used across the service
  • the team has made a commitment to making the service accessible

What the team needs to explore

Before their next assessment, the team needs to:

  • show how you are intending to gather feedback from users via a phase banner component and a feedback page
  • test and report more clearly how easy it is for a user to apply for a voter card as simply as possible, first time, with minimum help
  • test usability frequently with a broad range of users - including those with low and high digital literacy, across all age groups, location, gender, ethnicity, and with disabilities, using the appropriate research techniques and sample sizes
  • test all parts of the service that users will interact with, including the physical card and any letters and emails being sent to the user or ERO
  • perform a 2i on content across the service - ensuring adherence to GOV.UK style guide and design system. The 2i review will need to pick up in depth on tone, consistency, in-page content priority, user flow, adherence to tested patterns, and accessibility, in the same vein of the included brief content review of the user-facing system. (See section 8 for a brief content review, that the team should expand upon to test, iterate and improve frequently)

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team’s use of the AAA model was a good way of framing the challenges for users’ ability to use the service
  • the team is ensuring they are doing accessibility testing of its flows with support from independent specialists

What the team needs to explore

Before their next assessment, the team needs to:

  • show that the service (which encompasses both voter-facing UI and internal staff-facing portal) meets accessibility requirements at AA, including offline parts of the service. The offline parts of the service including call centre scripts and correspondence between EROs and users, and service emails were not available for review by the panel at the time of this assessment
  • show how the ERO portal has been designed with ATAG accessibility requirements in mind, so that the service is just as accessible and inclusive for ERO and internal users as it is for public-facing users
  • show how the team is avoiding excluding any groups within the intended audience. The team presented on how they engaged with traveller communities and people without a fixed abode - before the next assessment the team needs to show how they are designing the service to be fully inclusive, encompassing all edge cases., eg Irish travellers who are interested in voting
  • as mentioned in Section 3, before the next assessment the team needs to show how they have designed a fully-scoped journey for users who require assisted digital support, including but not limited to when the Voter card service links into another service, like ‘Contact a local ERO’. This end-to-end journey is an edge case and should not be dismissed in order to avoid disenfranchising these users. Assisted digital support must be clearly and consistently signposted across the service
  • before the next assessment, the team needs to be able to demonstrate what the paper process for the Voter card looks like, how it has been designed with user needs and accessibility in mind, and how changes to the online service that may affect the paper process will be implemented
  • before the next assessment, the team needs to ensure that plain English is a key consideration of the content review that was recommended in Section 4 of this report, and ensure that the language used across the service is clearly understood by all users

6. Have a multidisciplinary team

Decision

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have all the roles that would be expected for Alpha - with particular note being given to the resource in UX UR roles
  • the team understand the job roles needed for the beta phase

What the team needs to explore

Before their next assessment, the team needs to:

  • confirm whether funding will be in place for the team in beta, and how many of each role will be in place: the panel were unable to pass this point of the standard without this confirmation
  • understand whether having one person covering service design/service owner/product manager is the most effective: this places a lot of pressure on one individual and is a risk for the successful delivery of the product. The panel does note that the team said this would change in the future
  • better articulate and mitigate against the risk of such a reliance on suppliers in the team
  • ensure consistency of personnel wherever possible on the team, to minimise the risk of knowledge loss, especially around the users and the design iterations that have been used. The discovery seems to have been conducted by a number of different contractors with little consistency into alpha, and we would recommend that is avoided as the team moves into beta

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has strong agile ways of working in place and were able to explain how they use ceremonies to support their work
  • the governance surrounding the team and the programme seems to be supportive, although there were a number of constraints that we would normally expect a service team to be able to challenge
  • the team were able to talk to how they have iterated their ways of working based on retro feedback

What the team needs to explore

Before their next assessment, the team needs to:

  • be clearer whether all of the team is involved in research - there seemed to be a clear distinction between the UX/UR design roles and the tech roles: the ideal would be that research is a team sport

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to demonstrate how they have iterated and responded to feedback, for example on their “nudge” for those who don’t *need* a voter card.
  • they have a clear structured way of working to ensure they are able to learn and iterate quickly
  • the team was able to show some good examples of how they were iterating their approach, such as in the co-design workshops they did with EROs. They were able to clearly articulate and show how they had identified the top tasks
  • the team was clearly able demonstrate how they iterated their prototype according to what they had seen in the user research
  • The team has a content design resource on the team, showing a strong commitment to user-centred design

What the team needs to explore

Before their next assessment, the team needs to:

  • perform a 2i on content across the service - ensuring adherence to GOV.UK style guide and design system. The 2i review will need to pick up in depth on tone, consistency, in-page content priority, user flow, adherence to tested patterns, and accessibility, in the same vein of this brief content review of the user-facing system below:

  • the Guidance page should use a breadcrumb so users can orient themselves within GOV.UK, and use a blue line in the Related links right hand-column rather than black. See the relevant pattern in the Design system
  • consider rearranging the Start guidance in-page hierarchy based on content priority. For example, ‘Who should apply’ should probably appear higher up on the page than it currently does. Furthermore, the first sentence under ‘Who should apply’ does not need to be a bullet point, as it would be more legible as a complete sentence
  • the overview introduction could be misconstrued: in the second bullet point it is suggested that a voter card can be used when someone doesn’t have a valid photo ID outside of the voting context. This second bullet point uses a double negative so it is incorrect
  • the Back to top link is floating in the middle of the Guidance/Start page and is used three times on the page: it also receives the cursor in the wrong order. Revisit this component to understand what the use case is, whether the use case warrants a Back to top link, or whether you could solve it differently. If you have received feedback from users that the page is too long, consider breaking it up into separate numbered screens, as the Carer’s Allowance service does. If you do decide to continue using a Back to top link, revisit the guidance that exists on that component in the Government Frontend Development Guide, and the GOV.UK Design System Github discussion thread
  • the ‘Why it’s needed’ information is misleading - it might be better to word it: The Elections Bill 2021 requires anyone voting in person at some elections to show an approved form of photo ID. A voter card is an approved form of ID. It is suitable for people who do not have a valid photo ID, or for people whose photo ID has expired
  • the ‘Eligibility’ and ‘Who should apply’ sections on the Guidance page can be merged as the two pieces of information are complementary: doing this would also reduce the amount of headings on the page
  • the Temporary voter card section on the guidance page needs to provide a specific timeline: how close is “too close” to an election? The current wording is not specific enough
  • the ‘You do not need a voter card’ screen is quite abrupt: if the team hasn’t already, consider using the ‘Check a service is suitable’ pattern for presenting results on this screen. GOV.UK is written incorrectly on the current version. ‘You do not need a voter card – you can still apply if you want to’ - this message is contradictory and needs more information to clarify the terms of applying. It might be more useful to sensitively reiterate here who the service is intended for and what happens if a voter card is applied for unnecessarily
  • the H1 ‘You cannot use a voter card for any other reason’ on the interstitial screen is not clear enough. ‘You can only use a voter card to vote in person, and not for any other reason’ – may be clearer
  • review the language used in the session timeout box, referring to this guidance on the DWP Design System, discussion on the GOV.UK Github thread, and W3C guidance
  • reconsider the PDF format provided for the paper application in light of accessibility concerns, and shortening the hyperlink text here
  • on the temporary voter card screen, reconsider the in-page content hierarchy - is it better to have the call to action button more visible and higher up the page, for example? Also within this screen, consider removing the H2 ‘If you want to vote tomorrow’ as it makes the journey seem like two separate journeys: one for an election that is tomorrow
  • ensure the guidance regarding name changes is consistent throughout the service, and for transparency, it may be useful to discern between what a user applying for a temporary card and a name change ought to do differently, on the Guidance page
  • carry out full testing of the task list, to:

  • ensure that screen reader users can navigate the task list after completing a section, without frustration
  • test whether users need to return to the task list after completing each task or whether they can navigate straight to the next task
  • See the relevant pattern in the Design System and the Github discussion thread for further reference

  • a user can only complete the Identity section if they’ve provided both a phone number and an email address but the service says email address is optional
  • on the confirmation screen, tell the user why the reference number is important and why they may need to keep a record of it

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • this has been designed to minimise the lifespan of user-collected data as much as possible
  • care has been taken in the processing of uploaded assets
  • work had been put into planning a sound security architecture, and had been done with expert engagement outside of the team
  • the had investigated techniques to securely partition data

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to search for a more effective, and lower risk, mechanism for identity verification to remove the use of National Insurance numbers for this purpose
  • continue to invest in researching and mitigating potential fraud risks
  • ensure that planned use of encryption at the application level follows NCSC guidance, and extra care is taken to regularly rotate keys

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have considered the standard KPIs and also considered those which are specific to their service
  • the team have considered different routes to obtain feedback, including Google Analytics and reports from their ERO portal as well as feedback from user groups

What the team needs to explore

Before their next assessment, the team needs to:

  • understand how likely it is that they will be able to gather feedback from the EROs given constraints on mandating their processes
  • give further thought to anti-fraud measurement, as this was a key aim of the programme, and also consider how to measure the cumulative impact of this product against the others in the programme - for example will the introduction of the voter card have a direct impact on the number of people applying for a postal vote, and will this be considered a success in wider programme terms? It would be useful to see the aims of the programme and how the individual product contributes to these: what is good for the product individually may work against the programme aims
  • attach some targets to the KPIs - accepting that these can and should develop as time passes, but it would be good to understand the aims of the service where possible

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • new technology choices were being made in line with the wider programme.
  • existing configuration and tooling within the programme is to be re-used where it is available

What the team needs to explore

Before their next assessment, the team needs to:

  • plan and test how they will handle throttling errors when scaling their current technology choices quickly
  • ensure as much thought goes into the scaling of each tier as has been given to the front-end

12. Make new source code open

The source code for prototypes is not available, and there are no plans to make anything open source during Beta.

Decision

The service did not meet point 12 of the Standard.

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate which components of the system can be made open. Whilst there are few valid reasons for keeping code private, fraud detection algorithms is one of them but this should not imply that the entire system should be closed
  • begin with the assumption that the source code will be open, and follow the CDDO guidance on when code should be open or closed

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are planning to reuse common components across the programme where they are available
  • plans include common patterns around REST APIs

What the team needs to explore

Before their next assessment, the team needs to:

  • consider sharing what they learn from their approach to data partitioning, and PKI, with other departments

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • plans are in place to scale up support at specific times, reflecting the learnings from the Register to Vote service
  • has considered system observability from the beginning and proactively monitoring security

What the team needs to explore

Before their next assessment, the team needs to:

  • further investigate techniques to ensure that Electoral Role Officers are protected from floods of potentially invalid, or duplicate, applications
  • determine how the backup system will be explained to users should the main system become unavailable for some period of time
Published 29 June 2022