Challenge a Valuation - Alpha assessment

The report from the alpha assessment for VOA Challenge a Valuation service on 10 May 2016.

Stage Alpha
Result Met
Service provider Valuation Office Agency

The service met the Standard because:

  • The team are working well to collaborate with the teams working on the ‘check’ and ‘appeal’ components of the end-to-end service.

  • The team have developed a prototype in line with GOV.UK service design patterns, using simple page layouts.

  • The team have thoroughly considered metrics for measuring both the section of the service they are developing, and the overall service’s success.

About the service

Service Manager: Margaret Whitby

Digital Leader: Philip Macpherson

The service allows ratepayers to challenge the VOA’s assessment of the rateable value of their property. Users will be able to give details of their grounds for challenge and submit supporting evidence.

Detail of the assessment

Lead Assessor: Thomas Moore

User needs

The service team demonstrated a good understanding of their users from research with a variety of businesses, including specialists who help smaller businesses with business rates.

The team plan to procure an accessibility audit during beta. But they have done limited research so far with disabled people, and their plan for continuing research through beta was not clear enough. This is a concern, as the service contains steps, such uploading documents, that may be a barrier for some users.

The team are working with other VOA teams on a common assisted digital (AD) support model based on phone and post. This is appropriate as users see the check, challenge and other transactions, and the related guidance, as connected parts of one service.

The team have good evidence that the numbers of users needing assisted digital support are likely to be low - the great majority of users are business professionals with good skills, confidence and access, and many of the rest get help from business partners and family members. However, informal support is not sustainable. The team needs a clearer plan for testing their support model with users who seek help from colleagues or family.


The team shares members with the ‘check’ and ‘appeal’ teams working on the wider end-to-end service, with a service manager in overall charge of the end-to-end service. Whilst overall this appears to be working well, service content is clearly being impacted by the sharing of two content designers between three teams.

The team are geographically separated from the other component teams, but have mitigated this through the use of collaborative tools and combined ceremonies, including combined weekly show and tells. The team should continue to work closely with the other component teams working on the other service components, focusing on the handover points between the components. This will be critical to ensuring the service works in a joined up way.


The ‘document upload’ and ‘check and challenge’ functions should be areas of particular focus early in beta development.

The upload facility currently supports only PDF and JPEG file uploads. As a minimum, relevant open formats, in particular, ODF and PNG, should also be supported. Work in this area should be considered alongside changes that allow the user to indicate the purpose of each uploaded file and form a case without a separate supporting statement (see also the Design section).

The team provided suggestions of how the information supplied at the ‘check’ stage would be represented at the ‘challenge’ stage, as well as how data and documents supplied during the ‘challenge’ stage would be replayed before and after submitting the challenge. The team need to do further work on these aspects.


The team provided good evidence of substantial iteration to the design across 5 prototypes throughout the alpha, initially using wireframing software followed by the prototyping toolkit. The team plan to use the GOV.UK frontend toolkit in beta, in doing so, the team should take care to follow the principles and guidance in GOV.UK Elements, the Service Manual and the Design Notes Hackpad to avoid minor layout errors and inconsistencies. Using these tools will help the team avoid duplicating work and research where existing design patterns exist (e.g. document uploads, terms and conditions). The team are free to build new patterns or components not already described in these resources, however, they must ensure that these are rigorously tested both with users and for accessibility and shared on the Design Notes Hackpad.

Due to the separation of teams working on the end-to-end service, the link between ‘check’, ‘challenge’ and ‘appeal’ service components wasn’t satisfactorily demonstrated. Furthermore, the handover from the ‘check’ component wasn’t clearly defined. At the beta assessment, the panel will expect to see the full user journey through ‘view’, ‘check’, ‘challenge’ and ‘appeal’. User testing must reflect the end-to-end journey to get an accurate understanding of the users’ experience of the service as a whole.

Users are presented with the option of challenging the ‘check’ outcome or challenging the valuation, which is calculated once the ‘check’ stage is complete. The team should test requiring a user to fully accept the outcome of the ‘check’ stage before the ‘challenge a valuation’ stage is available. By not separating these two stages, users may submit separate ‘challenges’ that are interdependent.

The service currently uses the language of VOA internal processes and expects users to become familiar with it. Instead, the service should use words and phrases the users understand and use themselves. For example, the second radio button on the ‘Reasons for your challenge’ page refers to ‘the check stage’. This is VOA’s term and users may not know what is being referred to here. Users should not be expected to learn how to use the service (including service terms). Rather than expecting users to become familiar with the design of the service, the service must be developed to be as simple and intuitive as possible.

The service presents users with a 2000 character limited field for adding supporting comments. The team should explore options for commenting alongside relevant fields (for example, allowing users to provide a description of the calculation alongside the proposed rate). A ‘catch-all’ field is likely to confuse users, and drive them towards 3rd party support.


In addition to the four mandatory key performance indicators (KPIs), the team have thoroughly explored additional metrics both for the ‘challenge’ section of the overall service and for the end-to-end service. The team has engaged with the GDS Performance Platform team.


To pass the next assessment, the service team must:

  • Test the end-to-end service with users, ensuring the handover between service components is seamless.

  • Develop a clearer AD support model to cover the end-to-end service.

  • Address content issues and the use of VOA terminology.

  • Improve the way users present additional application information.

  • Support ODF and PNG file uploads as a minimum and consider supporting common proprietary formats.

  • Fully explore the risk of disclosing sensitive information to parties who are not entitled to see it where data and documents are replayed within the service especially as the scope of replay increases. Review the use of Government Gateway in this context.

The service team should also:

  • Thoroughly test a clearer separation between ‘check’ and ‘challenge’ stages, where users complete the ‘check’ stage before they move on to ‘challenge’.

  • Test the service with those with accessibility needs.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Not Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met


Specific issues were identified as part of the assessment. These were too granular to mention in the assessment, but are listed below as issues the service should address in beta.

  • The service currently uses full-width line lengths.

  • Radio buttons use bold text.

  • There are places in the service where bullet points are used where it is unclear whether the use needs to select multiple bullet points or just one.

  • The obligations of the user are sometimes unclear. The service states that users ‘should’ include a supporting statement, but then gives them the option of whether or not to do so. It should be made clear whether a user must do this and if they don’t have to, why they might prefer to do it.

Published 13 April 2017