Check or challenge a business rates valuation beta assessment

The report from the beta assessment for VOA's check or challenge a business rates valuation service on 9 March 2017.

From: Central Digital and Data Office
Assessment date: 09 March 2017
Stage: Beta
Result: Not met
Service provider: Valuation Office Agency

To meet the Standard the service should:

  • Improve service content to ensure it is understandable to users unfamiliar with VOA processes, using clear language and terms where possible, and contextual help to support users where necessary.
  • Ensure the service supports open standard formats.
  • Engage with the GOV.UK Verify team.

About the service

Description

The service allows users to check the rateable value for their property or properties and suggest changes with supporting evidence.

Service users

The users of this service are businesses with a commercial property or properties and agents acting on their behalf.

Detail

User needs

User groups

The service team demonstrated a good understanding of the users of the service, segmenting users into three main groups:

  • small to medium enterprises (SMEs)
  • experienced SMEs and professionals
  • agents.

These groups differ in terms of engagement and knowledge of VOA processes. Larger businesses are more likely to employ agents to act on their behalf, although smaller businesses may also be represented.

The team have ensured that research and testing has covered a range of user types, accounting for geographic location, disability, business type, socio-economic and digital ability. The team routinely index users against the Digital Inclusion scale.

The team has good plans to learn from the beta, including using analytics and research at contact centres. The team are making good efforts to ensure that the beta is inclusive and reaches a wide range of users.

Researching support needs

Users who may need support have been recruited through ongoing user research engagement and through third-party recruitment. These users are more likely to be associated with smaller businesses.

The team have well-developed communication routes with local authorities and can signpost users to support options through rates bills. The team are developing support scripts for contact centre staff. Users with low digital skills but online access will be supported to complete the service online, while some may be sent details to update and return.

Team

Team coverage provides for the key roles and responsibilities for a service team in the alpha phase, with the team continuing into the public beta phase of delivery. The team consists of a good mix of civil service and contractor staff. The panel was pleased to see that the team had recruited a performance analyst.

The service is dependent on other teams within HMRC and VOA to provide service functionality (see Platform, Legacy constraints, Open Standards and common platforms sections). The team should continuously engage with these teams to ensure the functionality the service requires and relies upon is provided and maintained.

Technology

Platform

The service is hosted on HMRC’s Tax Platform therefore much of the technical stack and operational process is common to similarly hosted services. The team have considered the comparative benefits and constraints of using the Tax Platform, and the explained the reasoning behind the decision to the panel.

Performance testing and monitoring

The team are conducting performance testing (as mandated by HMRC’s web operations function) prior to the service moving into public beta. The team have a well reasoned estimate for expected initial traffic to the service, including possible early peaks resulting from legislative requirements from 1st April 2017.

The service uses the operational monitoring and first-line support provided by the common HMRC web operation function, suggesting the service is operationally ready to go into public beta. VOA are involved with the third-line support desk, and have developed a plan for notifying users of the service being unavailable due to technical issues.

Legacy constraints

The service is highly dependent on a VOA legacy back end system, operated by VOA. This system uses large, infrequent releases to deliver the APIs the service relies on to source its data. VOA are engaged in a wider transformation project that will modernise this legacy system, and while this is essential to the long-term success of this and other transformative digital services, the team must ensure that they effectively communicate and coordinate with the legacy team.

Service uptime

Ahead of the service’s public beta launch, the panel recommends further practice of zero downtime deployment to avoid the risk of needing to close the service for updates. The team have some features to complete and bugs to resolve before launch, and the panel recommends the team approach this with careful consideration given the hard deadline for launching the service.

Open standards and common platforms

The team provided sound reasoning for why Government Gateway was chosen as the initial identity assurance route for the service, with plans to integrate GOV.UK Verify in the longer term. Although the team have engaged with HMRC’s Verify leads, they are yet to engage with the GOV.UK Verify team as recommended by the panel following the alpha assessment.

The service currently only supports proprietary formats in the form of JPEG and PDF formats (with plans to support Word and Excel proprietary formats in the near future). Open standard document formats such as ODF and PNG are currently not supported. While the panel recognise that the functionality for file uploads is provided by an HMRC team, a recommendation to support open formats was made following the alpha assessment for the Challenge a Valuation service, and the panel expected more headway to have been made particularly as the VOA is the primary consumer of this function. The team must ensure the service supports open standards. This could be achieved by working closely (and potentially directly supporting) the development of additional functionality, or by developing a service level solution (for example, through use of file conversion).

Open source

The team have open sourced approximately half of their source code, with further publication planned.

Design

User journey

The service presents users with points within the service where it’s not clear what they are expected to do next. The ‘valuation information’ and ‘dashboard’ sections are examples of this. While these points may represent meaningful end-points from a VOA perspective, this may not be apparent to users of the service who may need assurance that the journey is complete, or direction as to how to continue with the service.

Whilst a dashboard model may work well for agents, linking directly to the list of properties may make more sense to non-represented users. The panel has previously cautioned against the dashboard approach, and further development of the service should look to establish whether different service models should be used for agents and non-agent user groups.

While improvements have been made to the registration process, further simplification is required. The same information (such as name and contact details) are currently collected multiple times, presenting unnecessary steps to users of the service.

Content

The service uses the language of internal VOA processes rather than terms that users would understand, or terms that describe what a user is trying to do. While agents may understand the terms used, infrequent and inexperienced users of the service will likely struggle. The service uses contextual help to support users in understanding complex terms, however contextual help is both implemented inconsistently and used as a substitute for using simple, understandable language. The team must review the service’s content approach and amend accordingly, and not use contextual help as a substitute for making the service clearer for users.

A content review will be shared with the team in addition to this report. The review’s recommendations should be followed in addition to outstanding issues outlined in the content review following the alpha assessment.

Testing the end-to-end service

The team have tested the service across a range of browsers and devices, and the service is tested using testing software. Limited user testing has been conducted with users with low digital skills on mobile and tablet devices; the service team should look to address this.

Analytics

The service team now have a performance analyst working on the team, and are using funnels to understand user progression through the service and to identify drop off points. Performance analytics is providing insight into areas for further user research.

The team will present the four mandatory key performance indicators (KPIs) and are currently working on calculating cost per transaction.

Recommendations

To pass the reassessment, the service team must:

  • Improve service content to ensure it is understandable to users unfamiliar with VOA processes, using clear language and terms where possible, and contextual help to support users where necessary.
  • Ensure the service supports open standard formats.
  • Engage with the GOV.UK Verify team.

The service team should also:

  • Engage with teams across VOA and HMRC to ensure the functionality the service requires and relies upon is provided and maintained.
  • Practice zero downtime deployment to avoid the risk of needing to close the service for updates.
  • Complete features and resolve outstanding bugs ahead of launch.
  • Continue to improve user navigation through the service, making next steps clear to users.
  • Explore options for separating the agent and non-agent interfaces.
  • Conduct further user testing with users with low digital skills on mobile and tablet devices.

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Not met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 24 July 2018