Business Properties Rental Information - Beta Assessment

The report from the beta assessment of the VOA’s Business Properties Rental Information service on 18 November 2015 and reassessment on 16 February 2016.

Stage Beta
Service provider HMRC / VOA

About the service

Service Manager: Caroline Wrixon

Digital Leader: Sarah Wilkinson

One of the Valuation Office Agency’s fundamental responsibilities is to produce fair and accurate valuations of non domestic properties. Rental information for all non domestic properties is required in order to achieve this and the method of receiving the required information is currently via an 8 page ‘Form of Return’ (FOR) which is issued to customers in paper format to complete and return to us with the information required.

An online method (eFOR) for completing the form of return (referred to on page 1 of the paper FOR) which feeds the data directly into the database (after a validation process) already exists, but currently only 10% of customers use it – the other 90% of form submissions are by the paper form which is manually inputted in the Agency’s database. The Agency is looking to ultimately replace this paper FOR with a letter advising customers to visit their website and submit online, which will then feed directly into the database.

Results - 18 November 2015

Results: Not Met

After consideration, the assessment panel has concluded the Business Properties Rental Information service should not be given approval to launch on the domain as a Beta service.

Detail of the assessment - 18 November 2015

Lead Assessor: Jess McEvoy

The panel wants to make it clear that the development is very much going in the right direction. The team clearly works in agile and collaborative manner, and the handover to the new service manager is being handled very well. It is good to see that the service manager is responsible for the entirety of the service and has the ability to change the paper as well as the digital parts of the service.

User needs and user research

There has been insufficient research with users with low digital skills. Although there is a desire to carry out both prototype-based research on the service and research into support needs with users with low digital skills, this has not yet fully taken place due to issues with recruiting users. This is important because some assumptions have been made about the number of users requiring support, and about how support should be designed. Research will enable the team to test their assumptions and understand those users’ needs for support and start to make decisions during the private beta phase about how, specifically, to meet those needs. It may also help the team to improve the design of the on-screen service.

More broadly, the team were also unable to make clear statements about the granular user needs across the various audience segments for this service. This seemed to flow from the lack of any clear or systematic segmentation or profiling of the audience. While much useful research has been conducted during the beta so far, it has mainly been with small business users. Other audience sectors e.g. large businesses, and agents, have not featured significantly in recent research. The team mentioned various other audience dimensions - e.g. business size, agent/tenant, tenant experience, single/many returns - but there has been no systematic or broad based research in any of these areas.

Recruitment for the research that is being conducted has generally been through internal channels e.g. people who have contacted the call centre. As a consequence, the sample consists primarily of people who already have some familiarity with VOA and with the process. There is a risk in this approach that important areas of user need will be missed.

The team

The team are working in an agile way, and most disciplines are represented. The GDS design and interaction patterns are being followed, however, the panel had concerns around the front-end developer acting as a designer instead of this being a separate role. Given that the team is not observing user research sessions and are shortly to begin sharing their user researcher with another service, this is not sustainable and does not meet the service standard.


Although the team has enhanced its existing contact centre support and plans to test and measure this in beta, it was not demonstrated why this support was chosen over other providers or channels (e.g. face to face) to meet user needs. This is a requirement for services to move to public beta and was a clear recommendation in the report following the alpha assessment.

The panel raised two issues with the letter that is sent to users asking them to use the digital service. The logo used on this letter is poor quality and makes the letter look like it could be fake; and the content of the letter is still worded in such a way that it sounds like a user needs to send in a paper form.

The agency logo is applied to every page of the digital service despite lack of justification in terms of user need. The team should also test removing the logo and tracking the effect on completion rates and takeup.

Recommendations - 18 November 2015

Point 1

Make identification/recruitment of assisted digital users a priority. There has been some work to understand who will have support needs, their barriers to using the digital service independently and what support provision will be effective but more needs to be carried out to ensure the needs are well understood before making decisions around appropriate support. The GDS assisted digital team can help with further guidance on what service teams should have covered by the beta assessment.

Develop robust and systematic profiles or segmentations of the audience for the service, based on qualitative and quantitative data, which includes attitudinal and demographic dimensions relevant to each.

Develop a set of user needs statements, based on this research, which reflect the breadth of need across the audiences. A robust user needs statement typically has the following qualities:

  • It is something a real user would say

  • It helps you to design and prioritise

  • It does not unnecessarily constrain possible solutions

  • It is based on good evidence

Develop a research plan which addresses each audience sector identified in the segmentation/profile and which includes a range of research methods - including contextual research and lab research - to clarify user needs that can be further used to iterate the prototype.

Point 2

Consider using a user research lab and an external recruiter. This will allow recruitment of people from the target audience who have no knowledge of the form or process (supporting a more robust understanding of user need) and it will allow the whole team to view research (which will support team buy-in and understanding).

Expand the range of user-types who are recruited for forthcoming research. Include respondents from other parts of the audience, including large businesses and large agents.

Use the findings to identify the key dimensions of user need and user behaviour, and from this develop a profile, model or segmentation of the audience against relevant dimensions, which can be used as a basis for describing user need at a more granular level, and for future research recruitment.

Point 3

Consider recruiting a designer (ideally the panel would recommend this to be on full-time basis). This will complement the existing skills within the team (front-end developer, user researcher) and ensure that the important interplay between user research and design is fully in place.

As mentioned above, effort should be made to involve the entire team in user research.

Point 12

Design support to meet user needs, based on user research. All options for support should be explored, including whether a face-to-face route is appropriate. If they are planning to use it, the team should fully understand the support HMRC is proposing and ensure that it meets user needs of the this specific service. All support should be set up and ready to test alongside the onscreen part of the service during public beta.


The assessment team would like to thank the service team for their well-informed answers to our questions, we look forward to hearing about what the service team learn from their research and seeing how the service develops.

Digital Service Standard criteria - 18 November 2015

Criteria Passed Criteria Passed
1 No 2 No
3 No 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 No
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes

Results - 16 February 2016

Results: Met

The Business Properties Rental Information service is seeking permission to launch on a domain as a Beta service and has been reviewed against the remaining 4 points of the Digital Service Standard that the service did not meet in the original beta assessment.

Detail of the assessment - 16 February 2016

Lead Assessor: Jess McEvoy

After consideration the assessment panel has concluded the Business Properties Rental Information service has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to launch as a Beta service on a domain.

The service was reassessed against points 1, 2, 3 and 12 of the Digital by Default Service Standard.

Point 1

Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service.

The service team has made some progress on identifying users with Assisted Digital (AD) needs and in conducting research with that group. Six AD users (most of whom were trades people eg builders renting a yard) were successfully obtained through an external market research recruitment agency. It is encouraging that the team has found a viable and robust route to access their AD audience.

The evidence from this small sample supported the team’s hypothesis that phone support is sufficient to meet AD needs for these users. However the number of people tested with to date is too low to base any final AD-related service decision. While a further 17 AD users have been encountered via the call centre, they confirm the existence of demand for a call centre, not the absence of demand for face-to-face contact.

The team has made a solid start on the AD needs the service must meet and the initial externally recruited research is sufficient for a pass at this assessment but the panel would expect this enquiry to continue and for further evidence to be presented at the next assessment.

On the team’s wider research effort, the panel noted that the points made at the previous assessment had been largely addressed. A detailed set of profiles has been developed which break down the user base. They have been based on the body of existing research and on new research with the various parts of the audience.

The team has identified potential users of assisted digital support and had developed a good understanding of needs, using their findings to develop an assisted digital persona. The team has a good plan to expand their user research during beta and to test their assumptions around likely support required.

The service should continue with their plans for beta; to continue with assisted digital user research, test their hypothesis and start to iterate their service. They should also test the digital service with people who have lower confidence and digital skills.

Point 2

Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users to improve the service.

The team has a research plan for the next 6 sprints, involving 8 users pers sprint, 4 SMEs and 4 agents. The team hopes to include 3 AD users among the 8, although the team was concerned that they might struggle to find participants. Six of the 17 contact centre-acquired AD users are also available for future research (although they are likely to skew the sample towards the “call-center only” solution for AD).

Overall, the plan seemed viable and comprehensive and contained a strong element of research conducted in the user’s environment which is a commendable strength. However the panel was somewhat concerned that the AD element in the sample seemed to be viewed a nice-to-have rather than essential. And the lack of any lab-based research means that the beneficial effect of this on team understanding will be missing from forthcoming work (streamed research is a less effective alternative).

Point 3

Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

The team is working in an agile way, and most disciplines are well represented. The panel was pleased to see that the number of dedicated user researchers has increased. The front-end developer is still acting as a designer instead of this being a separate role, but the panel was pleased to see that the team have been working with a senior designer at HMRC to validate design decisions. The team has also begun to observe (remotely) user research sessions and are shortly to run some lab-based sessions that the team will be able to observe.

Point 12

Create a service that is simple and intuitive enough that users succeed first time.

The team has tested the service with AD users and found that they were able to complete it, though not without some difficulty. The interaction design of the service is good and the appropriate design patterns have been used. To ensure that the quality of the design is maintained the team should continue to consult regularly with a designer.

Recommendations - 16 February 2016

Point 1

The team must continue to test the current hypothesis (that face-to-face support is unnecessary) by regular ongoing recruitment of users with AD needs so that a larger body of solid evidence can be demonstrated on this point at the next assessment. The team may wish to consider a research sample of 30-40 as an adequate level of recruitment through this route.

Point 2

The team should ensure that AD users are an integral part of the research sample in each sprint.

Ensure that some rounds of prototype research are conducted in a lab with observation facilities, so that the whole team can observe effectively and without distraction.

Point 3

The team should continue to work with the HMRC design resource. Ensure the team are regularly observing user research and continue to work with the HMRC designer.

Point 12

Continue the work to ensure the service includes support routes that address the needs of users who will not be able to complete the service independently.

Digital Service Standard points - 16 February 2016

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Help us improve GOV.UK

Don’t include personal or financial information like your National Insurance number or credit card details.