File Company Accounts - beta

The report from the beta assessment for Companies House's File company accounts service on 28 February 2018.

Service Standard assessment report

File Company Accounts

From: Government Digital Service
Assessment date: 28 February 2018
Stage: Beta
Result: Met
Service provider: Companies House

The service met the Standard because:

  • The team have iterated and significantly improved the service during their closed trial phase. They were able to talk in detail about a wide variety of user research, changes they had made as a result, and even prototypes discarded.
  • They’ve “done the hard work to make it simple” in allowing novice users to easily parse company accounts and enter them as structured data, despite the many optional elements accountants might choose to add.
  • They are a co-located multidisciplinary team, empowered to make changes to the service
  • They are not significantly constrained by their underlying platform, have appropriate technical controls in place around the service, and have started to make their code open
  • They have made extensive use of real-time performance data and other analytics to improve the service, and are making this more widely visible across the organisation
  • They have tested the service with the minister responsible.

About the service

Description

The service allows companies to comply with their legislative requirement that they annually file company accounts, as signed off by the directors for distribution to shareholders, for the public record.

Service users

The users of this service are primarily accountants and company directors, who can be categorised into DIY filers (e.g. directors wishing to file company accounts themselves) and Agents (those filing on behalf of someone else - usually accountants or secretarial agents); some companies/agents will have an accounting software package which includes the functionality to file directly via Companies House APIs.

Detail

User needs

The team has evidently done a lot of work to understand its users and their needs since its last assessment, and the panel was impressed by the team’s commitment to user research, particularly with users who have with disabilities.

The team finds most of its users to participate in research using a combination of a participant panel created by Companies House, but complements this with pop-up research, surveying and using a research recruitment agency to ensure it reaches a representative spread of the different types of people who might use the service.

The panel was pleased by the whole team’s involvement in user research and analysis, including observing live lab-based research sessions remotely, and its participation in research debriefs and ‘download’ sessions.

In its user research, the team focuses on users who are likely to find the service most challenging. This includes company directors who lack experience or confidence in terminology relating to accounting. The team demonstrated where it had iterated content to better meet these users’ needs, though the team acknowledged that some terminology remains unfamiliar to some users. The panel encourages the team to continue to test, learn and find ways to meet these users’ needs even better..

The team is clearly committed to understand and meet the needs of users with disabilities and its research participants include users with cerebral palsy, multiple sclerosis and visual impairments. The team talked the panel through findings of its DAC audit, and showed the panel how it iterated based on those findings; for example, by changing the way radio buttons are labelled and improving usability for users of screen reading software.

The panel was impressed by the team’s use of analytics and data to understand its service’s users. As the service matures the team should strengthen the tie between analytics and research to better understand how its service design influences users’ behaviour.

Team

The team are co-located and work in close proximity to their SRO. The service manager confirmed there are no immediate plans to change the team during the public beta phase. They are using a full set of agile processes and ceremonies, with user research and analytics also feeding stories into the backlog. The team working in sprints, but the cycle time of a story is less than a day if necessary.

Since the Alpha assessment, they have added a content designer to the team; they also now have a team member dedicated to looking at performance analytics. They were able to talk about how content design skills have changed the way they approach problems. The analytics expert attended the assessment and provided a wealth of information.

The service manager was able to talk about how the team are managing a wider range of stakeholders during the private beta phase, and gave some insight into how they are evolving their communication approaches.

We were also pleased to hear about the high profile of user research and analytics work within Companies House. The user researcher will be sitting among the platform development team for a day a week, and real-time data about the service is being shared on TV screens around their offices.

We were delighted to see the enthusiasm the team brought to the assessment, how well the different disciplines collaborated on answering questions from the panel, and the shared understanding of the users and their needs.

Technology

The team are able to make and release changes quickly, with a modern micro-service architecture enabling a mix of technologies to be used, and components to be scaled out separately as needed. They are adopting continuous delivery processes, with zero-downtime deployments already in place.

Although development was started on a private repository, the team have since released a snapshot of the code as open source, and will switch to coding in the open as soon as an internal security review is complete. They are not currently using any standard government platforms, but have plans to move to Notify for sending emails and Pay for payment in future.

The service has already been shown to support their peak traffic (around 4,200 submissions in the month of December) and has a disaster-recovery environment in a separate data centre that is tested regularly and has been used in production during a migration already.

The team has a good testing workflow, including automating cross-browser testing with Browserstack and continuous integration, however there were some minor browser issues seen during the demonstration and at first they were not able to complete a submission due to a 404 error. The panel notes, however, that this latter issue was fixed during the assessment - providing evidence that the team have the ability to fix issues quickly, and/or that the infrastructure has a certain amount of internal health-checking and self-healing ability.

We recommend this service passes the technology part of the assessment. We also recommend that they continue to improve their testing/release verification processes to try and prevent issues like this making it into production.

Design

Iterating

The team made the journey from GOV.UK simpler, this performed well in usability testing and the team are setting up a start page following the beta assessment.

They looked at making it easier to find the authentication codes, suggesting they could be sent out with the Companies House reminder letters. Although it isn’t possible to move away from the current authentication method it’s good to hear that Companies House are exploring better ways to identify users as part of wider transformation work.

Service flows have been iterated and changed based on user research, for example, they’ve integrated the ‘notes’ questions into a one-thing-per-page journey which is clearer for users.

Accessibility

The team tested with assisted digital users and implemented changes, for example, they added additional content to make the options for a multiple choice question clearer for screen-reader users who were assuming there were 2 options where there were three.

Our review highlighted that markup for the balance sheet needs more work, there were also concerns about the inputs being so far from the labels. That said, the team could show they have conducted thorough usability research into this new layout and have made improvements where needed. We recommend this service conditionally passes the assessment with a plan to fix accessibility issues raised.

Support

The service highlights a support route at the start of the journey and the team have worked with the call centre to make sure they understand what problems users are having.

The team iterated content on the ‘before you start’ page to help users understand whether the service is right for them and support requests reduced significantly.

I’m wary the team are not questioning existing paper processes enough, for example, the content on the legal declaration reflects the paper forms but could be simplified and moved earlier in the journey.

Consistency with GOV.UK

The service is generally consistent with the GOV.UK design patterns and style guide however they’ve developed a new form pattern for making it easier for users to fill in the balance sheet. Although they did not test existing form patterns for this feature they tested the new design thoroughly and openly shared the designs through blogs.

The service does not currently use the same font as GOV.UK due to licensing restrictions (the private beta was on a separate domain) but this is expected to be resolved once a service.gov.uk domain is being used.

Analytics

The team has ready access to data about most aspects of the service, and demonstrated this excellently by giving relevant data on completion rates, drop-off rates, time taken on each page etc throughout the demonstration.

There is a clear set of metrics used to measure the effectiveness of the service over time, and to compare with other similar Companies House services.

The team has a dedicated analytics expert who works closely with user researchers, identifying and testing hypotheses about user behaviour. They gave good examples of how analytics findings had led directly to changes in the service and to improvements in the user experience.

The team outlined a number of improvements they are planning to make to analytics, including setting up more events to check data validation problems.

Data is shared with all members of the team through freely accessible dashboards showing a range of metrics for the service. The data is backed up by contextual information and insights are given as a narrative. Dashboards are displayed throughout the office space. Good practice, findings, and data are shared more widely across the organisation through blog posts.

The service has an exemption from publishing data on the Performance Platform as it is participating in a pilot scheme using an alternative set of metrics.

The panel was impressed by the teams approach to, and use of, analytics.

Recommendations

To pass the next assessment, the service team must:

  • demonstrate how they have improved the operational maturity of the service and proven their capability to respond to incidents, for example with game days
  • show how they are benchmarking and tracking errors (such as the 4* and 5* errors seen by the assessors during their testing) in order to minimise unknown degradation of service as it scales
  • be able to talk about how the knowledge and insight gained by this team will be used in services for the filing of many other types of company accounts, and how this service is expected to gain from that understanding
  • fix accessibility issues raised as part of the beta assessment
  • demonstrate user journeys starting from GOV.UK and talk about any findings

The service team should also:

  • Provide an update on using other common platforms, such as Notify
  • Talk about how they will support changes to the service - including to the API - over the years to come
  • Provide a further update on thinking around the authentication code
  • Adopt the GOV.UK font once they are on a service.gov.uk domain

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

This service now has permission to launch on a GOV.UK service domain. These instructions explain how to set up your *.service.gov.uk domain.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Submit feedback

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met*
18 Testing the service with the minister responsible for it Met
Published 6 August 2018