File Company Accounts - Alpha Assessment

The report from the alpha assessment for Companies House - File Company Accounts 15 February 2017.

Stage Alpha
Result Met
Service provider Companies House

The service met the Standard because:

  • This is a capable team working in a collaborative and iterative fashion, led by an empowered service manager with good relationships to his SRO as well as Policy and Legal colleagues

  • User research is being taken seriously, with sessions streamed to the office and findings shared with the whole team and stakeholders

  • The service has been tested with a broad range of users during the Alpha, with findings widely shared in the team. This has already allowed research to focus on the users who will have the greatest challenge as they proceed into Beta

  • The team have had a good technical approach during Alpha - the service team have worked with HTML prototypes in Alpha, allowing them to quickly iterate, while spikes into the production architecture have allowed them to reduce some of the larger risks to the beta development.

About the service

Service Manager: Johnny Pagliaro

Digital Leader: Emma Stace

Description

The service allows companies to comply with their legislative requirement that they annually file company accounts, as signed off by the directors, for the public record.

Service users

The users of this service are primarily accountants and company directors, who can be categorised into DIY filers (e.g. directors wishing to file company accounts themselves) and Agents (those filing on behalf of someone else - usually accountants or secretarial agents); some companies/agents will have an accounting software package which includes the functionality to file directly via Companies House APIs.

Detail

Lead Assessor: Tom Dolan

User needs

The panel was pleased to see the user research work that was done so far. This has involved prototype testing, call listening at the Companies House helpdesk and surveys. It was good to hear how involved the team is with the research sessions.

After seeing the demo and hearing the explanation provided we understood that there 4 major user groups: Big accountants firms, small accountants firms, companies with experience and companies without experience, which were referred to many times. A set of Personas were presented in passing didn’t match with this, which could mean that there were some information that is still to be gathered.

We were pleased to see that even though 80% of your users are accountants, the team had already started to concentrate on self-filing users as those were having more difficulty with the service. Some of the experiment design was good - the sample accounts submitted to the assessors required users to manage missing fields and explore common synonyms for accounting terms.

We encourage the team to continue developing their understanding of lower-skilled users that might need extra support; while exploring the service in preparing for the assessment the panel found themselves more confused than the team’s user research to date would indicate, which may indicate confirmation bias in the research. We are also concerned that the authentication code may be more confusing for novice users than it initially appears. However the panel acknowledge they may not be representative. At Beta we look forward to more developed evidence in this area.

The team plan to use events run by Company House as an additional way of accessing users - a good approach for quick feedback gathering or usability testing.

As mentioned above, the panel was really pleased to see how all the team was involved in the user research and how the designer could easily explain the design approach thanks to user feedback and user needs. We look forward to hearing more about the outcomes of this close collaboration at Beta.

Team

The service is the product of a collaboration between two teams. One team is developing a new platform for Companies House, including public-facing APIs for accounting software vendors; the other is extending this work to develop the service that allows for manual filing. The platform team are implementing changes needed for this filing service alongside their other API work.

Collaboration between these two teams appears to be good, with back-end developers attending user research sessions through the Alpha and creative contributions to all areas of the service welcome from all parties. While the panel were initially concerned about tensions between the two teams, they were pleased to hear that user research had contributed to changes to the APIs. The team were able to talk about how they had improved their processes during Alpha.

User research is continuing every sprint during the Beta.

Close work with policy and helpdesk users has led to non-obvious changes to the service - for example the need to remove validation as the accounts submitted must match those agreed by the directors, even if incorrect.

A content designer has recently joined the team, and the service manager was able to talk about how this has changed their ways of working.

Technology

The team are to be commended on their approach to the alpha. An iterable HTML prototype that allows quick turnaround, coupled with investigations into what a beta tech stack should look like has set the team up well for the beta.

The panel were pleased to note that the team had considered the rollout of the API into the vendor community, and have an existing community that they are engaged with. This includes dealing with potential impacts on existing business processes.

The team demonstrated a good understanding of the current usage patterns of the existing process and are well placed to model this through the beta to ensure that the service meets it’s likely use.

The continuing use of the single authentication code for companies is an acknowledged problem, the panel were pleased to see that there is an ongoing piece of work to replace this. This team should continue to be engaged in this work ensuring its continued importance and should roll it into this service at the earliest opportunity.

Given the hard deadline that this team face and the stated dependency on a platform migration, it would be sensible for the team to ensure that the alternative plan is tested.

Design

The team have built and iterated a variety of prototypes that are generally consistent with the GDS design patterns to solve the user needs. It would be good to see more user testing and iteration on specific areas of the service.

It was good to hear the team were challenging the the account authentication code and talking with the legal team about making this simpler. It could be made clearer where mentioned where authentication codes are found.

The team showed designs for making alternative ways to access the service clearer on the start page. They also tested with some users with access needs and have a plan to test with a digital accessibility centre to improve the service if needed.

There is plan in place for when the service is down and the team are working to ensure error screens are clear for users. There is also a plan in place for when there is maintenance on the site so users do not receive penalties.

It wasn’t clear why the flow from the GOV.UK page goes from a ‘before you start’ page to a search page rather than straight to the sign in page and then on into the file accounts flow. It doesn’t look like the team are doing the hard work to make this quicker and simpler for users here.

Some content is inconsistent across the service but as the team now has a full time content designer we are confident this will be addressed and improved in due course.

Analytics

The team have good benchmarks from the existing service, including user satisfaction. They were able to talk about how completion rate had improved during Alpha, and have registered for the performance platform.

Recommendations

To pass the next assessment, the service team must:

  • Research further with the lower skilled and novice users to explore their needs. At Beta we look forward to more developed evidence in this area.

  • Continue to explore the authentication code. Provide more evidence around the use by lower skilled users and update the panel on any policy/legal developments in this area

  • Show how they have used their content designer to improve the service

  • Show that on the initial filing journey they have ‘done the hard work to make it simple’, giving evidence that they have tested a range of journey approaches around logging in or compulsory notes, particularly with lower-skilled users. We recommend this is done early in Beta before it becomes too difficult to change

  • Test with the minister responsible

The service team should also:

  • Be able to better explain your user groups and how they match with any Personas you are using in your design process

  • Conduct some contextual research so they can see how users are dealing with this work in their own environment

  • The team stated that they should know by the end of March whether the new platform is on track. Any suspicion of delay should mean the team fully test the deployment on the existing stack

  • Describe any improvements in the wider service design, ensuring that company directors are aware of their obligations and know they have been met

  • Consider the use of Government as a Platform components.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it N/A at Alpha

Updates to this page

Published 21 July 2017