Close a company beta assessment

The report from the beta assessment for Companies House's close a company service on 5 April 2018.

From: Government Digital Service
Assessment date: 5/4/2018
Stage: Beta
Result: Not met
Service provider: Companies House

To meet the Standard the service should:

  • Have a clear plan for future user research.

  • Demonstrate that the team are able to conduct user research and apply this to iterating the service so that the user journey is improved and becomes simpler and more intuitive.
  • Resolve the items identified in the accompanying accessibility report.

About the service

Description

The Close a Company Service allows company directors (and those acting on behalf of directors) to apply to voluntarily close (dissolve or strike-off) a private limited company (LTD) or Limited Liability Partnership (LLP).

Service users

The users of this service are:

  • Directors of private limited companies (LTDs) and limited liability partnerships LLPs)

  • Presenters acting on behalf of directors, such as company secretaries, accountants, and solicitors.

Detail

User needs

The needs for the service are clearly understood by the team, with the service enabling users to dissolve a company online, and reduce the possibility of errors occuring that were present with the offline version.

The service has limited the possibility of a user trying to dissolve a company with a similar name by mistake which happened in a high profile case a few years ago - costing Companies House a considerable amount of money in damages and in their reputation. It has also reduced the number of users who don’t complete the journey due to not having the correct information to hand, or not knowing that they needed to pay for the service.

The team also has a good understanding of the users of the service, with the majority being individuals needing to dissolve their own company, or agents acting on behalf of their client(s).

The service currently has had approximately 40,000 transactions, which is quite substantial considering it is still in private beta. This has provided the team with a healthy sample of users to conduct research with.

However, it was clear in the assessment that they were restricted in what they could do given the resources at hand, as they are required to work on other projects which were seen as a higher priority by Companies House.

Resource issues have also limited the team’s ability to iterate the service, as have the restrictions that using Salesforce has imposed. This is something the team need to be able to do going into private beta.

Given the resource issues, and conflicting priorities, the panel was left with the impression that limited research has been conducted since the alpha assessment. This is by no means a fault of the team, and going forward the team needs to be resourced and supported correctly so the service can be developed.

This was reflected in the journey flow of the service being circular, and in the limited development and testing of other areas of the service like the save and return journey.

Although the service has been through an accessibility audit, the team has only involved a limited number of users with accessibility needs and/or low digital skills, and should look to further their user knowledge in this area going forward. Resource limitations have also restricted the team being able to develop an ongoing research plan which is critical in moving from private to public beta, and is expected when a team comes to a public beta assessment.

Team

“Close a company” is one of a number of different services which the product owner has responsibility for, and work was only carried out in some sprint cycles. Prioritisation was carried out across the services as a whole and because of this development of “Close a Company” was often in “bursts of activity”. The in-house team appeared to be working well together within their resource constraints, and it appeared they would make faster progress with more resource.

Technology

The team explained how the solution had been taken forward since the alpha assessment and in particular how it was divided into three main parts. The front end was currently based on a Salesforce solution, the middle part an API for handling the communications needed by the service, and the final part was a REST based interface into the existing business services at Companies House (CHIPS).

The team has considered the feedback given during the alpha phase, and stressed that the solution is based on a “share nothing architecture” with “no local file store” and it “can be spun up anywhere”. This architecture is well designed and has been put together in such a way that it can be segmented in a number of different ways.

This allows individual parts of the solution to be replaced as desired. For example the team has clearly now done the hard work around determining what is needed from a technical perspective to move away from the use of SalesForce to an alternative based on Spring. This is not a service where there is lock-in to a monolithic component.

The service uses Java (or Java derivatives) and OAuth 2.0 throughout. New code developed is published on Github in line with current GDS recommendations. There are plans to move to GovPay as soon as a feature around reconciliation required by this service is available. This is on the GovPay backlog.

The approach to security and risk management was explained in the pre-assessment technical call. The service has penetration tests and the items identified have been resolved. There are a number of additional controls in place outside the digital service which are important for protecting it from misuse. The team worked closely with the legal team at Companies House and showed a good understanding of the constraints this put on the service, such as the requirement to use the Advanced Electronic Signature standard and how this limited their technology choices. In private beta, the team had identified and worked closely with the supplier of Docusign to resolve a problem with the use of screen readers. They had also tested a number of different signing options and chosen one based on their user research.

The service had been successfully used in private beta by users from a wide variety of different browsers and devices.

Design

The prototype made good use of the GOV.UK design patterns to present information clearly, and the consistency with GOV.UK will give users confidence that this is the official Companies House service. However, the styles used are out of date, and it is recommended they update the service to reflect the latest GOV.UK styles.

The team were able to talk through different iterations of previous approaches, for example, document signing and signing on behalf of another director. However, they weren’t able to show or explain why previous iterations didn’t work for users.

It is not clear that the current design fully meets users needs. For example:

  • The confirming a company part of the flow is circular, asking the user to confirm several times;
  • the need to sign a document is a government need to match the offline service;
  • if the user successfully provides a signature, it is not apparent what they have to do next;
  • there is no feedback from the payment screens to say payment authorisation is pending or declined; and,
  • navigation back and forth between screens is not fully supported.

The team had engaged Digital Accessibility Centre (DAC) to conduct an audit of their service and they had received mixed feedback from their their accessibility tests.

Analytics

This service is part of the service platform pilot and it was explained there was close engagement with Government Digital Service on this. Piwick (Matomo) is the analytics platform. The team showed a number of metrics including improvements to customer satisfaction, and service growth of around 6% per annum. There had been some application of Tableau for providing management information.

The conversion rate for the service was low and the team could not yet explain how they were using analytics to inform user research and improve the service.

Recommendations

To pass the reassessment, the service team must:

  • Have a clear research plan going forward - understanding what the priorities are, and how they aim to conduct research and resolve and issues.

  • Demonstrate that they have been able to work as a team over a number of sprints, conducting research and acting upon it - iterate the service - acting on research and developing the user journey (circular issues and save and return journey).
  • Resolve the items identified in the accompanying accessibility report.

The service team should also:

  • Update the service to reflect the latest GOV.UK styles.
  • Consider carefully how activity is prioritised across the various Companies House digital services to ensure that enough sprint cycles are allocated to this service to sustain progress.
  • Consider putting a more detailed plan together as to how the service could transition from SalesForce to Spring, and from this determine the resource required to carry out such a change.

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Not met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Not met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 14 August 2018