Register your company - beta

The report from the beta assessment for Companies House and HMRC's Register your company service on 13 February 2018.

Register your company

From: Central Digital and Data Office
Assessment date: 13 February 2018
Stage: Beta
Result: Met
Service provider: HMRC and Companies House

The service met the Standard because:

  • The team have successfully made an intuitive transaction that users can complete first time.

  • A multi-discipline team is in place, working collaboratively and iteratively, despite being based in different locations and departments.

  • The team has made strong efforts to follow GOV.UK patterns and components.

About the service

Description

The service allows users to create a company by simultaneously registering it with Companies House (known as incorporation) and HMRC for paying Corporation Tax (CT), Pay as You Earn (PAYE). It will eventually include Value Added Tax (VAT).

Service users

The users of this service are members of the public who want to register their limited company with Companies House, set up CT, and set up PAYE.

Detail

User needs

The panel was overall very impressed with the amount of research undertaken by the team and the amount of time dedicated to understanding the user needs.

The panel were impressed with the way that the team is working to fully understand and fix the issues people face describing what their company does. The team is already planning on passing their research findings to ONS so that they can work to improve their data.

The panel liked the way that the users data was prepopulated despite jumping between Companies House and HMRC, especially considering the amount of times the same information could be asked for. This should also help improve the quality and consistency of the data Companies House and HMRC receive.

While the research within the digital service is to be commended, it would be good for the team to expand some of their research activities to include the user needs within the broader journey. This is especially true towards the end of the digital service, the panel felt that the teams knowledge of their users onward journey dropped off almost immediately.

The panel were particularly concerned with the two letters sent out to the user at the end of the journey. The user is sent out a letter with their Gov Gateway pin within it to access their online account, but are also sent out a letter with a reference number on it (known as the UTR) which is accessible on the online service. The service informs the user that they will be receiving a letter and that they must keep it safe, but the team could not explain why this letter was required. More research should be carried out to understand what the users need this information for and what they do with the letter when they receive it in the post.

The team have a good understanding of onward paths to other business taxes and good support for that, but hadn’t considered any other things a person setting up a business would need to do. And whether the things a user gets from this service work well to support the onward steps.

The panel were concerned that the success page on the service explicitly tells the user to ignore part of the official letter they will receive in the post. The service team noted this and said that they are working on redrafting the official letter that will be sent to incorporate the user needs of people who have used the online service.

The panel was happy with the amount of research planned to look into the current paper route. The volumes of people using this route are low and the panel felt that the proposed research was proportionate to this.

The panel were impressed with the amount of consideration for assisted digital access. The service team have ensured that the user has a smooth interaction with the Companies House contact centre even though the service jumps between Companies House and HMRC platforms. However, the panel felt that the team should spend more time researching the users going through the 3rd party route in case they are using 3rd parties because of low digital skills, which is what similar services have found.

The team have gone to DAC for an accessibility audit, but have done little direct research with people with disabilities. It would be beneficial to identify most likely contextual barriers for people with disabilities and do more research so they can reduce or even remove those barriers - for example sending PINs and letters to people with visual impairments, routing people with hearing and communication impairments to phone support.

Team

The teams working on this service are split between Cardiff and Telford. The panel was impressed with the amount of effort that the teams are putting into working together despite the distance. They have shown great use of technology to bridge the divide and ensure collaborative working.

The team is set to grow to enable the delivery of the VAT part of the service, the panel feels that it is important to maintain the current level of collaboration and interaction between the teams.

Technology

The team demonstrated that they have a good development process in place with a good choice of technology. The team is aware of the dependencies on the release processes between HMRC and CH and ensure there is good communication and discussions on the upcoming releases and iterations.

The team have a good volumetric model. There is a procedure in place for load testing to handle unanticipated traffic and are able to scale to the demand.

The service model is well understood within the team. There is good level of monitoring in place for both infrastructure. Tools such as Sensu and PagerDuty are in use for incident management. Alerts are configurable and the teams are notified via Slack. If an incident occurs, the on-call person is notified and the issue is then escalated to the responsible development teams.

The team demonstrated that they had a good testing procedure in place. The QA environment is stubbed and the data used for testing is anonymised. The team is also able to carry out end to end testing between CH and HMRC.

The team code in the open and demonstrated a good understanding of patching security fixes in the open. There is good evidence of code reuse.

The team showed a good understanding of the security threats to the service. The risk to the service and the use of software is owned by the Senior Responsible Officer. The panel encourages the team to carry out threat analysis of the service as a whole team to identify security threats when new features are implemented.

The service currently uses Barclaycard Smartpay service for payments but are in the process of engaging with GDS to start using GOV.UK Pay by end of the year.

The URLs change from tax.service for HMRC and ewf.companieshouse for the CH portion of the service during the user journey which looks inconsistent. The panel recommends the service uses a standard URL throughout the service.

Design

The team have successfully made an intuitive transaction that users can complete first time. They’ve shown that they can iterate based on a sophisticated combination of user research, performance data and analysis of support tickets. When something is causing users a problem the team know about it, and they are able to make effective and appropriate design changes which make the process clearer. The improvements made to the way shares are granted are a great example of this.

Less well-resolved is what happens once the transaction is complete. The service doesn’t make it easy for the user to know what they need to do next.

This is exemplified by the volume and consistency of emails sent by the service. The user receives 8 emails over the course of the transaction. The information and reference numbers required to sign back in or take the next steps in the wider journey of setting up a company are spread between the different emails. This will cause problems for the user later, especially since their company’s next interaction with government could be months later (for example filing their corporation tax).

This proliferation of emails happens because the service connects to multiple legacy systems, not because it’s been deliberately designed that way. For this reason the emails are also inconsistent in terms of:

  • their branding
  • the design of template they use
  • the email address they are sent from

The team should make the emails consistent for the user. The panel recommends investigating GOV.UK Notify, which would ensure a consistent template and sending address.

The second major concern that the panel had was the implications of a person being wrongfully identified as a director or stakeholder of a company. The service team have not given this issue enough consideration.

There are no checks that ensure that the directors and stakeholders are aware of the company or represent genuine identities. The onus is on the person who’s been wrongly identified to realise that this has happened and fix the problem.

The team should take the time to fully understand:

  • the implications of being wrongly identified as a director
  • what additional controls and checks are needed to reduce the risk of this happening

This could be as simple as requiring the directors and stakeholders to also log into the service using Government Gateway or GOV.UK Verify.

The panel’s third concern was with the electronic signature page. While alternatives to wet signatures are encouraged, this particular solution adds no additional security because it relies on personal information that is often in the public domain (for example, mother’s maiden name and eye colour). As such it offers less protection against identity fraud than an ink signature. The team should remove this part of the service entirely.

Analytics

The team showed that they have used data and analytics to feed into building the service and have thought about what metrics they need to use to measure their service. The service dashboard is set up on the Performance Platform to provide their mandatory KPI data. The team followed the recommendations given in the Alpha report to run a Performance Framework session. The panel would encourage the team to continue to revisit this Framework and use it as a working document as the service moves into public beta. There is more information on https://dataingovernment.blog.gov.uk/2016/11/02/setting-up-a-performance-framework-for-the-uk-parliament-website/ describing how they can do this.

While the team don’t have a full time Performance Analyst they have dedicated time each sprint from a Google Analytics specialist at HMRC. They also have access to a splunk specialist and wider support from the HMRC Analytics Team. Interestingly, the team are experimenting with a dual deployment of Google Analytics 360 alongside Piwik. Presently, the tools are aligning but thought should be given to the longer term strategy for this and which source can be relied upon to be the single point of truth.

The team acknowledge the difficulty with reporting across the two different domains as users move between HMRC and Companies House. They are regularly iterating to improve the quality of data available, for example recently updating the referral exclusion list within Google Analytics. We would encourage the team the continue working to resolve this and to request support from GDS in finding a longer term fix.

The team are making use of a variety of data sources, including regular analysis of Deskpro tickets. We were pleased to hear that tools are being developed to partially automate the analysis of this feedback as it is important to ensure the use of data can scale as the service moves into public beta. ‘Temperature checks’ were mentioned as a technique for monitoring social media data. We would suggest that the team add this to their Performance Framework and implement a more robust way of analysing this data - perhaps by looking at pre-packaged tools or using the Twitter API.

For the next assessment it would be good to highlight more about all the outcomes users are experiencing, not just those who succeed. Working closely with user research, it would be useful for the team to demonstrate drop out across the upstream steps. It would also be helpful to hear about the strategy behind the AB testing which the team plan to carry out over the next few months and the findings from these tests.

Recommendations

To pass the next assessment, the service team must:

  • Remove the electronic signature from the service
  • Investigate ways to protect people from being nominated as directors and shareholders unknowingly
  • Look into the consistency and quantity of emails sent by the service
  • Look to make the URL for the service consistent throughout the user journey
  • Do broader research to understand how this service fits into the wider journey of setting up a business
  • Identify the most significant contextual barriers for people with disabilities and work to reduce or remove them
  • Test the service with the Senior Responsible Officer

The service team should also:

  • Be able to give a clearer representation of the different user types for their service
  • Research why users are choosing to use 3rd party formation companies instead of doing it themselves
  • Research the way that the UTR is used by the user

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

This service now has permission to launch on a GOV.UK service domain. These instructions explain how to set up your *.service.gov.uk domain.

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Not Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 6 August 2018