Register your company - Alpha Assessment

The report from the alpha assessment for HMRC and Companies House's Register your company service on 9 December 2016

Stage: Alpha
Result: Met
Service provider: HMRC and Companies House (CH)

The service met the Standard because:

  • The team have worked in an agile way to design a cross-government service to meet needs without reflecting departmental structures.
  • The team learn from user research, understand where users struggle and are iterating the service to help users succeed first time.

About the service

Service Manager: Wendy Linkins

Digital Leader: Mike Potter

The Register your company service allows users to create a company by simultaneously registering it with Companies House (known as incorporation) and HMRC for paying Corporation Tax (CT), Pay as You Earn (PAYE) and Value Added Tax (VAT).

Detail of the assessment

Lead Assessor: John Strudwick

User needs

The team has shown a good level of understanding of their users and needs by using a range of research methodologies - from contextual interview, shadowing at the contact centre to usability testing. The panel suggest the team document high-level and subordinate user needs in order to share and display them in the team space at both offices.

The team have talked to a wide range of people, including those who are at different stages of interaction with their service, for example, those who need to set up new companies and those who are going through the existing process.

The team have made a good effort to recruit research participants with low digital skills and accessibility needs. The panel recognise these users are difficult to recruit, however, the team will need to have researched these needs with a larger sample of users before the beta assessment. The panel recommend learning from service teams with a similar user group, such as the Company Accounts and Tax Online (CATO) service team.

Research on paper channel has also been carried out to identify areas that could be improved.

The team has developed seven personas (evolved from initially two) based on specific goals and needs. They have been iterating their personas to reflect their increasing understanding of users. The panel are pleased to see the personas include people with accessibility needs and low digital skills.

The team members are actively involved in observing research sessions. The team stream research sessions so team members at both offices are able to participate in real time. More comprehensive analysis takes place a few days later but rarely with teams from both offices present. The team demonstrated they are iterating the service based on the feedback from a good amount of lab testing. Prioritization of changes are made based on the impact on users.


The service is being designed by two teams, split between two sites: Cardiff and Telford. The teams use chat and video calls to stay in regular contact. The teams try to meet once a month and if that’s not possible hold ceremonies, such as retros, via video call. While the teams have worked hard to establish their working culture and align HMRC and Companies House objectives, for the beta assessment the panel will expect to teams to have been co-located for at least one day every fortnight. The panel also recommend that the two teams are allowed more face to face time, via perhaps away days as well as regular co-location.

Both teams are multidisciplinary and are using agile methodology. The teams have established a set of ceremonies that reflect their geographic separation. More regular co-location may require the ceremonies to adapt but the panel is confident the service team are willing and able to make any changes needed. The whole service team have established good feedback processes and have demonstrated their ability to handle difficult technical spikes.

The majority of the service team are contractors. HMRC and Companies House are upskilling graduate recruits and using this process to help with knowledge transfer. The panel recommend more is done to reduce the reliance on contractors and improve knowledge retention within the teams.

The team have been able to influence policy changes and the panel were pleased to hear about the success the team had doing this. However, further changes in policy are likely to benefit the service and the panel encourage the team to continue this work. The panel recommend the team encourage policy colleagues to be more involved with the service team, for example by asking them to attend the team day and user research sessions on a regular basis.

The panel are aware of the typical challenges faced by cross-departmental working, in particular the challenge to align each department’s objectives. The service team have overcome these challenges and been able to design a service that meets user needs and does not reflect the structure of government to the user. The panel commends the team’s work and effort to achieve this, it’s no small feat. The panel suggests the team share what they have learned by blogging and in conversations with the GDS Services Programme.


The team demonstrated good technology choices and development processes. They use modern open source products to build and release their code (Ansible, Jenkins, Puppet) and they have automated their testing and release process. The project infrastructure is cloud based (AWS and UK Cloud) and there is good performance and security monitoring in place. End-to-end tests are implemented to test the deep stack on each side and also a shallow integration between the parties.

The volumetric model and security threat risks are well understood and the relevant provisions are in place to mitigate those risks. The architecture is scalable and well decoupled by using standard communication means, such as HTTP and JWE.

The team are aware of cross government common services and are planning to use GOV.UK Pay and GOV.UK Notify.

The team also code in the open and understand the benefits associated.

The team have made some plans should the service go fully or partially offline. The panel recommends the team should use private beta to plan and test for such an event. As part of the planning, the team should understand:

  • who will be contacted first
  • what is the escalation route
  • how long will the service stay offline
  • what backup processes and routes can be triggered to allow the public to continue using this digital service.


The scope of the alpha has been restricted to users registering a company with 1 director and 1 shareholder. For private beta, the panel recommend the team focus on building and testing this scope so they can deliver it’s value to users as soon as possible. The team can avoid adding complexity at the start of their beta by, for example, increasing the scope to include multiple shareholders and directors. The team should use a private beta of the alpha prototype to learn more about their users and inform their approach to adding additional functionality and complexity.

The team have done well to iterate the service based on user research. They have tackled some hard problems and have challenged policy and managed to eliminate some questions that were causing issues for users. The team are aware of the main pain points within the service and are working toward improved solutions for these sections based on continuous user research (eg PSC, shares, disclosure exemption, business activity list).

The team need to make sure they are continuing to do research with users with access needs. They have noted a difficulty in recruiting suitable people but are looking at addressing this.

The team’s research suggests users are succeeding first time for the most part. However, the team have identified areas of the journey where users have difficulties. The team are continuing to do research on the ‘picking business activity’ list, which is a complicated list that doesn’t necessarily match a user’s understanding of what their business does. Also, the list is not consistent with the one used by the Office of National Statistics (ONS). Companies House may need to update the list based on the team’s research findings.

The team should continue to talk to GDS about using GOV.UK Pay, GOV.UK Notify and the Country Register. The Register team are also building accessible type ahead functionality for countries, which the team should seek to use.

The team noted that users with lower digital skills prefer to use mobile devices. The panel recommend the team do more usability research with these users and mobile devices. There are some sections in the journey that could prove difficult or confusing for those using a mobile.

More detailed interaction/graphic design feedback will be shared separately with the team.


The team intend to use Google Analytics within the service. They also have plans to monitor support tickets and exit survey responses. The business analyst on the team will likely be responsible for the analysis. The panel recommend the team seek some dedicated time from a performance analyst each sprint.

In addition to the mandatory KPIs the team plan to track the number of penalties issued and drop out rates at particular stages of the user journey. The team have a good understanding of why penalties are issued and how they can help users avoid them.

The panel suggest the team start planning the service’s performance framework. To help, the team should read setting up a performance framework for the UK Parliament website and consider requesting the GDS Performance Analyst team to run a similar workshop to look at the Register your company service.


To pass the beta assessment, the service team must:

  • Document the high-level and subordinate user needs; display them in team spaces.
  • Research with people who have low digital skills, lack of internet access, lack the skill or confidence to use online services, or who have accessibility needs.
  • Continue research on the offline channels (eg paper, phone) to ensure effective changes are being made to improve service experience of those users.
  • Co-locate the teams for 1 day at least once every fortnight and try to provide more face to face time.
  • Do more to reduce the reliance on contractors and improve knowledge retention within the teams.
  • Follow recommendations in the design and content review.
  • Seek some dedicated time from a performance analyst each sprint.

The service team should also:

  • Conduct perpetual discovery research to continuously understand and evaluate their needs throughout the service lifecycle by doing more contextual research.
  • Encourage policy colleagues to be more involved with the service, for example by asking them to attend the team day and user research sessions on a regular basis.
  • Plan step by step actions for when the service is offline.
  • Focus on building and testing a service based on the scope of the alpha prototype. Avoid adding complexity until you’ve built and tested the minimum viable product.
  • Develop the service’s performance framework.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 12 October 2017