Apply to register as a Companies House authorised agent alpha assessment

Service Standard assessment report Apply to register as a Companies House authorised agent 17/10/2023

Service Standard assessment report

Apply to register as a Companies House authorised agent

From: Central Digital & Data Office (CDDO)
Assessment date: 17/10/2023
Stage: Alpha
Result: Met
Service provider: Companies House

Service description

The Economic Crime and Corporate Transparency Bill (ECCT) is introducing the requirement for all directors, people of significant control and presenters within a company to verify their identity with Companies House (CH). It also requires third-party agents (accountants etc.) who either present filings to CH on behalf of clients, or who will verify the identity of their clients for CH, to register as a new entity: an Authorised Corporate Service Provider (ACSP). Three services are required for ACSP’s journey:

  • Register an ACSP
  • Manage staff within the ACSP
  • Tell Companies House you’ve identity verified your client

While each service meets the standard in terms of naming convention, we have summarised these as ACSP services for the purpose of the report write up.

Service users

The requirement to register as an Authorised Corporate Service Provider (ACSP) applies to the following business types:

  • Sole trader
  • Limited Company
  • Limited partnerships and limited liability partnerships
  • Partnership, not registered with Companies House
  • Corporate body

According to the Economic Crime and Corporate Transparency Bill, ACSPs will need to:

  • Have a place of business in the UK.
  • Be registered with a UK Anti-Money Laundering (AML) supervisory body
  • Have an identity verified relevant person (director/partner of a company or a sole trader) to a GPG45 medium level standard. For day 1 of public beta, only sole traders will need to meet this requirement.

Once registered ACSPs can verify the identity of directors, people of significant control and presenters of the following business types:

  • Limited Company
  • Limited partnerships and limited liability partnerships
  • Partnership, not registered with Companies House
  • Corporate body

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • The research identified many needs, demonstrating a good understanding of the users.
  • The team created and tested many different versions of their design based on their initial hypotheses and user research findings. This allowed them to make informed decisions about what to keep and what to change, resulting in a better user experience.
  • It was encouraging to see a diverse range of users and stakeholders recruited for testing, across a variety of channels. Although recruiting users with accessibility needs was challenging, we were impressed with the team’s efforts to recruit proxy users.
  • Despite the difficulty of recruiting users with access needs, it was also encouraging to see the team use other methods to check for accessibility, such as manual testing and WAVE checks.

What the team needs to explore

Before the next assessment, the team needs to:

  • Put emphasis on the key persona points, to make them easier to read and understand. Some of the data in the personas is irrelevant or difficult to interpret. For example, what is the difference between a “7” and an “8” on the “Interaction with Companies House” scale? Why is this information important?
  • The personas show that users are likely to use different devices to access your service. However, it is unclear whether the team has tested the service on different devices and what the findings were. It would be helpful to see this information.
  • Prioritise users with accessibility and inclusion needs more by testing with users who are more likely to use your service.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has collaborated with policy, legislation, regulatory bodies and users to develop a service that meets the needs of industry and users.
  • This team has worked well to understand what the GOV.UK One Login solution offers and how ‘Apply to register as a Companies House authorised agent service’ should differ. The team has understood that business individuals have different needs when trying to verify their identity. Some may be able to self-serve and use the identity documents through One Login, whereas others may need help from an authorised agent.
  • The team has focused on supporting the agents who apply on behalf of the business individual. The team has a good understanding of the agent behaviours and business needs internally and what type of staff permissions are required. The team did multiple rounds of user testing with different types of agents.
  • The team has developed and tested a new standard for identity verification that aims to reduce the number of fraudulent cases and improve the data quality on the Companies House Register. This has also tested well with company agents who were able to understand and use the new standard within the prototypes. This new standard also incorporates users who live outside of the UK.
  • The team created partnerships with anti-money laundering providers and the Office for Professional Body Anti-Money Laundering Supervision (OPBAS) so that in the future information sharing can occur to confirm agents’ memberships.
  • The team has designed for verification throughout the service: to use the service, agents must be registered with an anti-money laundering provider. The agents provide their membership number in the application so that Companies House can check this. If they are existing customers of Companies House or HMRC this data can be pre-populated into the form.
  • The team showed evidence of working well across government to solve a whole problem by:

  • Good collaborative working with One Gov Login and other government departments to understand how the two services need to join up.
  • Worked with HMRC to design for users who have registered with HMRC.
  • Regular show and tells with Companies House and The Department for Business and Trade.

What the team needs to explore

Before the next assessment, the team needs to:

  • Prepare methods that help the team understand the wider journey or context of this service. For example:

  • How this service will affect the anti-economic crime industry and working practices.
  • Look at the wider impact of this service on clients’ and agents’ high-level goals and needs.
  • Once an agent has helped a client to be verified with Companies House, what are the next available options to the agent and the client? How will this communication be handled?

  • Work with the GDS team to understand where your journey might sit within the existing GOV.UK guidance and any content changes that are needed.
  • Prepare for how you might deal with the constraints that have arisen in alpha for example:

  • Making sure Companies House can verify anti-money laundering memberships successfully.
  • Legacy technology that may change in time.
  • Once approval has been given, the wider communication of this service to a variety of users.

  • Continue working in the open in Companies House and government by doing show and tells and sharing insights.
  • Keep testing the end-to-end journey to make sure business individuals can use agents to verify their identity.
  • Keep working with One Login so that this route can integrate well with the requirements for the service.
  • Continue working well within the Identify Verification workstream in Companies House to make sure the services are joined up and work well as a whole.
  • Think and plan for the unintended consequences that could arise from this service in particular the things that may go wrong for users.
  • Look for pain-points in the service that might make it hard for agents to use the service; for example, high-risk customers.

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has designed the service to offer a digital self-service route and an Assisted Digital route into the service. The Assisted Digital users will be supported by call centre agents who are trained to support a variety of users with accessibility needs across Companies House services.
  • The team is working with the call centre operating agents. Training will be developed for the operating areas to prepare call centre agents to support users when they need help.
  • The service channels will be digital, email, call centre and mobile phone texts.

What the team needs to explore

Before the next assessment, the team needs to:

  • Test and design the service across different devices.
  • Design and test the Assisted Digital journeys with the operating teams.
  • Plan for scenarios where the service may fail or when users require more support, based on the evidence already collected or risky areas.
  • Test the service channels are working well across digital, email and mobile phone text messages, and the call centre.
  • Think about how notifications will be managed.
  • Develop a way for the team and the call centre team to understand how the service is performing across channels and how to prioritise improvements.
  • Make progress on improving the users experience across channels.

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that the team had:

  • Developed and tested five design iterations based on nine rounds of usability sessions with 50 participants (42 sessions) including accountants, solicitors, company secretaries, and compliance and money laundering reporting officers. Highlights were:

  • Testing the name of the service so that users can find it easily and the language meets their mental model.
  • Presenting intuitive options to users by using language they understand.
  • Testing email notifications with different users including the agent and client.
  • Simplifying how the identity check standard is presented to users.
  • Providing a mobile phone as a contact detail as well as email.

  • Designed with GDS patterns and reused existing patterns from services in Companies House and DWP.
  • Iterated and tested the designs with users whilst keeping to the GDS patterns.
  • Worked with the relevant communities of practice and the business areas in Companies House to share insights across teams.
  • Built-in verification checks into the application form.

  • Check the Identity verification document types provided by the clients meet the new standard in real-time.
  • Check the agents are registered with an AML provider early in the application.
  • Make sure the right type of users apply by asking routing questions and communicating what needs to happen if they do not meet the eligibility criteria.

  • Reduced the number of times users have to provide information across services. If Companies House can connect to information about the agent from their existing database or HMRC then this information will be auto populated.

What the team needs to explore

Before the next assessment, the team needs to:

  • Keep testing and iterating on the designs based on the research and risky areas to make sure the service is robust enough to take to the Private/Public beta.
  • Work with developers and technical suppliers to trial and test areas of the prototype that are dependent on data.
  • Moving forward the team should look to design and test the full information architecture with users.
  • Think about the appropriate places to have links to and from other services.
  • Build in the ability to save and return to applications, integrating the various journeys into an end-to-end service. The team mentioned at assessment some of these improvements would already be part of their next iteration, which was underway.
  • Keep testing the journeys with each user group to understand the different needs and improvements needed, including testing unhappy paths.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • The team conducted research with a range of different users to understand the different motivations and needs. This included accountants, solicitors, company secretaries, compliance and money laundering reporting officers.
  • The team tested the designs with six proxy users who have accessibility needs (dyslexia, tunnel vision, inverted colours).
  • The team also tested for accessibility requirements throughout the design process:

  • The prototype has been checked for accessibility through a combination of manual and WAVE checks.
  • There are accessibility skills and reviews within Companies House.

  • The contact centre team is trained to support users who need assistance.

What the team needs to explore

Before the next assessment, the team needs to:

  • Increase the number of research sessions with users and those with accessibility needs.
  • Consider if the service has any problem areas that lead to people not being able to use the service.
  • Prepare to have an accessibility audit and fix any issues that arise from the report.
  • Keep testing the designs for accessibility requirements with a broader set of tools.
  • Utilise the expertise within Companies House and this team to support accessibility.
  • Test and Design across multiple devices and browsers Desktop, Mobile and Tablet

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • The team is multidisciplinary, involving a number of full time and part time roles in order to ensure the breadth of disciplinary insights. These disciplines include: product management, delivery management, service design, interaction design, user research, technical leadership, software development, content design, performance analytics, business analysis.
  • The mix of part-time roles enables join up across the wider programme of work to support identity verification within Companies House. As such the risks to communication with an extended team are balanced by the benefits of alignment across complementary services.
  • The team demonstrated that it works well together, with most attendees speaking during the assessment, and providing aligned and complementary responses to the panel’s questions.
  • The team includes a mix of permanent and interim staff – namely 5:4 ratio. This is to be expected given challenges with recruitment, and the fixed-term nature of projects. The ratio feels appropriate for the team.

What the team needs to explore

Before the next assessment, the team needs to:

  • Reflect on the risks of a primarily outsourced development team as they progress through the Beta phase, and the best ways to manage this. A separate development team does not align to multidisciplinary expectations of the Service Manual. Additionally, where the team is outsourced, it is a common source of technical debt in live service teams. The team were however not concerned, and expressed this is an established way of working within Companies House suggesting they felt the risks were mitigated.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • The team demonstrated that they worked effectively as a squad, with the ceremonies expected to plan and learn as a team. The team did a good job of talking us through the risky assumptions they’d identified for Alpha, and how they’d reflected and iterated on these throughout the phase.
  • The team also demonstrated that they work effectively in the open with other teams in Companies House that are also working on identity verification, with the relevant professional communities, and with related policy teams. The team referenced open sessions with HMRC as well as DBT, for example.
  • The service owner demonstrated a good understanding of their funding, governance process required, commercial and other dependencies.

What the team needs to explore

Before the next assessment, the team needs to:

  • Consider developing a stronger, team-wide understanding of the impact that the services need to have on users, and value for money. This will enable the full team to optimise its decisions, including on proportionate investments. This point is also picked up under section 10.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • The team identified its riskiest assumptions, building on the discovery phase. These were explicitly called out and presented.
  • The team explained how they worked to revisit these as they learned more. The team was able to demonstrate how they iterated from prototype version 1 to version 5. The team was working on version 6 at the time of the assessment, demonstrating this culture is in-built rather than done in order to meet assessment criteria for the next phase.

What the team needs to explore

Before the next assessment, the team needs to:

  • Build in regular communications with users. This was limited at the time of the assessment, due to the new Bill not yet coming into force.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • A data impact assessment has been completed and consideration has been made around data requirements of the service.
  • Penetration testing is planned, and ongoing annual tests will be the norm.
  • Good security practices are employed and standards such as OWAP and NCSC Cloud Security Principles are followed.
  • Vulnerability scanning will be adopted, and teams include security champions.
  • Least privilege processes were described in relation to temporary and permanent datastore access.
  • Web application firewalls will be used to protect the service against DDOS and other common internet-based threat vectors.

What the team needs to explore

Before the next assessment, the team needs to:

  • Consider more threat modelling, especially around the agent introduced users, that have not undertaken identity verification.
  • Continue to ensure software is using latest standards and language versions and adoption of in support cloud native services.
  • Look to see if they can adopt further technological practices and other pattern-based monitoring that helps identify any fraudulent use of the system early.

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • The team was working on the four mandatory KPIs for digital services and had a benchmark in place for many of these, with a good understanding of the journeys to look out for.
  • Additionally, the team was integrating customer experience measures, from the start.
  • The team also started to consider outcome measures to help them understand how successful the service is in solving the problems it is looking to address.
  • When prompted, the team demonstrated a lot of knowledge around fraud and threat data patterns which they would need to keep a close eye on for this service.

What the team needs to explore

Before the next assessment, the team needs to:

  • Develop its cost per transaction modelling, and consider what would represent value for money, in line with the business case behind the identity verification programme at Companies House.
  • Develop a plan to evaluate the outcomes this service is looking to deliver against, to ensure the service is being built in a way that enables data to be collected and monitoring to start at the earliest opportunity.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has architected the system to ensure there is minimal interaction between the main legacy system and the new proposed functionality, decoupling using a message-based pattern allowing for further transformation to take place.
  • The team has chosen to adopt appropriate container-based technology hosted in public cloud using managed servers.
  • The team has chosen suitable in support languages that are widely used within the agency and will not proliferate technical sprawl.
  • The system will take advantage of the GOV.UK One Login authentication and identity verification system.
  • Adoption of the save and resume pattern utilising datastore will lead to an improved user experience.

What the team needs to explore

Before the next assessment, the team needs to:

  • Explore the microservices / serverless approach to the anti-money laundering functionality and or ascertain if this will allow further decoupling and architectural improvement.
  • Consider carefully the impact of supporting multiple supervisory body ingestion formats and aim to work with organisations to standardise a pattern and or think about an API provision in Companies House that others can adopt.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • The team is committed to coding in the open.
  • The team ensures they practice appropriate security controls and practices which mean the codebases are free from secrets and sensitive information.

What the team needs to explore

Before the next assessment, the team needs to:

  • Continue publishing code.
  • Keep adopting a repository template pattern for ease of setup and consistency and if not already applicable always include besides a licence file, contributing file, code owners, issue and pull request templates and a security.md file.

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has adopted REST APIs and are using the OPENAPI specification.
  • There is obvious re-use within the agency and common component teams exist.
  • Identity verification is completed by a centralised solution built on top of GOV.UK One Login and gpg45 is well understood by team members.
  • Use of GOV.UK Pay and the design system is customary practice.

What the team needs to explore

Before the next assessment, the team needs to:

  • Further adoption of standard data exchange using the Json open data format, where possible (legacy software permitting) negating the need for JSON to XML format translations.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • There is an aim to use suitable observability practices, and use visualisation built with widely adopted tools.
  • Managed public cloud services for containerisation were adopted.
  • Alerting and system health considerations have been undertaken and dedicated platform teams are in place.
  • Multiple environments with production parity, automated tests and quality assurance practices are embedded within the team workflows.
  • Blue/green deployments are in place and an uptime target is set at the agency level.
  • Recovery point and recovery time objectives were thought about and suitable backup and recovery patterns / processes were presented to the panel.

What the team needs to explore

Before the next assessment, the team needs to:

  • Review if CI/CD practices could be further enhanced to get to continuous deployment safely.

Next Steps

This service can now move into a private beta phase, subject to implementing the recommendations outlined in the report and getting approval from the CDDO spend control team. The service must meet the standard at beta assessment before launching public beta.

To get the service ready to launch on GOV.UK the team needs to:

Published 11 December 2023