Manage Intellectual Property 

Intellectual Property Office 's Manage Intellectual Property  beta assessment report

Manage Intellectual Property 

Assessment date 25/02/2026
Assessment stage Beta
Assessment type Assessment
Service provider Intellectual Property Office
Result Amber

Previous assessment reports

https://www.gov.uk/service-standard-reports/manage-your-intellectual-property

Service description

Manage your intellectual property (IP) is a new digital service that allows users of IP rights (patents, trademarks and designs) to effortlessly view and control their rights in one place. It also allows IP attorneys to manage those rights on their behalf. The service comprises a customer account, a way of linking to existing rights, a way of viewing details of their rights and several transactions that allow user to manage their IP throughout its lifecycle. The new, transformed digital service replaces several disparate, existing paper and partially online transactions. The first phase of this transformation covers management of patents, with trademarks and designs transformation to follow later. 

A separate service for applying for new rights (Secure your IP) is being developed in conjunction with Manage your IP. The two services have been assessed by the same panel (with one change) in successive weeks. The common tech solution has been assessed once in conjunction with the Secure your IP assessment, and the commentary is replicated in both reports.

Service users

  • IP rights owners – large and small business owners or entrepreneurs with or without legal representation.
  • Legal professionals – attorneys who specialise in IP and act on behalf of clients, in-house IP specialists employed by companies and formalities / paralegals within law firms who handle application administration.
  • Internal IPO staff – formalities who perform quality checks at various process stages, examiners who assess applications for uniqueness and compliance with IP law and team leaders manage staff and ensure smooth operations.

Things the service team has done well:

  • Research: The team had a thorough understanding of their users and had clearly spent a lot of effort researching the different archetypes, which informed design and tech decisions and iterations that followed. The team demonstrated how they researched with niche user groups, rather than merely focusing on the main which makes up 90% of the users.
  • Design: The team are aligning other channels with the improvements made to the user journey when designing the digital service, by updating paper forms, contact centre training and scripts. Additionally, they demonstrated several iterations to the service based on user insight, these included simplifying the navigation on the Record changes pages and removing unnecessary questions, which had proved challenging for users.
  • Lead: The team were able to present a complex topic with ease, and were happy to share their constraints around legal challenge and operational change. The work to transform IP has been carefully thought through to ensure that customer service will not be negatively impacted, and the work supports the digital ambition displayed. 
  • Analytics: The team showed a comprehensive set of measures that were derived from the user needs of the service and worked through with the full team, underpinned by a well-developed data set up for capture and reporting that has been iterated through with the team.
  • Tech: OneIPO demonstrates strong technical architecture and good adoption of government digital services. The service shows excellent progress since alpha in technology choices and standards compliance. Key strengths include comprehensive security testing, modern cloud-native architecture, and effective use of shared government components.

1. Understand users and their needs

Decision

The service was rated green for point 1 of the Standard.

Optional advice to help the service team continually improve the service:

  • It wasn’t entirely clear that internal users were given proportionate attention; whether they were engaged with the digital transformation or not. Across all the research discussed, quotes or video/audio artefacts with sufficient redaction to be compliant with the participant’s consent would have eased this concern and also elevated the assessment. 
  • Research with users with assisted digital needs was limited as the team tried and struggled to identify them. Though the team did do some research with users with accessibility needs. Whilst this is a desk-based service and nearly all users will have a high degree of digital literacy and access, the team should collect data from non digital users to try to quantify how many - if any - have assisted digital needs.

2. Solve a whole problem for users

Decision

The service was rated green for point 2 of the Standard.

3. Provide a joined-up experience across all channels

Decision

The service was rated green for point 3 of the Standard.

4. Make the service simple to use

Decision

The service was rated green for point 4 of the Standard.

5. Make sure everyone can use the service 

Decision

The service was rated amber for point 5 of the Standard.

This is amber because:

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

Optional advice to help the service team continually improve the service:

  • The panel was impressed to note the level of permanent civil servants on the team, and with their plans to ensure longevity of experience on the wider programme. Whilst content changes were noted as out of scope, access to a content designer might be useful. The team should also review the shape of their analytics and how this feeds into and informs hypothesis driven work.

7. Use agile ways of working

Decision

The service was rated green  for point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service was rated green for point 8 of the Standard.

Optional advice to help the service team continually improve the service:

  • The team should consider how they can more clearly demonstrate that they are making best use of evidence to confirm or disprove the hypotheses attached to iterations.

9. Create a secure service which protects users’ privacy

Decision

The service was rated green for point 9 of the Standard.

  • The team were obviously working with Secure by Design principles and had planned threat modelling workshops with all team members. 
  • Penetration testing had been completed, and suitable SAST tooling was in place to capture vulnerabilities introduced in the supply chain.
  • The design and overall system complied with GDPR and showed a mature and responsible approach to data collection, retention and publication to comply with legislation.

10. Define what success looks like and publish performance data

Decision

The service was rated amber for point 10 of the Standard.

This is amber because:

  • We did not see evidence of using the 40+ identified measures to create a subset of Key Performance Indicators to understand the success of the service at a high level.
  • We did not see evidence of setting specific measures around current iterations and emerging user needs. This process needs to start with Analytics, User Research and Design colleagues, before the iterations go live so the team can understand whether the change has had the desired effect.
  • The panel recommend working on the visualisations in the dashboards to make them more direct and accessible to a multidisciplinary team.

11. Choose the right tools and technology

Decision

The service was rated green for point 11 of the Standard.

  • Modern cloud-native stack, appropriately hosted on containers with a range of technologies meeting the architectural guidelines within the department. Good governance process and technical design authority in place.
  • Using a number of common platforms [Gov.UK Pay, Gov.UK Notify] extended since the revision to adopt Gov.UK One Login, and using APIs to gain data from other departments.
  • Dedicated monitoring, and team for the production service, cost optimisation considered and scalability possible albeit some intervention required as per team policy.

12. Make new source code open

Decision

The service was rated amber for point 12 of the Standard.

This is amber because:

  • Single GitHub repository and not enough effort placed on open sourcing or coding in the open.
  • The open sourcing policy provided, while operationally well-designed, contradicts GDS Service Standard Point 12 and established GDS guidance on open-source code publication and needs attention.
  • The team do a good job and sharing that with the wider community would bring them and others benefit and often result in finding and preventing security incidents, Security through obscurity is not effective security and can often lead to more complacency through false confidence.
  • The team need to work to revise the policy and consider coding in the open practices, practice good licencing see examples such as https://engineering.homeoffice.gov.uk/standards/open-source-licensing/ and develop a policy more in keeping with the standard, more recent examples like the one from DWP during it’s procurement https://www.gov.uk/government/publications/dwp-procurement-security-policies-and-standards/open-source-code-publishing-policy align with departments that may be processing sensitive data.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated green for point 13 of the Standard.

  • The team are using GOV.UK design patterns and described iterating the designs based on user insight. It is positive changes are shared internally; the panel advise the team to make their designs available to other departments whether via GitHub or a departmental Design System.
  • Excellent adoption of GOV.UK Pay, One Login, Notify plus Companies House integration.
  • Common standards and patterns employed OAuth2, OpenID Connect, JWT implementation; RESTful APIs with Open API/Swagger definition
  • Understand specific appropriate standards are applied WIPO ST90 compliance for IP data processing and API communication

14. Operate a reliable service

Decision

The service was rated green for point 14 of the Standard.

  • Highly available Azure deployment (UK South/West) with Infrastructure as Code via Terraform.
  • Comprehensive toolset - Dynatrace, Azure Insights/Monitor, for monitoring and understanding the performance of the service.
  • Active/passive failover to UK West; DR testing completed with scheduled plans. Consider however automating the scaling policies and strengthening runbooks for incident response, agreeing and publishing RTO and RPO after consultation with the business.

Updates to this page

Published 11 April 2026