Identity and Access Management - alpha

The report from the alpha assessment for IPO’s Identity and Access Management service on 6 October 2017.

From: Government Digital Service
Assessment date: 06 October 2017
Stage: Alpha
Result: Met
Service provider: Intellectual Property Office (IPO)

The service met the Standard because:

  • The service team have identified the most common needs from a range of users to show the requirement for this enterprise service component
  • A strong agile team is in place and are demonstrating a clear focus on building a secure sign in for future IPO services
  • A sound technical approach has been taken for the service

About the service


The service is a SaaS cloud-based solution for online identity and access management.

Service users

The users of this service are the citizens, businesses and intermediaries in the UK and overseas which apply for and manage intellectual property rights (patents, trade marks and designs).


User needs

The service team as whole recognised and acknowledged their users needs. The user researcher demonstrated a good understanding of the service user base and clearly explained who the users are, while referring to the persona set. The personas appeared to be a robust representation of the user types that were introduced during the assessment and included a user with assisted digital needs.

A number of over-arching user needs were cited in the briefing paper and the researcher played video of research sessions held in a lab that evidenced needs being identified and the service iterated to meet those needs. The research methodology was wide reaching; as well as using a research lab, researchers visited users in their place of work, carried out remote research and held guerilla research sessions at IPO sites across the country.

The alpha research built on knowledge gained during discovery when large numbers of users provided feedback, which was combined with the outcomes from 40 interviews during alpha. The researcher did not seem to have a comprehensive research plan in place for beta, although he does intend to increase the number of face-to-face sessions during the beta phase - and the panel recommended making greater use of remote interview technology to support that process.

No research with disabled users was undertaken during alpha, however the service team affirmed that they are aware of the need to design to meet accessibility design standards; that research to understand the need of users with accessibility issues had previously been carried out by a specialist centre employing users with a range of disabilities to test digital services and that they intend to return to the centre during the beta phase for an accessibility audit of the service.

The service manager is the assisted digital lead. He has a well thought through plan in place to provide support to users with assisted digital across the country. The support model makes use of existing IPO service centres and staff for face-to-face support as well as including a telephone service which will be provided at no cost to users. However the team have found it hard to identify and carry out research with users with assisted digital needs. The panel recommended that the team contact GDS to discuss accessing an existing AD panel.


The IDAM service team work alongside a third party Pirean team who own, manage and host the service. The service team have a multi-disciplinary team with all relevant roles present going into private beta. They are co-located in Newport and the Pirean team are based in Canary Wharf. The service team are ensuring close working with Pirean using VSTS as a communication tool and having daily call’s and joint show and tells. The service team were able to demonstrate they are working in an agile way and are working to a fortnightly sprint cycle. The service team were clear about the intent and priorities of the service and the focus of developing an access management system for future IPO services. The service team were also able to articulate the governance route for this service and also the route in relation to other IPO services who will be consumed by this service.


The service team has taken a pragmatic and measured approach to procuring an identity management and single sign on component, and have successfully based their vendor assessment and selection process around identified user needs. Similarly the team have trialled their frontrunner product, Pirean, as part of a functional alpha prototype demonstrating real-world usage, including integration with sample services. This approach to testing and assessing commercial software with users in a real-world context before entering into any contractual agreements with the supplier is to be commended.

The panel was also impressed to hear that the IPO team are working in close collaboration with the Pirean team, and that potential issues with different product development processes between IPO and Pirean have been identified, acknowledged and addressed early on, which has ultimately led to a successful combined team running across both organisations.

The team’s approach to information security is also commendable, where trade-offs between usability and security concerns have been well balanced, with internal IPO security accreditors and architects closely involved throughout. Similarly, the overall identity architecture has been based on the widely used OIDC and SAML open standards, allowing vendors to be changed a later date, with the team being demonstrably aware of what kind of effort would be required for such a change.

The panel were particularly interested to hear how the team has accounted for the differing needs of dependent IPO services, both present and future, and how they have taken a balanced approach to allowing for a range of future requirements such as both role- and account-based access control, without taking a “kitchen sink” approach and selecting a product based on the number of features rather than their quality


The design of this component is consistent with GOV.UK style, and the team have improved and iterated the flow based on several rounds of user testing to a point where all 4 users they have recently tested with succeeded first time. There are some things the team needs to push further however, including simplifying the password criteria in line with service manual guidance.

The team said the design has been approached as “mobile first”, have not yet tested the component on a range of devices due to the users in the lab preferring desktop. We were not able to go into more detail on this as the designer wasn’t present, but going forward they should make sure the flow works on different kinds of devices.

We discussed in some detail the terms and conditions page, which the team have worked hard to simplify and reduce - but users still don’t read it. The panel encourage the team to keep pushing in this respect, and do the hard work to make it simple. It would be worth talking to Digital Marketplace about this as they have been working on creating simpler and clearer contracts for a long time. The panel also had some concerns that useful information about how to use the account is hidden within these T&Cs, and more thought needs to go into how this information can be surfaced to users as and when it is relevant.

The team have a plan for how they will support users when IDAM is offline - allowing users to see some information and use some services that don’t require a login, and using messaging to explain what is going on. They also suggested there will be alternative routes for users to submit applications without a login. They could still allow users to file an application but any pre-populated information wouldn’t be available, and they’d have to enter that again. This process is time significant because the date of filing is important, so it’s important that users can still do this. The team showed they are taking important user needs such as this into account.

Due to the nature of this component it is difficult to assess the full end-to-end journey but users will create an account the first time the use one of IPOs online services. This will then be their login for any IPO services they want to use. These services will be rolled out over time and will have separate start pages on GOV.UK. As this happens the team will need to consider how the component fits into these multiple journeys with multiple start points, and how they communicate to users that they can use the same login or have access to more services once they have logged in. They will also need to consider how the introduction of more services affect the T&Cs, and how and when users get more information about managing their account as and when they have more options.

In terms of accessibility the team seems to have good in-house capability to catch and fix a lot of the basic accessibility issues. They have run their own accessibility audit and have several stories in their current backlog to fix the known issues before putting their prototype in front of users with accessibility needs.


The service team highlighted they are planning to collect information for elements such as password re-sets and for the end to end journey associated with the consuming IPO services. The service team are using Piwik as a tracking tool to measure visits through the service and responsibility for analytics is currently with the UX team and product owner. When considering the completion rate thought has been given to users completing both the component journey and the end to end journey through to gaining access to a consuming service. At the moment the first service for integration is the Online Deposit Account Data and measures will be more fully determined when the wider IPO services become available.

It will also be important to collate user satisfaction and the team will need to co-ordinate this with future services to ensure the end to end service is represented.


To pass the next assessment the service team must:

[Mandatory items that must be completed in order to meet the standard].

  • Increase user research and testing with users with accessibility needs
  • Ensure all user’s needs are included in the user research plan
  • Show that they have tested the service on a range of devices
  • Understand reliant services’ expectations regarding performance levels and identify any additional KPIs to reflect these.

The service team should also:

[These are be non-mandatory items that would improve the service].

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up  
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it  
Published 24 July 2018