Conclave - Alpha Assessment

The Report for the Crown Commercial Service's Conclave Service Alpha Assessment on 24th March 2021

Digital Service Standard assessment report

Conclave

From: Government Digital Service
Assessment date: 24/03/21
Stage: Alpha
Result: Not met
Service provider: Crown Commercial Service/Cabinet Office

Service description

Conclave is a set of interrelated enabling technologies to provide SSO, organisation identity and supplier evidence services to CCS, across Government and the wider public sector. Overall, the service allows users, buyers and suppliers to securely access all CCS service offerings in order to transact. This includes the provision of the necessary documentation to prove their eligibility to use the CCS provided service on a supply once and use many times basis.

Service users

This service is for employees of Government, public sector and third sector organisations looking to procure goods and services (buyers) and sellers of goods and services, in the form of both large businesses and small - medium enterprises (SMEs):

  • CCS Category Managers
  • public sector procurement officers
  • private sector suppliers to the public sector
  • charitable organisations
  • SMEs and sole traders

1. Understand user needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have conducted user research to pinpoint needs/requirements related to the procurement process
  • research has been conducted with users from each of the target user groups (both buyers and sellers, from multiple sizes and types of business) to inform the needs generated by the team

What the team needs to explore

Before their next assessment, the team needs to:

  • consider revisiting their user needs to refine them based on the guidelines on writing user needs for GOV.UK. The team presented the panel with user needs that were often solution specific and, as a result, it was difficult to understand how the proposed solution had been decided upon based on solving a problem for users
  • conduct further research with users with access needs. Although the team had sought out 2 users with access needs and plan to procure an accessibility report, it is crucial that the team involve users with access needs in each round of research to determine the usability of the proposed service. The researchers may wish to reach out to the Civil Service Disability network for assistance in recruiting Civil Servants with access needs
  • understand the digital inclusion of their user groups in more detail. The team had not conducted research with users with assisted digital needs, despite opting to create a digital-only solution, and felt that their public sector users in particular did not fall within the remit of assisted digital users

  • in line with guidance from the service manual, which states “If you think your service has no users who need assisted digital support, you need to be able to explain how your user research and testing support this”, the team should employ methods of measuring participants’ digital competency against the assisted digital scale
  • moving forwards, the team should assess the overall digital competency of their users (both within and outside of the workplace) to understand digital inclusion and adaptability to new technology. For example, an internal user that is very familiar and comfortable with Microsoft packages may have trouble or concerns with using online banking or online shopping, which could indicate that there are aspects of digital working that the user struggles with. A more holistic approach to understanding and evidencing inclusivity will enable the team to consider the most feasible design for their service moving forwards

2. Do ongoing user research

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • since joining the team in 2020, the researchers had conducted 10 rounds of user research with 85 users in total
  • a variety of research methods have been used throughout Alpha, including surveys, depth interviews, focus groups/workshops and usability testing. Users from each of the key user groups have been included in research, including over 1000 SME’s having been consulted as part of the survey
  • the team had struggled to recruit at times, but had been proactive in reaching out via multiple sources in order to find users for research and had overcome this barrier effectively

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct more ongoing research to support the iteration of the service. The panel would like to acknowledge that conducting research with 85 users in Alpha is no easy feat - however given that the Alpha phase had been prolonged and that the team had 4 researchers, the volume of research and iteration as a result of this was not proportionate

  • the team indicated that they had struggled with governance issues during Alpha, which had impacted their ability to conduct research early on. As the team is driving a large part of CCS’s customer first principle, we hope that stakeholders will assist the team in ironing out future blockers early on so that the team can continue to conduct research and design with their users in mind

  • work alongside design colleagues to ensure that potential solutions can be stress tested effectively

  • the team’s use of clickable (but less interactive) prototyping software could cause issues for the research moving forwards, as it is difficult to appropriately research the usability of a prototype that can’t be used as it would in practice
  • moreover, the team should endeavour to use their Alpha phase to research and design multiple solutions to the problems faced by their users - please see more on this in Point 5, Old Standard

3. Have a multidisciplinary team

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a strong architectural presence, and that particularly a security architect was embedded early to ensure this is baked in
  • there is support for Conclave from Heads of Profession in CCS where a shortfall in resources has been identified

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that there are clear lines between people that are being recruited. There is a very different skill needed to be a Business Analyst or to be a User Researcher or a Data Analyst - the panel were very concerned that these roles are being combined
  • consider supplementing their Business Analyst resource with specialist skill, as well as additional Service/User Experience designers given the size of the current ask
  • ensuring that future panels can see squad as well as functional makeup

4. Use agile methods

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • team ways of working seem to be strong: ceremonies are in place and the team could talk about how these have been reviewed and changed through retro and reflection
  • the team are using 4 scrum teams (although there is some concern about the management “scrum” as they’re not delivering), and are using Jira and Confluence to ensure that all team members have line of sight to the work that’s ongoing
  • that the governance arrangements include stakeholders outside CCS, for example the Government Chief Commercial Officer
  • risks are raised and managed regularly

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the governance structure. The team brought us three products to assess - Single Sign On, Central Identification Index and Evidence Locker. The panel recommends splitting these elements up and reviewing them separately in the future so that the team have clear and distinct deliveries
  • to pass this point, the standard states that the team need to be able to demonstrate that they have explored and rejected design options: this was not clear in the story told at assessment. The panel’s impression was that a solution had been developed and that the research has supported further development of this, rather than taking a clear look at what the problems were, and designing to meet them
  • product managers on the team seemed to be focussed solely on the user. Whilst it’s correct that they are the advocate for the user on the team, it felt a marginalised role, and not one that had responsibility for the quality of the product as a whole. It is perhaps worth reflecting on the job description https://www.gov.uk/guidance/product-manager#introduction-to-the-role-of-product-manager

5. Iterate and improve frequently

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • there was some evidence of responding to user needs, for example the addition of the nominated admin user help, and that appearing twice in the sign up journey

What the team needs to explore

Before their next assessment, the team needs to:

  • as above, the panel felt that the team had developed user needs and research to test a solution, rather than driving the solution out of problem identification and user needs. There was no clear evidence of anything being discarded in the development of the service, and the lifecycle of a user need from identification to production wasn’t clear
  • there wasn’t a clear explanation of what problems had been identified in Discovery and how the new service will address them more than any other solutions available (for example, reducing the number of systems down rather than building a new separate entity)

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has been careful to develop select tools and technologies, such as ruby and PostgreSQL, that have an established reputation throughout government and that organisation itself has prior experience of
  • where the team are considering the use of commercial off-the-shelf then that is being encased in their own API definitions, to allow a future change of vendor to be kept as a practical possibility
  • credible plans are in place for the implementation of the fully featured development environment that will be needed in the beta development phase
  • the choice of Splunk to support security monitoring demonstrates the team are considering tooling needs beyond the development process

What the team needs to explore

Before their next assessment, the team needs to:

  • the team are already aware that the use of Angularjs in the prototype may lead to accessibility issues and need to ensure the beta phase development fully follows the GDS design standards, in particular to be able to work without javascript in the browser

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team includes a full time security architect and presented evidence which shows that security concerns have been an integral part of the project’s conversations during alpha
  • the project has been able to use the past experiences and resources of the wider organisation to put in place a plan to meet the many security and privacy objectives that have to be delivered in the beta phase

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that, while they already have delivered many successful similar systems, in this project they will be holding more data, on more people, for longer and it will be accessed more times and in many more ways than has been done before. The increased scale and centralisation promise huge business benefits, but with that comes a comparable increase in the potential cost of a security breach
  • the team should investigate what more could be done to enhance security and privacy. How can the project be more innovative and go further than on previous projects
  • one area the team may wish to investigate is to implement client side field level encryption, to enhance the protection of the data stored in their PostgreSQL databases

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are aware of the need to open source their code and are planning to do this in the next phase
  • the team explained the benefits that open sourcing will bring in promoting the use of their API by third parties

What the team needs to explore

Before their next assessment, the team needs to:

  • decide upon the open source license to use to enable publishing of their code
  • begin to publish open source code as soon as possible and to work in the open to enable early feedback from third parties

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the core purpose of the project is to provide the common procurement platform for government and that it will be more open and standard than anything before
  • the team are keen to promote the use of open standards and have adopted an API first approach to ensure their system will be easy to integrate with. They have already adopted the OpenAPI standard for documenting their APIs and already have the tooling in place that makes this practicable
  • the implementation of the Open Contracting Data Standards at the start of the project demonstrates a further commitment to the principles of open standards
  • the planned use of GDS technologies such as the notification service and hosting platform will avoid the duplication of cost and effort through use of these common government platform

What the team needs to explore

Before their next assessment, the team needs to:

  • the team to actively engage their user community, potential partners and third party integrators, in their development of their APIs and the standards that will be developed during the project

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the adoption of the API first approach and the strategy of making the user interface use the published API will greatly help ensure that end-to-end testing is embedded in the development of the project

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that end-to-end testing is a critical objective of the beta phase of the project

11. Make a plan for being offline

Decision

The service did not meet point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are taking steps to ensure a high standard of security and together with the benefits of cloud hosting this should enable them to provide a reliable service

What the team needs to explore

Before their next assessment, the team needs to:

  • the team needs to consider a plan for when the service goes down, how this will impact journeys, what messaging will be used and how data will be protected
  • the team needs to set clear objectives for SLAs for each part of the service and set a recovery time objective, so that the plan for being offline can be defined in that context

12. Make sure users succeed first time

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team presented a simple registration process to encourage users to onboard, utilising existing organisation registries to ensure consistent organisation details are held across government departments
  • support will be provided for organisations who aren’t registered with the listed registries, such as Companies House
  • the team explored leveraging other online accounts, Facebook and Google, to simplify the login process for small enterprises and sole traders
  • insight from user research, surfacing a need to nominate a colleague as the account admin, led to a change to the original design to accommodate this need
  • the team are referencing and adhering to WCAG 2.1 guidelines early in the design process and demonstrated expert knowledge within the User Centred Design team
  • the team shared their plans for external accessibility testing

What the team needs to explore

Before their next assessment, the team needs to:

  • explore how they ensure the right person is given admin rights to the account
  • be able to demonstrate more examples of how user research has informed changes to the design making the journey easier for a user
  • consider presenting the prototype from the perspective of the user. It’s useful to understand the users’ motivations and context before presenting the demo to help the assessment panel to understand what the user is trying to achieve. Excellent advice on the GOV.UK blog, Service demos: How to tell the story of your service
  • name the service so it makes sense to users. The team explained ‘Conclave’ is an internal project working title, naming the service will further demonstrate the team’s commitment to making the service findable and easy for users to understand whether it will meet their needs
  • consider using the GOV.UK prototype kit to build a realistic prototype. The limitations of a Figma mean usability test participants are unable to experience full interactions and explore branches of the journey based on their input and decisions
  • explore broader design alternatives. The demo consisted of one solution and the team were unable to demonstrate they’d looked at a range of solutions before focusing on the presented design, which is what the panel would expect to see in an Alpha assessment. The team needs to explore different approaches to solving the problem rather than small changes to a predefined design

13. Make the user experience consistent with GOV.UK

Decision

The service did not meet point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have based the design on GOV.UK design patterns and components, and as such will benefit from the knowledge the interactions are well tested
  • there’s distinction in the visual design between Conclave and GOV.UK

What the team needs to explore

Before their next assessment, the team needs to:

  • consider looking at Conclave as connected services that work together to solve a whole problem for the user. Conclave feels a lot like multiple rather than a single service:
  • Register / Sign in
  • Evidence locker
  • Manage users
  • Manage groups
    During the assessment the panel only saw a small part of Conclave; Registration and Sign-on, which suggests it’s too big as it currently stands
  • provide evidence the customised patterns and components will preserve the accessibility considerations baked into the GOV.UK components
  • demonstrate all components on the page are fulfilling a genuine user need within the context of Conclave, for example the breadcrumb
  • demonstrate how the service will surface validation errors and error summaries given the colour palette used
  • contribute insights from their use and customisation of GOV.UK patterns to the wider government design community

14. Encourage everyone to use the digital service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a good relationship with the Government Chief Commercial Officer as part of the governance - the team will need to work with him to move people onto their new service

What the team needs to explore

Before their next assessment, the team needs to:

  • how they will encourage people to take up the new service, and to move away from existing platforms. The panel were concerned that this would simply be mandated rather than being encouraged, and as such this might cause issues

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have thought through the KPIs and wider metrics, along with how they will baseline them and targets to be achieved over the first few years
  • the team have clear, articulated benefits that they expect to realise as part of this project, and have a way of tracking them
  • the team will be using Google Analytics to monitor and collect performance data from the service

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that they have an understanding of where non-digital performance data will come from - for example call volumes - and how they will understand any wider context for these so that they can be used in decision making

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has thought through the KPIs and wider metrics, along with how they will baseline them and targets to be achieved over the first few years. These targets were expressed as increasing year on year: the panel’s recommendation is to consider a saturation point - for example a reduction in contact centre interactions will reach a ceiling, so what is that and when do you aim to achieve it, rather than expecting a year on year change

What the team needs to explore

Before their next assessment, the team needs to:

  • consider and demonstrate whether any of their metrics drive out perverse behaviour - for example the reuse of information in the Evidence Locker could mean that buyers were asking extra follow up questions as the information wasn’t specific enough - and plan to monitor this
  • consider wider metrics on information security including how many accounts are set up and reconciliation action is needed with the official company account
  • review what “completion” means - the formula given was cost/transactions, whereas this perhaps needs to be seen more at user level - how many people complete their registration first time
  • consider how they will feed performance data into their backlog and future work

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are aware of what they need to do to publish their KPIs with GOV.UK

18. Test with the minister

For Alpha: Point 18 of the Standard does not apply at this stage

What the team has done well

The panel was impressed that:

  • the team are keeping the minister informed of their progress

What the team needs to explore

Before their next assessment, the team needs to:

  • the team needs to ensure that plans are in place to test the service with the minister before it goes live

Updates to this page

Published 13 October 2025