Award a Contract for Goods and Services - Alpha Reassessment

The report for Crown Commercial Service's "Award a contract for goods and services" service alpha reassessment on 30th June 2021

Service Standard assessment report

Award a contract for goods and services

From: Central Digital & Data Office (CDDO)
Assessment date: 30/06/2021
Stage: Alpha
Result: Not Met
Service provider: Crown Commercial Service (CCS) - Cabinet Office

Previous assessment reports

  • Alpha assessment report: November 2020, Not Met

Service description

This service aims to do two things:

It will allow buyers to self-serve their procurement activities when purchasing goods and services from CCS frameworks and it will start the journey to consolidate the CCS procurement digital estate. The strategic platform that will be used for this consolidation is a SaaS platform from Jaggaer called eSourcing.

Right now, the UX within eSourcing for buyers creates friction, users cannot self-serve successfully as the setup and running of procurements is complex and brittle so we will be replacing the buyer experience with a refreshed UX whilst using eSourcing as the underlying system of record.

Service users

Buyers

  • experienced procurement agent: experienced buyer from the Central Government who understands procurement well overall
  • Subject Matter Expert Commissioners: work and have training in a specific field (for example IT, health care, marketing).
  • procurement agents in growing organisations: are responsible for all types of procurement within their service from small to large scale
  • novice buyers: work is a small-medium sized organisation or authority. They are used to doing low value procurements but have now been asked to handle a procurement over £50k via a further competition

Suppliers

  • corporate bid managers: are senior managers in a large company with a dedicated Bids team. They are very experienced, but time pressured as they are dealing with multiple, often complex, projects at the same time
  • Subject Matter Expert Bid Managers: are senior directors and co-founders of an SME with a specialism. They evaluate bid opportunities at the same time as doing the work
  • administrative assistants: work for a large organisation that is signed up to a lot of procurement portals using one generic email address

In addition, as the first version of our service will serve the Digital Specialist and Programmes agreement the service team will ensure they are targeting the following supplier user groups.

  • eSourcing suppliers: are focused on working with their team to turn around compelling briefs and submit them before a deadline to win work on agreements their organisation is part of

Sector: SME

  • digital specialists suppliers: use the Digital Marketplace and DOS framework to place domain experts into roles across the government, they understand the framework

1. Understand users and their needs

Decision:

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has followed many of the recommendations from their previous assessment and demonstrated a good understanding of the end to end user journey and its transition points
  • the team has used a range of appropriate user research methods including diary studies, workshops, surveys and interviews to develop personas
  • the team had included assisted digital users and established a relationship with the Able Network which they planned to continue in private beta

What the team needs to explore

The team has captured a wealth of insights in re-discovery but the research for alpha felt light and the panel felt there were some key gaps and risks.

Before their next assessment, the team needs to:

  • consider the value of further research with a range of suppliers and developing supplier user needs. The panel felt understanding suppliers’ needs alongside buyers will be critical for the service and were not clear on how findings from the two workshops with suppliers had influenced the team’s approach
  • do further research on procurement users’ needs in relation to different procurement frameworks and procurement users’ needs in relation to suppliers
  • consider doing further research and testing with users who are less experienced in procurement (novice buyers and people whose main role is not procurement agents such as teachers) to ensure the service will meet their needs. The panel felt the focus on experienced procurement agents was risky (three rounds of research in alpha with 17 participants, six existing buyers from DOS framework and 11 users who were procurement agents new to CCS)

2. Solve a whole problem for users

Decision:

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a very good understanding of the end-to-end journey and where the service sits within the wider CCS landscape
  • the team displayed a good understanding of the problem for certain users of the service
  • the team has worked hard to turn user needs into a service proposition
  • the impact of COVID on users’ interactions with the service was researched
  • the team has tried to understand the problems for the wider public sector and charities
  • the team has been working closely with the ‘Get help finding the right agreement’ and ‘Shopping service’ teams

What the team needs to explore

Before their next assessment, the team needs to:

  • the service team has done some ‘diligence’ research with suppliers and are conducting a gap analysis to understand the difference in experience suppliers will have when moving to eSourcing. However, this move raises a significant concern over fairness to competition for SMEs. If the application journey is significantly more difficult, this will impact SMEs disproportionately, reducing the number of available suppliers for every opportunity. It is also concerning that 50 suppliers will be migrated to the new DOS, from the around 4000 suppliers in the Digital Marketplace today. We recommend more attention to be paid into this: consider including suppliers as part of the main project’s scope, to ensure SMEs have at least the same ability to compete as they enjoy today
  • consider researching the buyer needs of the framework agreement itself, pain points and issues with the current DOS framework, and whether the new DSP will meet buyer needs in terms of the range of suppliers and services available through it
  • consider how other frameworks would work in this platform. The alpha consisted in prototyping and testing the first part of the journey for one framework, and there’s a risk other frameworks will have significantly different requirements

3. Provide a joined-up experience across all channels

Decision:

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team displayed a good level of understanding of the problems users face across multi-channels, and is focusing on solving the journey with a multi-channel approach

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure they work closely moving forward with their customer services team to fully understand the future support model

4. Make the service simple to use

Decision:

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done considerable work to understand the different entry and exit points to the service
  • the team has carried out research into the service name, proposing a change from ‘Contract a thing’ to ‘Create and award a contract’, which will be further tested in beta
  • the team is working closely with the CCS Design System to ensure patterns and styles are aligned once the design system is live
  • from previous assessment: the team has built a coded prototype to test with users
  • the team demonstrated having done a number of iterations on their design based on user research findings, simplifying content and navigation for users
  • the team is exploring features to adapt to procurement agents ways of working, such as adding team members to a procurement

What the team needs to explore

Before their next assessment, the team needs to:

  • simplify the initial pages even further. There is a lot of content to get through. The accordion pattern helps with navigating to the content, but consider whether there are ways this could be made even simpler for new users
  • note the recommendations at point 2 around researching the frameworks: the digital service will only be as simple as the frameworks themselves
  • have a close engagement with the legal teams, category managers and sourcing teams to ensure they can feed into the frameworks as they’re being created and iterated

5. Make sure everyone can use the service

Decision:

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has strong assisted digital and user support routes in place via multiple channels
  • the team has conducted some user research with user with accessibility needs, and has been engaging with the Cross Gov Able network to find participants to test their prototype
  • the team has a good plan for accessibility in beta (both internal and user testing)
  • the team has conducted testing with new to procurement users in alpha

What the team needs to explore

Before their next assessment, the team needs to:

  • test with novice buyers and provide evidence of how this group is succeeding on their journeys first time and unaided. The prototype was tested with a variety of users, including 11 procurement agents who were new to CCS. However we understand these users did not include ‘novice buyers’ (people whose main job is not in procurement - for example head teachers - but have to engage with a procurement with CCS, likely for the first time). This is the user group more at risk of being excluded and not able to use the service.

6. Have a multidisciplinary team

Decision:

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was a multidisciplinary team, blended with civil servants and contractors, with focus on training and coaching the civil service team
  • a number of different roles within the team attended user research sessions
  • there were good resource plans for the team moving into beta
  • whilst not part of the team, they had found good ways to work closely with key stakeholders, bringing policy and delivery instep

What the team needs to explore

Before their next assessment, the team needs to:

  • as recommended in the previous alpha assessment report, have clear lines of accountability into an empowered service owner. This person should be accountable for the quality of their service, managing end-to-end services that include multiple products and channels. They should have overall responsibility for developing, operating and continually improving the service, representing it at service assessments

Guidance on service owner

Guidance on service owner role in team

7. Use agile ways of working

Decision:

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working in agile and have found a good approach that works for them, enabling them to deliver
  • they are working in the open, doing regular show & tells and briefing sessions
  • the team is using a good range of agile tools enabling them to effectively work remotely

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that internal policy and governance isn’t impacting on the team’s ability to effectively deliver, such as shared access between contractors and civil servants for collaborative calendars and emails

8. Iterate and improve frequently

Decision:

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team iterated the prototype over time based on user research and feedback

What the team needs to explore

Before their next assessment, the team needs to:

  • work with the CCS Tech Ops team to ensure the most efficient way to get development work released is established

9. Create a secure service which protects users’ privacy

Decision:

The service met (subject to GDS CTA review and approval) point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • users will authenticate using the Open ID Connect-compliant Conclave service
  • security is baked into the development process with automated vulnerability scanning and security checking tools
  • penetration testing is planned for before any public release of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • the proposed robotic process automation (RPA) component of the system presents a significant security issue. Because of the way Jaggaer implements user permissions, it is not possible for the credentials shared with the third-party RPA provider to adhere to the principle of least privilege, so even though the RPA is only intended to be used for a small number of functions the credentials will give full control of the buyer user accounts. Furthermore, the RPA integration will make it necessary for all user passwords to be stored unhashed (in a way that allows them to be accessed in plain text). This goes against best practice for securing user accounts, and any breach of this password data would be potentially catastrophic, giving full access to commercially sensitive procurement information and personal data of the buyers
  • have an independent GDS CTA review to confirm that the security risks posed by the RPA solution are reasonable to accept

10. Define what success looks like and publish performance data

Decision:

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a good range of well thought through KPIs, linked to their business case and driven by user needs
  • the team had a skilled resource to progress good work on KPIs

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how they will use the KPIs in beta to iterate and improve the service

11. Choose the right tools and technology

Decision:

The service did not meet point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has chosen to build using open-source technologies that are already in use in the organisation (PHP and the Symfony framework alongside Wordpress)
  • the service will be hosted in GOV.UK PaaS
  • integration with the Jaggaer API has been tested as part of the alpha
  • APIs for interactions with other SCALE components have been at least partially defined

What the team needs to explore

Before their next assessment, the team needs to:

  • many details of the data supplied by the Agreements Service are yet to be defined, and how this data will be used in the frontend to generate a specific user journey for each framework is not clear. The technical feasibility of this should be tested and proven, and more detail teased out. Modelling dependencies between questions (and the answers provided) is not straightforward, but the technical solution should not constrain the design of framework agreements. The platform for delivery should enable the right commercial agreements to be made, not dictate what is and isn’t possible
  • the proposed RPA solution is brittle to changes in the eSourcing UI that are out of the control of the service team. While RPA is only planned to be used for a small number of interactions, these are critical - messaging between buyers and suppliers, assigning scores to applications and ultimately awarding a contract will all be managed using RPA, and any of these functions going awry could lead to loss of reputation and potential legal issues
  • another open question around the RPA solution is how the web page (HTML) responses from Jaggaer web frontend will be translated by the RPA service into suitable API (JSON) responses to be passed back to the CAT user interface. Scraping web pages for data (for example to extract messages forming a conversation with a supplier) is another potentially unreliable process that could jeopardise a procurement if it went wrong
  • the team should plan to mitigate these risks with frequent integration testing against a non-production Jaggaer environment to check the desired behaviour is maintained. A clear responsibility model needs to be defined between Edgeverve, Jaggaer and CCS around keeping the RPA integration functioning as expected as the eSourcing UI changes over time - find out whether this is something that the SaaS provider is prepared to guarantee

12. Make new source code open

Decision:

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure any code written is open

13. Use and contribute to open standards, common components and patterns

Decision:

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using the OpenAPI specification to define interfaces for interaction with external services (Conclave and the Agreements Service)
  • the Open Contracting Data Standard will be used for data structures within the service where appropriate
  • the team plans to leverage Government as a Platform services, with hosting on GOV.UK PaaS and frontend components based on the GOV.UK Design System
  • a move to using GOV.UK Notify for notifications is planned for the future

What the team needs to explore

Before their next assessment, the team needs to:

  • consider feeding any new or improved design patterns created back into the GOV.UK design system

14. Operate a reliable service

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • The architectural approach is sensible and the team is hosting on Gov PaaS

What the team needs to explore

Before their next assessment, the team needs to:

  • the service is dependent on multiple other CCS products and services. The team needs to be clear on their recovery approaches to products and services they are dependent on failing
  • the proposed model is that the separate CCS TechOps team will be responsible for monitoring the production service and for deploying code to the production environment. If it’s not possible for the service team to own this responsibility they should at least share it, and have a real-time view of monitoring and alerts on their service, and be empowered to respond quickly to issues that arise; the process to get new code shipped to production should be in the order of hours rather than days

Published 10 January 2022