Submit a pipeline for spend assurance alpha assessment report

Report of the alpha assessment for the Submit a pipeline for spend assurance service on 16 Feb 2022

Service Standard assessment report

Submit a pipeline for spend assurance

From: Central Digital & Data Office (CDDO)
Assessment date: 16/02/2022
Stage: Alpha
Result: Met
Service provider: Cabinet Office (CDDO)

Service description

When projects/programmes meet a certain threshold of spend, Government departments are required to submit these projects to Cabinet Office for spend approval. This includes ‘pipelines’ of proposed spend, as well as ad hoc submissions of cases. Analysts in the Cabinet Office analyse these requests, offer advice and approval for the spend, and apply/monitor conditions to said spend. These conditions are then monitored to ensure Departments are applying spend in the most appropriate manner.

Currently, each spend control ‘area’ has their own process and route through which they manage these submissions, which creates a level of re-work for Departments/ALBs, but also siloed working within Cabinet Office.

Service users

This service is for:

  • department pipeline owners
  • department commercial leads
  • Cabinet Office assurance teams (for example, Digital and Technology Assurance, Commercial Assurance etc)
  • executive-level Cabinet Office decision-makers (for cases that are escalated to Board/CDO/Ministerial level)
  • Cabinet Office policymakers
  • department project and programme leads

1. Understand users and their needs

Decision:

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team knows a lot about their users and different use cases - from people with simple cases doing one-off spend controls, to those doing them regularly and having to interact with several other people to pull together information for an internal assurance before moving on to the Cabinet Office
  • the team has a well developed understanding of the spend control process and how submissions occur within different departments. This understanding allows them to recognise the boundaries of their service: how it will change behaviours and the extent to which it will have to align with existing ways of working
  • the team has a well developed prototype
  • the team has been working with a content designer to identify and utilise language that resonates with users
  • the team has a considered beta plan, which involves a gradual roll-out across departments and spend-control areas. Significantly, they recognise which areas of the product require further testing and have planned to continue prototype testing for these features during beta

What the team needs to explore

Before their next assessment, the team needs to:

  • use their chosen technology platform to inform testing with users. They will need to implement testing with users as they use the platform in beta to ensure that it meets needs
  • continue to refine the content used as part of the service to reduce the demand for domain specific and bureaucratic language

2. Solve a whole problem for users

Decision:

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understands the various constraints - both technical and organisational
  • the team has considered options for a first journey (such as properties) before designing on digital pipelines
  • though user research, the team identified an important journey - conditionals - and brought them into the scope
  • the team is thinking about different journeys such as submitting a pipeline and managing them
  • the team has a product roadmap of planned features

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to review their roadmap and decide when to release value - the ultimate value of this product is on reporting and improving communication, but to do so it needs good data. The private beta will give real information as to whether the content and journeys are good enough to do meaningful reporting
  • continue to develop the separate journeys - those of people submitting a pipeline and those receiving them. Some of the planned benefits are fewer questions about pipelines, faster communication and eventually better data - the team must get confidence that theses end-to-end journeys work in private beta
  • test not just the service but how people will learn about and start using the new system. The team is already embedding links to guidance in the service, but as part of private beta they should be testing and iterating the comms and training strategy Test how account creation and management will work based on the chosen tooling choice. The team understands the constraints from the tooling and proposes to let anyone with a gov.uk email register. Given the wide range of users and government department restrictions (for example, verification emails potentially getting flagged as spam) the team should test these journeys as soon as they have a development version to do so

3. Provide a joined-up experience across all channels

Decision:

The service met (with conditions) point 3 of the Standard.

What the team has done well

  • The panel was impressed that: the existing paper form will continue to be available when the new service is launched

What the team needs to explore

Before their next assessment, the team needs to:

  • consider iterating the offline journey as well as their online one - there may be opportunities to even improve existing paper forms as a way of testing language before committing to it in the digital service
  • show how the assisted digital model and problem paths (for example, being locked out of a sign in) will work - the team believes that their proposed tooling choice will manage this but they must be confident that it works before moving into public beta

4. Make the service simple to use

Decision:

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has been doing regular usability testing of the digital part of the service
  • the team has identified the entry point for the service https://www.gov.uk/guidance/commercial-spend-controls-version-5
  • the team has considered naming for the service and has decided that “spend control pipeline” is common language in the civil service
  • the service has had content and journeys iterated based on research, including linking out to existing guidance to help users
  • the team is considering how to use previous data to tailor information, for example by showing one section on the task list based on whether the pipeline has conditionals

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that all journeys are resolved and tested - for example, that users can easily create an account, change account details, and sign out from any page. The prototype had some inconsistencies that are understandable given the size of the service, but shouldn’t be replicated in the real service
  • continue to do the hard work to make things simple. For example, one section has guidance that ‘save and quit will return you to the sections-to-complete list’. This suggests that the ‘save and quit’ button should be renamed. The team should also continue to remove and simplify language where possible and use GOV.UK tone by not using ‘please’. The team may benefit from bringing parts of a service to a ‘Get Feedback’ design crit
  • continue to try a ‘one thing per page’ philosophy. While this doesn’t mean one question per page, it may mean splitting information that doesn’t need to be viewed at the same time, for example, a reporting dashboard and case list
  • continue to consider how the journey will appear for people sending a spend control pipeline to those that are managing the cases and reporting on them. Some services have different views and entry points for “submit” and “manage” users

5. Make sure everyone can use the service

Decision:

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • existing support from departments and the CDDO will also continue
  • an initial review of the chosen tool shows that it can use GOV.UK Frontend and so can benefit from its accessible features

What the team needs to explore

Before their next assessment, the team needs to:

  • do research with existing or potential users with access needs. The team has found one user with a visual impairment and put a call out across networks without success. The panel urges the team to put more effort into finding some people to test with, even if it means extending the reach to potential users that are given some background information. 3rd party accessibility testing will be useful but is not a replacement for testing with real people
  • if the team cannot find users with domain knowledge of the spend control process, they must still prioritise accessibility testing. They can do this by identifying meaningful prototype tests which are not dependent on domain-specific knowledge, they must then run these tests with a meaningful sample of users across a range of access needs
  • review their use of components with a particular view to accessibility. For example, the service uses a lot of dropdowns, which are only suggested as a last resort due to accessibility issues. Accordions can also cause accessibility issues. This will be even more important if the team decides to create their own components, for example, for showing reporting

6. Have a multidisciplinary team

Decision:

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have had the expected roles during Alpha (including a content designer) and have been able to work with their supplier to bring new roles into their team as necessary
  • the team is working well with their CDDO colleagues and ensured high levels of involvement in the work and decisions made
  • plans for Beta are proportionate and involve recruitment of Civil Servants to critical roles

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to think about the required roles for Beta and how to ensure the team has the right balance of suppliers and civil servants for the future

7. Use agile ways of working

Decision:

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working well together and have excellent levels of involvement from critical CDDO colleagues, including attendance at governance meetings and show and tells
  • the team worked well as a fully remote team and had mitigated for the drawbacks of this approach well, they were also able to recognise the benefits of this approach and the increased engagement possible from more senior stakeholders
  • the team are using a range of tools and approaches to help them prioritise work and deliver value incrementally. They have good knowledge of how to scale these for beta

8. Iterate and improve frequently

Decision:

The service met (with conditions) point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has iterated their prototypes and learned from testing to improve them
  • the team was able to describe these iterations and how they were informed by user research

What the team needs to explore

Before their next assessment, the team needs to:

  • continue iterating, particularly when the tooling has been chosen - the team should investigate ways to do throwaway testing with the sandbox version of the site (which is a high fidelity prototype, not a private beta)

9. Create a secure service which protects users’ privacy

Decision:

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • considered the security posture and risk for the service (but not recorded)
  • considered who the users are and the level of authentication needed
  • considered the security and data security of the 3 preferred suppliers and their management of security in SaaS
  • identified the lack of internal governance and reached out to GDS/CDDO information assurance security governance teams for support
  • issues highlighted through an initial assessment were addressed by the documentation submitted including Medix security policy, Mendix IAM and security assessment and ISM review of the original use of Mendix. In particular a thorough risk assessment by the CO I&A team with mitigations addressed.Technology and Security governance will pass from CO once formal security and technology authority boards are established

What the team needs to explore

Before their next assessment, the team needs to:

  • provide documentation on security risk assessments associated with data, cyber threats, and identity and access management (not just personal data)
  • data protection impact assessment and audit from an information assurance team
  • security review from CISO or equivalent governance structure
  • following risk assessments, how will risks be mitigated and consider how delivered via contract (as user stories or other)
  • design of identity and access management solution
  • preferably spike for IDAM to check if works and security checked
  • identify if supplier needs to provide IDAM
  • data security assessment of the suppliers environment
  • need an internal process for governance:

  • DPIA governance
  • security assessment
  • approval at CTO/SLT level

  • early Beta test of the IDAM integration
  • security testing including penetration testing

10. Define what success looks like and publish performance data

Decision:

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had considered the four mandatory KPIs and how they might be used to help them understand and improve their service. They had a well developed sense of how these might best be used for this largely internal service and which would be useful to users from other government departments
  • the team has a developed plan for other metrics and ideas about who the users of these might be. They plan to continue to iterate these through private beta

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to think about KPIs and how to move beyond the four mandatory ones to create effective ways to measure the success of your service

11. Choose the right tools and technology

Decision:

The service met (with conditions) point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • identified the main technology requirements through user research
  • identified where integration and interdependencies would be needed
  • assessed whether the 3 top suppliers can deliver broadly the technology and functional requirements as identified by the interface
  • understanding of the level of personal data developed

What the team needs to explore

Before their next assessment, the team needs to:

  • no build has been done with the actual technology so cannot be sure that all functions and data flow can be accommodated by the preferred supplier
  • need design work on IDAM and how IDAM options can integrate with the preferred supplier and how validation/authentication can be accommodated by the supplier or whether a 3rd party IDAM option is required
  • need to show documentation has been completed which shows criteria (functions, non functions or technical/security user stores) can be accommodated by the preferred supplier
  • details on data security and security testing from supplier
  • need to address the security risk assessment will be mitigated/met by the preferred supplier technology
  • given the records library is google docs – should consider whether a google based platform would be a better option
  • would be useful to have documented data retention requirements and how data in SaaS product would be made available for future investigations or audits and the archived data was in CO records library
  • records library requirements have been considered but should be documented and confirmed with information assurance on expectations on that and include this in contract with supplier
  • total costs of the solution should be considered including IDAM, additional cyber security, integrations etc
  • data retention/back-up needs to be designed and tested
  • accessibility check and integration with accessibility software
  • records library defined with process for retention, archive and end of life addressed

12. Make new source code open

Decision:

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is investigating how its proposed SaaS tools could allow for open-sourcing new code
  • the process and front end can be published and available and is technology agnostic
  • the workflows are well thought out and what documentation needs to be managed and stored
  • the mitigation on vendor lock-in was well understood and plans on how the processes can be shifted and rebuilt elsewhere if needed. (however this is budget dependent)
  • IPR of Forms are owned by HMG and can be shared by other Departments. Patterns and standards of design owned by HMG. Code of front-end owned and shareable

What the team needs to explore

Before their next assessment, the team needs to:

  • IPR of the low-code platform is all proprietary and cannot be made available publicly, therefore exist strategy and how that will be mitigated should be documented (this was explained verbally very well in assessment)
  • identify the removal process with supplier and provide documentation on how that can be moved to a different department
  • fundamental disjoint on SaaS versus open source that don’t think is fully reconciled in the assessment guidance (yes you can do SaaS, yes you must publish code)
  • define what part of the low code platform functions are proprietary and what are inherent front-end. How easy would it be to detangle at a later stage needs to be addressed and assessed by the team and their departmental governance
  • ensure repositories of information can be easily removed and archived

13. Use and contribute to open standards, common components and patterns

  • The team is proposing to use a proprietary low-code platform which provides proprietary integrations into cloud functions to allow a simple build/configure solution. These products are neither COTS nor bespoke but will tie the department into a proprietary solution. Guidance on this is unclear

The team is committed to open standards and building common components on the platforms. Their information and data architecture will be based on transferable patterns.

A clear exit strategy and plan for publishing standards and patterns needs to be defined in beta as well as an assessment of risk of moving off the platform.

Retention policy and archiving plan needs to be defined and implemented at beta.

Decision:

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified an important need to standardise language and vocabulary as part of improving spend control pipelines and has been working with a technical writer in the CDDO
  • the prototypes use the GOV.UK Design System
  • the team is talking to other teams in the Cabinet Office to understand similar caseworking patterns, and may be able to create potential patterns on case management evidence

What the team needs to explore

Before their next assessment, the team needs to:

  • work on their noted need for standardised and understandable terms - there may also be a need to publish this information as guidance or other communications. This may require content strategy/architecture capability as well as content design
  • review their use and adapting of patterns - for example the task list pattern, navigation and service name. The panel was also concerned that the service has some usable but not recommended components such as dropdowns and accordions (mentioned in more detail in point 5). The team has also created new components such as the reporting dashboard without showing if they tested simpler and less colourful options. Even if something isn’t in the GOV.UK design system, there may be some tested options in wider government. The team may benefit from attending the biweekly X-Gov Design Systems Chat (find out more from #govuk-design-system on cross-government slack )
  • engage with other teams in the government designing similar services, for example through the design system bi-weekly meetups, #admin-interfaces and #ucd-for-cots-products channels on the, or reviewing other departments’ design systems such as the MOJ and then talking to their teams
  • useful to establish a common standard around IFAM
  • useful to include a pattern on data in and data out, decision making points and include as pattern for future similar problems
  • SaaS vs low code platforms – is this a SaaS Cots or is it proprietary code? Not asked or answered in guidance
  • case management system – why low code rather than SaaS Cots
  • should include a bespoke coded solution in option even just to rule it out

14. Operate a reliable service

Without the preferred supplier in place, the details on how and who manages the day to day running of the service needs to be established; roles and responsibilities between the service owner, in-house team, supplier team needs to be defined including on areas of security, security monitoring, identity management, archiving and data retention and management. Non-functional requirements on service reliance needs to be established

Early in beta establish parameters on the ongoing service support in order to identify internal roles and responsibilities are established on the management of the service, security monitoring and alerting, data management and identity management. Establish an SLA in service transition to public Beta on how the service will be managed.

Decision:

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • considered retention but not documented
  • considered that the tool needs to act as a repository for decisions made, but not documented if any of the systems can do this properly

What the team needs to explore

Before their next assessment, the team needs to:

  • as mentioned before Security assessment needed – mitigation/risk - documented
  • SLA for supplier being confirmed and what will be in the contract – what is the reliability requirements – volume, scalability, out of hours, emergencies, downtime etc
  • as SaaS end of life, archiving, records management included in contract and available
  • note will need to this as an additional assessment before alpha is over – what is in the contract
  • define if this product is a records library and insuring information management team are aware of that

Published 12 August 2022