INDEX alpha assessment

Service Standard assessment report INDEX 30/03/2022

Service Standard assessment report

INDEX

From: Central Digital & Data Office (CDDO)
Assessment date: 30/03/2022
Stage: Alpha
Result: Met
Service provider: Cabinet Office

Service description

The Information & Data Exchange (INDEX) is a digital service for HMG to access and share data analysis and assessment from Publicly Available Information (PAI) alongside OFFICIAL and OFFICIAL SENSITIVE government reporting. INDEX uses cloud technology and advanced analytics to enable users working across government to navigate the huge volume of information available to quickly find, access and share relevant information to inform evidence-based decision making. INDEX aims to empower users to get value from increased coverage of information, in less time, to inform evidence-based assessments and decision-making.

Service users

INDEX users are from across the HMG intelligence assessment community.

Key User personas are:

  • Analyst: An analyst searches for, explores, and collects information from a wide variety of sources to produce intelligence products focused on a range of topics and themes.
  • Analyst with Geographical Focus: An analyst with a particular geographic focus.
  • Analyst with Quantitative Focus: An analyst with a particular focus on searching for, collecting, and synthesising quantitative information.
  • Visualisation Analyst: Supports all analysts to visualise their data and information in effective and compelling ways.
  • Manager: A manager spends a large portion of their time reviewing and revising intelligence products produced by other analysts.
  • Professionalisation Officer: Upholds best practice and standards across the community. Searches for and draws on internal and external tools and capability to upskill HMG officials in a variety of areas.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a good understanding of their core users and their needs
  • the team had tested different design ideas with likely users and looked at journeys for consumers and producers of information
  • the wider team was engaged in user research, several members of the team had direct experience working closely with the intelligence assessment community and a first hand understanding of current experiences, pain points and opportunities to enable users to navigate, find, access and share information under pressure

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to refine their user needs based on user research, focussing on the user’s problem rather than solutions such as INDEX or multi-factor authentication
  • consider incorporating contextual research (e.g. contextual interviews and observations in the workplace) into their user research plan for Beta alongside workshops and surveys. This may help the team describe and understand in rich detail offline parts of users’ journeys, systems and devices used, wider goals the service is helping them achieve, and potentially unmet user needs
  • review their approach to recruiting participants for research and adapt as necessary if they are not reaching certain users or gaps in key user groups participating in research. The team mentioned that they are using existing networks, contacts and email lists to find research participants as users are busy and due to the sensitive nature of the intelligence assessment community

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has used alpha to understand real constraints. This has included prioritising how more sensitive information can be securely stored online, resulting in two separate systems for publicly available information and other for government reports
  • the team has been engaging with the intelligence community, which has helped inform the vision of the service beyond MVP

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to monitor known tensions based on running two separate systems with a single sign on. The team’s approach of a single session working across both systems is in line with the Service Standard, but the team was aware of some users wanting them to be separate to avoid them uploading something to the wrong system. The team may also wish to make the government reports feature the core feature and publicly available information a value-add

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is thinking beyond repeat use to how the users will first get access to the service, recognising for example that making sure appropriate checks are in place could take up to a week. They’re also thinking about how to remove users from the service
  • the team has some understanding of how different users will have different journeys, but also that often information providers will also be information consumers
  • the team understand the tensions between a single sign-in giving access to two separate systems (open access and government products) - some users expect to be able to move between both whereas others expect them to be separate

What the team needs to explore

Before their next assessment, the team needs to:

  • show how they’ve designed and iterated on all touchpoints in the service. This includes automated emails to users but may also extend to service-related guidance that departments put in their own internal systems. Contextual research (as mentioned in point 1) will help test these artefacts
  • show that they’ve designed the service to meet the needs of not only the primary user but also any supporting users. For example, the manager persona may be a key person in the upload process. The team didn’t talk in detail about how they review documents so it wasn’t clear the role they might play in deciding text and tagging on INDEX and whether this would be through the service

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the proof of concept was generally in line with the principles of the GOV.UK Design System (using a content-first approach), even if it didn’t use GOV.UK Frontend
  • the team knows what devices their users are using and the implications - namely that while the current users are generally on desktops or work laptops, that this may change as the community invest in a wider range of devices

What the team needs to explore

Before their next assessment, the team needs to:

  • either use GOV.UK Frontend or have a strong justification as to why it is not appropriate. The team also acknowledged that the decision to deliberately look different from GOV.UK was based on early thinking which has since changed. Even if the team chooses to have a visually distinct identity, it’s possible to style GOV.UK Frontend to look visually different from GOV.UK and also work for repeat use (see for example the design systems for the Home Office and the Ministry of Justice)
  • show a robust and complete service. The team showed a video of a proof of concept (discarded for security reasons) and an interactive demo site. The demo site had placeholder text and many usability issues. These must be fixed for beta. There should also be no obvious pain points in common journeys. For example, the team mentioned that one or two analysts may work together, but the demonstrated service only allowed for one author and didn’t give guidance on how to input any authors’ names
  • monitor if there’s value in treating using publicly available information and using government reports as two separate transactions (as mentioned in point 2) with names to match. This is already hinted at in the proof of concept, and would also with naming best practices in the service manual. Depending on user research, there may be a basis to add visual distinctions, such as GOV.UK does to show different development environments

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is thinking about accessibility from the start, including finding users with access needs to test with and doing regular accessibility reviews of designs
  • the team is aware that the main user group are used to a document-based culture, and that they might not be comfortable with an entirely different way of sharing information such as working directly in a web system

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to do research with users with access needs, and show what they’ve learned. The team didn’t cover research they have done with users with access needs in the assessment discussion, and design decisions they’d made based on this research. It would be good to cover this at the next assessment to make sure no users are excluded from using INDEX. This will be particularly important given that users will be uploading their own PDF documents (mentioned more in the following point)
  • show how the uploaded documents will be accessible. Even though uploads are user generated, they will still need to meet public sector accessibility regulations. For example, published documents should be HTML5, ODT, or PDF/A with an open format. The team may need to use some of their private beta to understand how accessible existing government reports are and what can be done to improve them before they’re uploaded
  • continue with other accessibility tasks for beta, such as getting an accessibility audit and publishing an accessibility statement

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team comprises an appropriate combination of DDaT roles, which are currently filled by a reasonable mix of civil servants and contractors
  • the team has worked extensively with non-DDAT experts, including the data protection officer and external lawyers, to ensure it complies with its legal obligations
  • the team has tried to turn a resourcing challenge into an opportunity. Although some of the team members only work on INDEX 50% of the time, a number of these people are subject matter experts doing dual roles. While this arises because the delivery team are stretched, it contributes to their improved understanding of the problem space
  • the team has secured funding for a significant number of new roles to support the next phase of delivery. Some of these have already been advertised and are being socialised in different cross-government channels

What the team needs to explore

Before their next assessment, the team needs to:

  • consider if they should have ongoing team capability with skills in the knowledge management and information management profession. The team mentioned having some information management support, but the panel was concerned that they may end up with far messier data than they expect and have to restructure it later. Robust user-centred categories or ontologies may also help with training planned AI models
  • demonstrate how it is documenting and otherwise transferring knowledge about the technical architecture and engineering decisions it is making into the Cabinet Office.  Although the panel was impressed at the way civil servants and suppliers were integrated into a coherent delivery team, it didn’t see evidence of consistent engagement with Cabinet Office technologists. It would want to see more of this at the next assessment

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team works in 2 week sprints, using agile ceremonies such as sprint planning, retrospectives, daily stand-ups etc
  • the team bases its planning on a prioritised backlog, using suitable cloud-based tooling (in this case Jira)
  • the team uses a variety of other collaboration tools to share and manage knowledge between team members (including Confluence, SharePoint and Google Drive)
  • the team has taken a creative approach to buy vs build considerations by adopting its ‘hub and spoke’ model, where the central service is developed and maintained in-house, and additional services are procured from suppliers
  • the team has retained the support of its SRO through a change of personnel. The director attends weekly progress meetings and is highly invested in the service, but has delegated day-to-day decision-making to the team, in accordance with agile governance principle
  • the team gives regular updates to the chair of the JIC, who is supportive of the service
  • the team has advocated for following a rigorous agile delivery approach to this service, in spite of challenges within the organisation to push forward and deliver an end product without properly de-risking it
  • the team has championed taking an agile approach to service delivery, and submitting its work to cross-government peer-review through service assessments, within the national security community

What the team needs to explore

Before their next assessment, the team needs to:

  • consider ways it can bring an agile approach to the procurement elements of the delivery, by continuing to test the off-the-shelf software with users to ensure it meets their needs. This should include the option of discontinuing components that don’t deliver value
  • consider whether things like alerts can be deferred to a later phase of delivery, if the primary problem they are trying to solve is storage and active retrieval

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has been engaging users and getting feedback on use rather than just preferences - for example, descoping the idea of finding specialists via the service after finding that users liked the idea but were too nervous about security to add their own information
  • the team has clearly demonstrated that it has rejected (or deprioritised) features based on user feedback, for example with a feature that enabled users to connect with report authors
  • the team is being pragmatic and targeted about its development activities, through the ‘hub and spoke’ model that uses built and bought components

What the team needs to explore

Before their next assessment, the team needs to:

  • focus on doing usability testing and observations (as mentioned in point 1). The panel understands that the team has been constrained by security concerns. However, for beta the team will need strong evidence that the service works based on primary research
  • continue to validate that the off-the-shelf software is meeting the needs of users (and not becoming shelfware)

9. Create a secure service which protects users’ privacy

Decision

The service did not meet point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • The INDEX team have taken into consideration pertinent security principles (RBAC & identity) and ensured the various security threats, fraud vectors and data privacy stance, especially factoring in the level of sensitivity and official sensitivities of various data being collected through INDEX. However, the INDEX team has only outlined that they have undertaken data security measures through ‘Threat Assessment’ and are working with NCSC & CDIO on this avenue
  • The INDEX team has outlined that their Operationalise Support team includes a role of Threat Assessor within the overall development team structure. INDEX would also provide evidence of mitigatory steps for threats against DDoS attack vectors. It is deemed that the Threat Assessor would work with security experts at NCSC & CDIO to develop these mitigatory measures
  • The team would also undertake an IT Health Check of INDEX to test technical mitigations which would be put in-place in due course.

What the team needs to explore

Before their next assessment, the team needs to:

  • proactively demonstrate in their alpha and further more so in their private beta too (through various testing, validation and assurance measurement tests) of how they plan to build various data governance models, controls around protecting the sensitivity of data, which would make the INDEX service secure both whilst the data is ‘at rest’ and whilst ‘in motion/transit’ through the INDEX system
  • consider that it is strongly recommended by the Tech Assessor, that the INDEX team build and structurally outline a Data Classification Model which would not only ensure how the data/information is secured but also what steps/measures have INDEX taken to manage potential risks of fraud, official/non-official sensitivity of data leaks and protection against various vulnerability of penetration/accessibility to such critical data
  • outline within their architecture various security measures taken to encrypt data, how structurally robust is their VPC setup and depiction within their data flows/ETLs where security measures have been incorporated for various categories of data exchanges

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is thinking about how to measure and report on the 4 mandatory KPIs in the Service Manual
  • the team is considering appropriate technical tools to measure user interactions
  • the team is looking beyond basic KPIs to find more targeted ways to show they are meeting user needs, such as references in reports and the time taken to start writing a report

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to build on its initial early thinking after alpha

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the INDEX team have selected open source and open standards as their guiding principles in building their INDEX Hub
  • the INDEX team have proactively made a good decision on their choice of tools and technologies to be more cost effective and robust by hosting their core base in CO DSI Platform
  • the INDEX team has also taken measures specifically to enhance their Total Cost of Ownership and ROI and avoid ‘vendor lockin’ by selecting their components, tech languages and framework and IP by using open standards and open frameworks

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that it is recommended for INDEX to demonstrate that they have indeed incorporated the usage of open standards and open data by citing that they have a GITHUB account or any other open source repository to store their publicly available codebase to store their open-sourced product codebase
  • consider that it advisable to explore and build-up metadata standards in light of the sensitivity of the data being handled and improve performance of the service and not becoming another siloed service system
  • reevaluate storage costs involved as the service scales up and to provide more relsilency for their data, metadata and user base

12. Make new source code open

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the INDEX team during their alpha assessment has depicted that their system adheres to open standards and open data by their core INDEX being hosted on the DSI Platform and that INDEX’s IP design is vested in the interest of the Crown

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that it is recommended to showcase their adherence to open standards and open data by demonstrating that they have access to an open codebase repository such as GITHUB or any other public domain/open standards source code repository base
  • ensure that their technology partners can demonstrate at their next assessment within their architectural diagrams and drawings of how they have incorporated open standards versus closed code distinctively to separate their IP and sensitive data

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • INDEX at its ALPHA stage of assessment mentioned within its technical architecture adopting open standards and open data. However, in consideration of the sensitivity of the data being handled/uploaded and the requirements of data protection/privacy concerns, the tech assessor is unable to conclude how INDEX (PoC) has categorised its data and system in context to open standards and open data and how would INDEX contribute towards the open standards, in developing open components and patterns or in usage of open repository for their codebase

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate options to use GOV.UK technologies, such as GOV.UK Notify and GOV.UK Sign In. The service may not be appropriate for them, but the team will need to be able to explain why
  • use GOV.UK Frontend, or if not that, at least use principles from the GOV.UK Design System (as mentioned in point 4)
  • have a clear stance on how uploads will follow the open standards for government, and if needed, look for ways to provide guidance. The team didn’t talk about this in the assessment and the demo only allowed for PDFs. Aside from accessibility concerns (mentioned in point 5), clearer standards will help teams upload documents in a consistent format. There may be scope for the team to help create a specific standard

  • incorporate and demonstrate the usage of open standards in accessibility and assisted digital technologies as it progresses into its private beta

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the INDEX service is configured with three availability zones all resident within the UK and hence INDEX caters in providing a reliable service and a continuity of service in the event of any one of the availability zone’s failure

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that as a recommendation by the tech assessor that INDEX need to lay out their plans within their technical architecture and resiliency and security models of how they would maintain continuity of their service as the system scales upwards. What are their quality assurance testing measures and standards. Adoption of performance monitoring metrics, events and alerts related to high availability
  • consider that from a service design perspective the tech assessor would recommend for INDEX to incorporate within their SLA’s/SLO’s with delivery partners of how the service would account for service ownership with their partners whilst the CO maintaining accountability towards service continuity, availability and protection of sensitive data and confidential estate
Published 25 January 2024