Integrated Data Service beta assessment
Service Standard assessment report for ONS's Integrated Data Service
Service Standard assessment report
Integrated Data Service
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 08/03/2023 |
Stage: | Beta assessment |
Result: | Not met |
Service provider: | Office for National Statistics |
Previous assessment reports
Service description
The Integrated Data Service is being created in the cloud in line with the One Government Cloud Strategy to support statistical analysis giving access to a wide range of data. The platform takes us towards the future of data sharing across departmental boundaries and increased research collaboration across government and beyond. The ultimate ambition is to bring together all the talents within virtual teams drawn from across government (and beyond) working together to conduct research that addresses complex policy questions.
As a key component of the National Data Strategy, Integrated Data Platform’s vision is to create a safe, secure, and trusted infrastructure for government data, enabling analysis to support economic growth, better public services, and improving the lives of citizens.
The technology platform will utilise, as far as possible, managed and cloud-native services (rather than bespoke development) initially deployed within a single cloud supplier (with data accessed via the virtualisation layer) but over time it will become truly multi-cloud and hence will seek to adopt equivalent services from other providers where they are available.
A major element of the platform is the use of data virtualisation technology to enable access to different data sources whether hosted on-premises or stored with any of the cloud providers as agreed with the data supplier.
In summary, the value proposition for the service is the access it provides to a wide range of data, the tools to support innovative data analysis and data science and access to a cross-government community of data professionals who collaborate to help solve complex policy questions.
Service users
This service is for:
Current focus:
- Data Analysts
- Data Scientists
Others to be added:
- Data Providers
- Project Coordinators
- Policy Makers
- Chief Digital Information Officers
- Administrators
- Technical Operations
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team have embedded user-centred design (UCD) skills, including user research, across the end-to-end services
- the team have adopted a UCD approach to user research and usability testing to improve user experience
- the team used moderated usability testing to identify improvements
- the team has a comprehensive research plan for the beta stage, including a research trial planned with early adopters
- personas continue to be expanded upon and mapped across the user journey, for example, the external data provider persona has been mapped across the user journey
- the team have successfully switched to the jobs-to-be-done (JTBD) framework to transform personas into operational, role-based activities
- the team has a number of ‘deep dives’ lined up with chief data officers and chief technology & information officers across government as part of their strategic engagement strategy
What the team needs to explore
Before the next assessment, the team needs to:
- continue mapping users’ jobs-to-be-done across the service blueprint
- use the ‘deep dives’ with senior leaders and lead data analysts to collect what data they might seek to acquire, to feed in to a strategic data roadmap
- focus on building an audit trail between user research and design improvements
- continue to explore more opportunities to widen the number of users, for example, by engaging with existing data user groups, forums and communities
- identify, understand and document the needs of ‘decision-makers’ who look to data and data analysts to help them set and measure the impact of policy, like ministers, policymakers and senior civil servants, and others
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that the team:
- have an ambition to empower evidence based policymaking
- are working to enable the service to manage data with different levels of sensitivity for different audiences, including data analysts, government policymakers, academics and commercial users
- have invested much time and effort in engaging and communicating with with data providers to constantly improve data acquisition process
- will continue to look for ways to help users visualise the data to lead to better understanding and decision-making
- have created a data linking solution and other functionality that they have shared with the NHS and other departments, that could be scaled and used more widely by data suppliers
- have been working to change the approach to data acquisition from having an agreement in place for each data set, to instead having an agreement with a person that covers multiple data sets
What the team needs to explore
Before the next assessment the team:
- must understand and document how the design and service will meet the needs of ‘decision-makers’ who look to data and data analysts to help them set and measure the impact of policy and make decisions, like ministers, policymakers and senior civil servants, and others
- should look to understand and ideally create a visualisation of the high-level ‘as is’ and ‘to be’ data ecosystem, including which organisations (government and others) hold relevant data sets, which users or organisations have a need to access certain types of data sets through the IDS
- must provide a data roadmap of what data sets have been acquired already and who needs them and why
- in the data roadmap the team must include details of what data sets they are looking to acquire and which users have needs for these data sets, for example, decision-makers, so they can best meet the challenges they face in their key policy areas of the economy, health and energy
- must understand and document which identifiers or references they anticipate (based on users’ needs) will be their core data-matching keys (golden keys) to be used to match entities across data sets. For example, UPRN or NHS number. Their core list of matching keys may change over time
- must take the needs and key scenarios identified through user research and run experiments and/or proof-of-concept tests to confirm to what extent the service will give decision-makers the insights they need. For example, ‘As a minister I am looking to data and data analysts to help me make a decision on whether to increase or decrease certain UK tax rates or reliefs, so I can help support UK businesses but also collect tax revenue to provide essential services. I also need data to tell me how the changes I made or policy we set impacted businesses and tax revenues’
- must continue to understand the barriers that cause organisations to be reticent in releasing their data and continue to work to remove these barriers to help ensure that they will contribute to the platform
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- it’s clear to users on how to get support while using the service
- the team have gone from having low levels of support to having a trained and dedicated support team in place that is able to resolve queries much quicker
- the team are exploring how best to improve the onboarding process, for example, are looking to make the guidance easier to understand for all user types and ways users can self-serve
What the team needs to explore
Before the next assessment, the team needs to:
- make it clear to users on how to get support when not logged in to the service, for example, in the footer of the Integrated Data Service website
- continue to iterate the start points for the service
- ensure GOV.UK Design System patterns are fully exploited to ensure the user experience is consistent across the user journey
- conduct end-to-end usability testing to highlight
- create a service blueprint for the overarching user journey across all services
- prototype and test the end-to-end user experience and touch points and use the findings to iterate it and reduce pain points for all user types
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the research accreditation journey was made shorter and smoother
- improvements to dashboard navigation have been identified through moderated usability testing
- user support guidance has been re-designed to remove internal language, making the content clearer to users
What the team needs to explore
Before the next assessment, the team needs to:
- ensure GOV.UK patterns are fully exploited to ensure the user experience is consistent across the user journey
- test on mobile devices, particularly tablets
- document and prototype the end-to-end service across the different channels. Continue to test the end-to-end service with all users, including people whose primary roles are not data experts (product, policy and admin type roles) to ensure their user journeys can be completed with ease. These may be either data providers or data accessors
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has booked an accessibility audit with the Digital Accessibility Centre (DAC)
- the team are aware of the need to build the service in line with accessibility needs
What the team needs to explore
Before the next assessment, the team needs to:
- ensure that the end-to-end service is accessible and meets government standards
- continue exploring ideas for finding users with accessible needs, for example, through cross-government accessibility communities
- continue explore how to meet users’ accessibility needs, for example, in partnership with the ONS Disability Network and Neuro-divergent group
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- specialists have been hired for key roles, including service design, performance analysis and user support
- the team has found solutions to problems with recruitment and staff shortages
- teams across the end-to-end service are multidisciplinary
- knowledge being is shared as new team members are onboarded and offboarded
What the team needs to explore
Before the next assessment, the team needs to:
- define a governance model that empowers individual teams to contribute to the end-to-end service’s goals
- show how it uses service design and product management to ensure a consistent user experience across all stages of the end-to-end service
- create a plan for growing the user support team as demand from users increases
7. Use agile ways of working
Decision
The service did not meet point 7 of the Standard.
What the team has done well
The panel was impressed that:
- a roadmap has been established to communicate priorities across teams and to leadership
- stakeholders have been educated on agile ways of working, including the difference between a roadmap and a delivery plan
- a limited but well-sized cohort of users have been part of the public beta so far
- user personas have been iterated based on what the team learned through the agile phases
- the team responded quickly to emerging needs by spinning up a support team
What the team needs to explore
Before the next assessment, the team needs to:
- address the points raised on what the team needs to explore from the alpha assessment, for all parts of the Service Standard
- make the scope of the private beta clear, including which objectives they will not focus on
- define what success looks like for moving to public beta
- make their strategy understandable to data analysts, data providers and other users, for example, by creating an open roadmap
8. Iterate and improve frequently
Decision
The service did not meet point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team has used user feedback to identify improvements that could be made
- the team has thought about collecting feedback at different stages of the service
- the team has a plan for implementing performance metrics
What the team needs to explore
Before the next assessment, the team needs to:
- communicate which user needs and pain points have not been addressed, and create a plan for addressing those
- start implementing improvements that contribute to the service’s success, for example, make the hub easier to use to increase user satisfaction
- consider inviting more users to the private beta, so that they can use cohort analysis to observe improvements
- explore potential switching costs that could be preventing users from engaging in the private beta
- consider where Lean practices might help the team iterate more frequently
9. Create a secure service which protects users’ privacy
Decision
The service did not meet point 9 of the Standard.
What the team has done well
The panel was impressed that:
- security controls and technical design across the service, from data ingestion through to analytical tooling and products, suggests the right level of authentication, security and privacy assurances
- the team has set up monitoring, allowing them to easily identify and respond to unauthorised access
- the team has set up dashboards allowing them to easily observe and manage technical performance
- the team has demonstrated excellence in applying the Five Safes framework to the end-to-end service
- the service maintains data sovereignty within the UK
What the team needs to explore
Before the next assessment, the team needs to:
- demonstrate the data collection and acquisition process during data ingestion from the varied data providers
- show that the indexing and matching service actively incorporates security and privacy factors within its profiling, indexing and metadata tagging
- demonstrate how their security controls are robust and is proportionately an apt approach to identify risks and fraudulent activities
- how to provide security and privacy assurances to data providers before the data has been acquired
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has established a clear set of critical success factors
- key performance indicators have been created along with details of data sources and frequency
- some dashboards are in place for tracking and forecasting purposes
What the team needs to explore
Before the next assessment, the team needs to:
- provide greater visibility of how performance data is being used to support user research
- show how the findings are being used to iterate on the service, through service design and user testing
- use digital analytics to measure how users are interacting with web-based touchpoints
- clarify which critical success factors or user needs the KPIs relate to
- define what good looks like or a minimum threshold for each KPI
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the platform’s conceptual solution architecture outlines a good approach and choice of tools and technologies in developing a robust analytical platform
- the team have clearly articulated their transition from one cloud platform to another provider to overcome their integration and interoperability issues
What the team needs to explore
Before the next assessment, the team needs to:
- consider further development of tools and technologies to support data scientists and statisticians across government
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team is using GitHub to share code amongst ONS developers
- the repository is linked to other open, publicly accessible accounts for open data and open analytical components
What the team needs to explore
Before the next assessment, the team needs to:
- establish a more coherent approach to open data standards and practices by socialising its code base, best practices, and development of analytical and statistical standards
- consider how the code written in workbooks could be shared and re-used by the wider data community
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has developed a common model for their metadata tagging services that meets CDDO’s metadata standards
- the team has utilised the ONS Design System which is at par with the GOV.UK Design System
What the team needs to explore
Before the next assessment, the team needs to:
- contribute research findings or new components and patterns to the ONS Design System, for example, from the Research Accreditation Service or Hub
- run open show & tells to socialise the service and share patterns with other data services
- show how they are using GOV.UK patterns to guide and support the users
- share components, patterns and assurance frameworks with other organisations developing trusted research environments, for example, NHS England’s Secure Data Environment service
- consider making a researcher’s accreditation status available to other data services
- continue publishing outputs in HTML not PDF, further building on the transformation of the ONS website
- continue exploring how Semantic Web technologies and schemas can improve the discoverability of data and analyses on the open web
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the service is cloud-centric and has high availability of its services to its end users
- the service is using an infrastructure-as-code approach to ensure resilience in the event of any failover or downtime of its services
What the team needs to explore
Before the next assessment, the team needs to:
- evidence and demonstrate the quality of data once ingested within platform, so that users can trust the data products developed
- create a set of performance indicators and objectives for user support
- develop a plan for operating and improving the end-to-end service throughout public beta, as user numbers grow and the demand on individual teams increases