IPA Benchmarking Service - Alpha Assessment

The Report for the IPA Benchmarking Service Alpha Assessment on 22nd June 2021

Service Standard assessment report

IPA Benchmarking Service (known as well as The Data Hub)

From: Central Digital & Data Office (CDDO)
Assessment date: 22/06/2021
Stage: Alpha
Result: Not Met
Service provider: Cabinet Office (Infrastructure Project Authority)

Service description

The benchmarking hub will collect project data from completed projects and provide access to this data to project leaders across government to support project investment decisions. Helping to leverage the UK Government’s project portfolio data to support and shape future investment decisions. The platform will collect data on previously completed projects and future projects. The IPA would like to have a final product available by March 31 2022.

Service users

The following user groups and segments have been identified through user research and analysis:

  • Benchmarking Service Custodian (IPA - Head of Central Benchmarking Service)
  • Benchmarking Service Contributors & Consumers (Government Departments & ALB’s)
  • Benchmarking Service Consumers (External Parties)

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team moved at pace with a considerable level of research, a rapid cycle of sprints and the research being used to iterate and experiment with different features of the platform
  • the team have also identified a range of organisations in different sectors that would be using the platform and have been active at including them in the work they have done to date
  • the team has experimented with different design iterations and tested them with users

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct a deeper dive discovery of the various user groups to elicit their needs more fully, particularly the contrasting needs of different groups. The emphasis to date has been on eliciting the requirements of the platform, with prototypes being brought into the discovery process very early to test various features. But without really understanding the users, it is difficult to be sure that the platform being devised really addresses their needs
  • do one to one research with users to understand their needs. In doing this it would be valuable to bring a user researcher into the process. The discovery has relied heavily on workshops, and the dynamic of these tends to be to look for consensus - which may have contributed to the view that all users have common needs. More one to one user research will help elicit the contrasting need of different users groups, so give a fuller understanding of the complexity that the solution needs to address
  • the team are also encouraged to think of user segmentation not just in terms of job titles, but in terms of how the different user groups need to interact with the system. For instance, decision makers are likely to have quite different needs to hands on analysts. For example, the user personas created by ONS are an illustration of how this can reveal the contrasting needs, motivations and behaviours of different levels of users of data systems. We need to understand these different needs to be able to assess whether the platform being developed genuinely meets this breadth of needs
  • the work to date, and the requirements identified, have focused largely on the functionality of the platform. But it is unclear that this alone will address the business goals of reducing reliance on commercially sourced data, and improving the accuracy of cost estimates. The team have identified that there is some reluctance amongst bodies to share their cost data; and that different bodies count different cost drivers into their costs which adds complexity into the process of comparing costs. It would be good for the user research to elicit user needs and blockers around these issues, so that this can be addressed in the solution being developed. In addition, if this is about making cost estimates more accurate, it is expected that at least some of the user groups would have a need to compare costs against estimates and to learn from this, so that over time this feedback makes the estimates increasingly accurate. Again, these needs do not appear to have been surfaced, so it is not clear that the solution being developed will address them
  • the team plan to incorporate accessibility into their beta phase. However, it is an expectation at discovery and alpha to conduct research with people with access needs to ensure their particular needs are being considered in the early stages of design

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has engaged the users within some organisations and user groups
  • engagement has been done with the current small team and managed to gather some good feedback
  • proposed solution does cover good elements of the problem that has been identified with the limited research
  • the team has considered some alternatives to creating an online service. They started with an Excel spreadsheet and evolved the service in response to research with users who are less digitally skilled

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the solution provided can offer the same functionality to the different user groups identified from the research
  • ensure that the service meets the needs of having to be adopted for other user journeys e.g. report requirements from different users within the organisation or external organisations
  • the different user journeys on the service need to have a clear path where they converge so these are joined up and that the service team can review progress and performance
  • as per point 1, carry out more in-depth research into the users and the wider processes surrounding this service. For example, how will users know the data is reliable? What do they do with it after they’ve downloaded it? What process do they need to go through in their own organisation to submit their data and what blockers are there? Showing this research will help to demonstrate where this service fits into a user’s wider journey and how it will meet both user needs and business goals
  • map the end-to-end journey so the team can show the whole problem they are solving, including how the offline elements fit together and support the aim of the service. This is currently in the backlog, but it’s important to understand the whole landscape of the service to spot pain points and make design decisions. The Service Manual has guidance on mapping a user’s whole problem and getting the scope of your transaction right
  • find a way of getting service design expertise on the team to help challenge risky assumptions and map, understand, and improve the end-to-end user experience
  • learn from government communities and departments. For example, the data science community may have existing insights and research, as departments such as the Department for Education and Office of National Statistics have similar needs. The team may be able to reuse some work as they develop the service

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the digital and the non-digital journeys have been mentioned whereby on the latter an email route is being looked at
  • some users will have more mature technical teams which has lead the team to consider other options to share the data such as API or some integration with the user’s external service
  • the team has identified the different ways they expect users would get to the service

What the team needs to explore

Before their next assessment, the team needs to:

  • as per point 2, map the full service journey, including offline elements. This will help show that the team understands and has a plan to match what is being used today (online or offline) and how the new digital service will impact the operational processes
  • carry out more investigation into what the offline aspects of the service involve, what the user needs and pain points are, and how you can help users complete them
  • demonstrate how the offline journey will work, and show that the team is developing that journey in parallel with the online service so they’re consistent
  • based on user research, create the clear path to getting to the digital service, its adoption and take up

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the prototype is styled to be consistent with GOV.UK

What the team needs to explore

Before their next assessment, the team must:

  • carry out usability testing, ideally in user’s real work environments on their work devices, and reduce reliance on subject-matter expertise to make design decisions. This will make sure the team finds out what works, not just what people think or say they want
  • use the GOV.UK prototype kit to test functionality more realistically and make sure users can succeed in their tasks first time. The team is currently using a clickable wireframe, but they will learn more from making prototypes that allow users to do things like creating an account or submitting data. Having people on the team who can prototype in code will help do this
  • consider what essential features need to be part of a minimum viable product, and what can be added later. For example, the team could prototype and test around account creation, downloading and submitting data first, and add the power BI dashboards later if necessary
  • get content design expertise on the team to test hypotheses for the language the service uses, ensure consistency between online and offline elements, and identify where in their journey users need certain information. For example, definitions of key terms should normally be presented in context at the point a user needs it, not separated in a Glossary

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • there will be an offline way to use the service for those who cannot use the online version
  • the team are using the GOV.UK design system
  • the team are aware of the WCAG accessibility requirements

What the team needs to explore

Before their next assessment, the team needs to:

  • involve disabled users at every stage of their research and demonstrate how the service will meet their needs. At alpha, this involves understanding if current or potential users have access needs, and how online and offline parts of the service can accommodate them. The Service Manual has more about accessibility and alpha in making your service accessible, and accessibility for government users in designing services for government users
  • see WCAG 2.1 AA as a starting point not an end point when considering the team’s approach to accessibility. Meeting WCAG requirements is not enough to be sure a service is accessible - the team must carry out usability testing with disabled users and make improvements in response
  • secure specialist accessibility advice for custom components that are not in the design system, in particular, graphs

6. Have a multidisciplinary team

Decision

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made a really good progress considering its size and skillset appropriateness to deliver certain tasks which may have been outside their remit or area of expertise

What the team needs to explore

Before their next assessment, the team needs to:

  • have a team that can fulfil the different roles needed at alpha as set out in the Service Manual. On this service, it seems like a full time product manager, user researcher and service designer would be the most beneficial. The team should also have access to a designer with prototyping capabilities, and content and accessibility expertise
  • have a clear and defined roles and responsibilities view
  • have the right skills and experience who can manage the bridge between the digital and non-digital journeys so the team makes effective decisions which are the right ones to prioritise and deliver for its users

7. Use agile ways of working

Decision

The service did not meet point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • two weeks sprints have been setup for the team to deliver their committed tasks and check progress against
  • show and tells for the changes presented to the stakeholders

What the team needs to explore

Before their next assessment, the team needs to:

  • show a clear understanding of the agile ways of working along with the other governance processes when delivering digital services
  • have a robust way of managing changing of requirements and priorities so they can react in timely fashion
  • have a robust understanding to be able to reduce impacts on the sprints when priorities change and have a dedicated process to manage this
  • be able to factor in the changes and learning coming from testing the changes with the users and incorporating those into the next committed tasks in the sprint

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • landing page of the service tested and updated the page accordingly
  • team tests the service when a new change is introduced to the journey

What the team needs to explore

Before their next assessment, the team needs to:

  • continue showing the changes often and capture the feedback to learn from
  • ensure that the new team has the right capabilities and capacities to be able to iterate the service with the users and be able to capture the feedback and turn those into actionable stories/tasks

9. Create a secure service which protects users’ privacy

Decision

The service did not meet point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has undertaken an assessment of the security risks and issues that may affect the service early and this has been used to influence the design

What the team needs to explore

Before their next assessment, the team needs to:

  • define the identity and access management for the service following user research to identify what is preferred by the user community (i.e is single sign on a requirement) and how this access management will work with the chosen technology, to protect the data provided by the service users

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has clear strategic goals to help the department be more efficient when interacting with the users
  • the service team will work to baseline the requests from users to streamline the service
  • the number of organisations using the service as its benchmarking baseline will increase overtime which will give more use for the service and increase the funding coming in, once the recharge model is in place
  • meet user satisfaction with service availability and performance, as well as data quality that can be of use to the teams requesting access

What the team needs to explore

Before their next assessment, the team needs to:

  • define the service specific metrics to measure against such as request timings and responses to the users with the appropriate answer / information
  • set the right capability to help with appropriate metrics and define the details around the tool or ways of capturing the feedback
  • setup the service for success by capturing the right information from the monitoring capability and ensure the team has a clear view on what to do with the information captured e.g. translate them into appropriate items for the product backlog

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have completed a detailed assessment to select the right dashboarding tool based on sensible criteria
  • service prototyping has been completed on an appropriate platform
  • the team have defined appropriate tools and technologies to be used by the supplier.

What the team needs to explore

Before their next assessment, the team should:

  • confirm any assumptions/choices are fully backed by user research, especially where they are justifying the use of proprietary tools over open source technologies

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are committed to coding in the open as per the TCOP

What the team needs to explore

Before their next assessment, the team needs to:

  • confirm to any supplier that coding in the open is a requirement of the development and provide a link to the repository when it becomes available

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have narrowed potential languages and frameworks to open source and GDS approved ones
  • the team are using technology familiar and in common use in their user communities

What the team needs to explore

Before their next assessment, the team should:

  • confirm any assumptions on user technology through user research
  • explore the use of other common tools and technologies when designing the identity and access management and user systems for communication

14. Operate a reliable service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is being hired for beta through DOS framework
  • there will be four FTE in IPA for the benchmarking team, currently the team has not hired but the 4 FTEs will be planned to help with the delivery
  • two more roles have been approved by the IPA to help with the benchmarking data management side of things and this recruitment is scheduled for FY 2022/23 (subject to spend review approval)

What the team needs to explore

Before their next assessment, the team needs to:

  • a clear structure of the team that delivers, hands over to a live service operations mode and the support team needs to be understood and defined
  • the spend review request is not sufficient nor appropriate for supporting this service going forward; the team needs to either ensure they will have the right roles in place or have a strategic external partner that helps with the support
  • that reliance on the new 4 FTEs to help deliver this service seems very high risk considering the processes and timelines for the team to hire, train and have the new FTEs ready to join delivery; a new strategy is recommended to ensure the success of this delivery
  • confirm any assumptions made about system availability and resilience through user research

Published 4 February 2022