IG toolkit alpha assessment

The report from the alpha assessment for DH's IG toolkit service on 6 September 2017.

From: Government Digital Service
Assessment date: 06/09/2107
Stage: Alpha
Result: Not met
Service provider: NHS Digital

To meet the Standard the service should:

  • Understand user needs before continuing with further development of the service
  • Spend more time researching users in their own circumstances to understand their motivations and the barriers to completion for the most excluded users
  • Clarify the exact scope of the service and what the priorities and minimal viable product are for the first publicly accessible version
  • Focus on user journeys and outcomes through the service to avoid unnecessary design and interaction

About the service


The service together the legal rules and central guidance set out by Department of Health policy and presents them in a single standard as a set of requirements against which organisations can assess themselves.

Service users

The users of this service are nominated representative/s of any organisation that processes NHS Data. This presently amounts to circa 25,000 organisations


User needs

The service will be used by a person within a healthcare organisation responsible for reporting organisations compliance with NHS set security and data standards. This could be someone’s full time job (eg. a Security Officer) or something that is part of another role (eg a GP Practice Manager). Any of the 24000 organisation that shares information across the NHS must use this service to demonstrate their compliance with the policies and standards that have been set centrally concerning data standards and FOI requests. A new supplier to the NHS must complete this service as part of the contracting process with the NHS. Each year all care organisations must report back their compliance, this applies regardless of their capabilities, the range of reporting is from large Health Care Trusts to small independent Care Homes. Care Homes represent a targeted group of users that are not currently compliant with the framework and reporting. The mandate to replace the current service came from a review of data standards across health organisations.

The service team is building an understanding of users through a variety of engagements with stakeholders and users. These have revolved around presentations to special interest groups, where attendance has been higher than expected, indicating a high level of frustration from user of the current service from user who gave up non-work time to attend these. The outcomes of these sessions has led to the development of 4 key personas that the service team validated with further user engagement. The personas are being used to develop the scope of the service.

There has been limited usability testing so far, much behind the expected for a service of this type with the broad range of users identified by the personas. Some thought has been put to the capability of users, but there are some significant assumptions on skills and capabilities to the newly targeted users. It has not been shown that these users have an understanding of the service, their requirements to use it or the capability of users to complete the tasks required of them. Language and terminology is a significant barrier, with different user groups representatives requiring different terminology for the same activities or tasks. Some research has shown that professional users do not want language changed or simplified, although it was unclear to the panel what the motivations were for this as a significant insight from users was the desire to move to ‘Plain English’.

The service team is working to understand the overlap with the Care Quality Commission (CQC) and GDPR, in particular how users can complete this service to meet those needs as well. This is useful, but experience suggests that trying to combine multiple standards under a single framework is hard and complex for users to understand.


The development team is based in Exeter, with the policy and design teams based in Leeds. The teams communicate regularly but do not meet in person frequently. The development team has the usual coding and testing skills with a lead architect and support from a hosting and infrastructure team.

The development team get feedback from research and design and make changes based on those. They have access to video of research to see the actual behavior of users and have attend some research events. Requirement discussion happen weekly between the design and development teams, normally this is over video. Feedback from testing is regular, but there appears to be some ad-hoc work happening based on design possibilities rather than actual design work.

The service is using a Scrum style approach good for the development with a weekly sprint plan.

The service team is not responsible for the delivery of the content of the service, which is a blocker to design and delivery. The policy teams are aware of this, however the service team conduct feedback insights for user testing.

There is the possibility for conflict with the research, analysis and design roles being shared. There is clearly a passion from the team to improve the quality of the service.


The service team explained in the pre-assessment call the technology stack they had chosen. A diagram showing this was projected at the assessment and briefly discussed. The team had years of experience with this particular technology stack. The choice of stack was strongly influenced by pragmatic reasons and a commendable desire to deliver quickly. This approach should allow the service team to build a sustainable system, as the stack is widely used outside government and should be able to perform well at the anticipated scale of the digital service. However it does include significant proprietary components. New source code has been developed which is not open and reusable. The service team put forward a credible outline plan as to how the service could be moved towards an open source solution in the next phase.

Hosting would be at existing data centres where there was available capacity. The architecture at this point is such that it should be reasonably straightforward to move to a cloud solution if that is eventually desired.

The team clearly explained the security and privacy issues associated with the service, and how existing solutions in use to protect other services would be reused. The ability to test the digital aspects of the service end to end was in place, and due consideration had been given to what to do if the service was offline.


Design team/skills and process

The team doesn’t have a dedicated interaction or content designer – these roles are being shared by other members of the team. This creates a risk that design isn’t given the capacity or accountability needed to meet the standards.

The service has not followed a typical design process to arrive at the current approach – the team talked through how they began by mocking-up screens in PowerPoint, which were later put into production, rather than following a user-centred design approach.

It wasn’t clear how design decisions are made, how user needs are prioritised, or how research properly informs design. The team talked about business requirements and their approach to validating these through testing, which seems like a wasteful approach, instead the team should start again by looking at the needs of users (start with user needs, not government needs).

User journeys, information architecture, in-page hierarchy

It was unclear how the team has arrived at the current user journeys, Information Architecture, or navigation. This feels a little chaotic in the current service as demoed. There are many elements fighting for attention, several different types of navigation, and no clear start to the main journey.

Some of the navigation design in the current service is confusing – particularly the left-hand navigation in the ‘assertions’ section. It’s important to not rely on colour or icons to convey meaning – all types of users can find this confusing. For example, the ticks and crosses intended to convey completion can be interpreted as validation. Similarly, the numbers in coloured boxes – it’s not immediately clear what these mean.

The team explained that users will be informed about the service as part of their contractual agreements with NHS. It’s important to understand the details about routes into the service, and to design/optimise these, even if this involves an offline journey. Similarly, the team should consider and design journeys at the end of the service and return journeys to it. The team explained that some user groups are not using the existing system – what are the barriers? What can this team do or recommend to encourage take-up?

The design would benefit from a shift in focus to user journeys, providing clear and unambiguous visuals and emphasising how the user gets started and what they need to do.

The team should ensure there are fully complete user journeys. For example, the service allows users to switch organisation, but it’s not clear to the user how to link organisations in the first place. The team explained this is done through offline support mechanisms – ideally this function would be part of the digital service, but if not, the service should explain how users go about linking organisations.

Content design approach/guidance

The development of the ‘assertions’ is a separate process from the development of content generally for the service. The team is aware that this is a dependency to delivery – it is also a risk that the assertions and supporting content are not usability tested together.

Similarly, the guidance that supports the assertions and evidence gathering is being developed by third parties. Getting insights from third parties seems valuable, however the team should look at the content for the service holistically, so they can ensure all the content meets user needs. For example, relying on generic guidance to help users understand the assertions creates a broken journey – it is more helpful to:

  • First, aim to make the assertions and supporting content as simple and ‘complete’ as possible for the majority of users – you’re aiming to give the right amount of information that allows most people to progress (without having to delve into guidance), and avoid bloating the content in service by covering every edge case. This is a balance and requires testing and iteration to get right.
  • Consider ways to support specific user pain points with additional content – not necessarily linking out to guidance (linking out breaks the journey).
  • Understand whether generic guidance the user has to interrupt their task to read is helpful in this context, or whether this can be surfaced in a different, less disruptive way. What information is needed in the context of completing a task? Is there simple background material that isn’t needed in-context?

When working with subject matter experts to develop content for the service, it’s recommended to use pair writing techniques. This ensures user needs and UX-writing considerations are included, as well as subject specialism.


The service team is using the data they hold about the current service to inform the design and delivery of this revised version. There is evidence that the technical infrastructure is capable of delivering this service to the potential number of users well, shown through load testing. The service team is using Splunk to create insights and reporting, they are trying to understand the cost per submission. There is a complicated governance structure in place around the service because of the link to other organisations and desire to make this framework reusable for multiple purposes. It was not clear to the assessment panel who ultimately makes the decisions on the quality improvement measures from the service, although it was good to hear that feedback from users was included in this.

There appears to be an underlying need for health organisations to share their self-assessments with each other to be able to assure themselves when data is shared they can trust the receiving party and to compare their work with others. It’s not clear how users can use the service to benchmark their progress.


To pass the reassessment, the service team must:

  • Ensure that an alpha development stage is actually completed by testing with users and rapidly prototyping. This will allow the team to understand the largest challenges in the design and the risks to delivery.
  • Spend time understanding the government design patterns, that are well tested, and apply them to the service, only then add enhancements to navigation and notifications where users demonstrate a need in testing.
  • Have a clearer set of prioritised requirements for the launch of this replacement service (that can be simply communicated) and a separate plan for future lower priority enhancements, there is a risk that because this replacement service has been announced expectations will be raised from users and the business.


  • The service team should ensure that design and research are collaborative but separated roles, ensuring that the user researcher has taken part in appropriate training.


  • The service team should re-examine the code which has been developed and if there are any sections which can be opened up at this point, they should be.
  • In the next phase the service team should give consideration to the use of .NET core or alternative open source solutions, and more consideration to how other departments might reuse the code for their own information governance reporting.

Design recommendations:

  • Allocate dedicated and appropriately-skilled and content and interaction designers, and ensure they take part in the cross-government communities.
  • Implement a user-centred design approach, e.g.:

  • Design user-journeys and flows (map existing, and design ‘to be’ journeys).
  • Explore suitable existing design patterns that could be adopted/adapted.
  • Sketch/paper prototype, basing designs on user needs, exploring more than one approach – test early with users.
  • Move into HTML prototyping (e.g. Axure or cross-gov toolkit) – run moderated usability testing every sprint.
  • Iterate.

  • Set up a process whereby design is led by research, not business requirements.
  • Follow the examples across government with Content being designed by a skilled content designer, who will be responsible for ensuring that users understand the tasks they are required to complete.
  • Focus on user journeys – understand which journey(s) is the most important for these users, map it and optimise it (look at this as a separate exercise from the UI initially). Remove all journeys that don’t fulfil a user need or a legislative requirement. Map and optimise any secondary or tertiary journeys that do meet a user need. (When deciding how to represent these journeys in the service, understand the relative prominence each one needs to have for the user, so they’re not competing and users can clearly see what they need to do.)
  • Understand why some user groups may not be using the existing service, and aim to remove these barriers.
  • Ensure there are no incomplete journeys, or dead ends.
  • Review the current approach to navigation – do you need so many layers and types? Do you need a global navigation at all, or can you move more towards a task list/dashboard approach? (Or incorporate this.)
  • Start research and design activities at the real start of the service – how do users first become aware they need to interact with the service? What can you do to improve this journey? Include the start of the journey in the prototype and usability testing.
  • Consider and design journeys at the end of the service and return journeys to it.
  • Don’t rely on colour or icons to convey meaning.
  • Review the approach to content development in the service – consider the content holistically.
  • If possible, use pair writing when developing content with third parties and subject matter experts.
  • Refer to GOV.UK design patterns and components – these may not all be appropriate for NHS services, however, adopting tried and tested approaches saves time overall.


  • As part of the work to calculate to cost of submission consider understanding the costs for each of the identified personas.
  • Use data that is available from your contact centre and current online service to help understand the needs of users when the old service migrates

The service team should also:

  • The concept behind this service is perhaps something like “Show your care organisation handles information properly” and a more descriptive name than “Information Governance Toolkit” could encourage greater understanding of what the service does and why it should be used.
  • It would be beneficial to talk to other digital delivery teams who have been through the alpha assessment stage to share and learn from experiences.
  • The service team would benefit from more regular face to face meetings at the early stage of work, it would be good to see design work being co-located with the development team and a consideration to increasing the lengths of sprints to ensure adequate feedback and insights are gathered from user testing

Next Steps

In order for the service to continue to the next phase of development it must meet the Standard. The service must be re-assessed against the criteria not met at this assessment.

Please contact the Service Assessment team at least 4 weeks before the date you’d like to hold a reassessment. Consider carefully when you are ready for assessment and take into account the advice on the Service Manual about the different stages of service development and the outcomes they produce.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Not met
2 Improving the service based on user research and usability testing Not met
3 Having a sustainable, multidisciplinary team in place Not met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not met
13 Ensuring consistency with the design and style of GOV.UK Not met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Not met
17 Reporting performance data on the Performance Platform N/A
18 Testing the service with the minister responsible for it N/A
Published 6 August 2018