Overseas healthcare alpha assessment

The report from the alpha assessment for DH's overseas healthcare service on 12 June 2017.

From: Central Digital and Data Office
Assessment date: 12 June 2017
Stage: Alpha
Result: Met
Service provider: Department of Health

The service met the Standard because:

  • User needs were explored in detail and resulted in a service design that is consistent with existing patterns and is fit for purpose
  • The team identified the need to transform both the back-end business processes and the user-facing service, in keeping with the digital transformation strategy
  • The team have a clear plan for the beta phase and identified the areas where more work is needed, including ensuring the continuity of the team and learning more about the external users of the service

About the service

Description

The wider service allows EU citizens to access healthcare in any other Member State and ensures that the cost of such care is borne by the right member state. This include the use of EHIC (European Health Insurance Card) for temporary visitors, pensioners who have retired overseas, posted workers, and planned treatment in a country other than their own.

The scope of this assessment is limited to the penionser journey, and more specifically the internal agent service. A further review is recommended for the external pensioner service.

Service users

The users of the internal agency service are internal administrators who handle applications and claims for the care of pensioners receiving healthcare overseas, as well as pensioners of of EU member states receiving health care in the United Kingdom.

Detail

User needs

The panel was pleased to hear that the team have undertaken a number of user research activities with anticipated users of the alpha, as well as citizens. This demonstrates that the team have worked to develop an understanding of their users and their needs from the service.

The methods the team used with internal users included a significant number of contextual interviews and a survey. The team have used insights from this work to create personas that have informed the development of the service. This has included mapping the digital literacy of the user group. In addition the team have undertaken usability testing of the Alpha with this user group, and iterated it based on the results of the testing. One challenge with the prototype, which the team are aware of, is that currently it does not enable users to correct mistakes when progressing through the journey. This is an important aspect to address in order to move into Beta to ensure that users can complete the journey first time.

On the citizen side, the team have conducted Guerilla testing with over twenty users at an event, engaged in face to face interviews with users in Spain, and sought out overseas citizens who have moved to the UK. They have also done some early testing of wireframes with citizens. Through this research the team have generated a persona and journey for a citizen user of the service. The team articulated the challenges around engaging with these users including recruitment, and in some cases, the need for translation. While there have been research activities with this group, the depth of understanding of this group’s needs is not as strong as for the internal user group. The team identified that they need to do more work with this user group in Beta. One concern the panel had was that the citizen facing side of the service will require significant development and iteration.

The panel was pleased that the user researcher has opportunities to share their research with the team, for example through the user research huddle or through created artefacts like personas. The findings from the work are also documented through reports on a shared collaboration platform.

It was positive that the team had identified access needs for the internally facing service and tested them with one user. However, more work needs to be in this area. On the citizen side, they have also considered alternative channels for users who have low digital literacy or who need a non-digital route to the service.

The team have also considered the use of analytics for Beta which will be an important part of understanding the extent to which users are able to complete their journeys.

The team have clearly undertaken and integrated user research into the development of the internally facing Alpha. That said, a strong challenge the panel identified is that significant work needs to be undertaken to prototype, test and develop the citizen facing side of the service. Particularly when considering the diverse needs and circumstances of the citizen users of the service. It will almost be like developing a different service, involving different users, touch points, different languages and needs. A further, though less significant, concern is around continuity of development of the product as a number of potentially new team members could be working on it in the future, depending on the results of a tender. In such a situation, it is always a risk that knowledge is not effectively transferred to a potentially new team.

Team

The panel was impressed with the team set up and that all expected roles that are necessary to run an effective alpha were in place. However, with the exception of the Service Manager and Product Owner, the team is made up from an external agency. It should be noted that there is a current tender process being ran with no visibility of what the outcome will be leading into beta.

There is concern that having a full agency team acting as the service team could negatively affect team continuity. The panel recognises that this is mitigated to an extent by activities such as shadowing agency team members, but more emphasis should be placed on adding civil service team members during the beta phase.

Another concern is the future governance structure and ‘home’ of the team: it should be a priority in beta to reach consensus around this issue.

In terms of ways of working, the panel was pleased to hear that the team organises well-attended show-and-tells and has the right ceremonies in place to ensure that delivery remains on track, key stakeholders are informed of progress and there are avenues for user research feedback to be taken into account in iterating the service design.

The team also use collaboration tools in rich and diverse ways, which will serve as an excellent source of knowledge when bringing new team members on board.

Another great aspect of the service team is that it is co-located with the agents operating the service - this already served as a great source of insight and will continue to do so in later stages of service development.

Technology

The team has utilised appropriate technologies for a service at alpha stage to build an internally facing proof of concept system that represents the perceived happy path.

A robust pipeline appears to be in place with aspirations to utilise a platform agnostic Infrastructure as Code deployment solution allowing for faster, iterative deployments. However, the team is subject to external forces and cannot be certain of the target infrastructure at this time, we recommend that the team resolve this for private beta.

The service will be required to integrate with many different components from the DWP and DH, as well as a software component (RINA) from the EU Commission. EESSI and RINA are not yet deployed at alpha stage and represent a risk to the service. Additionally, the real-time capabilities, availability and non functional characteristics of external components may constrain some aspects of the service at private beta. For example if the service had to use overnight batch processing.

The service has a good approach to open sourcing their service, with aspirations to integrate this into the release pipeline without commit history.

The service aims to reduce ad-hoc processes the business has built around spreadsheets but not altogether eliminate them. Whilst the final output is described as a formal endpoint the team are encouraged to explore alternative options around integrating into the payments system, especially those options that would allow for real-time data exchange.

The panel are concerned about the personal data this service makes available to the operators and are encouraged to explore the impact on privacy by making personally identifiable information available only where an automated match may fail.

The service team demonstrated that there are currently many potential fraud vectors for which they currently struggle to analyse. The service should deliver an increased capability to analyse and report fraud and error activity with both the DWP and DH.

Design

Visually, the team have done some great work making sure the service is consistent with GOV.UK. The team were open to ideas, taking onboard advice about existing patterns such as the new GDS typeahead country register. The team showed good use of the prototyping kit, building just enough of a prototype to drive out insights in user research.

The team recognised that they may be able to leverage other existing services like Verify and Notify. At present, the team are unable to use Notify for printed letters due to an existing print contract.

The team presented a clear understanding of their design and research. However, there was not always a direct correlation between what had been designed and built and their defined user needs. For example, the user need for citizens to be able to understand their eligibility seems to have been overlooked from a service design point of view. Instead, the team are delaying this until they start building the digital citizen facing front-end. The team are aware of some existing information that is available to the user but there is no attempt to redesign or influence this in any way to meet their user need at present.

The team demonstrated a good iterative design process, including a list of links to different prototypes and sprint reports which highlight research insights. These are shared in an online collaboration platform. The team also demonstrated a good ‘success matrix’. This was a grid which consisted of users on one axis and tasks on the other. They used this to measure that most of their users were able to succeed first time when performing each task. This was really useful, but as all of the research to date has been conducted on ‘happy path’ journeys, more work is required to look at the non-happy path journeys and the effect these have on users.

Some of the content in the service could be referred to as ‘jargon’, but the team had good research insights to back up that this made things easier for the agents as the language used is consistent with that used across the other legacy systems and paper forms. Going forward, from a service design point of view the panel would like to see the team trying to influence some of these content changes across the wider service, rather than accepting that it has to be jargon because it’s always been jargon.

There has been no accessibility audit. However, the service is only used by 130 agents and they have identified any assistive technology being used by this group. The service has then been researched and built to work with these specific technologies.

The team struggled to answer questions about complex cases. Almost all of their focus has been on ‘happy path’ scenarios. With this in mind, the panel believes their design and research could be missing key insights. There is little or no research around whether agents would be able to understand and progress if anything were to go wrong. This includes situations such as form validation errors or a full service outage.

Following on from this, if the service was to go offline, the team does not have any way of communicating errors to the agents. The team had good background knowledge regarding the technical aspects of the service and its integrations, so they had calculated a low risk of downtime. However, when pressed specifically around design decisions and how these affect the end user who may encounter a downed service, the team weren’t sure what the user would see. The team do have a contingency plan and could resort to relocating teams or even a strict clerical process if need be, but the panel would recommend doing further research and design work around complex or broken user journeys to fully understand the impact it may have on their service.

In the event of a mismatch in the data, the system attempts to guess which citizen the claim may be related to and plays back the data of several people on the front end. The agent can then choose the correct one from the list. While the design has been well thought out to make life easy for the agent, there is a potential data protection risk around unnecessarily exposing the name, date of birth and National Insurance number of several people. This risk is amplified by the fact that the data is owned by the Department of Health, but the staff using the system work for the Department for Work and Pensions.

The design uses single-sign-on to determine an agent’s job role, before forwarding them onto the section of the service they need to do their job. However, the team spoke of a vision where agents had a ‘richer job’ covering multiple areas of a case, instead of an assembly line type setup where they each just do one part of the journey repetitively. The vision and the design contradict one another, so going forward the team will need to focus on aligning the two.

The team put forward some initial sketches for designs on the citizen facing system but they have not built any prototypes or done any extensive research on this subject. The team mentioned a ‘user portal’, but with little research done around this, there is a concern they are already solutionizing and this could cause the design work to become blinkered.

Analytics

The panel was pleased to hear that the service team already tracks key performance indicators quite closely, especially as demonstrating the value and costs of the service is essential for the continued funding of the team. The team has a reliable method to generate the four basic performance indicators to be reported to the performance platform.

It was also encouraging to hear the product owner talk in detail about additional business metrics, especially in the context of specific service goals. A good example of this is the service team’s desire to reduce claim and registration processing times by several orders of magnitude (a typical application is currently resolved in 20 minutes), or to reduce the contestation rate (the percentage of claims that are rejected and then subsequently contested by another member state).

It was mentioned in other parts of the report that more should be done to explore the non-happy paths of the service. Analytics can be a powerful tool in this respect as well, for example for highlighting ‘bottlenecks’ in the user journey.

Recommendations

To pass the next assessment, the service team must:

  • Arrange a GDS review of citizen-facing pensioner service at the end of alpha
  • Research, develop and test the citizen facing side of the service
  • Ensure that a sustainable and long-term service team is in place, with clear reporting lines and a ‘home’ within the department

The service team should also:

  • Continue to explore ways to use common government platform and tools from the service toolkit, including especially GOV.UK Verify and the components of the GaaP programme

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 30 July 2018