Manage Your Referral

The report from the beta assessment for NHS’s Manage your referral service on 9 November 2017.

From: Government Digital Service
Assessment date: 09 November 2017
Stage: Beta
Result: Met
Service provider: NHS Digital - Electronic- Referral Service (e-RS)

The service met the Standard because:

  • The team have continued to make strong progress iterating and improving the service based on its users’ needs
  • Begun to work in the open, with their first code now available publicly (albeit in a limited fashion)
  • Taken significant steps to measure and analyse users’ experience of their service through effective performance measures and use of analytics.

About the service


The service enables users referred to hospital by their GP to choose a convenient time and location for their appointment. Users are given information on the likely wait for treatment, distance from a location of their choice, and a range of dates / times to choose from. Although this was a ‘Beta’ service assessment, the service is already in ‘Live’ use by a significant (typically 10,000 per day) number of users. It replaces an existing digital service, and will operate in parallel with a professional service and telephone booking. It aims to significantly reduce the use of the telephone service, improving the experience for users and reducing operational costs for the NHS.

Service users

The users of this service are members of the public referred by their GPs for a hospital appointment. GPs provide a form of ‘triage’, inviting users with suitable digital skills to complete the service online independently.


User needs

The team demonstrated a good understanding of patient behaviours and motivations when choosing between, and then using, the online and telephone services. Changes made to the online service since Alpha have resulted in the completion rate increasing from 39% to 72%. This has resulted in a £500k reduction in the costs of running the telephone service.

The team have used extensive remote testing of patient behaviour using the online system and have captured a significant quantity of survey data following completion of the online referral journey. This has been supplemented by survey data from the telephone service, resulting in over 100k survey responses and an average satisfaction in excess of 4 out of 5.

The team has mapped out a useful reminder of unhappy (disaster) journeys which will help them to continually improve both the online and telephone services.

The panel felt that the face to face tests conducted with patients during Alpha should be repeated periodically so that more qualitative data can be shared with developers.

The panel also felt that the user need for certainty of a referral booking could be improved. Although this is scheduled as a future development covering multi-channel confirmation, it was felt that the experience of the online user could be improved by requesting the patient email address (as part of the referral completion) and then emailing the details of the referral to them.

The team demonstrated 2 basic personas representing younger and older patients. The panel felt that more user research should be conducted into the use of the online and telephone services by people whose native tongue is not English.

The team demonstrated a good understanding of the challenges faced by patients with accessibility needs and how they are addressing those needs.


This is a mature and effective multi-disciplinary team, working in close collaboration on a challenging and vital service.

Since Alpha, the team has expanded to 3 multi-disciplinary teams, with a combination of dedicated team members and cross-team working. The teams are co-located, demonstrating good agile practices, with the tools and technology they need to work rapidly and iteratively. They are empowered and funded appropriately to support what is effectively already a ‘live’ service. The team has a viable roadmap of future development, with both near-term detail and longer-term aspirations documented openly in their office. Governance isn’t over-powering, with the team represented adequately at key boards and groups – this provides effective internal and independent external steer, whilst enabling the team to work autonomously, empowered to manage their development priorities.


The panel were concerned to discover that the service has a dependency on the Angular.js Javascript framework. This resulted in a range of issues, from creating a substantial payload for mobile users completing the service on the move to potential security vulnerabilities if the framework isn’t updated (and related problems for internal users with legacy browsers if it is updated). The Service Manual emphasises the importance of ensuring that services and content work effectively without Javascript. Progressive enhancement, adding additional richness or functionality, is encouraged but all services should function effectively without.

The use of Angular.js is a remnant of early design choices, subsequently overtaken based on user research. The service as it currently stands has limited, if any, need of the framework. The team has agreed to act positively to re-assess the need for Angular, and will take steps to remove it if there is not a strong user-centric need to retain it as soon as possible. A future ‘Live’ assessment should review the progress made to resolve this problem. A more conventional route to a ‘Beta’ assessment would likely have identified this issue sooner, and would have given a stronger steer to the team to apply technical solutions which are just complex enough for the problem at hand.

They open sourced a component of their platform. This is good progress from Alpha, but it is clear that there are remaining barriers and constraints within NHS Digital to fully embracing the Open Source philosophy. Whilst this is a step forward, we expect services to go farther, faster on open source.

The team were very open on sharing their performance and penetration tests reports, which indicates a great level of transparency. The penetration test did not flag anything critical - the most significant point (rated as medium severity) is that the service stores and transmits “passwords” in plaintext – the panel recognised in discussions that these are not ‘passwords’ as such, but instead a sole-use access code, with limited impact on users or the service if lost.

The technical team demonstrated a mature approach in dealing with the complex landscape (as for systems, integrations and infrastructure) they work in. They are also very aware of the privacy issues around the sensitive data they store and manipulate.

Some concerns were raised by the panel because the manifest of their software dependencies had been removed from the open source repository in order to hide some potential vulnerabilities - a sort of “security through obscurity” approach. However, after in-depth conversations, it is clear they know the limitation of such approach, and they consider that to be just one extra layer of security, as the organisation takes a very conservative approach to security. They have other mechanisms in place to guarantee the integrity of their platform and are spending considerable effort in reducing the technical debt they have.

They are aware of common platforms, such as Notify, and may use it in the future for their email communication.

Their infrastructure seems very robust, with good redundancy and capacity. Moreover, they don’t rely on APIs or platforms outside their NHS protected network, which increases their level of resiliency.

They haven’t had any major incident or outage since they launched the service, which is impressive. That is a testament to the effectiveness of their security layers and their alert and monitoring systems.

They are putting great care in making sure the service is available to the user 24/7. During their deploys, when the platform is unavailable for around 10 minutes on average, users can use the telephone service.

The panel has the following technical recommendations:

  • Carry on addressing technical debt, triaging and patching (when relevant) the security vulnerabilities in dependencies that may exist now and in the future
  • Take steps to mitigate the biggest technical concern, which seems to be the dependency on an external service and infrastructure team
  • Re-assess the need for Angular.js and make plans for its retirement
  • Minify and compress the CSS and Javascript assets included in the web pages, to improve the user experience (faster download time)
  • Carry on the effort on making future code open-sourced.
  • In the current open sourced repository, give instructions on how to build the project despite not having the dependencies file (pom.xml) available.


The team have developed a very straightforward, highly usable online transaction. They have done an excellent job of removing or challenging the need for features which only get in the way, for example calendars and controls for sorting the results. This approach has been validated by a vastly increased success rate compared to the old service.

There are some constraints imposed by the existing system, especially:

  • The three pieces of information needed to authenticate
  • The information that is held about each clinic

More could be done to work around these constraints. Specifically, they should investigate not obfuscating the ‘password’, and formatting the ‘information’ about each clinic so that it’s more legible.

The accessibility audit mostly shows that the service works for users with access needs. Where problems have been found (for example ‘change’ links which can’t be differentiated from each other by a screen reader) it’s important to address them. Another audit should be conducted before doing a live assessment to make sure of this.

As they try to increase uptake of the service the team need the time, resources and remit to design beyond the edges of the online transaction. The team have demonstrated the effectiveness of their approach; to move out of beta they need to apply this approach to:

  • the letters users receive
  • investigating alternative routes into the service so that patients don’t have to wait for a letter
  • what interactions (e.g. confirmations, reminders) would be appropriate after the user has booked an appointment, with the aim of reducing missed appointments

These pieces of work should happen alongside increasing awareness of the service amongst GPs, which, by itself, won’t be enough to fully realise the benefits of the improved online transaction.


The team have demonstrated that they have used analytics and really considered the metrics that are important to them. They have data from multiple sources including their backend database and from their call centre. They have access to a performance analyst. Their work is managed through sprint planning and their findings feed back into the team.

The team have used analytics to feed into building the service. Analytics is considered as part of their planning and feeds into all tasks on their kanban wall. They have thought about which metrics they need especially success across multiple journeys.

They collect metrics directly from IP information collected from their system and have access to interrogate the data as and when they need to. They do not have an analytics package on the services but are in the process of implementing PIWIK. Without this they cannot track the user through the pages of the journey.

While they have a number of KPI’s, both for the Performance Platform and the team, they have not demonstrated that these are directly measuring if a user need is being met.

Overall though, the team have demonstrated a considered and thoughtful approach to tracking performance, made good progress since Alpha, and a good plan for future work. It is important that this continues throughout their public Beta phase.


To pass the next assessment, the service team must:

  • Resolve outstanding concerns relating to technology choices, particularly those that impact on the team’s ability to ‘work in the open’ and to deliver a user-centric service. There is a valuable lesson in making appropriate technology choices that do not create unexpected constraints for development, or place avoidable burdens on your users.
  • Get PIWIK implemented on the service, to better collect actionable data on service use.
  • Develop a performance framework that will help them to determine which metrics are needed to measure if their users’ needs are being met. There is one on here that we use in GDS with services.
  • Talk to the Performance Platform team about whether they need to have a dashboard, and if they do, then work with them to produce one.

The service team should also:

  • Continue to expand their thinking about the full end-to-end service, including reviewing the need for and implementation of the written letter.
  • Further develop the notification and reminder aspects of referrals, to better inform users and reduce errors or non-attendance at appointments.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK N/A
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it N/A
Published 30 July 2018