NHS Jobs - Get a job with the NHS

The service is the largest digital marketplace for health jobs, allowing services to publish NHS-related job vacancies and applicants to apply for those roles.

Service Standard reassessment report

NHS Jobs - Get a job with the NHS

From: Central Digital and Data Office
Assessment date: 28 March 2019
Stage: Alpha reassessment
Result: Met
Service provider: NHS Business Services Authority

The service met the Standard because:

  • the team clearly showed how they have prioritised key user groups and have iterated the design and service based on in-depth understanding of user needs
  • it is an appropriately resourced agile team that works together well and shares best practices across the wider NHS BSA organisation
  • the team has a well thought-out technical architecture and process for evaluating tools and third party systems

Description

The service is the largest digital marketplace for health jobs, allowing services to publish NHS-related job vacancies and applicants to apply for those roles. In this alpha, the service focused on small employers (GP practices) and job candidates.

Service users

GP practice managers and job applicants.

Detail

User needs

During the reassessment, the team demonstrated in detail how they have addressed the concerns around user needs that the panel had during the initial assessment.

Specifically:

The team has reduced the scope of the work significantly by focusing on the needs of ‘small’ users (like GP practices) first, with a view to widen this scope over time. As a result, the team was able to more clearly articulate who will be using the service, with an initial focus on job applicants and employers.

The team has also done more work to understand the pain points in the current service and presented an annotated journey map of the end-to-end recruitment process. Furthermore, they explained their prioritisation process, which is informed by the identified pain points and usability testing (sorted by criticality and the number of users it affects). This approach appears solid and seems to have eliminated the various non-essential features that were initially planned.

It remains important for the team to stay focused on identifying and articulating the core user needs. The set of epic user needs which the team presented is fairly system-focused and at times appears to describe actions within the service (“I need to be able to search vacancies”, “I need to accept/reject the starting/hiring details”, “I need to shortlist potential candidates”) rather than the underlying user need (for example “as an employer I need to find high quality candidates”). These appear to be more in line with user stories, and it would be good for the team to separately articulate the underlying needs. By formalising these needs and establishing their hierarchy the team will also be able to rationalise the prioritisation of what they currently see as ‘conflicting user needs’ (candidate needs versus employer needs).

It is important for the team to consider when which research methodology is most appropriate, and to use accurate descriptions of methodologies. For example, ‘A/B testing’ refers to the statistical testing of two variants on a live service (or potentially a high enough volume public beta service). The team, however, used this term to refer to the testing of two versions through usability testing, which is incorrect. The team also mentioned that their prioritisation ratings had been done “in accordance with Jakob Nielsen’s heuristic evaluation methodology” which gave the impression that they had carried out heuristic evaluations. However, the team meant to say that they had used Nielsen’s severity ratings for usability problems and applied those to the findings from usability testing.

Overall, the team has carried out a significant amount of research, and has made a lot of progress since the initial alpha assessment. The team should continue to progress this work, as they will need to start building their understanding of the needs of other types of users. Doing more accessibility testing must be a key part of this.

Team

Since the original assessment the team has addressed key gaps in their set up, so are much better positioned to progress the alpha and move into private beta. There is now a single product owner who is ultimately responsible for prioritisation decisions and works closely with the service manager and the new dedicated delivery lead. In line with the original report recommendations, the team has also increased its User Centered Design (UCD) capabilities. Now there are three users researchers and three designers who work across the different streams of the service (for employers and for job applicants). There is also a performance analyst which was reflected in the maturity of the analysis that the service presented on analytics and performance.

Probably the most problematic area of the last assessment was the lack of clear, prioritised scope and focus for the service, which meant that the team was trying to address everything at the same time. They have solved this issue and during this reassessment they presented a very mature approach for how they prioritise pain points and user needs, and iterate on those. The service should continue to maintain this clarity of scope and priorities. In private beta, it would be helpful to see a high-level strategic roadmap that shows how they are planning to build out the service to other users.

It was great to hear that the team is very active in collaborating and sharing lessons learned across the wider NHS BSA organisation and beyond, and that many people regularly attend their show and tells. Engagement and collaboration with their many, varied users will become even more critical for beta, so the service should invest even more effort and consider what other ways they can use to best engage with the numerous GP practices and trusts. For example, they should consider doing roadshows, newsletters and sharing their regular show and tell content to users and stakeholders.

Technology

The panel was pleased to see that the service team has significantly reduced the complexity of the service since the first Alpha assessment. For example, they have decided to remove a number of third-party integrations and the API requirement until the Minimum Viable Product (MVP) is ready. By initially focusing only on small users, they have also simplified the workflow for MVP delivery. The service team still shows a commendable flexibility in their architecture and have made it clear they see it as a ‘living document’ and are willing to update and improve it as development progresses.

The service team still has a comprehensive technical architecture that is well thought out. The focus on microservices is good in theory, however it still feels like a theory. None of the microservices could be shown working during the assessment and the panel remains concerned that the service team may not have done enough development to properly validate their approach. A key microservice will be PDF generation, but the service team had not tested with and could not demonstrate a working PDF generator. A lot of accessibility testing had been completed to ensure the service design was accessible, but none with functional software that would prove the technical architecture could deliver an accessible service. As the service team moves into beta, they should try to prove as quickly as possible that the technical approach is sufficient for the service.

The service team presented a well-documented process for evaluating tools and systems. After the first alpha assessment, the panel was concerned about the large number of tools and systems that would be required to deliver the service. By limiting the service’s scope, much of these concerns have been alleviated, the panel was pleased to see the service team has also given considerable thought to how they will evaluate tools and systems as new requirements are re-introduced. Provided they continue to use this process going forward, the service has ‘Met’ this point.

The service uses Javascript to render pages. Because accessibility testing hasn’t been conducted with working code, it is difficult for the panel to determine whether the Javascript will affect accessibility products. The service team should test this as soon as reasonably possible. The service team said they plan to use techniques such as progressive enhancement to gracefully handle Javascript failures, the panel is pleased to hear this. The service team should focus on making sure graceful failures can be handled at any point in the user journey.

The service team mentioned they will work with other teams and providers to use an identity verification service. The panel is pleased to see that they will consider other NHS and cross-Government identity verification platforms first and do not plan to build this functionality themselves.

Design

The team has taken on board the feedback from the first assessment and demonstrated that they can change everything from the scope of the service and the members of the team to the way the prototypes are designed and iterated.

The team is working with service designers to scope the service appropriately, make sure it is designed end to end, front to back and across all channels and to prioritise pain points and ideas. The team must make sure they continue to apply service design thinking as the service grows and changes, particularly as new user groups and journeys are considered.

Once the team starts adding functionality to the service which this user group doesn’t currently use, they hope to be able to meet unstated user needs and encourage these users to use these functions because they add value and are easy to use.

Designing a service with some features which are hidden or optional for users is a very tricky challenge. The panel looks forward to seeing how the team approach this and what learning they can draw from in and outside government. It could be possible that for the different types of employer users there are two or more different services.

The team referenced developing design hypotheses and testing alternative design ideas with users. It is clear that they are iterating their designs based on research findings, which is great to see. The team discussed how they had come to a more mature understanding of “one thing per page” through research and design iterations.

The researchers and designers seem to have developed an effective relationship and sprint rhythm. They should continue to evaluate and iterate their ways of working together as the team and the service grows. Documenting research findings and design decisions is important, but a researcher’s job isn’t over once a report is written. The team showed a nice visualisation of how their design iterations had performed in testing over time.

The team provided a good example of inclusive design in how they had addressed the ‘qualifications’ question in the application form.

The team should be informing the ‘NHS employers’ about their findings and should become more influential in helping this group determine best practice. They should then make sure that best practice processes and approaches are the foundation of their service and that their service makes it easy to follow best practice.

The team should reach out to the legal stakeholders who are mandating the wording of the ‘convictions’ question and demonstrate to them the effect of this question. The team can also test alternative approaches and gather evidence that there are other ways of meeting the legal teams’ objectives more effectively. We recommend the judicious use of user research highlight videos.

The team demonstrated that they are thinking about unhappy paths, errors and failure paths, and looking for ways to minimise these or help users to recover.

The team mentioned that employers are unsure what kind of content to put in the job description. The team could explore the use of a lot more example content by providing employers with three very different examples of the kind of things that could be said in each section. This might help overcome their uncertainty about what to enter and also encourage best practice by educating them on the kind of content that is most effective.

Analytics

Non applicable. The service met all analytics standard points in the original assessment.

Recommendations

To pass the next assessment, the service team must:

  • ensure the scope and service growth in beta continues to be based on well-defined prioritisation methodology and principles
  • build out and test key elements of the technical architecture as soon as possible to prove that it will work for the service
  • visualise the product roadmap in a single, coherent format, so it’s clear what’s in scope at each stage of the service development
  • continue to work with experienced service designers to ensure the service is scoped appropriately, decisions about what pain points and ideas to focus on are made in an evidence based way and that the service is designed end to end, including all channels. These journeys start and end outside of this service and activities that are ‘offline’ definitely need to be understood and considered in the design
  • continue to refine their hierarchy of user needs, ensuring that each user group has a single top level need which reflects the thing they are trying to do, and that this need is always understood as more important than anything below it in the hierarchy
  • carry out regular accessibility testing, including on Javascript code
  • designers and researchers on the team must continue to engage with the cross-government UCD community to learn from others’ experience and broaden their own skills and understanding
  • make sure there is a content designer working on the design and understanding the research in every sprint

The service team should also:

  • consider from early on in private beta how to increase engagement with its different users and stakeholders. Even though the service will only focus on a small subset of users in private beta, close collaboration and engagement should start with the larger trusts and other GP practices who are not participating as this will take time and could become a stumbling block to move to public beta or live
  • leverage the power this service could wield to encourage best practice in hiring processes in the NHS. Consider shifting focus from accommodating users’ current practices to ensuring users perform best practice. Consider setting up an internal service community centred around getting a job with the NHS. There is also an existing cross-government service community centred around employing someone that might be worth considering
  • provide more thought leadership and set direction for their stakeholders, including legal teams and regulatory bodies like NHS employers
  • create a persona for API users once the API functionality is re-introduced into the service
  • consider sharing (for example show and tells, blog posts) the tools and systems evaluation process

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

To get your service ready to launch on GOV.UK you need to:

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
10 Testing the end-to-end service, and browser and device testing Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
Published 23 January 2020