Submit monthly contributions for NHS Pension Scheme beta assessment

The report for the beta assessment of NHS' Submit Monthly Contributions for NHS Pension Scheme on 5th July 2018.

From: Central Digital and Data Office
Assessment date: 05 July 2018
Stage: Beta
Result: Met
Service provider: NHS Business Services Authority

The service met the Standard because:

  • user research is thorough, with knowledge and findings shared with the whole team and stakeholders
  • the team have had a good technical approach during Alpha and Private Beta, with the gathering details about APIs to be built, integration, testing plans, and documentation which they have shared
  • design and content patterns being followed
  • this is a capable and consistent team working in a collaborative and iterative fashion, led by an empowered service manager and product owner, with good relationships to the SRO as well as NHS BSA colleagues and support teams

About the service

Description

Over 9150 NHS Employers submit pension contributions per month. Around 2150 of these Employers submit contribution details using paper based means, including emails and spreadsheets. Employers find this onerous, problematic, costly and time consuming.

The service will ensure all employers within NHS Pensions Scheme have a digital means to submit contribution details for payment of pension contributions.

Service users

NHS Employers who can be one of the following high level types of employer:

  • Direction Body such as hospices, charities, and local council
  • GP practices and medical centers
  • Trust including local hospital

Detail

The top level need the service meets is the following: As a NHS employer, I need to submit the monthly contributions on time, so that I don’t receive a fine.

The scope of the service current includes the below functions:

  • employer can schedule and submit pensions contribution payment details
  • employers receive notification to remind them about upcoming payment deadline
  • BSA admins creating employer accounts
  • employer admins creating employers accounts (standard)
  • finance retrieving daily contributions for onward processing into finance system

User needs

The team demonstrated good rigour and integrity of their user research. They involved a wide range of users in testing and they showed a detailed plan with insights from each session and what iterations were made based on findings after each sprint.

They further refined a list of personas since the alpha assessment to better define needs of each group. They defined most common problems experienced by these user groups in the standard format “As a user, I need X, so that I can Y”.

The team also operated in the typical user research working pattern where hypothesis informs design and that leads to user research sessions which create insights, and that again leads into the iterations stage. Performance analytics were also brought into this process to strengthen it up.

The team showed a list of validated user needs as well as needs that are being researched and unvalidated needs in order to illustrate how well their hypothesis has been tested.

Due to difficulty to get GPs and other healthcare professionals’ time in user research sessions, most user research took place online which is a great alternative in situations where users cannot be recruited easily.

In total, the team performed 5 sprints of user testing which equates to 20 hours of research watched.

To check whether the team is involving people with accessibility needs, they send a questionnaire to check their participant’s digital skills each time before they are recruited. The team has not identified many users with these kinds of needs and they admitted that it was a struggle to do that since these people are rarely employed for given roles. We discussed that this does not mean that people with digital assisted needs may be employed in the future. The team has conducted an accessibility review to ensure that the application meets accessibility needs. Although the service requires some knowledge of pension administration process, the team should also test with accessibility need users in special centres where these people are available to get involved in user research.

The team should also test with users using mobile phones and tablets. Although the team believes that their users will be accessing this service from their working machines, we know that more and more users tend to perform complex tasks online using mobile phones and tablets. These users are going to grow in the coming years and the team should ensure that any kind of user is capable of using the system.

It will be also important for the team to further explore if there are any other reasons for users to use paper based forms apart from not being able to access POL. Although it looks like not being able to access POL is the largest barrier, it is also possible that there may be some other underlying issues for that.

Team

In summary, this is a capable and consistent team working in a collaborative and iterative fashion, led by an empowered service manager and product owner, with good relationships to the SRO and further NHS BSA team, along with engagement with policy stakeholders.

The team is using a good practice agile methodology to deliver this service via two-week sprints. The panel was pleased with their description of the flexible collaboratively agile working patterns and methodology. They are using a combination of digital planning tools, including Trello for planning, slack and email for communication, Confluence, Jira to track their stories, backlog and show transparent priorities.

The team is of good size and mixture of appropriate skills and disciplines required to satisfy the deliverables of Public Beta, including a dedicated digital service manager and product manager required. The team have access and are already engaged with other key roles required for strategic guidance and senior stakeholder support from a centralised pool of resource via NHS BSA programme, including access to an assisted digital and accessibility lead, quality assurance testers and performance analysts. Moving into Public Beta the team will be working closely with the Continuous Improvement team to help monitor performance and KPIS. The team indicated that the amount of time they received from these disciplines was flexible and responsive to their needs. I am confident the team have the expertise required to support a user centric end to end service.

There is a satisfactory amount of senior engagement, with clear escalation routes and reporting to the SRO who is regularly engaged. There are weekly collaborative reports produced by working with project managers to get end to end view of delivery, including digital, policy, comms team, and other relevant teams. The amount of senior engagement has clearly been thought out and planned by the team and wider NHS BSA. colleagues. The team are utilising the stakeholder engagement team to develop comms plans to ensure consistency and breadth of engagement.

Moving forward into Public Beta the key roles are to stay the same. The team are aware of the risks to knowledge transfer and service continuity but mitigate this with a low ratio on contractors and 75% or more permanent BSA staff. They undertake and encourage wide engagement through regular knowledge sharing sessions with programme wide show and tell and lessons learnt sessions attended by security, call centre to platforms teams.

Technology

The team is able to exhibit that they are working in Agile environment using open source. The team is using GitLab for source and CI/CD pipeline management for various environments, Artifactory for build artefacts, Ease of Iteration and Deployment, The team has good plans for testing including Unit Tests, Sonar static analysis (Coverage, FindBugs, CheckStyle, Duplicatio33).

The team is using SpringBoot framework using Java 8 for development along with PostgreSQL. The APIs being used are RESTful. The team needs to get to some level of maturity in APIs. The team plan to look into the API maturity model and adapt to it.

The team is using Amazon cloud (NHS BSA and Arcus) AWS cloud services. The scaling and DR services are planned well with a good incident management system. The service is following various security standards as prescribed at the organisational level security standards such as Anti-virus (Arcus) - Trend Micro, Firewall (Arcus) - Check Point (WAFs), DDOS (AWS) - AWS Shield, IDS (OSSEC), IPS - Trend Micro, AWS GuardDuty, Vulnerability Scans (Nessus), Vulnerability updates (AWS Systems Manager, DPS 2 days, PPS 4 Days, Tenable.IO → IT Sec Team (Informed Discussion) along with monitoring of the environment with Arcus CMaaS (Customer Monitoring as a Services) CPU, Memory, Disk Space , Prometheus and AWS Cloudwatch, ELK for Logs, Consul - Service availability monitoring, OpsGenie for incident triage.

The team is using the collaborative approach to detailed design. All developers join and share knowledge and skills in the proposed component design. This help to distribute knowledge around the team as well as being peer reviewed by other developers. Services are built in with rules in the code for security. Team is using https with TLS 1.0 and will upgrade to the TLS 1.2 when the whole organisation level upgrade will happen. The team is planning for the more detailed security architecture, its analyses, development and built during the Beta. The developers are using Cucumber tool for testing and Google Analytics for performance testing. The service team has good plans for the determination and need for controls and PEN tests and works with Threat and Fraud teams for CIA assessments and remediations.

The team needs to have robust plan to upgrade from current 20 users to 9000 users. Plans and technology for providing accessibility for disabled people and assisted technology were not available. It is proposed that team should be able to show the steps and development in this area in near future.

Design

The service team have done good work to keep to the GOV.UK design style and patterns where appropriate. As this is an NHSBSA service there are some aspects that are out of remit for this assessment point.

The team have evidenced that most users get through the service first time without major issues. The team are aware of certain pain points regarding use of domain specific language. They are monitoring this and adding specific hint text where appropriate to help user understanding. The team evidenced that they have tried different design variations such as ‘one page per thing’ and evidenced that the current design better meets user needs.

It’s recommended that the team focus on improving the journey around adding new employer users. This can currently only be done via the initial welcome email. It’s quite rudimentary right now and the team have noted they will improve it over the coming iterations of the service as they better understand different user needs and employer business processes. It’s highly recommended that they do this as soon as possible. The team are undertaking a phased approach in onboarding new users and it’s really important that they improve the ‘add new employer admin users’ journey before the bulk of new user’s are onboarded.

There are established design patterns in how to do this within products such as GOV.UK Notify and GOV.UK Pay. The team should reference these products for design patterns around how to add new admin users.

The welcome email itself also has quite a few errors. The team mentioned that this is in a process of testing and refinement. They noted that most users understood what was being said to them in the welcome email and understood what they needed to do. It’s important the team continue to iterate this email to better meet user needs.

Analytics

The service team had a shortlist of 5 metrics they consider most useful in evaluating the success of the product. The number of late payments was the only service-specific metric, alongside the 4 standard service KPIs. Supplementary metrics were also identified and collected.

They had good examples of where data has been used to make changes to the service, and where it has prompted further research. They also understood the key metrics were subject to change throughout the development of the product.

Their analyst spoke about their use of Google Tag Manager, and how an organisation-wide implementation is being developed. This is good forward-thinking, and goes beyond the immediate and essential requirements of this service, hopefully ensuring consistency and efficiency in other NHS BSA projects.

They have slightly adapted the cost per transaction metric as defined in the service standard, but their approach is approved and makes calculation simpler and more relevant for their service, still covering all available channels.

The team’s performance analyst is effectively a part-time role (combined with user research). In future, the analysis will be provided by a team shared across other BSA services. As the service grows it is important that the analysis function is not downgraded as part of these arrangements.

User satisfaction measures are captured as users complete transactions. Because this is a monthly task for most users, to avoid repetition the team spoke about varying the questions over time. The panel would be interested to see how this is handled.

The performance platform is complete, but under-populated and slightly out of date. This should be addressed, with arrangements made for keeping it up-to-date, before the service goes into the next phase.

Recommendations

To pass the next assessment the service team must:

  • focus on improving the journey around adding new employer users (see design feedback section for detail). It’s highly recommended that they do this as soon as possible
  • it’s important the team edit the errors in the welcome email and continue to iterate the email to better meet user needs
  • research with current and likely users with disabilities to understand existing and potential barriers for users with disabilities. We would need to see how the service team plan to will remove or mitigate such hindrances for users with disabilities
  • test the service on mobile phones and tablets
  • explore if there are any other barriers to using the online service by users who use paper forms right now in addition to the key problem of not being able to use POL
  • research with mobile phone and tablet users to ensure that the system is supported by different devices
  • leam need to bring to life how, and on what frequency, they plan to continually develop and improve the end to end service
  • plans and technology for providing accessibility for disabled people and assisted technology were not available. It is proposed that team should be able to show the steps and development in this area in near future
  • the security architecture needs to be clearly defined for next stage
  • the maturity of the APIs that are to be integrated need to be defined
  • test with the minister responsible, not just the ministerial office via Department for Health and Social Care. Please send evidence of agreement to review through to assessment team by 1 August 2018
  • Publish their dashboard on the performance platform and make arrangements to keep it up-to-date

The service team should also:

  • the team should closely follow whether the new decommissioning of the N3 Network and legacy migration is on track. Any suspicion of delay should mean the team should create several mitigation plans to reduce the impact on delivery
  • the team is utilising pre-existing stakeholder engagement team and comms plans to ensure consistency and breadth of engagement, but a suggestion is to review the onboarding of new users to increase the digital uptake of assisted digital users
  • the team needs to have robust plan to upgrade from current 20 users to 9000 users.

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Not Met
Published 22 January 2019