The Centre for Evaluation and Monitoring Alpha Service Assessment Report

The service will enable customers to engage with SLC and address any issues they have by interacting directly through digital channels.

From: Central Digital and Data Office
Assessment date: 01/12/2020
Stage: Alpha
Result: Met
Service provider: Student Loan Company (SLC)

Service description

The Centre for Evaluation and Monitoring (CEM) is a significant change and improvement programme being delivered by SLC across its full range of customers and services. This element of that programme is the development of self-service tools. The service will enable customers to engage with SLC and address any issues they have by interacting directly through digital channels. By expanding online capabilities, we can provide more opportunities for students and their sponsors to self-serve, reducing the need for them to contact SLC and to follow up on their application for student finance. This will ultimately allow customers to have a more complete, end to end the digital journey with their student finance and enhance their student experience.

Service users

This service is for:

  • students applying for or have applied for student finance
  • sponsors providing financial information in support of a student’s application

The intention is to use these tools as part of all relevant SLC customer journeys, but initial development is focussing on one student finance journey.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • research had been done with a range of users
  • the team appeared to have conducted research across a range of users with low digital skills and demonstrated an intimate understanding of how low digital skills presents in users who are so-called digital natives: young and frequent users of the internet
  • research was wide-ranging, the team had a clear and strong understanding of the majority of users. Primary features appeared to be responding to strong user needs. Needs were supplemented by analysis of calls to SLC. It’s clear that there is a wealth of call data that has formed some of the decision-making around the platform

What the team needs to explore

Before their next assessment, the team needs to:

  • bring clarity to the way in which they link features to user needs. The panel felt that the platform may be attempting to support an ambitious number of features that do not all have a clear line of sight to user needs
  • bring clarity to how the team had done accessibility testing and understood complex user needs. The panel felt unsure as to assess how rigorous this was investigated. The panel was concerned that personas and user needs did not reflect the needs of vulnerable users.
  • prioritise testing with users who have an accessibility need, including users of assistive technologies (including screen readers). The panel felt this is especially important as the design diverges from GDS style.
  • continue to be aware of the risk of focussing research on an (albeit large) subset of the total user base, when the plan is to roll the products and approach out more widely

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • internal teams were involved in research which helped ensure the language that users would come across was the same language that would be used by customer service advisors
  • how the changes to the online service will affect the offline journey has been considered; for example, there is a clear journey for users who do not have access to the internet or who are low on the digital inclusion scale
  • the team is committed to offering a service to internal and external users with equivalent access to information in a transparent fashion

What the team needs to explore

Before their next assessment, the team needs to:

  • support the work of the assessment panel by clearly contextualising this work alongside other parts of the programme, to enable a better understanding of how the different elements contribute to the overall user experience

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the service makes use of existing components from GDS and HMCTS design system such as the progress bar
  • the service is designed consistently to look like GOV.UK and will be linked from the existing ‘Get undergraduate student finance: step by step’ journey
  • there is the awareness that the GDS Design system components will need to be recreated for use and compatibility on Salesforce in order to make sure the service is accessible and expertise will be brought in during beta to help with this
  • the service allows users to complete actions without relying on human assistance but also ensures human assistance is easy to find within the service
  • the service is reusing existing page patterns from other government services
  • the service prioritises solving the major pain points of the existing service, for example, making everything the user needs to complete their application available in one place and making it clear when a user needs to take action at each stage of their application
  • important notifications are sent to users via GOV.UK notify using email but users can also choose to be sent notifications by post

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to implement key findings from usability testing into the beta while ensuring the content of the service is clear and easy to navigate- especially for users who may be using assistive technology
  • start working with a content designer to agree the hierarchy of content on the page and ensure it is logical
  • remove reliance on the bold text to indicate progress to users, and explore additional indicators
  • consider a different placement for the ‘actions to complete’ lists and application progress tracker, currently the progress bar is within the main text and isn’t immediately obvious what stage the application is in yet this was one of the key pain points from the research with users. The alert for the actions to complete is at the top of the page but isn’t immediately followed by the list of actions, the users have to click a link to jump down the page to interact with these. Consider users with sight impairments using screen readers who may find this content structure confusing and difficult to follow
  • continue to iterate the ‘your account page’ to make it easy to understand for all users, for example, consider using the GDS Design System accordions (https://design-system.service.gov.uk/components/accordion/) so the most important headings stand out
  • reconsider the placement of the navigation links and how much is immediately visible. Currently, the main content plus the navigation links to the right side of the page is structured in a way which could be overwhelming for users with thinking and understanding impairments such as dyslexia or ADHD
  • ensure the order of content within sections such as ‘next important date’, makes sense for the user, currently the date appears last, under secondary information
  • In order to follow best practice consider splitting content for ‘student finance’ and ‘supporting an application’ onto secondary navigation at the top of the page rather than using the tabs component for this distinction
  • ensure the service functions correctly across all devices and that important navigation links are not pushed to the bottom of the page

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • research has been done with a range of users with different accessibility needs and who map both high and low the digital inclusion scale
  • there is an established route to allow users who may not have a permanent home address or internet access to access the service

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to run accessibility research sessions with users who use assistive digital tools such as screen readers
  • make sure the digital part of the service works for keyboard-only users- currently tabbing through content skips all over the page in an illogical order
  • ensure to test the non-digital journey with a range of users including users with disabilities
  • undergo a full WCAG accessibility audit on the beta version of this service
  • ensure that any possible implementation of a chatbot feature or attended webchat does not interfere with the use of assistive technology and accessibility of the service
  • consider the risk of building up own GDS pattern library within Salesforce and how this might affect the accessibility of the service
  • ensure that the service works for users with Javascript disabled
  • make a plan to share the salesforce GDS pattern library once created
  • continue to explore alternative ways to help users ‘in the moment’ so they do not need to switch between online and offline assistance

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a good mix of skills, and is committed to a sustainable balance of full-time staff and managed services roles
  • the team benefits from most of the disciplines required from a service delivery team, understands the importance of domain knowledge,
  • the team has identified a sustainable home and leadership for the service going forward as part of a change programme that will be embedded as BAU
  • the team is transitioning from using managed service to a sustainable mix of civil servants and specialist developer resource

What the team needs to explore

Before their next assessment, the team needs to:

  • put the more explicit and formal focus on knowledge exchange across the programme, and within the team as work evolves and more permanent staff are brought on
  • continue to ensure that senior managers remain content with the risk and value considerations of committing to a commercial solution which may need ongoing customisation to meet GOV.UK patterns

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a commitment to sharing its work with stakeholders continuously and demonstrating the outputs of their iterative development process

What the team needs to explore

Before their next assessment, the team needs to:

  • be more structured as well as open and engaging with the broader user community and explore whether how show and tells and sprint reviews can increase the visibility of their work
  • consider how this work can be presented alongside other parts of the programme to ensure high levels of visibility around the work and future state to all stakeholders, including senior management

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team structures and iterates its approach to development and can demonstrate clear continuity and evolution of their work

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work as much as possible within the COVID-19 restrictions to engage with end-users as the broader service elements and transactions are finalised

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

The service has given due consideration during the Alpha as to how the system can be kept secure.

What the team has done well

The panel was impressed that:

  • the service is making use of commercial security products and is not writing its own security code
  • the service will be re-using the security mechanism already in place for the Apply system

What the team needs to explore

Before their next assessment, the team needs to:

  • meet the challenge of integrating the core Salesforce platform-based system with their many lines of business systems in a secure manner
  • ensure secure coding practices are put in place and maintained during the development
  • as there will be far more emphasis on security during the next assessment the team will need to present detailed evidence of security engagement and should include a member of the SLC security team to present to the next panel.

##

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a clear understanding of GDS performance measures and how to supplement those with service-specific data
  • the senior leader for the programme demonstrated understanding of the importance of identifying and applying key performance indicators in driving outcome benefits for the wider SLC operational service

What the team needs to explore

Before their next assessment, the team needs to:

  • communicate how the measurements around the self-service element relate to the overall performance of BAU services being improved as part of the CEM programme

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

Choosing to build the service on the Salesforce platform is innovative for the government. It has the potential to enable advanced service capabilities that an organisation the size of SLC could not build on its own.

What the team has done well

The panel was impressed that:

  • the team carried out a detailed evaluation of CRM tools and how they could meet user needs, before making the choice of platform
  • the team identified the need for Salesforce platform expertise and have begun recruitment of specialists
  • the team have tooling for continuous integration and version control already in place

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the use of the Salesforce platform does not compromise accessibility.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understands the importance of open sourcing their code
  • the team have already discussed the need for open sourcing with their suppliers

What the team needs to explore

Before their next assessment, the team needs to:

  • establish open-source repository hosting
  • make a clear separation of platform code from the project-specific code to make collaboration with other organisations easier
  • demonstrate a history of regular publishing of code to the open-source repository
  • endeavour to fully establish working in the open approach

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard

Due to the preparatory work is done in Alpha. Getting this point right in the beta will be the most challenging part of the project. It is crucial to the success of the service.

What the team has done well

The panel was impressed that:

  • there is an ambitious plan to pioneer the use of the Salesforce platform as a new technology in government
  • the service understands the need to replicate GDS design and accessibility standards on new technology.
  • there is a recognition of the risks and challenges involved
  • made the supplier aware of the need to adapt the platform to GDS standards
  • has already made contact with other government organisations that are interested in the same new technology

What the team needs to explore

Before their next assessment, the team needs to:

  • create new implementations of GDS components based on the Salesforce platform
  • publish the source code for these components
  • establish collaboration with the wider government community on the development and maintainability of the new components and patterns
  • undertake external evaluations of the design, accessibility, coding and the security of the user interface throughout the beta phase. Avoid leaving this just for the next GDS assessment to do

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • already have experience of running a live service which will directly benefit this service
  • the reliability of the Salesforce platform was considered in making the choice to use it

What the team needs to explore

Before their next assessment, the team needs to:

  • provide detailed plans for how user needs can be met if the service goes offline
  • currently, there is enough call centre capacity to support the service, but if this service reduces the need for that capacity when it is live, then how will the remaining capacity be able to cope if the new system goes offline
  • this most likely point of failure is in the integration between the Salesforce platform and the line of business background processes. This means users will be able to login to their accounts but will experience errors later in their user journeys. The system design needs to cater for this as well as more general failures.
Published 23 January 2021