Get help to retrain alpha assessment

The report for the alpha assessment for DfE's Get help to retrain service on the 21/02/2019.

From: Central Digital and Data Office
Assessment date: 21/02/2019
Stage: Alpha
Result: Met
Service provider: Department for Education

Service description

People want good jobs. Projections say automation in the labour market will change around 30% of existing roles, impacting the lower skilled the most - threatening their pay and conditions over time and displacing some entirely. Many people affected don’t have the skills, confidence or social capital to thrive during this change to the jobs market. We want to improve their ability to compete, and secure good jobs long-term.

We are developing a set of linked products which together give low and medium-skilled workers optimised information, advice and guidance about a better job for them; and training and support to get that job. The Get Help to Retrain product (which is being assessed) helps people identify their skills and preferences and identify suitable jobs; understand jobs available locally; and identify suitable training that could help them to get that job.

Service users

People who are in employment; age 24+ and without a degree or equivalent level 6 qualification; who work in jobs at high risk of automation.

1. Understand user needs

Decision

The team met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team articulated who their user group was for this service: age 24+ and without a degree or equivalent level 6 qualification; who work in jobs at high risk of automation
  • through analysis of the research, the team did good work in highlighting six key factors when considering trainee users, which then informed their detailed user needs and personas. The findings of the research have also informed the development of their service to date, for example having a step-by-step guide through the process, the option to contact a careers advisor (through phone, email or chat) at each stage and showing distance to travel to each of the courses listed in the search
  • the team have done some great work on accessibility and inclusion. They identified a range of accessibility issues that their users face in employment and therefore would need to consider in developing their beta offering training courses
  • they also took the time to really understand the characteristics of their users and barriers they face

What the team needs to explore

Before their next assessment, the team needs to:

  • once in Beta, more work is needed to understand how this compliments or replicates other job search and training options available to employees, including the NCS itself
  • as this service is pulling on other existing services, it’s important to draw on the learning from those services, through evaluations and assessment feedback, to inform this service. Part of this is looking at the users journey to and from these add-ons
  • explore how the accessibility and barrier data will inform the development of the service

2. Do ongoing user research

Decision

The team met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team did a substantial amount of research to understand the needs of employees that are considering, or will need to consider, getting another job if their jobs become automated. They spoke to 420 people in total from around England, who were a mixture of employees, employers and service providers
  • one of the key elements of the service was the involvement of the whole team in the research process, particularly including their policy team, in site visit observations and analysis.
  • the team discussed how this helped to reduce bias in the research approach
  • the research consisted of depth and group interviews, co-design sessions, workshops and remote unmoderated usability tests
  • the team identified their four riskiest assumptions for the service, and explained that they had tested these with a small sample of their users. There are also more risky assumptions for the whole end-to-end journey, which will be tested in the other services. The team explained that they planned to test these with hundreds more people in private beta to reduce the risk of the service
  • the team highlighted that it may be difficult to recruit users to test their service in private beta because this user group aren’t necessarily motivated or ‘contemplating’ looking for employment while in a job. The team mentioned that they may have to use incentives to recruit people

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to iterate the quiz, maybe look at the added value this brings to the service, if any (how accurate is this? If this is being pulled from elsewhere, but tailored, how are user needs/research informing that process?)
  • continue to iterate the course filter, as users may need more than three options - how could the service best display those and how can they most usefully be filtered
  • iterate the ‘distance to course’ part of the service and how can this include distance from work, or other locations

3. Have a multidisciplinary team

Decision

The team met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team comprises a mixture of experienced contract staff with permanent civil servants in the service owner and product owner. The key roles in a multi-disciplinary team were covered well, and all those present demonstrated a strong knowledge of the domain and challenges they face
  • it was clear that the DfE representatives present who led much of the discussion were enthusiastic converts to agile, user-centric practices and were able to bring a wealth of subject matter expertise into the development team

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that any team managing the National Retraining Service is sustainable, and this will likely require a shift away from contracted to internal specialists
  • ensure that the organisation’s recruitment plans are able to keep pace with the departments ambitions to develop digital services is key

4. Use agile methods

Decision

The team met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • this team has made a lot of progress, and developed new skills and working practices as a result of working in an agile, user-centric way
  • work began as a traditional policy-making team, but have taken what they describe as a ‘hard turn into agile, user-centric design’
  • as well as a delivery team working in line with agile principles, the team has also got a Programme Director who is fully engaged with the team’s ways of working, and has also attended user research sessions alongside them. Consequently the responsible policy development is now being influenced by the outputs of user research, and they have used that to help Ministers understand this unfamiliar way of working

What the team needs to explore

Before their next assessment, the team needs to:

  • care should be taken to test risky assumptions before they reach beta development – their current approach to researching for Alpha is clearly delivering benefits, but lightweight prototyping should remain part of their development approach into beta, to avoid costly, and avoidable, mis-steps
  • evidence from research suggests that there can be a mismatch between policy concept and real-world delivery, such as the experience discussing ‘disposable income’ with people who didn’t recognise the concept

5. Iterate and improve frequently

Decision

The team met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has moved from a traditional procurement-based approach, seeking off the shelf technology solutions to stitch together, towards owning responsibility for and designing solutions within DfE
  • this has allowed or encouraged them to work in a far more user-centric fashion, designing, building and testing ideas and concepts with real potential users
  • the team is now working around agile sprints and has developed a sustainable cadence for their work

6. Evaluate tools and systems

Decision

The team met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have considered thoughtfully the best tools for developing the beta product, following DfE recommendations for using Azure PaaS, Gov.uk standard toolkits and recommended languages (Ruby on Rails)

What the team needs to explore

Before their next assessment, the team needs to:

  • there are significant questions outstanding around the integration with external systems from NCS and DWP that need to be resolved in the beta
  • the vision of amalgamating the two disparate sources of information into a cohesive usable UI for skills quizzes requires verification early on
  • curating data around skills and job listings will require a management console and integrated IDAM solution for staff. Questions around how this might be implemented will need to be addressed early in the beta. A technical spike to cover the management and integration of data from external sources is recommended

7. Understand security and privacy issues

Decision

The team met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have identified the limited nature of personal identifiable information and plan to segregate it from the rest of the system in a secure fashion
  • the team are adhering to general good practices for security and privacy in their architectural design

8. Make all new source code open

Decision

The team met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are working in the open and making all their code publicly available
  • the team are planning to make reusable components around skills to job profile matching that can be used by other agencies / departments

9. Use open standards and common platforms

Decision

The team met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are adopting open standards throughout and intend on using common government platforms such as Gov.uk Notify and at a later stage, Gov.uk Verify

What the team needs to explore

Before their next assessment, the team needs to:

  • the use of Azure PaaS should be balanced with the development of robustly portable / reusable components built on open standards
  • the choice of IDAM for both staff and citizens should be considered for compatibility with existing DfE and government platforms. Best endeavours to avoid platform lock in is recommended

10. Test the end-to-end service

Decision

The team met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have plans for a good development pipeline and will be using best practice testing strategies
  • the team have good plans for managing their infrastructure and intend to hire a specialist for DevOps support throughout the development of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • it is important to consider whether the entire user journey can be tested if part of the journey is taken on third-party sites. Whilst there are complications to this, the responsibility of the service team is for the whole service journey. Therefore, further consideration about how to best work with third parties who are providing part of the transaction will need to take place in the beta. Loss of availability or lack of accessibility to sign up for the retraining programmes offered is a significant concern and needs to be addressed
  • the service team needs to look at the ‘getting a start page’ https://www.gov.uk/service-manual/service-assessments/get-your-service-on-govuk of the service manual

11. Make a plan for being offline

Decision

The team met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have considered assisted digital and offline support at a reasonable level for an alpha assessment
  • the team have given due consideration to resilience and availability of the technical infrastructure they intend to use for development of the beta service

What the team needs to explore

Before their next assessment, the team needs to:

  • the availability and resilience of third-party training course providers may impact the user experience of citizens using the service and the service team should consider at some reasonable level contingencies should they not be available to citizens at the time the service directs them to these external sites

12: Make sure users succeed first time

Decision

The team met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have tried different ideas and iterated to meet identified user needs and behaviours - making significant changes based on research insights. The team learned quickly that the first idea to meet the policy intent wouldn’t work for users, and changed approach
  • the whole team, including policy, has been involved in research, synthesis, hypothesis for testing and design ideation
  • the team identified the importance of motivational and assisted digital aspects, and worked hard on these, bringing to the foreground assistance and support for all users
  • the team understand the difficulty of bridging to 3rd party course provider websites - especially for users of assistive technology. They found good ideas of ways to ensure that the journey isn’t broken at this point

13. Make the user experience consistent with GOV.UK

Decision

The team met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • using the step-by-step pattern feels like the right approach to the journey for users. The team should liaise with the step-by-step team at GDS to share knowledge on this, relatively new, pattern

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should keep testing and working to influence the content of the linked products, so that this service works as a coherent, consistent service. Specifically the ‘check my skills and experience’ - the quiz - being created in another team, and being assessed independently. This product contains components not in the design system and the content needs to work in the context of this service

14. Encourage everyone to use the digital service

Decision

The team has not met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working hard to identify users with a range of digital skills, and is using research findings to achieve meaningful improvements for users who might otherwise be left behind
  • they have actively sought to test their service with users with limited digital skills, who present the greatest challenge for a team providing effective digital solutions

What the team needs to explore

Before their next assessment, the team needs to:

  • continue this valuable work to ensure inclusion for those most in need of the service, and where possible share the learnings and evidence gathered across the wider government community

15. Collect performance data

Decision

The team met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • at this early stage the DfE team have put considerable effort into considering their approach to performance data. They have proposed four separate ‘lenses’ to measure performance – ‘product’, ‘support’, ‘user need’ and ‘outcomes’ to combine practical measures on their technical implementation with more complex challenges such as the progression of retrained people into qualitatively ‘better’ jobs as a result of retraining

What the team needs to explore

Before their next assessment, the team needs to:

  • the work completed so far is thorough and interesting, and it would be valuable to share the thinking behind it with the performance management community within government
  • it would also potentially make a good case study to share outside government through blog posts etc.

16. Identify performance indicators

Decision

The team met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • performance measurement will investigate not only the success of the online service, but also the conversion into use of the service, and subsequent conversion of ‘retrained’ users into better jobs
  • there will also be separate identification of users progressing through ‘assisted digital’ journeys, and evidence on the value or success of face to face and telephone support channels

17. Report performance data on the Performance Platform

Decision

The team met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team are well on-track.

18. Test with the minister

Decision

The team met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team have been effectively engaging with both policy officials, and are using evidence from their work to inform and seek the support of the Minister

Updates to this page

Published 4 February 2020