Civil Service Learning service - Alpha Assessment

The report from the alpha assessment for GDS's Civil Service Learning service on 19 April 2016.

Stage Alpha
Result Not Met
Service provider Government Digital Services

Result of service assessment

The assessment panel has concluded the Civil Service Learning service has not shown sufficient progress and evidence of meeting the alpha criteria and should not proceed to private beta yet.

Detail of the assessment

Lead Assessor: Adam Maddison

Researching and understanding user needs [points 1, 2]

The team has undertaken extensive user research, seeing over 100 participants across the country, through a combination of interviews, walking through design mock-ups, and then later looking at prototypes.

The team were able to talk about the user needs that they had identified from this research, which emphasised the role of learning within the line management/team relationship and the impact of the context in which the service is used. Whilst the team were able to explain the decision to focus on learning for operational roles, it wasn’t clear why the scope of the alpha had been limited to the individual learner journey, as this in itself did not meet the full range of user needs that had been identified.

As a result, the team did not address challenging issues, such as balancing the needs of the individual with those of their line manager, managing confidentiality, and supporting team-wide learning, or how learning content fitted in with the wider journey.

The team have employed appropriate methods for carrying out user research in alpha. However, the team obviously understood that, due to the importance of the context of use, ongoing research needs to be primarily carried out in situ rather than being lab-based. Whilst the team have established a regular, sprint-based rhythm for user research, the current plan for beta is based on lab-based research rather than research with users in their place of work, and ideally, using their kit.

Running the service team [point 3]

The team is well structured, capable and led by an enthusiastic, truly empowered service manager. However, there are acknowledged gaps in the team, notably in Content and Service Design and Business Analysis. The service team has plans in place to fill these roles, and the panel agrees that this will need to be done in order to enable the team to research, prototype and test a broader service, such that the panel would be confident the team could progress to beta development.

There is currently no dedicated frontend developer on the team, with this skillset instead covered by existing developers. The service may wish to consider whether it would be advantageous to have a separate frontend role for when the team is ready to build the beta service.

Designing and testing the service [points 4, 5, 12, 13, 18]

The team demonstrated a prototype that explored a ‘new starter’ journey. Whilst the panel were confident in the usability of this journey - with users demonstrably succeeding first time - the prototype only demonstrated a narrow slice of the wider journey that users will need to go through. The panel would like to see more exploration around the entire civil service learning platform, including learning record and sharing with managers.

The panel had some concerns that the team hadn’t explored how course content ‘slotted’ in with the wider platform, for example, testing whether a clear differentiation to users that they were currently within a course would benefit those users.

The panel appreciated that the team had looked at models of testing whether learning was effective. The panel agree that this is a vital area to address such that the service can best provide the most valuable learning and development opportunities for civil servants.

Technology, security and resilience [points 6, 7, 8, 9, 10, 11]

A substantial amount of data is stored by the existing CSL system about Civil Servants, including names, roles, grades, disability data, and physical locations and attendance lists for training events. For certain professions, such as those in defence and security, the learning record may also expose further information about a person’s day to day role, for example, if they have undertaken training around handling sensitive information.

The team must consider the privacy level and security of this data as per the Service Standard and OFFICIAL threat model. Paul Coleman, the GDS SIRO, has been engaged and will sign off the approach.

The intended beta timeline includes a migration phase where the existing CSL system will be moved from Rackspace to Amazon Web Services (AWS) using a ‘lift and shift’ approach. The panel notes this creates significant new risks around the data, for example, access controls to infrastructure the new AWS public cloud.

The panel would expect the team to undertake a rigorous security impact analysis before this work begins to guard against such risks and would expect any findings to be presented at the next assessment.

Some personnel in the Civil Service are using devices incapable of accessing multimedia training materials such as online videos, due to legacy technology in use in some departments. The panel suggests the CSL team apply appropriate pressure on departments or agencies through engagement with the CTS programme to ensure they update these devices, rather than going out of their way to support out-of-date browsers or unacceptably poor network connectivity in new courses.

Improving take-up and reporting performance [points 14, 15, 16, 17]

The service team has good insight into the metrics that could be used to identify how the service is running, and, understanding how costly a poorly designed learning plan is, has clearly been thinking about measuring the effectiveness of learning.

Analytics on the prototype already allow tracking of users through the entire journey - something not currently achievable in the existing service.

The service team are aware of the need to continue to validate some of the assumptions that they have made, by looking at the data around what users are looking for, how long they spend on the site, and what they might add to a wishlist. Continuing to gather insights into these areas will be vital to delivering a service that provides.

Recommendations

To pass the reassessment, the service team must:

  • Expand the prototype and usability testing to include exploration of more areas of the Civil Service Learning service, and test these with users. This will enable the team to have a better understanding of areas to develop in the beta phase.

  • Ensure that explored areas cover the key users of the service and the core user needs identified. Areas to explore should include a learner being able to share/prove their learning, and the line manager’s user journey so that the online interaction within the line management relationship is clearly defined.

  • Conduct a security impact analysis around the migration of the legacy CSL system from Rackspace to AWS before this work commences.

The service team should also:

  • Document user needs so they link back to the evidence base. For example, documenting the overall learning and development journey and mapping needs onto this, or creating personas and scenarios.

  • Ensure that there is a clear plan for ongoing user research that allows for usability testing in users’ context of use and on their kit.

  • More actively engage with departments to improve access to their service, and removing barriers to users being able to access content.

  • Explore the ways in which learning content is delivered, including how learning content is styled and positioned in relation to the wider service.

  • Include research into how the service should look and behave, and how navigation should work, whilst a user is ‘inside’ a course.

  • Research ways of testing whether learning was effective or not.

Digital Service Standard points

Point Description Result
1 Understanding user needs Not Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 12 April 2017