Recruit an apprentice (Provider Posting) - Beta Assessment

The report from the beta assessment for SFA's Apprenticeship Applications (Provider Posting) on 20 July 2016.

Stage Beta
Result Met
Service provider The Skills Funding Agency (BIS)

The service met the Standard because:

  • The team have a good understanding of their users and their needs
  • The team have built on the successful technical approach from the find an apprenticeship service

About the service

Service Manager: Gary Tucke Digital Leader: Emma Stace

The service allows apprenticeship providers to post vacancies to the ‘Find an Apprenticeship’ service, and manage applications. It also allows agency staff and suppliers to manage and review vacancies.

Detail of the assessment

Lead Assessor: David Illsley

User needs

The team have demonstrated a good understanding of who their users are. In discovery and alpha they used contextual inquiry to create their personas, paying attention to how employers and providers work together. The personas are based on individual aspects of the provider users and the team considered less confident users for personas. The team also spent time with the Serco QA users to understand their workflow and needs. The personas have been iterated over the course of the beta.

The team identified user needs in discovery and alpha, using surveys and interviews. The needs have been verified during the private beta and have not changed. The team have stated that they have not encountered many users with assisted digital needs, despite reviewing support phone calls and assessing all research participants on the digital inclusion scale. This could be because all users will be in an office environment, with colleagues to assist them if needed.

The team have used a mixture of lab-based, contextual on-site and remote research alongside a survey of 192 providers undertaken in private beta. The team have observed over 200 hours of research with 104 providers. The panel was also impressed to hear the team had built their own research lab in Coventry.

The team have observed users with screen readers and screen magnifiers and use an automated testing suite to check their code is accessible as it is developed. There has also been a fully accessibility audit, and the results have been shared with Richard Morton on the accessibility team at GDS.

The research findings are displayed on a board, which the team then prioritises. The findings are then triaged into the backlog as either stories or epics.

The service has been tested with the previous minister, however, the incumbent minister was only appointed a few days prior to the assessment.

Team

The team are taking an agile approach to development, having started with scrum, and now using a hybrid of scrum/kanban which is currently working for them. Appropriately enough, the agency is offering technical apprenticeships to build technical skills in the organisation.

The intention is to fold multiple products together to create a single ‘Digital Apprenticeship Service’. The agency should be careful when doing so to structure the agile teams to allow them to work effectively.

While there is a relatively complete team, there is not currently a dedicated designer. This role is currently being performed by a frontend developer. While this does seem to be working at the moment, the team should keep this under review, particularly as the service expands, and they progress through beta.

Technology

The team have built on the technical foundations of the successful find an apprenticeship service, though they have made some different choices based on experience. The entire solution will be hosted in public cloud, which is a substantial step forward for the agency. The team were aware of the potential lock-in issues around their chosen platform, and should continue to manage those risks.

The team have taken a phased and considered approach to migration from the old system. The plan presented expects to switch over in a single month. This is reasonably ambitious, and the team should keep this under review.

The solution depends on an existing IDAM solution to manage access by providers and employees. This has yet to be updated to meet the digital service standard, though the agency plans to do this later in 2016.

Design

The MVP has been based on vacancy creation. Features that didn’t stop users from being able to successfully post a vacancy have not been included in the MVP. These are being prioritised for future development based on feedback.

Initially, the design was based on a collaborative model between providers and employers. However, this did not test well and the design was iterated to be more task focussed. The research has been focussed around these tasks and included contextual research observing users inputting real data into the live system.

The team have also researched the support model and the sign in process including resetting passwords.

The single sign on system does not follow the GOV.UK style guide or patterns. This is a legacy system that is used across the apprenticeships services. The team have decided to use this system to prevent users from needing a separate set of credentials to use this service. The team stated there will be an opportunity to redesign the sign in system using the GOV.UK styles in the near future.

The rest of the service follows the GOV.UK style guide and patterns. The team stated they looked for existing patterns for the dashboard, specifically looking at the performance platform and GOV.UK Notify for inspiration.

As the service is 100% digital and there are no alternative channels, digital take up is not considered to be an issue. Users will be transitioned from the legacy system to the new service over the course of the public beta.

Analytics

The team has established baselines (for costs, user satisfaction, completion rate) allow comparison with the existing AV service.

They are taking data from the back-end to link transaction starts with users who complete using save and return, which will enable them to measure completion and drop-outs more effectively.

They are already working with Performance Platform to set up dashboard with 4 KPIs, and are able to isolate costs of this service from other apprenticeship services.

The team plans to use metrics to monitor how effective vacancies are and to help investigate why some vacancies receive fewer applications than others.

Because the team’s performance analyst is shared with other SFA teams, it needs to ensure sufficient time is available to focus on the ‘recruit’ part. However, the panel would also like more attention given to measuring the effectiveness of the whole apprenticeships service from end-to-end, not just to individual transaction types.

Recommendations

The service team should:

  • continue to take an iterative approach to migration, and avoid rushing if feedback indicates more work is needed
  • look into how employers use the service in more detail, and should continue to recruit users with AD needs, especially those from smaller companies
  • redesign the single sign on system using the GOV.UK design patterns and style guide

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 3 January 2017