Access my Levy - Alpha Assessment

The report from the alpha assessment for Department for Business Innovation & Skills Access my Levy service on 15 June 2016.

Stage Alpha
Result Met
Service provider Department for Business Innovation & Skills

The service met the Standard because:

  • The Skills Funding Agency have built on experience with their exemplar development, forming a highly effective multi-skilled team, embedding digital-by-default approaches into their work.
  • The team have developed a strong understanding of their users and those users’ needs to develop effective prototypes.
  • They have carried out a thorough, extensive analysis of the design and technology challenges the service faces and approaches to be used during beta.

About the service

Service Manager: Gary Tucker

Digital Leader: Emma Stace

The service allows employers to view, spend and manage their payments towards the apprenticeship levy.

Detail of the assessment

Lead Assessor : Emma Stace

User research

User space

The team has demonstrated an understanding of the user space in relation to the apprenticeship service and their service-specific user group for managing levy payments.

The team have benefitted from access to users from established agency contacts but the panel were particularly impressed with the effort made to seek less involved and potential users. The team made the effort to understand the various aspects of a business’ experience when employing apprentices, their attitude towards the levy, businesses capabilities, which were presented in the form of personas, segments profiles and analysis of digital skills, and accessibility needs.

At the beta assessment the panel would like to hear more about the team’s understanding of the internal users and what internal users will need to provide a quality end user experience.

User needs

As this is a new service the panel recognise it can be more difficult to understand what users will need. The team has demonstrated how they’ve approached this task by not only asking users what they want but digging under user statements to find the “actual need” and employing prototyping techniques to research into the future. The panel was pleased this work helped the service develop beyond a financial management tool to include apprenticeship management and forecasting.

The move into private beta offers the team a good opportunity to evaluate the scope and user needs in light of everything learned during alpha. This exercise will also help the team prioritise and focus ideas for private beta.

The user research methodology/approach

The panel was impressed with the way the team reviewed existing research from BIS and SFA before starting their own. The team had gone on to conduct a large amount of interviews in the period of their alpha. The team discussed some of the challenges faced when bringing research into the team thinking and the steps taken, such as making Tuesdays and Wednesdays ‘in office, team days’, to balance fieldwork and feeding back to the team on findings.

The team spoke about utilising survey methodology, which the panel was interested to hear about. However, surveys should be designed very carefully to elicit robust findings, especially in the case of future services or reaching out to potential users. The panel hope the team will be able to influence how surveys are conducted in the future to maximise the benefits of this research technique.

The team

The team is co-located and multidisciplinary, comprising permanent civil servants and interim/contractor staff as appropriate. The Skills Funding Agency clearly has an increasingly mature in-house digital capability, building on their experience from the digital transformation programme to ensure empowered agile delivery teams. This capability extends beyond technical development, with strong relationships with both policy development and operational delivery colleagues. This was a particular strength which the panel wishes to commend the team for, and it was hugely encouraging that their policy colleague also attended the assessment.

The Skills Funding Agency has conducted an internal reorganisation which has helped to embed and normalise this way of working, and helped the team to ramp up quickly. Improving internal capability has been prioritised, for example by embedding a ‘knowledge transfer’ requirement into specialist procurement.

Working practices, tools and technologies were all designed to enable the team to deliver, reducing friction and enabling effective collaboration, both within the team and across other related projects through ‘scrum of scrums’. Additionally establishing ‘clans’ for knowledge sharing across disciplines will be helpful for future development.

The team were clearly motivated and energetic, and the panel are confident they had the skills, environment and support to deliver a successful private beta.

Technology

During the discovery and alpha phases the team have extensively investigated the required integration points for the service, identified the potential risks, and proposed a technical architecture to mitigate downtime of external dependencies.

The team have evaluated their proposed technical stack, cloud hosting environment and supporting tools and technologies, building on the work of other Skills Funding Agency digital services. The team described an impressively mature development toolchain, automated test processes, and continuous integration/deployment pipelines for multiple environments.

The team are working closely with both their senior information risk owner and security consultant at such an early stage, demonstrating a mature and considered approach to information security. The team has begun to think about traffic and load patterns and have an impressive understanding of what will be required.

The team have been publishing their code with an appropriate open source license from the outset, avoiding the friction and security concerns of making code public retroactively.

Open and common standards are used where possible (authentication, AMQP, JSON REST APIs) and the team are working to ensure that commodity components and cloud hosting providers could be changed if required. The team has already engaged with cross-government platforms such as GOV.UK Notify.

The team described how the service forms part of a wider programme, with shared components across the whole, and how the SFA work to ensure that teams are able to work independently without becoming blocked by other programme teams. The panel were interested to hear about ongoing work on integration testing for the wider programme as a whole, and we look forward to hearing more about how this has been tackled at the beta assessment.

Design

The team demonstrated a prototype that explored many areas of the proposed service, focusing in particular on forecasting and levy understanding. The team identified early on in user research that understanding the levy and predicted payments was crucial to the success of the service.

The team has a challenging task of designing for a brand new service where needs and policy are still being worked out, and for both medium sized to very-large companies. Focus appeared to be on medium businesses, and the team will need to ensure the service still makes sense for very large companies or users who only do a small number of the possible tasks.

The panel agreed that prioritisation of levy understanding was important for a new service. However, several other areas need to be further explored and should be prioritised as the team move into beta. These include: registration / user management / delegation of responsibility, dealing with PAYE groupings, new user / account state, and making actions needed clear.

Unlike many other services, Access my levy will have all users starting from a ‘fresh’ or empty state. The prototype has so far used ‘fake’ data and scenarios for research. It will be important to design for the ‘new user’ state to ensure the actions needed by users are clear. The panel would have liked to see more work in this area at alpha stage but acknowledge the team felt understanding of the levy was a greater risk and higher priority.

The current prototype allows users to do a number of different tasks - some users may only be interested in some areas, whilst others will need to complete tasks across the service. As a result, the journey for users and what they need to do isn’t immediately obvious. The panel believes that designing and researching for the ‘empty state’ will help identify needs in this area and ensure that users are clear on what they need to do.

The panel were particularly impressed to hear that the team had actively engaged with other teams in the development of the work, and had planned more collaboration in the future. This includes working with the accessibility team at GDS and the cross government design community.

Analytics

Work carried out so far to understand key performance indicators and to engage with the GDS Performance Platform team is appropriate for the current stage of development. The team should use their private beta to get a better understanding of the users’ expectations, and how these may be measured. There may be a particular challenge where there isn’t a clear path to completing the service, which will need further investigation, but the panel were confident that the team were up to this challenge.

Recommendations

To pass the next assessment, the service team must:

  • Establish a private beta with ‘real’ users to ensure that the targeted and guided research conducted so far is backed-up by unmoderated use.
  • Ensure the following areas of user research and design are explored: user accounts and access permissions, PAYE groupings, and making user journeys and actions clearer.

The service team should also:

  • Build on their understanding of internal users and what they need to provide a quality end user experience.
  • Blog about the work done and relationship with the policy team to share learnings and invite input and contributions from others.
  • Consider the name of the service, testing user understanding of the terminology used. The team need to be wary that language in common use within government isn’t as well understood with their target audience.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 18 January 2017