I want to go fishing - Alpha Assessment

The report from the alpha assessment for Environment Agency's I want to go fishing service on 04 April 2016. This service is also known as I want to fish.

Stage Alpha
Result Met
Service provider Environment Agency

The service met the Standard because:

  • The objectives of the alpha have been met as a prototype has been built, designed and tested around meeting user needs, and a team has been formed that are working well in an agile way.

  • The team have good foundations to move into a beta phase.

  • The team are considering and including in their work the full end-to-end service and wider user base, including internal staff.

About the service

Service Manager: Jerry Gallop

Digital Leader: John Seglias

The service will enable anglers to get permission to fish for different types of fish at various locations. It will replace the existing post office service and will also support changes to the licence structure and price. The revenue generated from Licence Sales goes directly back into initiatives that benefit anglers. The service will have over 1m users a year.

The vision is to add new products to the service over time and increase the number of users, including targeting new anglers. The service also includes enforcement and compliance, so there will be internal users such as bailiffs.

Detail of the assessment

Lead Assessor: Julian Dos Remedios

Researching and understanding user needs [points 1, 2, 10]

The team used available data and information to inform the initial development of their online prototype and assisted digital support. They demonstrated good examples of how they researched and iterated the online prototype in response to their ongoing research. The team used an agile approach, including working in fortnightly sprints, and were proactive in seeking out potential users, including at fishing locations and a trade event.

The team has so far focused on researching for the online parts of the service. They are carrying out user research and usability testing every sprint, and using a variety of research methods including broad market research and getting out to meet directly with users on river banks and at angling shows. The team have generated more than 200 user stories from their research.

The team had identified some topline user needs, and were considering them alongside business / departmental aims such as generating revenue for improving angling facilities, gathering information to improve marketing and enforcement, and increasing participation in the sport.

They have created three personas, including one with low digital skills. They recognise the need to expand this work and have plans to do so in beta.

The team’s understanding of their 1 million plus users’ broad demographics leads them to expect an above average proportion to have low digital skills and/or resistance to using an online service. Despite this they had struggled to recruit and research with users with low digital skills prior to the assessment. They plan to work with the Angling Trust to recruit users with low digital skills.

The team had researched with users of the current face-to-face and non-digital Post Office-based service, and found they often prefer this route for reasons such as habit, trust and wanting to support their local Post Office. The team could not identify a user need that directly related to a face to face service, beyond potential assisted digital needs or a need to pay with cash.

The team had not explicitly sought to research and test with users with disabilities or impairments, but had plans to do so and understood that the learnings from these users cannot be left until later in the development of this service.

The team said that tackle shops could be a good place for users to get support with the online service, but didn’t consider these outlets to be a sustainable source of support that the team could incorporate as part of the end-to-end service.

 Running the service team [point 3]

The team have a clear vision and are proud of the emerging service. They are already seeing the benefits of working in an agile way. The key roles and responsibilities are well resourced and there is strong organisational support for the service. This is reflected in the success and good work demonstrated through the Alpha.

Designing and testing the service [points 4, 5, 11, 12, 13, 18]

The prototype has been iterated and tested continuously. The team have also been developing and testing options to support the back end of the service as the team recognise the importance of the whole end-to-end service design. Where users have stumbled in testing, the team have done well to address these problems, including appropriately developing existing GOV.UK patterns where they didn’t quite meet the needs of their specific users. The team are looking to reduce or remove the number of points in the service that might cause users with low digital skills or enthusiasm to give up, such as the current requirement to submit an email address for renewal reminders, with no option to be texted or written to instead.

Technology, security and resilience [points 6, 7, 8, 9]

The team has evaluated tools and systems for building the service by creating several prototypes of the frontend and backend components. They are confident that some of the technologies they prototyped will be able to meet the service’s needs for the beta and don’t plan to introduce radically different tools and frameworks during the beta.

The team has not yet made the service’s source code from the alpha available but plans to release it during the private beta.

The team have carried out an impact assessment and have engaged the corporate security and data protection teams during the alpha. They have identified denial of service attacks and copycat sites as risks associated with the service. They have carried out quantitative analysis to ensure that the service is optimised in search engine results and plan to continue with this during the beta.

The team has liased with the GOV.UK Pay and Notify teams in order to meet the service’s needs to take payments and to send confirmations to users.

Improving take-up and reporting performance [points 14, 15, 16, 17]

The team demonstrated a good understanding of what performance measures they will develop to ensure that data is used to continuously improve and iterate the service moving forward. This includes potentially creating a number of additional KPIs to support the wider vision and objectives such as increasing the number of users.

The team found no legal reason to retain the paper components of the service, and were looking to mitigate concerns around fraud and enforcement of going fully digital. They are talking to DVLA about paperless options and their ambition is to go fully digital in future.

The team had various approaches planned to further increase digital take up, such as awareness raising campaigns and improving the online service in line with resistant users’ concerns.

Recommendations

Researching and understanding user needs [point 1 and 2]

  • The panel strongly recommends further consideration is given to options for the beta phase because a private beta, with real users and data will be extremely valuable. Without this option the team are likely to find it difficult to pass a beta assessment.

  • User research will need to be far more extensive during beta and the panel recommends finding users that couldn’t and/or don’t want to use the online service to better understand their needs, including their needs of support. Getting to know why some users do not want to use the online service as well as what support others will require to use the online service are vital to inform its end-to-end development.

  • The team must broaden their research to understand users’ full user journeys. This means prior to and subsequent to going online, and through any non-digital sections.

  • The team should seek to recruit users with disabilities and impairments for both their research into and testing of their online prototype and any assisted digital support.

  • The team should look for users and potential users with low digital skills, confidence or access in places where they are likely to go. This does not necessarily mean looking only in places connected with angling.

Technology, security and resilience [points 6, 7, 8, 9]

  • For development during beta the panel would recommend ensuring that the emerging development work of the back end systems is owned by the service team.

  • The team has not yet made the service’s source code from the alpha available but plans to release it during the private beta. The panel recommends that the team develops any new code in the open, as retrospectively opening up a codebase is much more difficult and complicated than doing it in the open from the start. The panel suggests that the team get in touch with the Register to Vote and GOV.UK Verify teams to share experiences about opening up a closed codebase. As the team has not identified good reasons why the source code cannot be open sourced, the panel would expect the vast majority of the code to be open sourced in time for the beta assessment, in order to pass point 8.

  • If the team decides to use closed CRM platforms to deliver the service, they will need to demonstrate evidence that it is possible to export data out of those platforms in open data formats, in order to pass the beta assessment.

  • The proposed technical architecture for the service contains a large number of subcomponents. The team should take care to ensure that each integration genuinely supports the user needs for the service MVP, as opposed to future user needs, in order to reduce technical delivery risks.

  • The alpha prototypes included confirmation screens with the user’s personal data (such as their address and date of birth) that was accessible with only a rod licence number and email. The panel recommends testing if removing the replay of this data would negatively impact completion rate, in order to avoid introducing unnecessary identity theft risks.

Designing and testing the service [points 4, 5, 11, 12, 13, 14, 18]

  • Further work and collaboration with the cross government design community on the design pattern for times and more extensive testing to ensure the user needs and accessibility standards are met, particularly on any new components or patterns the team create.

  • The team acknowledged the need to continue refining the content throughout the online service, the use of first person throughout the service may be confusing to the subset of users buying a licence for someone else for instance. Such issues should be addressed before a beta assessment.

  • The panel questioned the need for the first of the two summary screens (halfway through the transaction), the team should test the impact of removing this duplication from the service to avoid unnecessary duplication and steps within the service.

  • The team should look further into third parties such as tackle shops as a viable source of face to face assisted digital support for users who need it. This may include finding a way to pay those venues for the part of this service they would be providing. The team should also consider using the Digital Training & Support Framework to procure assisted digital support, if required.

  • Look into alternatives to email as reminder options, to make the service more usable for those less likely to have a well attended email address.

  • Continue to work with the GDS teams, in particular Pay, Notify and Verify as there are potential benefits.

  • The team must ensure that plans to boost digital take up are measurable, can be tested in beta, and are directly linked to their research into users’ barriers. The team should be actively testing all approaches to maximising take up, including switching off paper options.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 10 February 2017