Check local environmental data for agriculture - Alpha Assessment

The report from the alpha assessment for Environment Agency's Check local environmental data for agriculture service on 7 July 2016.

Stage Alpha
Result Not Met
Service provider Environment Agency

The service did not meet the Standard because

  • The panel was concerned that the team’s prior experience of working with farmers meant they had missed opportunities to explore different service models and had not tested some assumptions.

  • The panel did not see any evidence of observing how the service was used by users. The panel was concerned that this was not part of the team’s future plans.

  • The panel did not feel confident that the current approach was sufficiently informed by the needs of those with low digital skills or accessibility issues, or those who live in places with limited connectivity.

  • Although the six week Alpha conducted by the team had been necessarily tightly scoped, the panel was not satisfied that the product had been sufficiently tested to validate the extent to which the service required users to interact with the mapping software.

  • The team had not had committed content and interaction designers with the result that it was not clear whether the approach presented in the Alpha would allow users to successfully meet their needs unaided, first time.

About the service

Service Manager: Rob Jones

Digital Leader: John Seglias

The service allows those involved with managing farms to easily access information about the environmental priorities for their farm so that they meet their legal responsibilities. This piece of transformation enables the Environment Agency (EA) to move towards retiring an existing platform.

Detail of the assessment

Lead Assessor: Ben Welby

User research

The team has a challenging task to reinvent the environmental advice service to encourage environmentally friendly farming. Taking small steps and starting with a few core areas for technical discovery, the team needs to do more discovery user research work in order to understand overarching needs, behaviours and attitudes in relation to environmental advice. This approach will help them create a scalable, robust and user-friendly service.

The team has a thorough understanding of the farming community, due to long-term working relationships. It was positive to see the empathy the team has towards the farmers and depth of the knowledge of agriculture practices. However, in order to build good services, the team need to shape their knowledge into usable insights that will help them make decisions on functionality, design and content.

The team has made a positive start by mapping out user groups within the “circle of confidence” and showing an understanding of who influences farmers. It was particularly good to hear the team recognise that different groups of users have different needs:

“People in EA want to tell them everything, and precisely and scientifically technically correct but at the other end farmers just want to be told what to do.”

In order to help the team make decisions, it is very important to research attitudes, behaviours and needs of identified groups and map those out in the form of behavioural personas and user journeys.

The panel was impressed by the research the team had conducted within such a short timescale with limited resource. The team has conducted a series of workshops with farmers, land agents, contractors and advisory groups. The team has reached out to users through questionnaires, phone interviews, and conducted 1-1 interviews during workshop breakout sessions and run remote usability sessions with email feedback.

However, while those are valid insight gathering practices their methodology should look at what people actually do vs what they say they do in context of their usage. For example, the team has identified that users were asking for the functionality to print the reports but it would be useful to understand why users are asking for this and how they use it; it could be that users highlight parts of the report or use it as evidence and this knowledge might help the team with the design of the report and give ideas of the needed digital functionality.

It can not be overstated how important it is to observe users using the product rather than ask them to report on their experiences. The team should really make an effort to observe all user groups performing tasks relevant to their needs on devices that the group is likely to use. The absence of testing the prototype in this manner meant it was impossible for the panel to establish whether the user journey presented in the Alpha was a minimum viable approach.

Recruitment is a big challenge for the team due to the location and schedule of farmers and it was positive that the team has identified opportunities to meet farmers at events and go on visits with inspectors - all those opportunities should be utilised to interview and test. However, it is essential for the team to seek opportunities to observe users in their work context, for example in the offices of agronomists and other advisory groups.

Finally, the team need to make more effort to test their service with less engaged users, those with low digital skills, people with accessibility needs and those who may have limited internet access. The EA team should gain insight from teams in other parts of their organisation and other departments regarding digital skills, behaviours attitudes, assisted digital and accessibility needs.

Team

The panel was impressed by the expert knowledge of the team and their clear passion for farmers and the natural environment. However, we were concerned that their funding model meant the team had to start before they had the full complement on the team and that certain critical roles were not filled during this Alpha.

The team recognise this problem and are seeking to address this by ensuring that a full team, including full time user research, service designer and content designer, are in place from the outset of the beta. This must be the case for the service to be successful at the beta assessment.

The panel was pleased to see that the team had found a good blend of co-location and remote working. We would encourage them to ensure that this approach is maintained when the new team is put in place.

Whilst the working relationship with ESRI is clearly very strong and the service benefits from their expertise in web mapping, the constraints on the team mean the Alpha has so far been led by their technology rather than testing their assumptions over the design of the service. There is a lot of value in maintaining continuity of ESRI team members from Alpha into beta but the team need to be wary of being led by what their supplier’s product can do.

The panel was pleased to hear that the team had taken a pragmatic approach to some of the wider questions being asked within the Department for Environment, Food & Rural Affairs (Defra) about their future approach to service delivery. We would of course like to see that uncertainty removed but were encouraged that the service owner is empowered to take decisions about how to proceed and that those decisions allow for being flexible in delivering the service on the basis of what Defra concludes.

Technology

The Discovery identified an extensive backlog of work for the team to complete, so the panel was surprised to hear that the team anticipated only a further 5 sprints of development work during the beta. Whilst the ease of implementing out of the box functionality is attractive, the team must ensure that their approach is informed by their user research.

The service will not create another ‘version of the truth’ to be maintained but accesses data as needed from the relevant canonical source. This means they are storing the minimum amount of data in the service and are in the position to be flexible about what they consume and display.

This approach means that the reliability of the service will be dependent on those data sources. The service team should consider what this means for availability and scalability of the service. Although they are confident that the service could be easily scaled to cope with demand, is this also true of the external data sources?

The team have prototyped the service using a Software as a Service layer for geographic info and data access with a frontend application deployed to a Platform as a Service (PaaS) provider. They intend to continue with this architecture into beta.

The team is able to do continuous development of their frontend application and are able to deploy quickly. However, they have some reservations about having their supplier in between them and the PaaS provider, and are investigating a different approach to help speed up development.

The team has focused on a maximum portability and flexibility which is good. They are working with their supplier to prioritise configuration of the SaaS product over the writing of code and therefore have used as much as possible out of the box rather than high amounts of bespoke development.

The team is drawing on the technical experiences of other teams within Defra.

The frontend application is based on existing Open Source code published on Github. The team intend to contribute changes back to the original software, however it is currently in a private repository. The team is looking to open this up in beta.

The team has an expectation of 24/7 access, but need to understand what happens when things go wrong and what the implications and risk are if the service was to go offline for 2 days.

Although their supplier is doing good work on accessibility and will continue to improve their mapping platform there will always be limitations in using web mapping. Although the team had identified a ‘sense of place’ as being important to their users there was not enough evidence to suggest that this technology led approach would meet the needs of those with accessibility needs, low digital capability or poor internet access.

It was good to see the team challenging their supplier to make the elements of the page load progressively but there are concerns about the performance of the service. The use of the mapping service requires Javascript to be enabled. The team should consider how users will be able to use the service if Javascript isn’t available (if, for example, the service is accessed from an older machine or fails to load over a slow mobile/rural broadband connection). Given the high level of use expected for this service in a rural setting the team must make performance issues a priority. This issue raises questions about the appropriateness of web mapping as the main interface by which this information is surfaced.

The importance of mobile devices to this user group was unclear. Although the team stated that the office for a farmer is their mobile device it was not clear whether the user need for this service was best met at a desk or out and about on the farm, or in the office. This lack of clarity meant that the team had not tested their assumptions about connectivity and, as a result, the technical requirements of the product they have chosen. The team must do more work to understand whether the service stands up to the needs of those in a field.

Design

Lack of design resource

The team has lacked an Interaction Designer but has been able to use the various GOV.UK front-end code resources to make the prototype consistent with the rest of www.gov.uk. They will be looking to recruit a Designer which will become more crucial for beta development as the effectiveness of the whole service becomes more important.

Despite this, and by focusing on the big technical questions and general look and feel rather than the detail, the team has managed to get the prototype to a point where users can complete journeys with some success.

Interactive mapping layer

The team have placed a lot of emphasis on the importance of a mapping layer that users can interact with but the panel was concerned it might be something of a distraction. Users need to know the information that directly affects their property and the map is a useful way of presenting this geographically but the panel saw no evidence that interacting with it was a necessary step for this.

In addition, the mapping layer had not had any accessibility testing. This would not normally be a big concern at this stage but given its centrality to the service and the extent of bespoke user interface elements it needs to be tested by an accessibility company as soon as possible to judge the extent of the risk in adopting this technical approach to meet the need.

Most importantly, the team should use the sessions they have planned observing users using the service to determine how to iterate this feature. The team know that maps help farmers engage with the information the service produces but need to know what problems they feel they solve. Given this, the team should be able to explain how the interactions they are looking to support let the users do what they need to. Part of this should be testing flows that don’t rely on interacting with the map for comparison.

Digital uptake

The team was able to clearly articulate their plans for driving users to the service.

At the moment the catalyst for uptake comes from industry, mainly through voluntary initiatives and by encouraging agronomists to introduce it to their farmer. The EA is part of the voluntary initiative but the team should consider that at national level it will quickly become a comms job which they may need to resource.

Every 4 years the zones the information is based on are redrawn. When this happens farmers are given a letter which The EA believes will prompt them to go and seek out the information. That letter goes to 100,000 but Defra probably sends more than one letter per farm business. They know that letter drop is connected to a spike in calls so has proven its use. The challenge for the team is to ensure there is consistency between the content in the letters and the rest of the service. They should also be making connections with Defra to ensure consistency with communications those farmers receive from them. This is something we would expect to see evidence of in their beta assessment.

The existing call centre will continue to support users requesting reports over the phone (and being delivered in paper form by post). The team should ensure they collect enough data, of the right type, on usage of this service to determine the amount of users they can move from it onto the online channel and, when they can’t, the type of needs the remaining users have which this channel fulfils.

Content design

The team needs to hire a content designer to work closely with the user researcher on the discovery work recommended above. This will inform the scope, structure and wording of the service based on user needs. The content designer should be engaged with the content community both within EA and at GDS.

The team needs to consider the full user journey for the service and the entry point for users, including starting points within GOV.UK. Content of letters sent to farmers when Nitrate Vulnerable Zones (NVZs) are redrawn should also be considered as part of the service.

There is also work to do to establish the scope of the service and how it relates to other environmental needs for farmers. The team has voiced concerns about including too much within the service. As mentioned above, whether or not the scope of the service is extended should be determined by the user needs discovery work.

Service name

At the moment the title echoes the uncertainty about the scope of the service. ‘Environmental Risks for Farmers’ is much greater in scope than ‘Find your farm and see what environmental pressures apply to water in your area’. ‘Find your farm’ isn’t part of the user need - farmers know where their farm is.

Drawing tool

Finding a farm is an action within the service rather than a user need and may not be the most appropriate way of presenting the data. There is likely to be a high failure rate for farmers attempting to draw the area of their land accurately, based on previous experience with RPA and given that it isn’t possible to select non-contiguous areas of land. Testing will help to establish success rates for drawing farm areas and the content designer should iterate based on observed use - clearer instructions may help completion of the task.

No results

Users who return results where no action is required must clearly be told that they have no responsibilities. At the moment this isn’t clear.

Analytics

The team had identified how they would measure their 4 KPIs but had also started to think about the other performance indicators they would measure in particular the impact on their other channels and whether or not farmers are compliant with their legal responsibilities.

The team is aware that there are limitations on the way in which they can measure the use of the mapping element of the service. They’re using what ESRI provide out of the box but they need to be able to measure whether the ability to interact with the map is adding value.

A lot of the focus in this report is on the ease of use of the bounding box because it’s not immediately obvious how to engage with it and it’s not good for accessibility. The team must investigate ways of measuring how much it is used so that they can make the right decisions about whether to persist with that technical approach.

The team is confident of being able to collect lots of feedback, particularly from industry but the panel would highlight that the team need to ensure they’re able to understand how people use the service rather than what people tell them about how they use the service and implement analytics accordingly.

The panel is pleased to see the involvement of the EA’s call centre. The team should work closely with the NCCC to ensure that it is possible to measure the impact of the service on those offline channels and to reflect the feedback from callers into the design of the service.

Recommendations

To pass the re-assessment the service team must:

  • Provide evidence that they have conducted research of the prototype in the homes and places of work of their users. This will enable the team to test whether or not the current flow (with a focus on interactive mapping) easily provides farmers with information about its environmental priorities in order to meet legal responsibilities.

  • Demonstrate that they have a viable approach to meeting the needs of those with low digital skills, accessibility needs and poor connectivity.

To pass the next assessment the service team must:

  • Focus their user research methodology on what their users are actually doing, and not what they say they do. Whilst we appreciate the constraints of gaining access to farmers and their management teams the team must undertake more user research directly in their places of work.

  • Ensure that the team is fully staffed with the necessary multi-disciplinary skills and, in particular, with full time content and service designers.

  • Avoid the design and implementation of the service being led by their technical supplier

  • Identify a way to measure whether or not the service is fundamentally making it easier for a farmer to understand their responsibilities in relation to the land, and how the environment is benefitting as a result.

The service team should also:

  • Continue to work with the Rural Payments Agency, the Forestry Commission, Natural England and any other relevant bodies with a view to removing the need for farms to navigate several different organisations. The team may find it helpful to engage with the One Government at the Border team who are exploring how to minimise the need of switching between different organisations.

  • Work closely with colleagues in the Environment Agency’s call centres to understand what queries around local environment data drive contacts through non-digital channels.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Not Met
3 Having a sustainable, multidisciplinary team in place Not Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Not Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Not Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 12 April 2017