Apply for a large countryside productivity grant alpha reassessment

The report for the Apply for a large countryside productivity grant alpha reassessment on the 02 February 2021

Service Standard assessment report

Apply for a large countryside productivity grant

From: Central Digital & Data Office (CDDO)
Assessment date: 02/02/2021
Stage: Alpha reassessment
Result: Met
Service provider: Department for Environment, Food and Rural Affairs

Previous assessment reports

Service description

Farmers and rural businesses can apply for grants for projects that improve irrigation or manage slurry. They can get up to 40% of the project costs and receive grants from £35,000 to £1 million. The application process is long and complex. This digital service allows farmers to:

  • Check if they and their project are eligible for transformation fund
  • Check how well the project fits the funding priorities

The expression of interest allows users to outline the benefits of their project. EOIs are assessed based on policy’s priorities.

Service users

  • Farmers/land managers
  • Agents acting on behalf of farmers/land managers
  • Internal back office staff

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has really engaged with the feedback from the previous assessment - working towards a strong set of revised research goals and covering a lot of ground in the last 3 months
  • it sounded like the team has been working well with the RPA and their assessors to really understand the assessment process and make improvements to the service
  • the team has done further research including more contextual interviews to build a richer understanding of different types of farmers and agents. For example, it was good to see the team carrying out telephone interviews with farmers with poor connectivity and ensuring their experience was captured

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to do regular research with farmers, agents and farm managers. In particular with harder to reach, smaller scale, and less engaged farmers, to better understand their needs and work out how to meet them. As the team have clearly begun to identify a route for farmers with poor internet access and low digital skills this feels particularly important
  • build on the groundwork the team has done to ensure they do research with users with a range of access needs
  • explore further the experience of users that are eliminated/ruled ineligible during the service. Look into where is the best place to communicate this information
  • consider revisiting and simplifying some of the ‘high level user needs’ and making them less oriented towards the process requirements of the scheme

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have made it easier for farmers to find out if their project is eligible for the grant, and that farmers no longer need to wait for a evaluation of their expression of interest, allowing for instant feedback
  • the team have worked closely with policy and the RPA to better understand the policy area and the potential constraints
  • the team has considered and mapped the pre-preparation and awareness stages of the service (before users reach the Eligibility section)

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the possibility of an eligibility checker for all grants, so that farmers can identify which grants they’re able to apply for on a wholesale rather than individual grant basis
  • consider the application process (post eligibility checker) as part of the whole service journey so that the entire end to end process is easier for farmers and easier to administrate operationally and join-up in the future; as part of this, consider ways the service could help applicants to make stronger applications
  • evaluate the medium-high scoring approach and how that helps users with their application and levels of confidence. Explore other options, and allow users to get support at that point of the journey (understanding and interpreting the feedback)
  • the current end to end journey ends when users get a score for their expression of interest however it doesn’t enable the user to apply for the grant. As a result, the panel found it challenging to assess against standard point 2 ‘Solve the whole problem’ as the current journey doesn’t meet the high level need of submitting a successful application
  • currently the service and its management is disjointed, with responsibility split between different parts of Defra/RPA. To progress to public Beta, the service must resolve this confusion and clearly demonstrate a user, rather than tech or organisation-centred, approach

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is aware of the importance of non-digital routes for users with access needs, hard to reach farmers and poor connectivity, and have identified some options to develop and test

What the team needs to explore

Before their next assessment, the team needs to:

  • test how existing assisted digital RPA channels meet the needs of users who can’t complete the EOI journey online, including offline channels to ensure that users can still get a consistent experience. Explore if there are any iterations to those routes

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has considered alternatives channels and ways to support user with less digital skills, varying internet connectivity, confidence levels and access needs, such as using the existing RPA helpline to support farmers over the phone with access needs to make their full application offline or allowing users to receive printed feedback on their EOI through the post
  • the team has developed a better understanding of the different user groups and their needs

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct user research and testing with people with access needs and the use of assisted technology
  • review if it’s necessary to ask users for personal data at the end of the journey (as this is not a requirement to begin with the application process)
  • make sure the service is accessible during private beta, and carry out an accessibility audit and publish an accessibility statement before opening to a public beta
  • test with participants on different devices to understand the usability of the service on mobile and tablet in particular

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • a multidisciplinary team had been deployed for the alpha extension and the beta phase
  • the team included a product owner, content designer, user research and service designer
  • plans were underway to ensure the longer-term stability of the team throughout the beta phase

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that there is continuity in the team structure with a clear roadmap of what the team will be delivering at beta and live
  • ensure there is a knowledge transfer mechanism between the delivery partner and permanent Defra and RPA staff
  • continue to deliver Show & tells across the wider Future farming programme to build capability within the programme and encourage a joined up delivery

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team implemented previous report recommendation on having design workshops with farmers, agents and RPA application reviewers
  • the RPA workshop enabled the team to remove two extra time consuming steps to the journey
  • the team demonstrated iterations based on usability testing
  • the team identified pain points in the previous prototype and designed a new end to end solution to remove these pain points
  • the content of the service and the design patterns had been tested and iterated in various rounds
  • the team explored automating some of the EOI reviewing processes
  • the team has developed a scoring method which they are planning to test and iterate further in private beta

What the team needs to explore

Before their next assessment, the team needs to:

  • test further their eligibility checker entry point - guidance page. The team has changed from linking to it using a green start button to a link at the bottom of the page. The reason being that people usually tend to not read the guidance and go straight into the green button. The team should consider how the content would work within the rest of the service rather than hiding the link at the bottom. Users might miss the link, and this is the main action for them. Because the eligibility checking is now automated rather than a manual process, it seems that there is no wasted operational effort if users whose projects are ineligible for the grant go through the eligibility checker
  • test further the scoring page results: This page uses a similar style of the confirmation page, but it’s not the end of the service. It also includes a “check your answers” section. Users may not reach the ‘Continue’ button if they believe that their eligibility outcome eg. ‘medium fit’ is the final outcome of this service. Users might drop off after this page if they don’t reach the bottom of the page and click continue, missing out on the opportunity to receive an email with the full application form
  • explore making the “check your answers” section a separate page in the journey. Consider placing it before the scoring page, and thinking about how to connect collecting any user data in a more joined up way (if collecting user data such as an email address is required for users to receive the link to the application form, it could be embedded in the main flow)
  • consider making the reference number and next steps page the confirmation page, as it is the end of the service and should follow the GOV.UK Design System patterns
  • collect feedback on scoring results and being transparent on government priorities.
  • we recommend the team to explore ways in which users can know what are the government’s priorities and criteria for selecting projects earlier in the user journey

  • iterate validation errors to be in line with the Design System. Entering a comma in a field for what is the estimated total project cost’ results in a validation error. The service should:
    • use error and validation components from the GOV.UK Design System
    • work on ways for multiple formats to be validated such as numbers with and without a comma
    • show an example for a correct entry (see ‘date input’ in the Design System)

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • The team is using right kind of protocols
  • The team is adhering to GDPR guidelines and seems to no concerns around that
  • Online service is the first stage of a two stage application process
  • The fraud check is done at the 2nd stage when the full application is submitted along with all the supporting documents. The processes in place to refer their concern to the Fraud Risk Management team, who would decide whether to refer it on to Defra Investigation Service (DIS). DIS have the investigators and legal powers to investigate suspected fraud and would submit a Suspicious Activities Report (SAR)
Published 23 August 2022