Apply for a large countryside productivity grant alpha assessment

The report for the Apply for a large countryside productivity grant alpha assessment on the 28th October 2020.

From: Central Digital & Data Office (CDDO)
Assessment date: 28/10/2020
Stage: Alpha
Result: Not met
Service provider: Department for Environment, Food and Rural Affairs

Service description

Farmers and rural businesses can apply for grants for projects that improve irrigation or manage slurry. They can get up to 40% of the project costs and receive grants from £35,000 to £1 million. The application process is long and complex. This service allows farmers to:

  • Check if they are eligible to apply
  • Create and send an expression of interest

The expression of interest allows users to outline the benefits of their project. EOIs are assessed and compared with other projects. Those invited to make a full application can be confident that it is worth the time and effort.

Overview of service users

  • Farmers/land managers
  • Agents acting on behalf of farmers/land managers
  • Internal back office staff

1. Understand users and their needs

User research assessor

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed by:

  • the team had a good process in place for testing their prototype. The team is clearly involved in observing research sessions and partaking in analysis using Miro
  • user research is clearly influencing iteration of the design (such as the example shown around information architecture).

What the team needs to explore

The changes in people within the team has clearly had an impact on the amount of research the team has been able to do, and the team was aware that they hadn’t yet got to the saturation point where they were seeing the same findings coming up again. Overall the amount of research done for both discovery and alpha felt light.

Before their next assessment, the team needs to:

  • do more discovery research to expand out the user needs, and build a deeper understanding of the different types of farmers or agents in particular
  • do further research with agents to better understand how they manage applications for farmers and get them the funding they are seeking. For example do they apply for multiple schemes or one project (i.e. applying for both small grants and large grants to get money on the same project?)
  • research with RPA staff to better understand how the assessment process will work, and how the EOI speeds up and helps the assessing/decision-making side of the process. Are there further changes to the EOI form or process that will really help them?
  • further to the research gaps already identified around digital inclusion, assisted digital and accessibility needs - also consider what devices (i.e. mobile, tablet) people are using to find out about, as well as do their EOIs, and ensure you are testing on these devices
  • consider supplementing the existing recruitment process with other means (i.e. recruitment agency) to find farmers (particularly prospective applicants) who know less about the process of getting grants, and are potentially less engaged with Defra or the RPA.

2. Solve a whole problem for users

Lead assessor, particularly with design and research input

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • consideration had been given to what the end to end journey was, and whilst waiting for the strategic solution, there was a potential immediate opportunity to improve the outcome for users through expression of interest and eligibility checker
  • whilst working within a particularly narrow scope, that is far from ideal but the panel can understand, and given immovable constraints, the service team members have worked across boundaries to share/gain knowledge on other aspects of the wider programme, and that there is now a central team in place doing more work on the end-to-end across grants etc.

What the team needs to explore

Before their next assessment, the team needs to:

  • build on the recommendations under point 1 to demonstrate that this tactical service is enabling all farmers intended to benefit from large grants to reach that goal with greater ease. It’s worth better understanding their context and what it means for the service. Support of family members for some farmers was pointed at, which is good. It appeared the context on the agent side of things could be more developed - why agents are used, what the impact of this would be on ‘consideration’ and EOIs
  • develop a good understanding of all the touch points across the public sector (and with other actors) that applying for a Large Grant could generate, and whether aspects of this service could improve those other touch points and vice versa. For example, if the same information is being reused elsewhere or could be used here
  • refocus on users most likely to struggle with the service as this will help build a service that works for everyone. Not having researched enough with users with assisted digital and accessibility needs (which could be high) means opportunities to build the service from the ground up with these needs in mind could be missed, and this could cause issues for many actual users.

3. Provide a joined-up experience across all channels

Lead assessor, particularly with design and research input

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had developed an outline of the end-to-end process that this tactical service fits within
  • guidance around the schemes has been rethought and prototyped within GOV.UK.

What the team needs to explore

Before their next assessment, the team needs to:

  • test the overall experience moving between channels (ie. from knowledge of the scheme, through to submitting an EOI, receiving feedback and then completing the actual application) even if the primary focus of the service is a smaller part of that journey
  • develop a deeper understanding of users likely to struggle with the digital channel to ensure they can still get a consistent experience
  • investigate ways for the whole process to feel designed as one service, such as reviewing and rewriting the final application form and correspondence templates
  • include back office staff in the design process, making sure that the information they need is captured in the EOI and application process.

4. Make the service simple to use

Design assessor

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • high quality prototypes had been made for 2 of the grant journeys, following the GOV.UK Design System
  • iterations had been made based on user research
  • content pages had been included to explore the split between guidance and service
  • a language guide has been created to keep terms consistent throughout the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to iterate the 2 journeys that have been created based on thorough user research with a wide range of participants
  • design and iterate the remaining 4 journeys
  • include the policy team in the research and design of the question protocols, changing policy where the user need is unclear or creating pain points (such as need for planning permission before applying for the grant), or the information farmers need to supply makes the service hard to use
  • continue to research with a wide range of agents to make sure the service works for them
  • get regular feedback from back office staff to make sure the information being captured is useful in making decisions about grants, and iterate the question protocols
  • explore, design and iterate solutions for save and return
  • work with GOV.UK to design the content around the service, including exploring the use of step by step navigation
  • continue to push against PDF handbooks for the grant process
  • design and iterate the communications with farmers during the end-to-end grant process
  • incorporate other steps of the service such as the communications and awareness of the availability of grants
  • iterate the service and question protocol based on analytics in the private beta.

5. Make sure everyone can use the service

Design assessor

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were using the GOV.UK Design System with no custom components, so accessibility can be assured
  • the language has been iterated where it was not understood in user research.

What the team needs to explore

Before their next assessment, the team needs to:

  • understand how many of the users have assisted digital needs
  • have a better understanding of why a majority of farmers are using agents for grants such as these
  • start to create an assisted digital strategy for the service - or tie in to cross RPA / DEFRA assisted digital strategies
  • make sure the service is accessible during private beta, and carry out an accessibility audit and publish an accessibility statement before opening to a public beta
  • monitor the use of agents in service, incorporate this as a KPI, and try to reduce reliance of agents over time.

6. Have a multidisciplinary team

Lead assessor

Decision

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team included most of the key roles, the team at the assessment was capable and passionate. Whilst a wholesale change of the delivery team, caused by issues outside of the service’s control was imminent, some steps had already been taken to mitigate this, including the Product Manager from the new supplier having already started.

What the team needs to explore

Before their next assessment, the team needs to:

  • a real concern is the current agreed longevity of the new team taking this on. It has helped that efforts have been made to smooth the transition, but it is also clear that past delivery changes have been inevitably disruptive. The service could do with longer-term stability and consideration should be given to how that might be provided. That impact of past changes, and uncertainty of the long-term are what have made this a ‘not met’ at the moment, until it is clear sufficient stability can be achieved to allow work to proceed at pace through the coming year.

7. Use agile ways of working

Lead assessor

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team established agile practices within the team, and found ways to adapt to constraints of Technology.

What the team needs to explore

Before their next assessment, the team needs to:

  • implement the proposed changes to technology to further strengthen ways of working
  • consider how, in the context of the risk of possible further changes to the team, good curation of artefacts, insight and rationale of service decisions are preserved and made easy to consume, and find opportunities to test this with onboarding of later joiners.

8. Iterate and improve frequently

Lead assessor

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated the ability to make well-considered iterations that improved the service based on insight from user research.

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct more rounds of significant iteration. What the team covered through each iteration is doing is good work. If one is still being surprised in usability testing and wider user research, that points to the need to do more - this is the cause of not meeting this point
  • continue to influence policy stakeholders. The team pointed to good examples of influencing policy on policy asks of the service and bringing policy into the team/ceremonies. This should be deepened, perhaps to a point where insight from user research is more and more relevant to shaping policy too (rather than just what the service is doing in response to policy).

9. Create a secure service which protects users’ privacy

Tech assessor

Decision

The service did not meet point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the integration with the existing service will be developed in conjunction with the UR function
  • data classification has concluded that most data is OFFICIAL, and that there are no GDPR concerns at this stage
  • an approach to avoid the need for the user to reveal their identity has been developed as part of a generic eligibility criteria.

What the team needs to explore

Before their next assessment, the team needs to:

  • explore, as the service evolves, more sophisticated approach to protect against fraud
  • continue to think about downstream GDPR impacts on data
  • need to think about securing during submission as the service evolves, as opposed to only post-submission.

10. Define what success looks like and publish performance data

Analytics or lead assessor

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had given due consideration to their core KPIs and how to measure them, including exploring beyond the 4 required KPIs to how this enabled service-specific goals.

What the team needs to explore

Before their next assessment, the team needs to:

  • consider impact mapping across other aspects of the end-to-end service beyond the immediate scope, to better understand how this particular service ultimately enables the outcomes
  • confirm approach to using performance data to drive iterations of the service when the service is in private beta
  • confirm approach to reporting performance data to relevant stakeholders on an ongoing basis.

11. Choose the right tools and technology

Tech assessor

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the new build will conform with existing GDS and Defra standards
  • with the approach to use a flexible microservice as part of the initial rapid transitional state for 2021
  • choreographing microservices / CI pipelines / DevSecOps elements within the FFC platform itself
  • expertise is shared between the Defra EU Exit and Trade Programme teams, combined with agility in development should smooth transition into Production.

What the team needs to explore

Before their next assessment, the team needs to:

  • be cognizant of the impacts from external sources, particularly Defra Eu Exit teams on requirements and overall direction.

12. Make new source code open

Tech assessor

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before their next assessment, the team needs to:

  • clarify the results of the architecture technical investigations
  • maintain the cadence as appropriate to the delivery schedule.

13. Use and contribute to open standards, common components and patterns

Tech assessor

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the Becoming a Teacher service from DfE was reused to shape the service
  • existing external services from GOV.UK (Publisher, Front End and Notify) have been incorporated into the service to date
  • the team has used the GOV.UK prototype kit during alpha.

What the team needs to explore

Before their next assessment, the team needs to:

  • lessen the reliance on existing components - FFFC Platform, Defra O365 SharePoint online and existing backend (legacy) platforms and processes
  • continue choosing open technologies unless there’s a good reason not to
  • use the GOV.UK Design System to build the Beta service
  • keep looking out for any government-built components that could be reused.

14. Operate a reliable service

Lead, Design and Technology assessors.

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • consideration is in place for suitable ways that explore native cloud availability in combination with different zone placements to improve the reliability from a technical standpoint.

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how offline processes can be incorporated into the wider service resilience approach in the next stage.
Published 8 April 2021