Grants Lifecycle Utility (GLU)
Service Standard assessment report for CO's Grants Lifecycle Utility (GLU)
Service Standard assessment report Grants Lifecycle Utility (GLU)
From: | GDS |
Assessment date: | 03/07/2025 |
Stage: | Alpha |
Result: | Amber |
Service provider: | Cabinet Office, Government Grants Management Function (GGMF) |
Service description
This service aims to solve the problems of facilitating the management of grants from inception through to assessment and post-award monitoring, enabling greater efficiency, transparency, and confidence throughout the grant administration process. grant administrators use multiple systems, data sources and office productivity software, resulting in a frustrating user experience.
Service users
This service is for
-
Government Grants Managed Service (GGMS) internal grant administrators
-
Government Grants Management Function (GGMF) Civil Servants Civil Servants from other government departments (OGDs) collaborating with GGMF to design and administer schemes
-
Civil Servants from OGDs who are not yet using the Grants Lifecycle Utility (GLU)
-
Citizens or organisations applying for grants through GGMS supported channels
Things the service team has done well:
- the team utilised hypothesis-driven research and design which fed into development.
- the service blueprint and benefits mapping were comprehensive.
- this service is more of a component, and the team have a firm understand of where it sits within the wider grants lifecycle and various user journeys that interface with it. -the team have made sensible use of available PWC and Microsoft accessibility software.
1. Understand users and their needs
Decision
The service was rated amber for point 1 of the Standard.
During the assessment, we didn’t see evidence of:
-
user research has being undertaken with people with additional needs due to health or disability. Although the team have followed the appropriate standards and good practice in their work, they still need to demonstrate through user research that the service works for these users
-
the support model being defined or tested with users. The team don’t have evidence that when people need help, they know how to get it in a way that meets their needs
-
user research being undertaken with people who aren’t in the pilot group but administer grants across Government (persona 4). The team estimate there to be around 5,000 potential users and the intention is to encourage them to use the service, but without research the team cannot evidence that they have built a service that works for everyone eligible to use it
-
a mixed methods approach to user research. The team need to expand their research methods to ensure they are making changes on reliable data. For example, conducting in-person research and usability testing. Having a clickable prototype will support this
-
high level user needs. Those shown were functional and did not reflect user’s motivations for using the service
Optional advice to help the service team continually improve the service
- the team should consider re-working their personas to remove the names and stock images. Until they were discussed it was unclear whether they represented individuals or groups of users and people could be easily misled by them.
2. Solve a whole problem for users
Decision
The service was rated green for point 2 of the Standard.
3. Provide a joined-up experience across all channels
Decision
The service was rated green for point 3 of the Standard.
4. Make the service simple to use
Decision
The service was rated amber for point 4 of the Standard.
During the assessment, we didn’t see evidence of:
- usability testing with secondary and potential users of the service to ensure it is simple to use for all users. While it is admirable that the team co-created the service with their key users, they have only shown their secondary and potential users the screens.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
- testing with actual users who have accessibility needs. The team has done well to use the tools available to them for accessibility, but it is no substitute for testing with actual users.
- an understanding of the assisted digital journey explaining how users will receive support when the number of users of the service increases.
6. Have a multidisciplinary team
Decision
The service was rated green for point 6 of the Standard.
7. Use agile ways of working
Decision
The service was rated green for point 7 of the Standard.
8. Iterate and improve frequently
Decision
The service was rated green for point 8 of the Standard.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
10. Define what success looks like and publish performance data
Decision
The service was rated green for point 10 of the Standard.
Optional advice to help the service team continually improve the service
- the team have a good grasp of their measurable metrics to help them track the performance of their service. The panel would recommend the development of a performance framework to help them articulate this clearly.
11. Choose the right tools and technology
Decision
The service was rated amber point 11 of the Standard.
During the assessment, we didn’t see evidence of:
- consideration of different technologies and the decision making process that let to Power Apps being chosen as the solution. Whilst the team provided evidence to support the choice of a no-code/low-code technological solution, it would be challenging to conclude that the technology choice was the right one without understanding what other options were eliminated and why
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Next Steps
This service can now move into a private beta phase, subject to addressing the amber points within three months time and Spend Control approval.
To get the service ready to launch on GOV.UK the team needs to: get a GOV.UK service domain name work with the GOV.UK content team on any changes required to GOV.UK content