GOV.UK forms alpha assessment report

Report for the alpha assessment of the GOV.UK forms service on 03/03/2022

Service Standard assessment report

GOV.UK Forms

From: Central Digital & Data Office (CDDO)
Assessment date: 03/03/2022
Stage: Alpha
Result: Met
Service provider: Cabinet Office

Service description

This service aims to solve the problem of how to create online forms for GOV.UK without needing technical knowledge. Use online forms to collect information from members of the public and businesses.

Service users

Primary users: form creators in policy and operations teams

Civil servants making policy or running services that use document-based forms and have fewer than 10,000 transactions per year

Secondary users: digital specialists supporting form creators

Civil servants in departments’ publishing teams that currently give feedback on and publish document-based forms and guidance

Civil servants in departments’ digital assurance teams that currently give feedback on document-based forms that support government services

Tertiary users: members of the public and businesses filling in government forms

Members of the public and businesses using forms when interacting with government

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • there has been a substantial amount of research across government with a variety of different organisations and roles and using appropriate techniques to understand the users and context of this service. This research has allowed the team to segment its primary user group by their circumstance (full time form creator vs. ad-hoc creator)
  • the team has thought about the different roles within this service and how these roles impact the likelihood of the service being used, e.g. decision makers approving usage within their organisation
  • the team has focused their attention on the organisations that create the most paper forms in government which has meant they have a much better understanding of the teams that will have the most impact in getting forms online

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how they can reframe the user needs to be as useful as possible for the team to make decisions. The team has demonstrated good knowledge of their users’ goals, pain points, behaviours, context, etc. However, the user needs as written don’t bring these to light and it seemed hard for a team to make decisions with the current set of needs. The team should also review some of their user profiles to make them more consistent and easy for their team to use - as an example, the ad-hoc form creator’s goals and behaviours aren’t goals or behaviours and aren’t written in a similar way to some of the other profiles
  • think about which teams will be best to work with in beta to provide the most significant learning. The team has considered having a variety of organisation sizes and the organisations that have published inaccessible forms. The team should consider if there are other factors, such as limited digital capability, which might help them find partners who are more likely to identify potential problems

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team based their alpha phase around testing their riskiest assumptions
  • the team considered all facets of the service, from approvals through creation to information processing on the internal side, and user needs of the public on the other
  • accessibility is one of the three elements of ‘good enough’ for the forms created in the service
  • thinking has already started around what sort of guidance will be needed for primary users of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • carry out more practical, contextual research of the form creation, maintenance, and information processing elements during prototyping and trials
  • be alert to the (probably small) risks of creating new problems through misuse or poor use of the service (unintended consequences). Will role-based access management mitigate or eliminate this?
  • interrogate the feasibility of the ‘departmental champions’ approach - in our experience people move around, or can’t always be found, so it could be risky to rely too much on this
  • design and test how caseworkers will access information from completed forms. The team has benefited from previous work from Submit on this however there is a risk they create a form builder which works well for the creators but results in additional work or complexity for users who must use the data collected, while using an API might reduce the work for some services. The team needs to better understand how their solution will work for these users by testing it

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has investigated processes surrounding the creation of the form, including internal approvals processes, case working with completed forms and the management of existing forms
  • the team is already thinking about the journey from GOV.UK to the service - probably linking from detailed guides
  • the HTML forms can be provided alongside existing or alternative online channels
  • use of GOV.UK Design System patterns will help make the user experience consistent with that of other related services and journeys

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate how people completing forms will be able to access a paper version of the form. For certain users, in certain circumstances and using a certain service, a paper form may be required. The team has an opportunity to understand demand and usage of paper versions of forms. It would be beneficial to find ways of learning from users who sought out the printed version of the form about their experience
  • assess the risks around version control if departments run HTML forms and PDF or paper forms in parallel

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the end-to-end journey has been thoroughly mapped out, and detailed concerns and risks at each step have been identified for further investigation
  • the team was able to gather evidence to inform the early service design by using existing solutions like Google Forms and the x-gov form builder
  • the team intends to integrate with the GOV.UK Design System to automatically refresh design patterns used within forms to ensure they follow the latest good practice
  • the service will be designed such that primary users will not need to check for accessibility, as it will be built in

What the team needs to explore

Before their next assessment, the team needs to:

  • consider it was very surprising that despite having several Interaction Designers on the team, not many prototypes for the form builders have been created or tested. It will be of paramount importance to hear about different prototypes, how those have been iterated and made simple to use and accessible via partnership with different departments at the next assessment
  • keep exploring the need for primary-user guidance. Can the service be made so intuitive that little to no guidance is required?
  • continue thinking about primary-user guidance around content design and (tertiary) user experience as well as guidance on using the tool

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team can clearly articulate the problems with inaccessible forms in the current state and what this means for users who have to tell the government something
  • the team has screened their research participants to help get a better understanding of digital skills and adjustments to use technology. Research has been carried out with a range of digital capabilities and testing of different types of form builders that has helped the team to spot the different challenges that their users can face
  • there are clear accessibility objectives for the beta stage

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate that their solution works for end users with accessibility needs
  • demonstrate that the form builder meets the needs of departmental users, including those with accessibility needs and with lower digital skills/confidence
  • demonstrate that users who relied upon paper forms can still access the service they need
  • ensure that some of their beta teams are on the lower levels of digital skills so that they don’t just build for the highly skilled teams

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • content design has been a consideration right from the start and the team’s content designer is properly integrated with the team and adding value well beyond writing content
  • a performance analyst is already included in the team and adding value by considering both metrics to shape the service and future needs of reporting by teams across the government
  • overall while the shape of the team was non-standard, they seemed to have the access to all skills necessary in this stage and the buy-in for ongoing growth

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to consider what team would be required to continually iterate and support the service once in public beta/live and whether the estimated cost of £2k per form created will be realistic (and if the cost is higher per form, whether it is still something the organisation will be content to support for a sufficient amount of time to justify the build cost)

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • despite the team growing quite rapidly throughout the alpha phase, they worked well together and changed processes to suit them and the nature of work

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that as the team is already of a substantial size and notably contains multiple Interaction Designers and User Researchers. It will be interesting to hear about how the team changes, and how the processes are shaped throughout the next stage to ensure communication and collaboration remain effective

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • as noted above, the team tested existing form-making tools as a way to gauge user capability and preference

What the team needs to explore

Before their next assessment, the team needs to:

  • create, test, and iterate their own interfaces for form creation
  • consider how to demonstrate the iteration of an end-to-end process (including tertiary user experience). across the different pilots and how the lessons learned in each have been brought together at the next assessments

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has considered the importance of security throughout the alpha, they are actively engaging with security professionals, have plans to undertake a full Data impact assessment and have taken a minimal approach to what data needs to be stored
  • the team only plan to persist form sessions in memory, ensuring the team are classed as data processors and avoid the team from becoming a data controller in the definition of General Data Protection Regulations and Data Protection Act 2018
  • the team has chosen a platform with suitable observability, monitoring and logging infrastructure to ensure they will be able to perform observability across the applications
  • the team has chosen to use the GOV.UK account sign on system

What the team needs to explore

Before their next assessment, the team needs to:

  • continue understanding the user roles for the system and develop further suitable role-based access control. Further technological choices may need to be made in beta to support requirements and to enforce a “least privilege” approach
  • consider further the security implications around transferring form creator, approver and reviewer roles between individuals in a department
  • carry out appropriate threat modelling, vulnerability and penetration testing as they enter the private beta phase

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done a robust piece of work around analysing non-html forms available on GOV.UK, looking at common patterns using the content API and have a plan for regularly monitoring this
  • the team had a clear idea what metrics will be useful for them and for the form creators in the future: including error and abandonment rates

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to research how performance information can best be communicated back to form creators, including plans for what will happen if no action is taken by them or if the people originally responsible move on

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had adopted the GOV.UK platform as a service (PAAS) solution to provide its infrastructure needs into the beta phase. This is an appropriate choice of a mature environment which has also been through a Government Digital Service (CDDO) Live Assessment. The rollout to 10 partner organisations during private beta should be easily sustained by this platform
  • the team is planning on all their technology and framework choices to be guided by the existing GDS way and technology code of practice
  • the team have considered a move to AWS public cloud infrastructure should demand and scalability outgrow GOV.UK PAAS
  • the team is considering appropriate component level separation and looking at queues and loose coupling whilst integrated with other services
  • the team has chosen to develop their own form designer for private beta, as the technical panel members felt the reliance on JavaScript within other proposed solutions was a usability and accessibility concern

What the team needs to explore

Before their next assessment, the team needs to:

  • further the understanding around the need for event collection and analytics in private beta and establish methodology for other roles beside form creators to be able to access that information

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before their next assessment, the team needs to:

  • continue the exemplary approach to developing in the open

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are integrating with existing services for form submission delivery (GOV.UK Notify) and payment (GOV.UK Pay)
  • the team are using common hosting platforms
  • the team has assessed and reviewed cross government and Ministry of Justice solutions for form building and hosting
  • the team will make extensive use of the GOV.UK Design System

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to follow the technology code of practise when creating components
  • ensure any form submitter API or webhook implementation adopts appropriate standards

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • a suitable hosting platform for beta has already been adopted
  • the team have already consulted with site reliability engineers within the Government Digital Service
  • the team have integrated with reliable government backed services

What the team needs to explore

Before their next assessment, the team needs to:

  • further investigate auditing of form submissions and storage of appropriate meta data to allow for delivery dispute resolution

Published 12 August 2022