|Service provider||Office for National Statistics|
The service met the Standard because:
The Electronic Data Collection (EDC) tool has been reviewed against the points of Digital Service Standard at the end of the Alpha development. The tool is a component part of a wider service that will allow survey authors for the Office of National Statistics (ONS) to create questionnaires and publish them online. This review was completed to fulfil conditions set by the Standards Assurance Service team at the Government Digital Service (GDS). At this early stage the assessment panel is assured the service team is on track to deliver the tool that will meet the Digital Service Standard.
The ONS is responsible for collating information from businesses, social surveys and the census. The EDC tool will bring together the authoring and publishing parts of those different services into a single tool. Prior user research conducted by the ONS has shown a desire from users to fill in mandatory surveys electronically, saving time for users and reducing error rates.
User needs and research
The outcomes from the discovery confirmed a tool would need to be generic to meet the different survey requirements from across the ONS, that it could scale easily in performance and would allow for multiple languages to be used. At the programme level there is an aim to have 75% of census responses completed online (with the possibility of 500k users accessing the census concurrently) and a desire to avoid repeatedly asking users the same questions. The Chief Technology Officer (CTO) has empowered the team to deliver the tool without the constraints of legacy systems.
The team have used a variety of user research approaches including usability lab testing with 20 participants on the tool as prototypes were developed. Observational research was completed at contact centres (both inbound and outbound) to understand the needs of respondents as well as front-line staff, and with door-to-door survey gatherers. Click testing has been completed with 80 participants. The team have also used pop-up testing in a cafe/arts centre and at a local library to understand the needs of respondents to social surveys, and this has included some users at the lower end of the digital skills spectrum.
Within the ONS, observational research at desks with users showed that survey authors copied and pasted in content from other documents which caused formatting issues; because of the agile approach the team were able to quickly respond and strip formatting from free text boxes.
While each survey team is responsible for the provision of assisted digital support for their survey, the tool is being built with the needs of those with the lowest digital skills in mind.
The EDC team are working along the principles of agile using Kanban over 2 week sprints with 1 day between sprints for review, retrospective and planning the next sprint. The team are co-located and as a team have retrospectives and planning sessions and use show and tells and blogging to share their work with stakeholders. A variety of tools like Jira for backlog management, Confluence and Slack are helping to support the development. The team have also benefitted from agile coaching support.
The team has a product and delivery manager, tech lead, UX designer, business analyst, developers and a user researcher all working in sprints. Specialist technical architecture support is from an external provider.
There are continued efforts to recruit a service manager whose responsibilities will cover the entire user journey (management portal, electronic questionnaire tool and survey support). Despite this the team have continued to make decisions based on research and data led by an empowered product manager.
The plan for beta is to operate the tool while continuing to build out features and move into a live continuous improvement phase. The transition from alpha to beta will involve the team re-working aspects of the tool and ensuring appropriate security mechanisms are in place to handle real data.
Skills shortage and role changes within the ONS are causing some issues, particularly with the different architecture functions, but the team have sought to resolve this by inviting their input in a consultative way. The team are proving that the ONS can deliver a complicated service in-house.
The publishing tool has been using a variety of common technologies and platforms, such as Heroku and AWS, and the team plan to use the Government Platform as a Service (PaaS) as soon as it is possible to do so. The team can take a story from analysis and story refinement, through development and into deployment in under 26 days on average, although fixes can happen rapidly. Environments can be created quickly from scripted builds. Moving code from staging to production is simple and happens regularly and when it is needed.
In beta the team are planning to deploy every few days. The service is being built in a safe and secure way, data is moved regularly from the front end servers to secure long term storage.
Google Analytics will be used for the front end survey runner and EQ author giving insights into the usage of both systems. Back end monitoring will be in place as part of the Government PaaS.
Information Architects and the Senior Information Risk Officer (SIRO) understand the risks and issues. The team are working with the ONS Information Assurance team (including support from the Census CLAS consultant) and CESG CLAS consultants to ensure that protection is sufficient for the sensitive information the tool will gather. At current there is limited impact on survey respondents if the service is taken offline, however, there may be a bigger impact on other services relying on these tools in the future.
All source code is being made available in public repos. It’s possible that the code to link the system with certain sensitive ONS systems will be kept as closed source when this work is undertaken. Some reuse of code might be possible by other governments - conversations are happening already to progress that. The ONS own the IP for this service.
The public facing part of the questionnaire tool will be accessed through a portal that verifies the identity of the respondent. The method of verifying will vary depending on the survey type (business, social, census) and will support GOV.UK Verify. After research, no current information or document standards were found to be helpful to the service. The new ONS beta website is responsible for the needs around public presentation of data.
The team have taken the best practice principle of the GDS style guide and are developing an ONS version as they have an exemption from the GOV.UK style. The design of the EDC survey tool has been usability tested regularly during each of the sprints. Intuitive design of the surveys will help with digital take up but can not solve complex routing around surveys.
While content in surveys is the responsibility of methodologists, the service team are responsible for micro copy and labelling as well as generic warning and error notices. Feedback from survey respondents will be a mixture of content and design so it’s important to ensure that methodologists have access to that feedback and understand how UI design can affect content and vice-versa.
The team have acknowledged that some features, such as drag and drop, may not be the most efficient method and may not work for all users and they will investigate that further during the beta development.
The aim for the 2021 census is for a 75% digital take up. Work is still to be done to understand how the tools can be built to support this and, while the team is not responsible for the census, it will be important to continue to focus on developing a system that is straightforward for everyone. Further research with people with low levels of digital literacy and complete experience mapping with respondents against the current paper process to ensure that users will see benefits in switching to a digital channel should be considered.
Currently the product manager is responsible for the analysis of data and feeding that back into the services development. The expectation is that this will be the service manager’s responsibility going forward.
Setting KPIs for the products will happen during the beta. Each of the survey teams will be responsible for their metrics, such as cost per transaction. User satisfaction will be captured separately. The level of detail wanted by different areas of the ONS on completion is variable and the team will need to consider those needs against the effort to support that carefully.
Digital take up will be monitored through the respondent portal, which helps capture details about assisted digital support needs.
Recruiting a new front end developer to ensure that the UX designer is freed up to focus on interaction design must be completed as soon as possible.
A trainee user researcher will join ONS soon. It is important to share these skills and knowledge across the organisation, however this should not take vital time away from researching this tool and its users; GDS will look to support this.
There is sufficient evidence to show the need for an in-house device library to test the tool on a variety of platforms; funding should be released to ensure this resource is in place quickly.
Google Analytics is not perceived as safe by some within the ONS; this view should be considered in further detail and in collaboration with the GDS analytics team.
Levels of support for the service once it is publically available need to be confirmed early in the beta for the different user groups, this may challenge the expectations of the ONS to provide working hours support only.
While the product manager is responsible for making decisions based on the user needs, data collection and interpretation may be better suited to a business analyst; the team should consider their need for one going forward.
The role out plan is based on financial value, with simplicity of survey being considered also, however ensuring that the tools work for the most complicated surveys will ensure the tool can work for the simplest; the team should weigh up the risks of leaving the most complex surveys till the end of development.
While the team have said they have looked closely at GDS guidance on front end design there are patterns being used that can prove difficult for many users; the team should take responsibility for accessibility in design and audit the front end thoroughly.
The team should take note and incorporate the throughly researched and tested designs used by other government services (https://www.gov.uk/service-manual/user-centred-design/resources/patterns/index.html and https://designpatterns.hackpad.com/List-of-design-patterns-0eUk1OdHvql)
The team should consider having a content designer help with the microcopy on the service and help bring together the different language approaches to components within the service from business, social and census teams.
The team should look for a pairing approach between user research and methodologists to ensure that both disciplines are understood and remove perceived barriers. Again, GDS will look at what support can be offered to facilitate this.