Submit a General Aviation report alpha assessment

The report for Home Office's 'Submit a General Aviation report' alpha service assessment on the 29th November 2018.

From: Central Digital and Data Office
Assessment date: 29 November 2018
Stage: Alpha
Result: Met
Service provider: Home Office and Border Force

The service met the Standard because:

  • the service team demonstrated that they have taken the time to understand user needs for the different user groups, through a series of in-person and remote interviews
  • the team has a very fully featured prototype and has a good understanding of what to design, prototype and iterate next
  • there is a multidisciplinary, co-located team using a collaborative and iterative methodology to deliver the service

About the service

Description

General Aviation (GA) is defined by Border Force as any aircraft not operating to a specific schedule and not making a solely military flight. The service will allow the GA community to submit General Aviation Reports (GAR) online. This will enable increased security, reduce costs and use digital technologies to replace manual paper-based systems.

Service users

The users of this service vary from small single person aircrafts flown by private pilots for leisure purposes to business aviation jets flown on a commercial basis. These fall under four main groups:

  • singleton pilots
  • fixed-based operators (FBOs) and other commercial organisations in the General Aviation Space, for example flying schools
  • military and defence contractors
  • Border Force Officers

Detail

User needs

The service team demonstrated that they have taken the time to understand user needs for the different user groups, through a series of in-person and remote interviews. The panel was pleased to see that the team had a plan for tackling each of the pain points or had thought through the pain points that would be out of scope by working with other teams like communications and engagement.

The service team has used these insights to make a smooth transition from the old system to the new system for those users who have a particular set up in the old system (for example a saved copy of a previous form or macros in Excel).

However, the panel questions if the team has built based on what users want, instead of designing the team’s own ideas based on the identified user problems. The new system seems to be an improvement from the old system, but with the same underlying principles. The panel wonders if there could be other solutions that meet user needs, for which the team would develop using their own expertise rather than relying on what users say.

The team plans to do further research on bulk uploading, which the panel strongly advises.

The panel had some confusion over when the research was done, whether before or whilst the solution was being built. Either way, the team outlined a solid plan for further research, which the panel agrees would be needed to move on to beta. This includes testing with users who have accessibility or assisted digital needs.

The panel also agreed with the idea of pop up testing with users who have no prior knowledge of the project as well as testing with internal colleagues who answer phones and emails to get a deeper understanding of the biggest challenges that users face.

Team

The multidisciplinary team is co-located and uses a collaborative and iterative methodology to deliver the service, working in 2 weekly sprints. The core team is mostly made up of contractors, with the Service and Product Owners coming from the Civil Service. There is also a part-time Content Designer whose time they plan to increase. Content is key for the next phase so it is recommended that a full time Content Designer is engaged as soon as possible. For the next phase it is also recommended that a clear plan is put in place to transition from contractors to civil servants during the beta phase and ensure that permanent members of staff are taken on the digital journey with the team.

There seems to be some confusion between roles, with a Scrum Master in the core team and a Delivery Manager as one of the stakeholders. This might be down to job titles rather than roles being used to describe the team and its stakeholders. The team should refer to the GOV.UK service manual to help them better understand the key roles to best support the next phase.

The team is working to tight timescales, and concentrating on delivering a minimum viable product (MVP). It was not completely clear how the team prioritise their backlog, although they do work collaboratively to do this. It is recommended that during the beta development phase, the team work on a clear approach to prioritisation and that there is also a plan in place for iterating the service during beta.

The governance process is based on Home Office governance. There is a Project Board that feeds directly into the Home Office Project Board.

Technology

The service team presented a technical solution with a complex background. This is the third restart of the service and the panel was pleased to see the service team consider whether they could re-use some of the code from the previous work. The panel was also pleased to see that the service is hosted on the Home Office Application Container Platform (ACP) which is hosted on AWS and that they use the standard deployment and release management patterns that come with the platform. This means they have been able to take advantage of various shared capabilities (including cross-government capabilities like GOV.UK Notify) which have been tried and tested by previous services. The service team has also focused heavily on deployment automation which should improve their ability to iterate quickly and effectively in the future.

The service benefits from the ACP automated recovery and resilience capabilities but the panel was still pleased to see that the team has considered an offline plan. The plan is to have users revert back to submitting GARs via email and fax. However, the service team should consider whether this plan will still be valid once the new legislation is introduced that will effectively ban non-digital submissions to the service.

The team is coding in a private GitLab repository as they did not want to share the questions asked in the service until work is made public. However, the panel did not think the service is asking many questions that are not already asked on the existing forms and would encourage the service team to review this position. Either way, if public beta starts in early January the service team should consider making their code open source at that point.

The service team demonstrated that they were looking at a number of techniques to make it easier for users to access the system whilst maintaining security, for example, using a magic link. However, considering the sensitive personal passenger information that can be stored in the service, the panel would recommend the service team consider the risks of using magic links. A user expecting a login link via email could present an opportunity for a threat actor to imitate service emails and trick users into revealing account credentials. More broadly, the service team should consider the level of access to store personal information that users have in the service. For example, if a user adds a saved person to a manifest then they are presented with all of that person’s personal information (including passport details and date of birth) even though they may not need to use it. The service team should consider ways to deliver the service without showing this information unless absolutely necessary. The service team should also consider offering organisation administrators the ability to audit or restrict what their users are able to do in the service.

The service team explained how larger user groups often try to automate their data submission process to make it easier to use the service. The service team should consider whether an API would meet this need better than manually uploading bulk spreadsheets and whether they could prevent large users from having to change their process twice by building an API as early as possible in private beta.

Design

The team has a very full featured prototype, in part because of work on previous projects, and has a good understanding of what to design, prototype and iterate next. The team is using the GDS prototype kit and design system.

The team is designing a service for a wide group of potential users, from solo occasional fliers to companies operating private jets internationally. This has resulted in some tension in the designs. The team must continue to make sure that developing this as one overall service, rather than splitting out commercial or organisation use, does not impact the new or occasional user.

There needs to be more work on designing privacy for different roles in organisations, considering what information can be seen and what should be redacted by people in FBOs and larger organisations using this service.

There has been some thinking about reuse of the service and storing information. The team has seen this as a user need in the research. The current model of saving information from the last 30 days does not feel intuitive and the team should explore more explicit save mechanisms. Redesigning the dashboard and overview pages will also help returning users.

There must be time for iteration, improvements to current flows and changes based on learning as well as expanding the feature set. To do this in the limited time of the proposed private and public betas, the team will need to include at least a full time interaction designer and a full time content designer. This is especially important as research will not be lab based, but will involve traveling to meet with users. The designers need to be involved with the actual research work, and not just getting a summary.

The team must find people with differing access needs and include them in the ongoing research and recruit some into the private beta.

It is good that the team is making sure the service can scale to similar needs, such as General Maritime traffic. Border Force, Home Office and HMRC should make sure that there is no replication of services, capture information once, and reuse work where possible.

It is good that the designers and user researchers are sharing the work within Home Office digital and participating in the cross-Government user-centred design communities. The team should look at other similar services for the design of tables of applications and display of large sets of information.

The researchers and designers on the team need a close relationship with Border Force staff on the ground and on the emergency helpline during the private beta to make sure that both their needs are included, and make sure that issues that get reported through these channels get incorporated into the backlog.

The team should think about having a second wave of private beta that has a slightly larger scale, so better analytic data can be incorporated into iterations.

Given the time constraints for going to public beta, the team will need to be working soon with Home Office and GOV.UK content teams to draft and iterate the start page and final name of the service.

Analytics

The team are using data from current GAR submissions to baseline the KPIs for the new online service. They will collect data for the new service via google analytics, questionnaires and user feedback. The team were unsure whether they had access to a paid version of google analytics. It is recommended that the paid version is used.

The team are aware that there are substantial issues with missing or poor quality data with the current GAR so plan to measure this in addition to the four mandatory KPIs.

Recommendations

To pass the next assessment, the service team must:

  • make their code open-source
  • test with users who have accessibility needs and assisted digital needs
  • understand the size and need for assistance and support for using the service and develop and iterate assistance channels
  • check the security risks of displaying personal information (e.g. passport numbers or date of birth) to all users and only display it where absolutely necessary
  • carry out further research into bulk uploading before settling on a solution
  • plan for the transition of mostly contractors in the team to mostly civil servants
  • engage a full time content designer and full time interaction designer for the service
  • work on a clear approach to prioritisation of the backlog
  • put a plan in place for iterating the service during public beta and beyond
  • have a clear plan for support channels during public beta
  • make sure the service is usable and useful to all potential users, from an occasional leisure flyer through to large fixed base operators with many user accounts
  • use the paid version of Google Analytics to gather data insights rather than the free version

The service team should also:

  • deepen understanding of what makes an ‘organisation’. The panel did not feel that the team were fully aware of how responsibility is delegated within an organisation
  • show what roles the team members are fulfilling rather than what job title they hold
  • share and compare design patterns with similar services in Home Office and Government, and contribute to new draft design patterns
  • consider the scale of the private beta, make sure the recruitment is from a wide user base, and keep inviting more users until there are few new learnings

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform N/A
18 Testing the service with the minister responsible for it N/A
Published 16 October 2019