Drone registration and education service beta assessment report

The report for the Civil Aviation Authority's Drone registration and education beta assessment on the 27th of August 2019.

From: GDS
Assessment date: 27/08/2019
Stage: Beta
Result: Met
Service provider: Civil Aviation Authority

Alpha assessment report

Service description

The Civil Aviation Authority (CAA) is creating a new service to help people use unmanned aircraft (drones and model aircraft) in a safe and legal way. The service is made up of registration, learning, testing elements, and is due to go live on October 1st, 2019.

The services are being delivered following the 2018 amendment to the Air Navigation Order (ANO) that made a series of changes to how people are legally allowed to use unmanned aircrafts. The new regulations includes requirement that:

  • any person or organisation that is responsible for an unmanned aircraft must register with the CAA and label all their unmanned aircraft with their registration number
  • any person who wishes to fly an unmanned aircraft must first prove their competency to the CAA
  • any person who wishes to fly an unmanned aircraft must now abide by a series of new and different rules when using their unmanned aircraft

The purpose of these new requirements is to:

  • reduce air safety incidents
  • improve accountability and risk awareness among users
  • support enforcement actions against illegal use
  • reduce privacy risks associated with the filming involved in drone usage

On the back of these requirements, the CAA undertook user research to identify the needs of the users who will be impacted by the regulations, which demonstrated the requirement to build a service which included the following core components:

  • an education service to help users understand how to fly safely and legally
  • a testing service that assesses a flyer’s knowledge about flying an unmanned aircraft safely and legally in the UK
  • a registration service that enables users to fulfil their legal requirement to register that they are responsible for an unmanned aircraft

Service users

We have identified eight broad groups of service users:

Drone enthusiasts

Proficient and engaged drone users who frequently go out flying. Their main interest is in the flying of drones – whether for money, racing, or fun. They are working age, digitally and technologically proficient, and engage with each other and the community both digitally and face to face. They are overwhelmingly male.

Model aircraft flyers

Proficient and engaged model aircraft flyers who regularly go out flying. They are generally well trained in how to fly safely. These users fly as a hobby with a social element where they get to engage with like-minded people. They primarily engage with other users in person at club meet-ups. They are less digitally aware, overwhelmingly male and of an older demographic.

Disengaged drone owners

Rarely fly their drones. When they do fly, it is primarily for associated interests, such as photography. Sometimes they will fly commercially; sometimes personally. They are a mix of ages and gender and are of a wealthier socio-economic bracket. Generally, they do not engage with other users.

People seriously considering getting a drone

These users take a measured approach to getting a drone, and thoroughly research the complexities and costs involved based on a respect for the complexity of engaging in the hobby (drones can be expensive and an investment to become proficient in.) They are generally of working age. They will be considering drones out of an interest in going flying, the technology involved, or in relation to another hobby, such as photography. They differ from people who are disengaged with drones in that they invest time in learning about the hobby from other sources – generally found online.

Impromptu fliers

This group of users have never used a drone before but have had the chance to fly. For example, they may have:

  • borrowed a drone from a friend
  • been given an unexpected opportunity to fly a drone
  • made an unplanned purchase of a drone
  • been given a drone as a gift.

They likely form the largest group of potential users.

People responsible for an aircraft that will be used by others

People in this group are responsible for looking after drones and making sure that anyone flying the drone is responsible for and meets the competency requirement. This group includes people like parents, teachers, and managers. It includes people of all working age demographics. They may not fly a drone themselves.

Under 18s (sub-group)

These users are a sub-group of any of the other groups (other than ‘People responsible for a drone that will be used by others’). They will only fly drones: the regulations mean they cannot be registered as responsible for a drone. They fly drones as a social activity. They include some of the best drone flyers in the world – the current world champion is 12 years old! They are digitally competent and familiar with online learning.

Verifiers

Responsible for checking someone else’s drone registration status and/or competency to fly. They include people such as: insurers, police and law enforcers, event organisers, clients of drone companies and employers of drone users. Their single need is to verify that a third party meets any legal requirements in relation to drones – for example, in order to make sure that the verifier’s own insurance is valid or that they have the relevant permissions.

Covering, advisory note

DRES is a well-designed, user-centred product. It is a testament to the team’s attitude and ability that they have made such progress since their first Alpha assessment, and that they have provided such an excellent service with an impressively high first-time-completion rate.

During the assessment, the service team explained that the desired outcomes of the new legislation are to:

  • reduce drone incidents
  • improve accountability and risk awareness
  • support enforcement
  • reduce privacy risks

The team set the much broader context and ecosystem within which the DRES sits, alongside a roadmap of activities that are planned, which will ultimately lead to the realisation of the above outcomes. This roadmap of work includes ensuring adequate enforcement regulation and provision, through work with law enforcement and policy people working further back in the drone production chain, such as with manufacturers. An example of this is to ensure that each drone is uniquely identifiable and thereby ensuring that they are registered by a purchaser.

The panel wanted to highlight this context and roadmap, and to accentuate the importance of the CAA continuing to support the programme of work.

As the service team pointed out, this service is a first step along this journey; having an easy and intuitive way to register drones (and the existence, then, of a register) and to educate those in their possession is a prerequisite for enforcement and other activities, for example. However, they also rightly identified that this service will not be able to achieve the aforementioned outcomes single-handedly.

For this reason, there is a risk that, were the CAA and others to de-prioritise future activities that lead to the fulfilment of the roadmap, and/or the on-going development of this service, the greatest value for taxpayers’ money could not be released and would therefore potentially result in much of this work (and therefore money) being wasted. The value of the whole is greater than the sum of its parts, so to speak.

This note, therefore, simply acts as a warning to decision-makers that they must prioritise this on-going, excellent work if they are to be successful. Keep up the good work!

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have clearly done extensive research both to understand their users and to iteratively improve service prototypes in a user-centred, data-driven way
  • this is some of the most inclusive research seen come through an assessment. They have been concluding a range of digital capability and confidence and access needs throughout the research to meet users needs
  • assisted digital needs have been well considered
  • the team will continue to research the use of images to teach practical skills in an online environment.

What the team needs to explore

Before their next assessment, the team needs to:

Primary

  • understand if the service is meeting current and near future needs of the police and develop enforce processes together. Does this service enable both CAA and Home Office to reduce drone incidents and support enforcement actions? Is the service having an impact on reduce privacy risks?
  • understand if they are meeting the needs of organisational users as they have different behaviours to private individuals
  • test the privacy statement to ensure users understand how the personal information is being stored and used
  • understand how much impact the competency test has on safe flying
  • work with policy, manufacturers and retailers to investigate enforcing all drones and model aircraft to have a serial number, which can be associated with a registered operator / flyer
  • work with Border Force to explore options for enforcement of bringing in drones from other countries.

Secondary

  • understand if the timescales involved in offline processes for registration and the competency test are meeting user needs.

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has included and tracked different levels of digital capability, confidence and access needs and plans to continue to include a range of users throughout public beta and into the continuous improvement cycle when the service is live
  • regular user research is planned from public beta and live, with a well established cycle of turning insight into actions
  • an appropriate variety of user research methodologies have been used throughout the development cycle across the user groups, whom have been well represented with an appropriate number of users
  • the team has considered the GDPR implications of the tools they are using to collect data
  • the team will have access and use of data from the call centre, satisfaction data, analytics, etc. - a wide variety of data that can be used to improve the service.

What the team needs to explore

Before their next assessment, the team needs to:

  • iteratively research with organisation users to ensure that different size of organisation with different behaviours and needs are being met, consider if additional personas are needed
  • research and use analytics to track what impact the service is having on reducing drone incidents, improving accountability and risk awareness, supporting enforcement actions and reducing privacy risks
  • work with CAA communications team to assessment impact of communications; is it driving the disengaged and impromptu users in particular to register?
  • research to understand the size of risk of fraud for those not participating in the trust model - in the areas of uptake and knowing that the person who has taken the test is the flyer / operator
  • research to ensure the full end to end journey has been tested from communications to registration and enforcement
  • further research to ensure that internal users’ needs are being met
  • the team has plans to include the labelling of drones; this should include contextual research of how real users do this, which should be used to inform how this information is served in the digital service.

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has a full multi-disciplinary team, who seem to be well empowered to carry out their roles effectively
  • the team has clearly acted on guidance given in earlier assessments, with particular regard to having more content design time
  • while the contracting company will no longer continue to support the product beyond October, there are people and processes in place to continue to do good work in the right way.

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • as explored in the user research section, the team’s understanding of, and design based on user needs is exceptional
  • there was clear and consistent demonstration that the team is willing and able to make changes to the design - both simple content changes and fundamental service design - based on new findings through user research.
  • the team demonstrated strong, collaborative, hypothesis-driven research and design processes. It was good to see how co-creation methods had been used
  • during alpha, the team had tested the assumptions with the greatest impact to the user journey and have focussed on iterating this user journey during beta.

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed examples where they had successfully iterated content based on research. They had done a lot of exploration into the way they ask users if they are drone operators or drone flyers. Through several iterations during beta they found that ‘responsible person’ was the language that worked best for users.
  • content and interaction design worked well to include guidance within the service, for example, they found users did not read relevant content within progressive disclosure elements but discovered that when presented as a hint beneath relevant radio button choices users read the information they needed.
  • when the research showed multiple choice answers did not meet user’s mental model for the test element of the service the content was rewritten to work as single choice answers. This is an example of doing the hard work to make it simple.

What the team needs to explore

Before their next assessment, the team needs to:

  • focus on what the users need and not what they users want, this is relevant to possible future features like PDF certificates. There could be a much simpler solution to these needs.
  • improve the successful payment page. It was noted that users often drop out too soon at this point. Could the team change the content on this page to something more active like “Continue your registration”. Perhaps as this page only exists in the journey for technical reasons there may be ways to remove it.
  • the team explained that height is traditionally measured in feet where distance is in meters, the team should consider whether this makes sense and if there is a way they could improve the way we talk about measurements (even if it means breaking norms).

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has put a lot of thought into the technology they have selected, particularly in their decision to use GOV.UK PaaS and how they leveraged it to build their seamless CI/CD model in conjunction with Azure DevOps. They have also made pragmatic decisions such as moving the use of javascript to the backlog to make sure that it is given the proper level of research and testing
  • their CI/CD pipeline is very impressive and was also met with approval from the CAA with whom they hope to feed back and broaden its usage. It is a well designed pattern that can easily be adopted by other projects and programmes with the same or similar stack
  • the service team also have a robust feature pipeline and particularly of interest was the API service that would benefit not only Policing but Insurance Providers and is of interest to Department for Transport as well.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure, when the team begin building the javascript components, that they continue their good practices with regard to accessibility testing to ensure all users’ needs are met
  • undergo an additional accessibility audit when the javascript components are ready
  • consider, when the team start looking at the Experian integration, just who this validation will cover and if it will cover just one subset of users, many or fall short of what is needed in a robust validation check. Particularly of note is foreign flyers or operators who may not be in Experian’s datasets and what risk that could pose.

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has put a lot of thought and effort into their security architecture, as evidenced not only by their penetration test results but also their detailed plan surrounding threat vectors, PII data, development practices and ongoing plans for monitoring the system via a managed security operations centre (SOC)
  • the service team has security at the forefront of their minds and has a security consultant working with them to ensure that threats, risks and security issues are managed and minimised.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue their good work and when they add new components and features, such as the javascript enhancements, to ensure that they have another penetration test and security review to continue their high standard of security practices
  • the team explained their plan to run a verification check (soft check) on users to gather data on completion rates and, if successful, start to implement a stronger verification check. The stronger checks would need to include document uploads. By including stronger checking processes you’ll see the completion rates drop so it’s not clear how useful this planned experiment would be. If the team decide to check identities, make sure they or any suppliers follow GPG 45, the government standard for identity proofing and verification.

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team have evaluated the security concerns around exposing the internal workings of the application and have made rational decisions around putting a limited amount of code on public GitHub.

What the team needs to explore

Before their next assessment, the team needs to:

  • explore publishing the code as open source and its interoperability with other regulatory departments.

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team have used a raft of common platforms, GOV.UK PaaS, Notify, Pay and the Design System and its patterns and components allowing them to get on with solving the harder problems.

What the team needs to explore

Before their next assessment, the team needs to:

  • the service team should have another look at GOV.UK Verify when exploring further identity checks as an alternative route or elevated verification check for UK residents. This comes with the same caveat as above, which is to say that it must be assessed as to whether or not such a verification check will provide much value.

10. Test the end-to-end service

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has ensured that automated testing was built into the pipeline giving them the ability to test scenarios seamlessly. This allowed them to take down integrated services while testing to assess impact
  • though they’ve not been able to do research using the prepared comms material, the team is able to feed research findings into the campaign, working closely to maintain consistency of language and imagery and align with the Government Communications Service (GCS) guidelines
  • the team is exploring other entry points, including the drone technology itself directing users to check the CAA rules.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue with the end to end testing practices the team have built when introducing new integrations or services like Experian.
  • ensure that when the team begins to work on extending the service to other parties: policing, insurance providers DfT etc., those parties are included in test plans and engaged during requirement gathering to help feed into development of the service.

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has put together a comprehensive plan for offline support. They have taken into account previous recommendations and have gone the next step to ensure a seamless offline journey
  • the service team decided to de-risk the support centre by utilising a third party already utilised by the CAA to capture volumes and metrics so they can assess what will be necessary for in house support
  • the journey for an offline user was a great highlight of their plans, in that a user would call into the call centre and request the flyer and or operator licence. A packet would then be mailed including a test for the flyer and after completion it would be mailed back for grading. If the test was not a passing mark, the support team would simply send the results with a different version of the test. This coupled with the features of the administrative module gives great confidence that even an entirely offline user is supported.

12: Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a 20% target for researching with users with assisted digital needs. They expect users with low digital confidence and assisted digital needs will be able to get support from friends, family or flying clubs. The team are sharing guidance with club ‘heads’ to make this possible. If users do not have support or access to clubs they can request the test and education material by post
  • the team presented a 99% first time success rate, having made huge improvements over the phases of development.

What the team needs to explore

Before their next assessment, the team needs to:

  • further explore how the paper service affects the user experience.

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service maintains consistency of user experience with other government services while still representing the CAA brand.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue looking into implementing the country picker element from the design system but have currently used a version that does not require any javascript as a first step.

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team will be working with the communications partners to help make sure the campaign directs users to the start of this service.

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

  • there is no ‘live’ data from service users yet, but the team outlined what data they will focus on and how they will collect it
  • the team will closely monitor which questions users get wrong, and how/whether this should affect the guidance or the wording of questions
  • the team was very keen on ensuring data from offline users were captured in the same way as online users, and compared where relevant. For example, they have processes in place to capture the responses to questions from offline users and to identify where uses gave the wrong answers
  • data from call centres will be collected and used by the team so that common questions or areas of confusion can be addressed by guidance
  • the panel was pleased at the efforts the team had made in calculating a cost per transaction for the service
  • standard online tracking, such as completion rates, where users drop out, and the devices they use, has been implemented.

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure it has sufficient ability to implement advanced Google Analytics tracking of users. The service will need to be tracked in ways which identify problems and trends, allowing the team to measure the success of any changes they make.

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

  • the team suggested several metrics for evaluating the success of people using the drone registration service, mostly around the 4 mandatory KPIs, as outlined above.

What the team needs to explore

  • the team should aim to make a stronger connection between the purpose of the service, the needs of users, and the data it uses to evaluate success (see the Covering, advisory note)
  • many of the existing performance measures outlined were the standard GDS KPIs, but the panel would like to see more focus on metrics specifically designed to evaluate the success of the drone registration service
  • among the stated aims was for the service to become an authoritative source of information on drone owners and users, and to improve education among drone users, for example, but there were no clear plans for using data to demonstrate how well the service fulfils these roles
  • the team mentioned the possibility of working closer with the police to get information on use of non-registered drones. There was also the suggestion of monitoring CAA incidents relating to drones, with the intention of correlating this to its efforts to improve education about drone use. The panel would welcome more work in this area.

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

  • the team has been in touch with the performance platform team, and is in the process of setting up a dashboard. It has also started considering how the data will be sent in the future, and how this can be automated
  • the team explained how the mandatory KPIs have been applied to the service, and outlined how they will be collecting and calculating these metrics.

What the team needs to explore

  • the team needs to ensure the dashboard is set up and maintained, and that contextual notes are provided to help users interpret the data.

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has been tested with two ministers
  • the team has ensured continuous engagement with ministerial and policy teams.
Published 21 October 2019