Report A Cyber Incident

The service allows UK organisations to report cyber incidents and crimes to law enforcement and the National Cyber Security Centre (NCSC).

Service Standard assessment report

Report A Cyber Incident

From: Central Digital and Data Office
Assessment date: 06/05/20
Stage: Alpha
Result: Met
Service provider: National Cyber Security Centre

Service description

The service allows UK organisations to report cyber incidents and crimes to law enforcement and the National Cyber Security Centre (NCSC). It simplifies the reporting landscape for users, removing the need to self-select between Law Enforcement and NCSC reporting channels. Guidance to organisations using the service will be offered and reports forwarded to relevant triage agencies based on incident information and policy. Organisations can submit a meaningful report using simple language while agencies will receive improved information. The service will integrate with a triage solution, which enables greater collaboration among law enforcement agencies and NCSC for report handling and onwards investigation.

Service users

The service will be available to anyone who wishes to report a cyber incident on behalf of an organisation and members of the public looking for advice and guidance on where to report when they have suffered a cyber incident.

Primary user groups:

  • small and medium sized organisation reporters: owners and managers of small businesses with a range of technical literacy and support needs
  • large organisation reporters: compliance managers, IT/Cyber security experts with high levels of technical literacy who are familiar with reporting benefits and obligations

Secondary user groups:

  • individuals reporting a cyber incident: members of the public who have suffered a cyber incident, needing advice and guidance on where and how to report this to authorities
  • triage agents from NCSC, NFIB and NCA: use the completed incident report forms to perform agency triage and dissemination to determine next steps

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done a significant amount of research with a very wide range of different users, in particular with reporters of cyber incidents. The team has also done some user research with internal users (agents etc)
  • the team have written user needs for priority and secondary users and been clear about how the service will work for them
  • the team have a good process and methodology in place for understanding the needs of reporters and testing the service with them
  • the team has done accessibility research on the service
  • the team has done lots of Discovery research to understand the pain points of a wide range of users

What the team needs to explore

Before their next assessment, the team needs to:

  • take the user research focus they have had for reporters and apply that to internal users impacted by the changes they are making
  • make a clearer distinction between business analysis and user research. Although the needs of internal users were considered in the Discovery much of the detail of this has been lost as the team has developed their Alpha. Agent needs haven’t been spelt out clearly so it is not clear how the service will be impacting them as users. This is also partly because the way the team is working means the distinction between business analysis and user research has become blurred. The team must acknowledge the agent needs which relate to the service they are building (the people who receive the data they are sending, those who step in when it does not work and those whose jobs they are trying to help). To do this they must write user needs for internal users and do research with them
  • the team must do ongoing user research with internal users. It is great that they are working with internal stakeholders, but internal teams are also users of the service (directly and indirectly) and so the team must prototype and test their assumptions with these users
  • the team should be clearer about the reporter user needs the service is meeting and how. This was very clear for primary users, but not clear at all for secondary users. For example individuals being redirected to action fraud is a user journey which should be tested and a user need which should be called out to see how well it is being met
  • the team should look more at users with low cyber literacy as well as expanding their accessibility research and thinking about other groups who the service might be excluding or making it hard for
  • demonstrate how the team’s understanding of user needs is used to prioritise new work
  • consider broadening research methods to understand more about user needs and how well the service is working, like contextual research

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the service sits across a number of different organisations in the Fortis group and one of the drivers for the project is enabling agency collaboration
  • the introduction of a single reference number has allowed shared visibility of a user’s incident report between agencies
  • the team are exploring prevention by understanding the value of sharing report data with other businesses
  • the service’s automated suggested severity triage has a success rate of 87%
  • the team’s cross-agency discovery and alpha covered blockers to inter-agency collaboration, governance, service management, detailed process mapping, agreeing new processes for managing incidents and engagement to ensure all organisations were moving along on the journey together

What the team needs to explore

Before their next assessment, the team needs to:

  • understand and present agent user needs
  • demonstrate how they have been incorporating research and testing with agency users into their fortnightly sprint cycle
  • understand and present the needs of the ‘novice’ user group who may be at risk of being excluded from the service because their digital capabilities are lower
  • make a plan to work with the agencies that issue updates to users about further action taken and understand whether or not it is sufficient to meet their needs
  • continue to work with Action Fraud to make sure that users (e.g organisation users) are routed correctly from the Action Fraud service to the NCSC service
  • think more about the users who are implicated in their minimum viable product, clearly articulate their needs and how the service is planning to meet them. And plan design and research activities to find the best way to meet these needs with the constraints the team has

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have involved other cybercrime service providers and gathered analytical information about where users start their journeys
  • the team have conducted surveys with users about what names they give to things
  • if the service goes down it will still display guidance about reporting cyber incidents
  • the team understand the behaviour of users when communication channels have been compromised, defaulting back to personal email and using their mobile phones if the network has been affected

What the team needs to explore

Before their next assessment, the team needs to:

  • better understand low capability users and the channels they rely on
  • qualify the assumption that low literacy users will not want to report anything
  • provide the insight and evidence to back up the decision that they do not need a phone line into the service
  • understand the needs of users relying on the Action Fraud phone line (or provide evidence if this channel was investigated by the team in their discovery)

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • over the lifecycle of the project the team have tried different brands, experimenting with NCSC styles and eventually choosing the GOV.UK style
  • the team have embraced the ‘one thing per page’ principal
  • the project initially had a portal scoped out where you would log in and come back and earlier prototypes had complex UI, all of which has been disregarded and simplified throughout testing and iteration
  • the content used in the service has been analysed and determined appropriate for the reading age of its primary users
  • the beta build is going to be mobile first
  • the content upfront sets clear expectations about the structure and stages of reporting a cyber incident
  • the design process has gone from wireframes to low fidelity to high fidelity prototypes
  • the team have looked at and learned from cyber incident reporting approaches in other countries

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate how confirmation/receipts/reference numbers work for users - the panel heard that users get a summary of answers and a message to users to say their report is being looked at (these points in the user journey were not presented at the assessment)
  • provide information about the success of the hand-off with other agencies and evidence that any change in styles / branding from GOV.UK to another agency will not introduce a problem for users
  • identify the extent to which users see reporting incidents of fraud, cyber and information breach as one task or separate tasks
  • continue to work towards solving the complexity of the system, simplifying user journeys and merging these transactions for users where appropriate
  • for the next assessment present your goal or aspiration. Keep records/evidence of what influencing is happening and the progress you are making
  • the team should consider getting a dedicated content designer. The subject area is complex and there are many different needs to meet with the service
  • ensure the content works for all users of the service, including those who will be linked elsewhere (and not just those who can report using the service). This includes reviewing the service name. Is “Report a Cyber Incident” something low expertise users understand? The name of the service needs to cover individuals reporting a range of different fraud related things - do they know what a cyber incident is? Why has the team chosen cyber incident over fraud
  • show research questions which look to test the assumption this longer form is the best way to deliver the service. How will it impact submission rates? What if users can’t answer questions? To test this the team should test and understand scenarios where users might not know the answers

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understand the accessibility issues with dropdown menus, in particular the company lists
  • the team have been using the empathy lab at GDS to test the service
  • the team has a plan in beta to do more research with users with low digital literacy, users of assistive technology and assisted digital support

What the team needs to explore

Before their next assessment, the team needs to:

  • present their research plan for accessibility and inclusion - this needs to be more detailed and robust ahead of the start of beta
  • ensure the design of the service is not going to be a problem in the ‘real world’ in particular considering the length of the form and the potential for users being interrupted, connections coming and going, users needing to go away and find the information being requested by the service etc
  • understand lower digitally literate users - the service is designed for the user group with the majority of users - a risk to meeting the needs of users with more complex needs outside of this group
  • reach lower digital literate users remotely (due to COVID-19) - the panel recommends reaching out to the wider UR community for advice
  • ensure the service works for secondary users. The service has a limited scope for secondary users, but the service still plans to offer limited use to them. The team must ensure that these parts of the service work for those users. For examples individuals sent to Action Fraud
  • make sure the service works for users who might otherwise find it hard to use. The team should prioritise research with users who might find it particularly difficult to use the service, for example users with low cyber literacy and users with low/no prior awareness of reporting and how to do it
  • make sure the whole service works for internal users. The way the team is currently working will not reduce the risk that some internal users are excluded either because they find it difficult to use the service or because it does not meet their needs

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team make-up and governance structures were clear
  • fully multi-disciplinary team

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate the civil servant (CS) departmental representation within the project. There is some concern over the ratio of contractors to CSs and the scope of future support and development being directly assigned to the incumbent supplier. We would ask that a departmental CS undertakes a forward-look as to how contractor resources and dependencies can be minimised post-launch as we would expect this to be insourced (with appropriate knowledge transfer) as opposed to be contractor delivered
  • have a dedicated content designer or show the thinking if they feel it’s not necessary

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • traditional agile collaboration and ceremonies used

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • UR -> plan -> prioritise -> develop -> review - > iterate cycle was demonstrated
  • user research model used to help inform iteration and review

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the cycle above continues and ongoing user research feeds into and helps shape each iteration
  • ensure that the cycle of iterating and improving is in place for all user journeys included in the service (see other points about user journeys not focused on in Alpha - in particular internal users and individuals)
  • ensure the cycle of iterating and improving is in place for users who might find it hard to use the service (including people with access needs and users with low cyber literacy)

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • good architecture setup using AWS serverless architecture concepts whereby each function has its own module, this makes the solution
  • function based architecture throughout the solution for the service
  • pass through data captured through the service is validated and various security testing has been done to ensure that the data is in good shape to be passed through to other services. This service provides that security to the recipients whom they can rely on

What the team needs to explore

Before their next assessment, the team needs to:

  • the service team needs to ensure that constant monitoring is in place with the right skillset in the team to be able to react to incidents; current team’s capability may be there but focus needs to be put in the team that will support this in a production environment which will also need to align to strategic processes/goals

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • performance indicator have been categorised in three main categories such as effectiveness, efficiency and satisfaction - this has given the team a good understanding of their landscapes on what they need to capture, focus on as a priority
  • above categories contributed into team’s decision making to choose the right tools and be able to design and think about the right management information reports

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that performance analytics is used in conjunction with user research to continue to iterate the service
  • ensure that performance data is linked to user needs to indicate how well the service is meeting those needs, for reporters and internal users

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • very good development pipeline in place, where frontend, infrastructure and backend
  • highly resilient as it is hosted on multiple zones
  • decisions regarding the tools, technology and architecture decisions are run through the NCSC central ‘design authority’, and decisions are documented - this has fed into team’s critical decision making process e.g. when some of the components have been moved from Azure to AWS

What the team needs to explore

Before their next assessment, the team needs to:

  • technical debt: how is this managed and how much does the team know about the debt - team needs to review redundant component and functionality, which is considered tech debt in the development environments, especially since the development team is planning to expand
  • technical debt may also impact deployment pipeline, or even application performance, if left there for long period of time so team needs to focus on this too

12. Make new source code open

Decision

The service did not meet point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • discussion has started with the security team to see what can be done to make the code open considering the sensitivity

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the common components that this service has made use of from NCSC and whether these can be made open, further discussion needs to be put in place and agreed with this
  • separate source code used for functionality and the configuration files which enable sending data to the external users such as NCSC
  • intellectual property needs to be confirmed to be on NCSC side so the right teams are involved in the discussions to make the source code in the open considering the end to end service is fairly simple from the tech point of view

13. Use and contribute to common standards, components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • contributing back which serverless technology was used for the service
  • making use of many components from NCSC and cross government

What the team needs to explore

Before their next assessment, the team needs to:

  • the service team expressed interest in talking to the GDS cyber security team. This is backed up by the panel and an action has been taken to get the team in touch with the right teams at GDS

14. Operate a reliable service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • team has made good progress on understanding the needs to have a successful support team for when the service goes live
  • the solution design provide for this service is done in a decoupled way with the right architecture to be able to monitor, report and release on individual components

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that an agreement is reached and in place prior to the service going fully live, whether this agreement is between the NCSC or other third party partners who will be able to provide support
  • explore the options available to have the right team in place and agree on funding for the support whether internal or external capability is chosen

Updates to this page

Published 9 June 2020