Independent report

Interim reports from the Centre for Data Ethics and Innovation

The Centre for Data Ethics and Innovation has published interim reports on its two major reviews into online targeting and bias in algorithmic decision-making.

Documents

Details

The CDEI has published interim reports setting out our approach to exploring the issues of online targeting and bias in algorithmic decision-making. The reports also detail our progress to date, and our emerging insights.

Purpose of the reviews: why targeting and bias?

In 2018, the Government consulted on the CDEI and proposed six themes where we could undertake projects to strengthen the governance of data-driven technology. These were: targeting, fairness, transparency, liability, data access, and intellectual property and ownership. Of these, two areas, targeting and fairness, were identified as requiring immediate attention.

The CDEI agreed that ‘targeting’ and ‘fairness’ were important priority areas, and worked with the government to agree the focus of the reviews as ‘online targeting’ and ‘bias in algorithmic decision-making’. These two issues were seen as important given the impact that data-driven technology is having on individuals and society now in relation to both targeting and bias coupled with the need to better understand the governance in both these areas.

About the interim reports

These reports are intended to provide an update on our progress to date regarding these two reviews. The reviews have been informed by:

  • research undertaken by our policy teams
  • stakeholder engagement with government, regulators, industry and the public
  • calls for evidence
  • landscape summaries, carried out by academics to assess the academic and policy landscape in each of these areas.

The CDEI is in a unique position to explore the issues of bias and targeting, given our independence, our cross-sector remit, and our mandate to provide recommendations to government. We have purposely taken different approaches to the reviews: the targeting review focuses on specific themes within online targeting and the bias review focuses on specific sectors. This allows us to understand and analyse the landscape through various lenses, with a view to developing robust and thorough recommendations in our final reports.

At the highest level, the reviews are addressing the following types of questions:

  1. Where is the use of technology out of line with public values or the norms defined by our laws and regulations
  2. Where can technology positively reinforce these values and address societal issues?
  3. Where does law and regulation need to be strengthened? Where might existing rules hinder positive innovation? Where do regulators need new skills and capacities to address issues?
  4. How can ethics be built into innovation and innovation be directed towards supporting ethics?
  5. Where are we failing to make use of the benefits of data-driven technology because we have failed to resolve ethical tensions or provided sufficient clarity to innovators?

We are working closely with key stakeholders on these Reviews and we are grateful to all individuals and organisations who have supported our work to date, including those who have responded to our calls for evidence.

The CDEI’s work programme

As well as these two major reviews, the CDEI is working on:

  • a series of snapshot papers on deep fakes, AI and insurance, smart speakers and facial recognition technology to be published from late August 2019
  • a report on ethical frameworks for data sharing, published in autumn 2019
  • a further series of snapshot papers for publication from January 2020

Read more information about our 2019/2020 work programme.

The CDEI’s research and reports are informed by our developing thoughts on ethical and governance principles for data-driven technology. Read more about this here.

Published 19 July 2019
Last updated 25 July 2019 + show all updates
  1. We have updated the timing of future 'snapshot' paper publications to August 2019 for the first series, and January 2020 for the second series.

  2. First published.