What is the CDEI Bias review?
As part of the CDEI’s 2019/2020 Work Programme, we are undertaking two reviews. The first focuses on Bias in Algorithmic Decision-Making, exploring bias in four key sectors: policing; financial services; recruitment; local government.
What was in the Call for Evidence?
The Call for Evidence document was published on 7 May 2019, and invited submissions on four question areas: the use of algorithmic tools; bias identification and mitigation; public engagement; regulation and governance.
How many responses did you receive, and from whom?
The CDEI received 52 responses from across academia, civil society, industry and the public sector:
- 11 responses were in relation to policing
- 14 responses were in relation to finance
- 3 responses were in relation to recruitment
- 5 responses were in relation to local government
- 19 had no specific sector focus
What themes did you identify in the responses?
There was a general acknowledgement that, as the use of big data and machine learning in this space increases, the importance of addressing potential issues of algorithmic bias also grows.
At the same time, respondents agreed that many of these biases stem from underlying societal inequalities and, as such, will require societal as well as technological solutions.
Some were also optimistic about the potential of algorithms to challenge and improve biased human decision-making.
The need for a set of ethical principles to underpin approaches to decision-making algorithms was a repeated theme and a number of organisations pointed either to frameworks which they had developed themselves, or to external approaches.
There was significant overlap between the principles being proposed with fairness, transparency and explainability coming through as especially strong themes.
What is the CDEI doing with this evidence?
This input forms part of our ongoing review, due to be published in Spring 2020.