Skip down to main content

Bias as Evidence

Artificial Intelligence or AI Bias hidden iceberg model vector presentation. Visible is computauional biases, invisible is human biases and systemic biases. Bias in AI systems concepts.

Bias as Evidence

Full project title: Bias as Evidence: How, if at all, can bias in risk-assessment tools work as evidence for policy?

Overview

While risk-assessment algorithms were introduced with the promise to reduce human bias in decision-making, these tools have been found to reproduce or exacerbate pre-existing social biases. This is mirrored in the presence of unfair differences in the way risk-assessment algorithms perform towards different social groups along the lines of gender, race, class, etc. This phenomenon has given rise in the literature to the term “algorithmic bias”.

This project proposes that what is often referred to as algorithmic bias can also be conceived as a mirror of pre-existing social disparities. Starting from this intuition, the main aim of this research is to understand in which ways algorithmic bias stands, or can stand, as evidence of these disparities, with the aim to inform policy interventions to tackle their pre-existing social causes. It focusses on the cases of credit, health, and pre-trial risk assessment.

The main research question is: “How, if at all, can bias in risk-assessment tools work as evidence for policy?” Given the presence of unfair differences in the way risk-assessment algorithms perform towards different social groups, this project enquire whether it is possible to qualify and quantify these differences in a way that is informative for policymaking.

Key Information

Funder:
  • Dieter Schwarz Stiftung gGmbH
  • Project dates:
    February 2022 - January 2023
    Contact:
    Marta Ziosi

    Related Topics:

    Privacy Overview
    Oxford Internet Institute

    This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

    Strictly Necessary Cookies
    • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

    This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

    Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

    Google Analytics

    This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

    Enabling this option will allow cookies from:

    • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

    These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.