Skip down to main content

Leveraging algorithmic bias as evidence to support marginalized communities

Leveraging algorithmic bias as evidence to support marginalized communities

Project Contents

Main photo credit: Image by Emily Rand / © LOTI / Better Images of AI / AI City / CC-BY 4.0

Overview

Algorithmic bias refers to the uneven treatment of different social groups by an algorithm, often along the lines of race, gender, and class. In the context of algorithmic predictions, where models inform real-world decisions, algorithmic bias can have significant societal implications. Some of the most pivotal and worrying examples come from crime risk prediction, where algorithms are often found to overestimate the likelihood of (re)committing a crime for racial minorities.

This project focuses on the enforcement bias captured in the crime prediction algorithm developed by Rotaru et al. (2022) in Chicago. Through interviews with the researchers, government agencies, policymakers, community organizations and affected communities involved with the Chicago Crime Prediction Algorithm, this project seeks to examine the extent to which this bias can function as a device or an instrument for social critique. Specifically, I focus on how the evidence that this bias carries can serve affected communities to expose existing disparities, denounce their mistreatment and challenge the use of these tools in the first place.

Photo: Image by Emily Rand / © LOTI / Better Images of AI / AI City / CC-BY 4.0

Key Information

Funder:
  • Dieter Schwarz Stiftung gGmbH
  • Project dates:
    November 2023 - October 2024
    Contact:
    Marta Ziosi

    Related Topics:

    Privacy Overview
    Oxford Internet Institute

    This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

    Strictly Necessary Cookies
    • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

    This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

    Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

    Google Analytics

    This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

    Enabling this option will allow cookies from:

    • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

    These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.