Skip down to main content

Explainable and accountable algorithms and automated decision-making in Europe

Transparency concept

Explainable and accountable algorithms and automated decision-making in Europe

Overview

The proliferation of AI systems and algorithms is increasingly accelerating across the public (e.g. health care and criminal justice) and the private (e.g. finance and insurance) sectors. These decision-making systems often operate as black boxes and do not allow insights on how they arrived at a decision, especially when machine learning is applied. These systems can have severe consequences for individuals. To ensure accuracy in automated decisions and to increase users’ trust, explanations should be provided if demanded by individuals or third party proxies. Explaining the envisioned legal utility and technical feasibility of ‘explanations’ or ‘transparency’ is essential to realise this goal. This project investigates transparency mechanisms and the technical possibility of offering individuals explanations of automated decisions. The aim is to test the feasibility, advantages, and difficulties that may be encountered in the formulation and adoption of explainable algorithms. Areas of investigation include:

  1. Given the likelihood of a proliferation of applications of ML models with low interpretability, what processes, information, or evaluations may be desirable to provide the best possible explainability and accountability of the operations of these models?
  2. What may constitute best possible practices or legal codes of practice in sectors experiencing widespread deployment of algorithmic systems to prevent biased, discriminatory, unintended, or socially undesirable outcomes.

Key Information

Funder:
  • DeepMind Technologies Limited
  • Project dates:
    January 2018 - January 2019

    Related Topics:

    Privacy Overview
    Oxford Internet Institute

    This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

    Strictly Necessary Cookies
    • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

    This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

    Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

    Google Analytics

    This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

    Enabling this option will allow cookies from:

    • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

    These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.