Skip down to main content

Explaining black-box decisions

Explaining black-box decisions

Overview

As we deploy decision-making systems in the real world, questions of accountability become increasingly important. For system builders, questions such as “Is the system working as intended?”, “Do the decisions being made seem sensible?” or “Are we conforming to equality regulation and legislation?” are important, while a subject of the decision-making algorithm may be more concerned with questions such as “Am I being treated fairly?” or “What could I do differently to get a favourable outcome next time?”

These issues are not unique to computerised decision-making systems. However, with the growth of machine learning based systems, these questions have become even more important. What distinguishes machine learning is its use of arbitrary black box functions to make decisions. As such, the functions used to make decisions may well be too complex to comprehend; and it may not be possible to completely understand the full decision-making criteria.

Recently, the project investigators presented a proof of concept for counterfactual explanations, This a novel type of explanation of automated decisions that describes minimum conditions that would have led to an alternative decision.

With a proof of concept already in place, this project undertakes the essential further work to transform counterfactual explanations into a practically useful tool to generate explanations for users and parties affected by automated decision-making systems.

Key Information

Funder:
  • Alan Turing Institute (EPSRC)
  • Project dates:
    October 2018 - September 2020

    Related Topics:

    Privacy Overview
    Oxford Internet Institute

    This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

    Strictly Necessary Cookies
    • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

    This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

    Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

    Google Analytics

    This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

    Enabling this option will allow cookies from:

    • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

    These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.