Skip down to main content

PRESS RELEASE -
Public at risk of discrimination from online behavioural advertising, says Oxford legal expert

Published on
25 Nov 2019
A leading expert in ethics and law from Oxford Internet Institute (OII), University of Oxford and the Alan Turing Institute, believes current regulation might fail to protect the public from the inherent bias in online behavioural-based advertising.

Associate Professor, Dr Sandra Wachter, Senior Research Fellow at the Oxford Internet Institute highlights how techniques such as affinity profiling – grouping people together according to their assumed interests rather than solely their personal traits – has become common place in the online advertising industry, with advertisers increasingly able to target or exclude certain groups from products and services based on assumptions of what they think users want to see.

Professor Wachter makes the case that (indirect) discrimination in online behavioural advertising is typically very subtle.  For example, someone might have many LGBTQ+ friends on Facebook and is not aware that their own sexual orientation can be inferred based on their circle of friends. However, advertisers might use these – maybe inaccurate – assumptions (one might not identify as LBGTQ+) about someone’s’ sexuality and target or exclude them on that basis. This means both accurately and inaccurately profiled users potentially miss out on certain products and services based on their interests, habits or routines.

In her article, ‘Affinity Profiling and Discrimination by Association in Online Behavioural Advertising’ forthcoming in Berkeley Technology Law Journal, Professor Wachter presents an alternative legal concept to address structural bias in online advertising. This includes the case for applying the concept of ‘discrimination by association’.

Professor, Dr Sandra Wachter, Oxford Internet Institute, and author of the paper said:

“Algorithms and digital technologies constantly collect data and evaluate us and sometimes make life changing decisions such as credit, housing and employment. Advertisements play a crucial part in this in that they inform us about goods and services, opportunities, products or nudge us into certain behaviours.

“Affinity profiling may use “seemingly neutral” reasons to withhold a job advert for example from Cosmopolitan readers or people with an affinity for black culture, while claiming not to have any designs against women or ethnicity per se. Of course it is not unrealistic to assume that a large portion of Cosmopolitan readers are women, and that there is a strong correlation between interest in a particular culture and one’s ethnicity.”

The concept of ‘discrimination by association’ challenges the idea of strictly differentiating between assumed interests and personality traits when profiling people.

Instead the concept of ‘discrimination by association’ acknowledges that a person might be treated significantly worse than others based on their assumed relationship with a protected group without directly inferring details about that person and grants protection regardless if someone is a member of that group and if these inferences are accurate.

The paper sets out three ways in which adopting ‘discrimination by association’ could provide greater protection from structural bias within online behavioural advertising:

  • Greater protection against adverse actions by advertisers based on assumed interests, groupings or associations (i.e. interest in a culture) even if no personal traits are directly inferred about the person (i.e. assuming one’s ethnicity)
  • Protections for misclassified users that experience adverse effects
  • Greater protection for “accurately” profiled people because group membership is not necessary for protection. Someone with a gender identity, disability, or religious related legal claim would not need to “out” themselves to file a complaint.

Professor Wachter adds:

“Of course, even if the gap is closed and we find a way to protect ‘all’ those who are discriminated by advertising, whether or not they belong to the intended group – challenges remain. In particular, the lack of transparent business models and practices poses a considerable barrier to detecting and proving cases of discrimination. Often, we might not be aware that we are discriminated against. For example, how do I know that I am not seeing a job ad.  How do we know if interest groups (i.e. interest in dogs) correlates with protected groups (i.e. gender).”

Professor Wachter concludes: “In the long term we need to ensure that no sensitive information is used unethically or illegally and that algorithms aren’t discriminating against us. Current transparency tools aren’t able to demonstrate that. Ultimately, we need to find new, reasonable ways of governance that increase accountability and protect all groups without stifling innovation. The paper closes with recommendations and ideas on how to achieve this goal”.

For more information contact press@oii.ox.ac.uk.

Note to Editors

Professor Wachter’s paper, ‘Affinity Profiling and Discrimination by Association in Online Behavioural Advertising’ is due to be published in Berkeley Technology Law Journal.

Related Topics