Skip down to main content

Governing the likeness: How AI art communities of practice manage the use of real faces

Governing the likeness: How AI art communities of practice manage the use of real faces

Project Contents

Main photo credit: This image was generated using Stable Diffusion XL using a custom model derived from the base SDXL 1.0 model trained on the likeness of Bernie Hogan by himself

Overview

This project explores the social implications of generative AI technologies, particularly image-based technologies. It focusses on the ability of these technologies to generate images that look like people. These images can either represent a synthetic person, or they can represent a real person. Colloquially, these are known as ‘deep fakes’.

While many studies have examined the veracity of these images (a ‘deep fake detector’), this project instead addresses the issue of likeness detection. If it is possible to recognize the specific individual in an image, it can be said that the image contains the likeness of that person.

Following from this, this project raises the question: does the model that created the image also contains the likeness of the person? This question establishes the basis for governing spaces where generative images proliferate, where the likeness becomes a form of property which can now be traded, exchanged and licensed inside a model trained on images of a person.

In the absence of appropriate legal and regulatory frameworks, the communities of practice around the generation of synthetic images tend to self-govern on these issues of image training. The governance of these communities of practice suggest evolving norms about the appropriate training or display of a likeness.

This project undertakes a two-stage approach to exploring this issue.

  • A participant observation study of the online communities of practice surrounding the creation of synthetic images, with the primary goal to understand and catalogue the themes and discourses surrounding the use of a proper likeness.
  • An online experiment using synthetic images trained on volunteers, seeking to assess the perception of stranger faces in deepfakes versus synthetic text-to-image models.

Insights from this work will be able to provide clear guidance to policymakers in an area currently without adequate legal and regulatory frameworks. It will provide particular insights on the distinction between ‘mere’ misinformation, and what constitutes misrepresentation of specific likenesses in synthetic images.

Photo: This image was generated using Stable Diffusion XL using a custom model derived from the base SDXL 1.0 model trained on the likeness of Bernie Hogan by himself

Key Information

Funder:
  • Dieter Schwarz Stiftung gGmbH
  • Project dates:
    April 2023 - September 2024

    Project News

    Related Topics:

    Privacy Overview
    Oxford Internet Institute

    This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

    Strictly Necessary Cookies
    • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

    This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

    Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

    Google Analytics

    This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

    Enabling this option will allow cookies from:

    • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

    These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.