Skip down to main content
Misdoom Oxford 2021 - with picture of Radcliffe Camera in Oxford

21-22 September 2021

The 3rd Multidisciplinary International Symposium on Disinformation in Open Online Media

Key Information

Dates

  • 21-22 September 2021

Location

  • Online

Registration link:

https://www.eventbrite.co.uk/e/misdoom-2021-tickets-167462281191 

LNCS Proceedings: Disinformation in Open Online Media

https://link.springer.com/book/10.1007%2F978-3-030-87031-7

Submissions

  • Submission Deadline: 4 June 2021
  • Notification by: 12 July 2021
  • Camera ready version: 9 August 2021 (for LNCS proceedings only)

Call for Contributions

After two successful editions, the international MISDOOM symposium on misinformation in online media returns for a 3rd edition, this time co-hosted by the Oxford Internet Institute and the Universiteit Utrecht on 21-22 September 2021. Topics include health misinformation, hate speech, misinformation diffusion, news spreading behaviour and mitigation, harm-aware news recommender systems, and related topics on misinformation in online media.

Online media have become a politically, economically, and organizationally critical infrastructure. Internet users all over the world can directly interact with each other and participate in political discussions. Through online media, journalists have access to enormous amounts of information and public sentiment that increasingly becomes part of their reporting. Politicians refine their positions and actions based on the (seemingly) public opinion, which they distill from online media. Others use these channels to distribute their views. Companies allow product reviews by users to provide crowd-based quality assurance.

The symposium brings together researchers from multiple disciplines, including communication science, computer science, computational social science, political communication, journalism and media studies, as well as practitioners in journalism and online media. The symposium has a strong multidisciplinary character, and aims to cater to the habits of different disciplines. Therefore, there are three submission tracks (see detailed information below):

  1. Social Science Track: Full Papers for Special Issue in Social Media + Society.
  2. Computer Science Track: Full papers with Springer LNCS proceedings.
  3. Extended Abstracts

Submission deadline: 4 June 2021

Cost of Attendance: Free

Submissions for this event have closed.

Symposium topics

In an ideal world, participation and openness would foster free and democratic processes as well as beneficial societal interactions. However, beyond the desired space for free expression of public opinions, such openness also provides options for large-scaled and orchestrated manipulations. Groups of humans (“trolls”) or semi- to fully-automated systems (“social bots”) can bias or manipulate online discussions and inject false perceptions into social media. How can we detect and learn from this phenomenon, and how do we combat fake news and misinformation?

Participants can discuss and contribute to the following (non-exclusive list of) topics:

  • Cross-platform campaigns and their impact (e.g., diffusion of disinformation and manipulation, observations of campaigns and strategies, communication strategies, hate speech)
  • Approaches to studying misinformation (e.g., qualitative approaches, case studies, quantitative approaches, experiments)
  • User involvement with fake news on various platforms (e.g., engagement, viewership)
  • Counter-measures on mis- and disinformation and manipulation (e.g., censorship policies, behavioral changes, education, professional codices, legal actions)
  • Social networking platforms, censorship policies and impacts (e.g. policies to counter hate speech, health misinformation)
  • Trending topics in mis- and disinformation research (e.g., health related fake news)

Presenting at MISDOOM

In addition to keynote talks and panel discussions, we particularly invite researchers and practitioners to present their work at the symposium.

Given that we welcome both social scientists and computer scientists, and that the publication strategies of these fields differ, we solicit three types of contributions that, upon acceptance, result in the same opportunity to present at MISDOOM:

  1. Social Science Track: Full papers for a special issue of Social Media + Society, to be published in 2022. Papers should be around 8,000 words in length, describe original unpublished and new research, and follow the SM+S formatting guidelines. Please note that acceptance to the conference is not a guarantee of inclusion in this special issue, which would also have a further review process attached to it. Such submissions will be judged based on scientific quality and relevance for the MISDOOM symposium.
  2. Computer Science Track: Full papers to be published with Springer LNCS proceedings. Up to 15 pages (including references) in Springer Lecture Notes in Computer Science (LNCS) format describing original unpublished and new research. The work should be structured like a research paper, and cover the context of the problem studied, the research question, approach/methodology and results in 6 to 15 pages. It should be formatted according to the LNCS Word or LaTeX template. Such submissions will be judged based on scientific quality and relevance for the MISDOOM symposium.
  3. Extended Abstracts: If authors do not have a full paper ready in time for the conference we encourage them to submit an Extended Abstract. This abstract should be up to one page of A4 (including references), and should briefly describe the work that will be presented. The abstract can be based on previously published work, ongoing work in progress or even a new research idea or agenda. No template is provided, but at least title, authors, their affiliation, the text of the abstract and, especially in case of previously published work, reference(s) should be included. Submissions are non-archival, and not formally published. Authors of extended abstracts will also have the opportunity to submit a full paper at a later stage for consideration in one of the two special issues. If you are interested in having your abstract considered for a special issue, please note this as a footnote after the citations in your abstract.
Lecture Notes in Computer Science LNCS

Lecture Notes in Computer Science LNCS

Social Media + Society

Social Media + Society

Organisers

A program committee of international recognized scholars evaluates all abstracts for suitability according to international research standards. All accepted abstracts and full papers are eligible for oral presentation at the symposium.

Organizers:

  • Jonathan Bright, Oxford Internet Institute, General Chair
  • Anastasia Giachanou, Universiteit Utrecht, General Chair
  • Viktoria Spaiser, University of Leeds, Program Chair
  • Francesca Spezzano, Boise State University, Program Chair
  • Anna George, Oxford Internet Institute, Outreach Chair
  • Alexandra Pavliuc, Oxford Internet Institute, Communications Chair

Programme Committee

Elena Kochkina, Queen Mary University

Kelechi Amakoh, Aarhus University

Heidi Vepsäläinen, University of Helsinki

Eric Fernandes de Mello Araujo, Vrije Universiteit Amsterdam

Henna Paakki, Aalto University

Taha Yasseri, University College Dublin

Alon Sela, Ariel University

Adriana Amaral, UNISINOS

Marina Tulin, Erasmus University Rotterdam

Hendrik Heuer, University of Bremen

Dennis M. Riehle, Universität Koblenz-Landau

Milos Ulman, Czech University of Life Sciences

Neta Kligler Vilenchik, Hebrew University of Jerusalem

Peter van der Putten, LIACS, Leiden University & Pegasystems

Dennis Assenmacher, Uni Münster

Thorsten Quandt, WWU Munster

Ross Towns, Leiden University

Heike Trautmann, University of Münster

Stefano Cresci, IIT-CNR

Chico Camargo, University of Exeter

Giulio Barbero, Leiden University

Ebrahim Bagheri, Ryerson University

Mehwish Nasim, CSIRO Data61

Myrto Pantazi, Oxford Internet Institute

Sílvia Majó-Vázquez, University of Oxford

Florian Wintterlin, University of Muenster

Jan Schacht, HAW Hamburg

Louis Shekhtman, Bar Ilan University

Tommaso Caselli, Rijksuniversiteit Groningen

Christian Burgers, Department of Communication Studies, VU University Amsterdam

Arkaitz Zubiaga, Queen Mary University of London

Cody Buntain, New York University

Ansgard Heinrich, University of Groningen

German Neubaum, University of Duisburg-Essen

Raquel Recuero, Universidade Federal de Pelotas (UFPel)

Matteo Gagliolo, Université libre de Bruxelles (ULB)

Meysam Alizadeh, Princeton University

Elly Konijn, Vrije Universiteit Amsterdam

Marco Niemann, University of Muenster

Marcel Schliebs, Oxford Internet Institute

Anne Dirskon, University of Leiden

Maziyar Panahi, Institut des Systèmes Complexes Paris Île-de-France

Martin Wettstein, University of Zurich

Mona Elswah, Oxford Internet Institute

Nicoleta Corbu, National University of Political Studies and Public Administration SNSPA

Aleksi Knuutila, Oxford Internet Institute

Travis Coan, Exeter University

Lena Frischlich, University of Munster

Tim Schatto-Eckrodt, University of Muenster

Gerhard Weiss, Maastricht University

Aliaksandr Herasimenka, Oxford Internet Institute

André Calero Valdez, RWTH Aachen University

Tom Nicholls, University of Oxford

Łukasz Gajewski, Warsaw University of Technology

Keynote Speakers

Nina Jankowicz

[Image caption: Nina Jankowicz poses for headshots at her home on April 14, 2021 in Arlington, Virginia.]

Since revelations of Russian online influence campaign first broke in 2016, the United States and the Western world has finally begun to wake up to the threat of online warfare and the attacks from Russia. The question no one seems to be able to answer is: what can the West do about it? Central and Eastern European states, however, have been aware of the threat for years. Nina Jankowicz has advised these governments on the front lines of the information war. In her keynote speech at MISDOOM, she will explore the threat of state-backed online disinformation, how it has changed since it became a blockbuster news story, and government responses to the phenomenon over the past five years.

Dr. Reza Zafarani

[Image caption: Dr. Reza Zafarni smiling into the camera]

“Fake news” is now viewed as one of the greatest threats to democracy, freedom of expression, and journalism. Massive spread of fake news has weakened public trust in governments and its potential impact on various political processes, e.g., the “Brexit” referendum or the equally divisive 2016 U.S. presidential election, is yet to be realized. At MISDOOM, we will briefly review some of the modern computational techniques for fake news detection, along with some of the current challenges that these methods face. We will discuss some recent advancements to tackle these challenges, with particular focus on fake news early detection, multi-modal fake news analysis, modeling the intent of fake news spreaders, and the lack of data.

Prof. Sander van der Linden

[Image caption: Prof. Sander van der Linden smiling into the camera with his arms crossed]

Much like a viral contagion, false information can spread rapidly from one individual to another. Moreover, once lodged in memory, misinformation is difficult to correct. Inoculation theory therefore offers a natural basis for developing a psychological ‘vaccine’ against the spread of fake news and misinformation. Specifically, in a series of randomized lab and field studies, we show that it is possible to pre-emptively “immunize” people against disinformation about climate change, COVID- 19, and elections (amongst other topics) by pre-exposing them to severely weakened doses of the techniques that underlie its production. This psychological process helps people cultivate cognitive antibodies in a simulated social media environment. During the talk, I’ll showcase an award- winning real-world intervention (“Bad News”) we developed and empirically evaluated in 20 languages—with governments and social media companies—to help citizens around the world recognize and resist un- wanted attempts to influence and mislead.

MISDOOM Programme

Note: Programme is in BST

1 Day (Sep. 21)

12.00 pm: Opening by Organisers

12.30 pm: Keynote by Dr. van der Linden: “Psychological Inoculation Against Misinformation”

  • Abstract: Much like a viral contagion, false information can spread rapidly from one individual to another. Moreover, once lodged in memory, misinformation is difficult to correct. Inoculation theory therefore offers a natural basis for developing a psychological ‘vaccine’ against the spread of fake news and misinformation. Specifically, in a series of randomized lab and field studies, we show that it is possible to pre-emptively “immunize” people against disinformation about climate change, COVID- 19, and elections (amongst other topics) by pre-exposing them to severely weakened doses of the techniques that underlie its production. This psychological process helps people cultivate cognitive antibodies in a simulated social media environment. During the talk, I’ll showcase an award- winning real-world intervention (“Bad News”) we developed and empirically evaluated in 20 languages—with governments and social media companies—to help citizens around the world recognize and resist un- wanted attempts to influence and mislead.

1.30 pm: Short Break

1.45 pm: Session 1 (Parallel Sessions):

  • 1.1 COVID-19: Behavioural Aspects
  • 1.2. Susceptibility to Misinformation
  • 1.3. Building Resilience Against Misinformation
  • 1.4 COVID-19: Disinformation Narratives

3.00 pm: Break

3.30 pm: Keynote by Dr. Zafarani: “Computational Challenges and Recent Advancements in Automated Fake News Detection”

  • Abstract: “Fake news” is now viewed as one of the greatest threats to democracy, freedom of expression, and journalism. Massive spread of fake news has weakened public trust in governments and its potential impact on various political processes, e.g., the “Brexit” referendum or the equally divisive 2016 U.S. presidential election, is yet to be realized. At MISDOOM, we will briefly review some of the modern computational techniques for fake news detection, along with some of the current challenges that these methods face. We will discuss some recent advancements to tackle these challenges, with particular focus on fake news early detection, multi-modal fake news analysis, modeling the intent of fake news spreaders, and the lack of data.

4.30 pm: Short Break

4.45 pm: Session 2 (Parallel Sessions):

  • 2.1. Fake News Detection
  • 2.2. Misinformation on Science
  • 2.3. Fact Checking Approaches
  • 2.4. State-sponsored Misinformation
  • 2.5. COVID-19: Tackling Misinformation

6 pm: End

2 Day (Sep. 22)

12.00 pm: Opening by Organisers: Reflection on the first day of MISDOOM2021, outlook day 2

12.15 pm: Session 3 (Parallel Sessions):

  •  3.1 Reflections
  •  3.2. Social Media Regulation
  •  3.3. Mathematical Models
  • 3.4. Media Industry

1.30 pm: Short Break

1.45 pm: Session 4 (Parallel Sessions):

  • 4.1. Online Communities
  • 4.2. COVID-19: Vaccination
  • 4.3. COVID-19: Misinformation Spread
  • 4.4.  COVID-19: International Dimension

3.00 pm: Break

3.30 pm: Session 5 (Parallel Sessions):

  • 5.1. Misinformation and Disinformation in Politics
  • 5.2. Social Sciences Approaches
  • 5.3. Online Radicalisation
  • 5.4. Conspiracy Theories

4.45 pm: Short Break

5.00 pm: Keynote by Nina Jankowicz: “Russian Disinformation, Five Years Later”

  • Abstract: Since revelations of Russian online influence campaign first broke in 2016, the United States and the Western world has finally begun to wake up to the threat of online warfare and the attacks from Russia. The question no one seems to be able to answer is: what can the West do about it? Central and Eastern European states, however, have been aware of the threat for years. Nina Jankowicz has advised these governments on the front lines of the information war. In her keynote speech at MISDOOM, she will explore the threat of state-backed online disinformation, how it has changed since it became a blockbuster news story, and government responses to the phenomenon over the past five years.

6 pm: Closing remarks & end

Presenters can find their presentation symposium and time here:

Title of Submission and Time of Presentation

Recordings

Welcome Address and Keynote by Dr. van der Linden: “Psychological Inoculation Against Misinformation”

Keynote by Dr. Zafarani: “Computational Challenges and Recent Advancements in Automated Fake News Detection”

Keynote by Nina Jankowicz: “Russian Disinformation, Five Years Later”

  • 1.1 COVID-19: Behavioural Aspects
  • 1.2. Susceptibility to Misinformation
  • 1.3. Building Resilience Against Misinformation
  • 1.4 COVID-19: Disinformation Narratives
  • 2.1. Fake News Detection
  • 2.2. Misinformation on Science
  • 2.3. Fact Checking Approaches
  • 2.4. State-sponsored Misinformation
  • 2.5. COVID-19: Tackling Misinformation
  •  3.1 Reflections
  •  3.2. Social Media Regulation
  •  3.3. Mathematical Models
  • 3.4. Media Industry
  • 4.1. Online Communities
  • 4.2. COVID-19: Vaccination
  • 4.3. COVID-19: Misinformation Spread
  • 4.4.  COVID-19: International Dimension
  • 5.1. Misinformation and Disinformation in Politics
  • 5.2. Social Sciences Approaches
  • 5.3. Online Radicalisation
  • 5.4. Conspiracy Theories

Submissions

The submission deadline is 4 June 2021 (extended and final). Submissions should be made in PDF through EasyChair using the following link:

Submissions for this event have closed.