Skip down to main content

Oxford Martin Programme on Misinformation, Science and Media

Oxford Martin Programme on Misinformation, Science and Media


The Challenge

In some key domains of public life, there appear to be coordinated efforts to undermine the reputation of science and innovation. Scientists now protest in the streets just to get governments to base policy on scientific evidence. Long-held scientific consensus on issues like the causes and consequences of climate change or the importance of vaccines for public health is increasingly contested, and heavily debated on social media and sometimes in the news. New technological innovations like artificial intelligence are discussed in terms that veer from the alarmist to the exuberant.
Public understanding of key issues in science and technology is often limited and misinformation about basic issues in science and technology – from natural selection to global warming – abounds.

How can we better understand public discussions of science and technology, and what can be done to improve them?

Our Approach

In this three-year programme, researchers will examine the interplay between systematic misinformation campaigns, news coverage, and increasingly important social media platforms for public understanding of science and technological innovation. The programme will turn to the problem of “junk science”, “fake news” and public policy issues. We will focus on three questions:

  1. How does the public’s understanding of science and technology vary from country to country and how is this variation related to differences in media use?
  2. How do misinformation campaigns on social media influence public learning about science and technology?
  3. How can scientists, journalists, and policy makers be better at communicating about science and new innovations, so as to contribute to evidence‐based policy making and respond to misinformation and junk science?

Methodologically, the project will combine the established social science methods of surveys, content analysis, and qualitative research with new computationally‐intensive methods of auditing algorithms, scraping social media posts, and social network analysis of big data.


Until now, understanding of the interplay between the public, misinformation campaigns, and social media has been limited and most research carried out has focused on elections and candidates for public office rather than broader but equally important issues of science communication.
Our aim is to combine social science and computer science to address the damaging impact of computational propaganda and other forms of digitally‐enabled misinformation campaigns on scientific innovation, policy making, and public life. We will engage with stakeholders in journalism, the technology industry, the scientific community, and among policymakers in the search for evidence-based actionable interventions.

Key Information

  • Oxford Martin School
  • Project dates:
    August 2017 - December 2023

    Major Areas of Research


    The global pandemic has brought to the fore the pressing problems caused by disinformation, leading many scholars to study the “infodemic” that is accompanying and exacerbating the public health crisis. Disinformation about the virus has already led to serious health repercussions in countries around the world. Our research on COVID-related disinformation looks at the prominence of stories by junk news outlets and state-backed media outlets on social media. ComProp researchers are also investigating the systems that help these junk news stories to succeed: from the online advertising ecosystem to incentives on social media platforms.


    The tools of computational propaganda are often deployed around elections, as various actors seek to sway public opinion through legitimate and illegitimate means. Our research on disinformation and elections looks at information-sharing on social media​ by members of the electorate, foreign influence campaigns, and the role of these campaigns in political polarization. We have conducted research on elections in Europe, North America, South America, the Middle East, and Asia.

    Tech Platforms and Governance

    Our team is interested not only in the content of disinformation but also in the technologies and systems that shape the information landscape. To this end, our research examines the various forces constraining and enabling computational propaganda: how tech companies incentivise and amplify problematic content, how governments seek to regulate these companies, and how tech platforms themselves are responding.

    State Sponsored Disinformation

    The tools of computational propaganda are increasingly deployed by states to shape public opinion, sow distrust, and create confusion both at home and abroad. Our research on state-sponsored disinformation looks at the proliferation of “cyber troops” in countries around the world, the reach and contents of state-sponsored media outlets, and the impacts of foreign influence operations.

    Related Topics