In some key domains of public life, there appear to be coordinated efforts to undermine the reputation of science and innovation. Scientists now protest in the streets just to get governments to base policy on scientific evidence. Long-held scientific consensus on issues like the causes and consequences of climate change or the importance of vaccines for public health is increasingly contested, and heavily debated on social media and sometimes in the news. New technological innovations like artificial intelligence are discussed in terms that veer from the alarmist to the exuberant.
Public understanding of key issues in science and technology is often limited and misinformation about basic issues in science and technology – from natural selection to global warming – abounds.
How can we better understand public discussions of science and technology, and what can be done to improve them?
Our Approach
In this three-year programme, researchers will examine the interplay between systematic misinformation campaigns, news coverage, and increasingly important social media platforms for public understanding of science and technological innovation. The programme will turn to the problem of “junk science”, “fake news” and public policy issues. We will focus on three questions:
How does the public’s understanding of science and technology vary from country to country and how is this variation related to differences in media use?
How do misinformation campaigns on social media influence public learning about science and technology?
How can scientists, journalists, and policy makers be better at communicating about science and new innovations, so as to contribute to evidence‐based policy making and respond to misinformation and junk science?
Methodologically, the project will combine the established social science methods of surveys, content analysis, and qualitative research with new computationally‐intensive methods of auditing algorithms, scraping social media posts, and social network analysis of big data.
Ambition
Until now, understanding of the interplay between the public, misinformation campaigns, and social media has been limited and most research carried out has focused on elections and candidates for public office rather than broader but equally important issues of science communication.
Our aim is to combine social science and computer science to address the damaging impact of computational propaganda and other forms of digitally‐enabled misinformation campaigns on scientific innovation, policy making, and public life. We will engage with stakeholders in journalism, the technology industry, the scientific community, and among policymakers in the search for evidence-based actionable interventions.
The global pandemic has brought to the fore the pressing problems caused by disinformation, leading many scholars to study the “infodemic” that is accompanying and exacerbating the public health crisis. Disinformation about the virus has already led to serious health repercussions in countries around the world. Our research on COVID-related disinformation looks at the prominence of stories by junk news outlets and state-backed media outlets on social media. ComProp researchers are also investigating the systems that help these junk news stories to succeed: from the online advertising ecosystem to incentives on social media platforms.
Elections
The tools of computational propaganda are often deployed around elections, as various actors seek to sway public opinion through legitimate and illegitimate means. Our research on disinformation and elections looks at information-sharing on social media by members of the electorate, foreign influence campaigns, and the role of these campaigns in political polarization. We have conducted research on elections in Europe, North America, South America, the Middle East, and Asia.
Tech Platforms and Governance
Our team is interested not only in the content of disinformation but also in the technologies and systems that shape the information landscape. To this end, our research examines the various forces constraining and enabling computational propaganda: how tech companies incentivise and amplify problematic content, how governments seek to regulate these companies, and how tech platforms themselves are responding.
State Sponsored Disinformation
The tools of computational propaganda are increasingly deployed by states to shape public opinion, sow distrust, and create confusion both at home and abroad. Our research on state-sponsored disinformation looks at the proliferation of “cyber troops” in countries around the world, the reach and contents of state-sponsored media outlets, and the impacts of foreign influence operations.
DemTech investigates the use of algorithms, automation, and computational propaganda in public life.
Active
Related Topics:
Privacy Overview
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
moove_gdrp_popup - a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.
This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.
Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.
Google Analytics
This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.
Enabling this option will allow cookies from:
Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains
These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.
Please enable Strictly Necessary Cookies first so that we can save your preferences!