Programme on Democracy and Technology
DemTech investigates the use of algorithms, automation, and computational propaganda in public life.
The Programme on Technology and Democracy investigates the use of algorithms, automation, and computational propaganda in public life. This programme of activity is backed by a team of social and information scientists eager to protect democracy and put social data science to work for civic engagement. We are conducting international fieldwork with the political consultants and computer experts who are commissioned to activate or catch information operations. We are building original databases of incidents and accounts involved in such activities, and we use our knowledge to make better tools for detecting and ending interference with democracy. We engage in “real-time” social and information science, actively disseminating our findings to journalists, industry, and foreign policy experts. Our network of experts helps civil society, industry, government, and other independent researchers develop a better understanding of the role of technology in public life.
Major Areas of Research
The global pandemic has brought to the fore the pressing problems caused by disinformation, leading many scholars to study the “infodemic” that is accompanying and exacerbating the public health crisis. Disinformation about the virus has already led to serious health repercussions in countries around the world. Our research on COVID-related disinformation looks at the prominence of stories by junk news outlets and state-backed media outlets on social media. ComProp researchers are also investigating the systems that help these junk news stories to succeed: from the online advertising ecosystem to incentives on social media platforms.
The tools of computational propaganda are often deployed around elections, as various actors seek to sway public opinion through legitimate and illegitimate means. Our research on disinformation and elections looks at information-sharing on social media by members of the electorate, foreign influence campaigns, and the role of these campaigns in political polarization. We have conducted research on elections in Europe, North America, South America, the Middle East, and Asia.
Tech Platforms and Governance
Our team is interested not only in the content of disinformation but also in the technologies and systems that shape the information landscape. To this end, our research examines the various forces constraining and enabling computational propaganda: how tech companies incentivise and amplify problematic content, how governments seek to regulate these companies, and how tech platforms themselves are responding.
State Sponsored Disinformation
The tools of computational propaganda are increasingly deployed by states to shape public opinion, sow distrust, and create confusion both at home and abroad. Our research on state-sponsored disinformation looks at the proliferation of “cyber troops” in countries around the world, the reach and contents of state-sponsored media outlets, and the impacts of foreign influence operations.
Associate Professor, Senior Research Fellow
Jonathan Bright is a political scientist specialising in computational and ‘big data’ approaches to the social sciences.
Professor of Internet Studies
Philip N. Howard is a professor of sociology, information and international affairs, and the author of Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives.
Aliaksandr Herasimenka is a political communication scholar and a postdoctoral researcher at the Programme on Democracy & Technology.
Dr Aleksi Knuutila is an anthropologist and a data scientist. He is working as a Postdoctoral Researcher in the Computational Propaganda programme, where he uses computational and qualitative methods to study political communications and communities.
Research Assistant & DPhil Student
Lisa-Maria Neudert is a DPhil Student researching platform governance and regulation in response to mis/disinformation.
Emily Taylor is the CEO of Oxford Information Labs Ltd. and a specialist in cybersecurity, internet law and governance, and surveillance laws. In collaboration with Philip Howard, she is involved in ongoing research around AI and Good Governance.
Research Associate and MSc Alumnus
Lofred Madzou’s interests lie at the intersection of philosophy, artificial intelligence and policy. The overarching theme of his work is "responsible AI" and he works in collaboration with Philip Howard.