Nahema is a doctoral candidate at the Oxford Internet Institute and a researcher at the Computational Propaganda Project.
Nahema Marchal is a student on the DPhil in Information, Communication & the Social Sciences.
Nahema is a doctoral candidate at the Oxford Internet Institute and a researcher at the Computational Propaganda Project, where her work focuses on the relationship between online political communication and affective partisan polarisation. Other research interests include the spread of misinformation online and the impact of artificial intelligence on politics and democratic processes.
Prior to joining the OII, Nahema worked as content editor at Dow Jones Media Group and as program officer for a number of not-for-profit organisations including the World Policy Institute and the Center for Public Scholarship. Nahema holds an MA in Political Theory from the New School for Social Research in New York and a B.Sc. in Political Science and International Relations from the University of Bristol.
Polarisation, echo chambers, information bubbles, social trust, affective politics, collective action, political economy of digital media, balkanisation, algorithmic sorting, disinformation
Supervisors at the OII
Recorded: 22 April 2020
As part of the OII’s new webinar series, Nahema Marchal from the Computational Propaganda Project discusses initial reactions of their YouTube search analysis.
YouTube proving a popular source of reliable information on COVID-19, but public health agencies could make greater use of channel
17 April 2020
A new memo from the Oxford Internet Institute, University of Oxford, has found very limited amounts of “junk” or conspiratorial health content among the most popular searches for COVID-19 content on YouTube.
State-backed media in Russia, China, Iran and Turkey successful in sharing misleading stories on COVID-19
9 April 2020
A new memo from the Oxford Internet Institute, University of Oxford, has lifted the lid on the actions of English language state-backed media in Russia, China, Iran and Turkey during the coronavirus pandemic.
9 December 2019
Less than two percent of news sources shared on Twitter ahead of the 2019 UK General Election defined as ‘junk news’, says new analysis from Oxford researchers.
Junk news ‘not prevalent’ on Twitter, but more likely to be shared and liked on Facebook, finds unique multilingual study
21 May 2019
Fewer than 4% of news sources shared on Twitter ahead of the 2019 European Parliamentary elections were ‘junk news’
1 November 2018
25% of content shared around US midterms is junk news, despite efforts by the platforms to curb the problem
5 October 2018
Researchers from the Oxford Internet Institute, University of Oxford conclude only 1.2% of Twitter content connected to the elections is junk news
13 January 2020 BBC News
Star Wars actor Mark Hamill has deleted his Facebook account, lambasting the company's political ads policy. In a tweet, the celebrity accused the firm's chief Mark Zuckerberg of having valued profit over truthfulness.
13 December 2019 Express and Star
It is perhaps the most surprising development of all during the election campaign that so-called ‘junk news’ is apparently on the decline.
9 December 2019 ITV News
So-called ‘junk news’ is on the decline on Twitter but still receives high engagement on Facebook, researchers claim.
1 December 2019 Politico
Silicon Valley's efforts to keep bad actors from manipulating next year's election face threats that have evolved since 2016.
26 November 2019 First Draft News
‘Parody’ sites are the new battleground for grabbing voters’ attention and counteracting opposition party criticism.
22 May 2019 Financial Times
Far-right parties shift focus from leaving EU to divisive social issues
21 May 2019 BBC News
Are Twitter bots controlled by Russia on the march across Europe? And is Facebook full of misinformation designed to influence voters?
20 November 2018 Council on Foreign Relations
France is taking an innovative step to curb disinformation on Facebook. It might prove to be a model for regulators elsewhere.