Skip down to main content

PRESS RELEASE -
Covid-related misinformation videos spread primarily through Facebook, as its fact-checkers fail to spot false information, finds new Oxford study

Published on
21 Sep 2020
YouTube videos with false coronavirus information gathered more shares on social media than the videos of five leading news broadcasters combined.
  • YouTube videos with false coronavirus information gathered more shares on social media than the videos of five leading news broadcasters combined
  • Facebook most popular platform for sharing YouTube misinformation videos in comparison with Twitter and Reddit
  • Analysis shows failure of Facebook’s content moderation policies, with third party fact-checks catching less than 1% of Covid misinformation videos

In a new study by the Oxford Internet Institute, University of Oxford analysis shows that Covid related misinformation videos primarily spread through social media, with Facebook the primary channel for sharing misinformation without sufficient fact checks in place to moderate content.

The Oxford study, ‘Covid-related Misinformation on YouTube’, examined over a million Covid-related YouTube videos which circulated on social media and identified videos YouTube had eventually removed because they contained false information. It finds that Covid-related misinformation videos do not find their audience through YouTube itself, but largely by being shared on Facebook. Data analysed by the Oxford researchers shows YouTube misinformation videos were shared nearly 20 million times on Facebook during between October 2019 and June 2020. They had a higher reach on social media than the five largest English-language news sources on YouTube, CNN, ABC News, BBC, Fox News and Al Jazeera combined, whose videos were shared 15 million times.

Facebook also generated a higher number of reactions than other social media platforms.  Misinformation videos shared on Facebook generated a total of around 11,000 reactions (likes, comments or shares), before being deleted by YouTube.  In comparison, videos posted on Twitter were retweeted on average around 63 times.

The Oxford researchers also found that out of the 8,105 misinformation videos shared on Facebook between October 2019 and June 2020, only 55 videos had warning labels attached to them by third party fact checkers, less than 1% of all misinformation videos. This failure of fact-checking helped Covid-related misinformation videos spread on Facebook and find a large audience.

Oxford researchers observed that despite YouTube’s investment in containing the spread of misinformation, it still took YouTube on average 41 days to remove Covid-related videos with false information. Misinformation videos were viewed on average 150,000 times, before they were deleted by YouTube.

Dr Aleksi Knuutila, Postdoctoral Researcher, Oxford Internet Institute, said:

“People searching for Covid-related information on YouTube will see credible sources, because the company has improved its algorithm. The problem, however, is that misinformation videos will spread by going viral on other platforms, above all Facebook.

“The study shows that misinformation videos posted on YouTube found a massive audience, and it is likely to have changed people’s attitudes and behaviour for the worse.  For the sake of public health, platform companies need to ensure the information people receive is accurate and trustworthy.

Other findings include:

  • Just 250 Facebook groups are responsible for half of the visibility that misinformation videos acquire among public social media accounts.
  • The groups spreading misinformation include anti-vaccination and 5G-focused groups, but also religious organizations and musicians.
  • The most popular misinformation videos often include individuals who claim medical expertise making unscientific claims about treatments for and protection from the coronavirus disease.

Notes for editors:

About the research

Oxford researchers analysed 1,091,8756 distinct Covid-related misinformation videos posted on Facebook, Twitter and Reddit during the period October 2019 and June 2020. The full report can be found here.