Skip down to main content

The Most Important Lesson From the Dust-Up Over Trump’s Fake Twitter Followers

Published on
2 Jun 2017

Project members Tim Hwang and Sam Woolley have a new article in Slate discussing bots that follow political candidates.

Let’s be clear: Coordinated campaigns of misinformation and manipulation on social media are absolutely real and are becoming an increasingly prominent component of the online media landscape. A variety of state and nonstate actors are increasingly flexing their muscles on these platforms to achieve a range of propaganda ends around the world. Swarms of bots have been used to disrupt dissident activists in places like Turkey, Mexico, and Syria, and dedicated Russian psy-ops and cyberattacks certainly played a role in the 2016 U.S. election. This is a real threat, and one that bears a much closer look by society as a whole.

But, at the same time, this week’s story reveals another key truth about these emerging threats and the social media platforms on which they find success: The opacity of platforms like Twitter and their continued unwillingness to provide critical data to journalists and researchers makes it even more difficult to determine where campaigns of misinformation are emerging and who is behind them.

While the sudden boost of these fake accounts is suspicious, their actual origins and purpose are a matter of conjecture. We know that the follower count for a given account has changed, we have a list of those new followers, and we have a rough sense of the behavior of those accounts in ways that are indicative of whether they might be fake identities or bots.

But that is all the information we have, and all we are likely to get without a leak from a builder or the help of a platform. There is no place to download data about these accounts or quickly find any information about, say, the IP addresses or other registration details associated with them. We also can’t effectively compare them in a real, quantitative way against other campaigns of misinformation that we’ve seen in the past. This limits our ability to connect this particular situation to other things that we’ve seen before or that may be occurring on Twitter or other social media platforms at the same time.

Read the full article here.

 

Related Topics:

Privacy Overview
Oxford Internet Institute

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies
  • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

Google Analytics

This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

Enabling this option will allow cookies from:

  • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.