Skip down to main content

Harder, Better, Faster, Stronger: International Law and the Future on Online PsyOps

Published on
23 Feb 2017

Recent years have seen an explosion of activity from states and non-state actors seeking to manipulate online political discourse at home and abroad. These efforts have leveraged a range of different techniques, from the use of swarms of automated bots to the systemic spreading of misleading or outright fabricated information through social media. Most dramatically, recent revelations at the time of writing have suggested that the use of these techniques by the Russian government may have played a role in swaying the outcome of the 2016 US presidential election.  Technological trends seem poised to make these types of online psychological operations (psyops) ever cheaper, more effective, and difficult to attribute in the near future. Given the potential for this new generation of psyops to destabilize the global political environment, what can be done through channels of international law and other forms of coordination to combat or control the impact of these persuasive campaigns? This paper examines this question in the context of state and non-state actor use of online psyops to undermine other states. It examines the current state of development of these techniques, and projects future capabilities based on recent advances in artificial intelligence and quantitative social science. It then examines a set of applicable international legal frameworks, arguing that the existing body of laws and norms fail to adequately constrain the use of these techniques. Finally, it provides a set of potential interventions for exploration, considering both technical and legal approaches.

Download here.

Your can see a video of the authors’ presentation, which took place at the Oxford Internet Institute, University of Oxford, on February 20th, 2017, here.

Timothy Hwang and Lea Rosen.  “Harder, Better, Faster, Stronger: International Law and the Future on Online PsyOps.” Philip N. Howard and Samuel Woolley (Eds). Working Paper 2017.1. Oxford, UK: Project on Computational Propaganda. www.politicalbots.org.

 

 

Related Topics:

Privacy Overview
Oxford Internet Institute

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies
  • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

Google Analytics

This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

Enabling this option will allow cookies from:

  • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.