Gillian Bolsover

Gillian Bolsover researches politics, citizenship and civil society in the modern world, focusing on the changes associated with new media technologies and on producing research that takes into account China and other non-Western populations.

Email: gillian.bolsover@oii.ox.ac.uk

Tel: 01865 (2) 87239

Gillian Bolsover is a Research Associate at the Oxford Internet Institute who researchers the effects of commercialisation, globalisation and new media and communication technologies on politics and civil society.  She completed a DPhil at the OII in January 2017, researching how the effects of the commercialisation of online spaces affect their ability to provide a venue for political speech in different political systems through a comparison of the US and China. Between January 2017 and January 2018, she worked as a Postdoctoral Researcher on the computational propaganda project, investigating bots, algorithms, misinformation and other forms of automated online political opinion manipulation.

Research interests

Political science; political and social theory; political economy; civil society and citizenship; critical methodologies; digital social research; computational social science; mixed-methods research; communications; sociology and culture; language and linguistics; identity and psychology; privacy, surveillance and censorship

Positions held at the OII

  • Research Associate, January 2018 –
  • Researcher, January 2017 – January 2018
  • DPhil Student, October 2012 – January 2017
  • Teaching Assistant, Digital Social Research Methods, October – December 2016
  • Teaching Assistant, Digital Social Research Methods, October – December 2015
  • Teaching Assistant, Information Visualisation, January – March 2015
  • Teaching Assistant, Digital Social Research Methods, October – December 2014
  • Teaching Assistant, Social Research Methods and the Internet, January – March 2014
  • Research Assistant, Global Centre for Cyber Security Capacity-Building, August 2013 – March 2014
  • Research Assistant, Global Internet Values Project, October 2012 – November 2013

Latest blog posts

Current projects

  • Computational Propaganda

    Participants: Professor Philip Howard, Sam Woolley, Gillian Bolsover

    This project will focus on how bots, algorithms and other forms of automation are used by political actors in countries around the world.

Past projects

Chapters

  • Bolsover, G., Dutton, W., Law, G. and Dutta, G. (2014) "China and the US in the New Internet World: A Comparative Perspective" In: Society and the Internet How Networks of Information and Communication are Changing Our Lives. Oxford University Press, USA.

Conference papers

  • Bolsover, G., Blank, G. and Dubois, E. (2014) A New Privacy Paradox: Young People and Privacy on Social Network Sites.
  • Bolsover, G. (2013) News in China’s new information environment: Dissemination patterns, opinion leaders and news commentary on Weibo.
  • Bolsover, G., Dutton, W.H., Law, G. and Dutta, S. (2013) Social Foundations of the Internet in China and the New Internet World: A Cross-National Comparative Perspective.

Journal articles

Reports

  • Bolsover, G. (2017) Computational Propaganda in China: An Alternative Model of a Widespread Practice. Oxford, UK: Computational Propaganda Research Project.
  • Dutton, W., Law, G., Bolsover, G. and Dutta, S. The Internet Trust Bubble: Global Values, Beliefs and Practices.

Other

  • Howard, P., Bradshaw, S., Kollanyi, B., Desigaud, C. and Bolsover, G. Junk News and Bots during the French Presidential Election: What Are French Voters Sharing Over Twitter?.
  • Howard, P., Bolsover, G., Kollanyi, B., Bradshaw, S. and Neudert, L. Junk News and Bots during the U.S. Election: What Were Michigan Voters Sharing Over Twitter?.
  • Bolsover, G. Constructing the virtual body.
  • Computational Propaganda: Bots, Targeting And The Future

    12 February 2018 NPR


    The December issue of the journal Big Data was dedicated to the problem of computational propaganda. In it, researchers Gillian Bolsover and Philip Howard, of the Oxford Internet Institute, define the dangers that need to be addressed.