OII Professor Helen Margetts discusses how the massive growth in Internet-mediated interactions creates a need for innovative methods to research online activity. Experimental laboratories — where subjects participate in games or information-seeking tasks on networked computers — have been used by experimental economists for some time, but the great expansion in online social and commercial activity means that they have growing utility in sociology and political science.
Experiments – or more technically, Randomised Control Trials – are the most exciting thing on the UK public policy horizon. In 2010, the incoming Coalition Government set up the Behavioural Insights Team in the Cabinet Office to find innovative and cost effective (cheap) ways to change people’s behaviour. Since then the team have run a number of exciting experiments with remarkable success, particularly in terms of encouraging organ donation and timely payment of taxes. With Bad Science author Ben Goldacre, they have now published a Guide to RCTs, and plenty more experiments are planned.
This sudden enthusiasm for experiments in the UK government is very exciting. The Behavioural Insights Team is the first of its kind in the world – In the US, there are few experiments at federal level, although there have been a few well publicised ones at local level – and the UK government has always been rather scared of the concept before, there being a number of cultural barriers to the very word ‘experiment’ in British government. Experiments came to the fore in the previous Administration’s Mindscape document. But what made them popular for Public Policy may well have been the 2008 book Nudge by Thaler and Sunstein, which shows that by knowing how people think, it is possible to design choice environments that make it “easier for people to choose what is best for themselves, their families, and their society.” Since then, the political scientist Peter John has published ‘Nudge, Nudge, Think, Think, which has received positive coverage in The Economist: The use of behavioural economics in public policy shows promise and the Financial Times: Nudge, nudge. Think, think. Say no more …; and has been reviewed by the LSE Review of Books: Nudge, Nudge, Think, Think: experimenting with ways to change civic behaviour.
But there is one thing missing here. Very few of these experiments use manipulation of information environments on the internet as a way to change people’s behaviour. The Internet seems to hold enormous promise for ‘Nudging’ by redesigning ‘choice environments’, yet Thaler and Sunstein’s book hardly mentions it, and none of the BIT’s experiments so far have used the Internet; although a new experiment looking at ways of encouraging court attendees to pay fines is based on text messages.
So, at the Oxford Internet Institute we are doing something about that. At OxLab, an experimental laboratory for the social sciences run by the OII and Said Business School, we are running online experiments to test the impact of various online platforms on people’s behaviour. So, for example, two reports for the UK National Audit Office: Government on the Internet (2007) and Communicating with Customers (2009) carried out by a joint OII-LSE team used experiments to see how people search for and find government-internet related information. Further experiments investigated the impact of various types of social influence, particularly social information about the behaviour of others and visibility (as opposed to anonymity), on the propensity of people to participate politically.
And the OII-edited journal Policy and Internet has been a good venue for experimentalists to publicise their work. So, Stephan Grimmelikhuijsen’s paper Transparency of Public Decision-Making: Towards Trust in Local Government? (Policy & Internet 2010; 2:1) reports an experiment to see if transparency (relating to decision-making by local government) actually leads to higher levels of trust. Interestingly, his results indicated that participants exposed to more information (in this case, full council minutes) were significantly more negative regarding the perceived competence of the council compared to those who did not access all the available information. Additionally, participants who received restricted information about the minutes thought the council was less honest compared to those who did not read them.
Note: This post was originally published on the Policy & Internet blog on . It might have been updated since then in its original location. The post gives the views of the author(s), and not necessarily the position of the Oxford Internet Institute.