Social Media Research: When Epistemic Cultures Collide


On June 17th the Proceedings of the National Academy of Science (PNAS) published a paper, Experimental evidence of massive-scale emotional contagion through social networks, describing a large-scale experiment on Facebook carried out in 2012. The first author, Adam Kramer, is a Facebook employee, with the second and third authors academics from University of California and Cornell University. The experiment described in the paper used sentiment analysis to filter the positive or negative emotional content in 689,000 Facebook users’ news-feeds. Since the paper was published it has generated a storm of interest online and in the mainstream media, most of which has been highly critical. For example, Clay Johnson, founder of the Blue State online media organisation tweeted: “In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook ‘transmission of anger’ experiment is terrifying.” Reaction has combined a deep-rooted suspicion of Facebook with more pointed criticisms that Facebook, and particularly their academic collaborators, were unethically experimenting on human subjects. Michelle Sayer puts forward an analysis that the academics involvement in the study was probably compliant with the Federal Research Regulations, but highlights the gap between the ethical requirements on state funded academics to ensure avoidance of harm and gain informed consent, and the much laxer requirements on commercial firms.

The PNAS paper authors claim that subjects gave their consent because the experiment was “consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook”. The use of A/B tests, where the user base is split and two interface or service designs are tested to find out empirically which one works is endemic on the web. It forms a central pillar in most companies claiming to follow Eric Ries’ Lean Startup model. What the discussion around the PNAS paper highlights are the different meanings of “research” and “consent” between web companies and academics. Following the sociology of Karin Knorr-Cetina, these are two epistemic cultures: “amalgam(s) of arrangements and mechanisms—bonded through affinity, necessity and historical coincidence—which in a given field, make up how we know what we know”. The arrangements and mechanisms of academic research include the processes and rules of ethics committees, while those for Facebook include a more liberal interpretation of “consent” and the panoply of the company’s performance indicators. The cultures overlap where they are both trying to theorise the use of technology, but diverge on the purpose of that understanding.

As academics increasingly want to analyse “big data” to answer there research questions, these conflicts will become more frequent and possibly more significant. A whole academic cottage industry has developed around the analysis of social media data, but the basis for arguing that the data subjects in this research have given their consent is almost invariably their acceptance of the platform’s Terms of Service. These issues led to an interesting discussion at the University of Edinburgh’s Digital Day of Ideas following a presentation by Professor Procter of Warwick University on Big Data and the Co-production of Social Scientific Knowledge, which covered studies in the COSMOS project anlaysing large-scale Twitter data.

In their paper Tweeting the terror: modelling the social media reaction to the Woolwich terrorist attack the COSMOS researchers automatically coded the sentiments in 427,330 tweets. The paper does not address the ethics of whether the tweeters had consented to this analysis. That this is an analysis of personally identifiable data is elided in the paper when it says: “Number of followers, followees and total number of previous tweets were extracted from the streaming API metadata as social features of the tweets.” However Number of followers, followees and total number of previous tweets are primarily characteristics of the tweeter. The research description plays down the classification of individual subjects, but it is implicit that the project is developing tools to classify individuals.

This blurring of the gap between developing applications within research council funded projects following academic ethical standards and the use of the applications in ways that would fall foul of academic research ethics is clear in a final COSMOS grant report last year: “During the project we established research connections with key industry and public sector partners, these included: Google UK; EADS Innovation Works; Fujitsu and High Performance Computing Wales; Sage Publications; Office for National Statistics; Cabinet Office Office for Cyber Security and Information Assurance and the Identity Assurance Programme; Home Office Business Intelligence and Shared Services Programme; College of Policing; Metropolitan Police Service; Association of Chief Police Officers; UK Data Service; and the Welsh Government Equality, Diversity and Inclusion Division. EADS Innovation Works recently funded a research project in which we are applying our tension detection tool to detect hate speech in local communities in Wales. If the project is successful EADS intend to license our tension detection engine and include it in their WebLab application that is used by National Governments and large corporations worldwide. Fujitsu and High Performance Computing Wales have recently provided funding to integrate, test and host the COSMOS platform on scalable computing infrastructure, making it a sustainable resource for academics in the future. Meetings are also on-going with Sage Publications (London and Washington DC) in relation to distributing the COSMOS platform in the North American HE sector. We are also working with the Metropolitan Police Service on the development of a crime and disorder detection engine which will extend the tension detection tool. Later this year the Metropolitan Police Service is entering a procurement exercise to acquire a new social media monitoring solution. COSMOS has been identified as a potential solution and we were invited to submit an application.”

Leave a Reply

Your email address will not be published. Required fields are marked *