Facebook conducts emotion experiments linked to military research on political opposition

By Don Barrett
8 July 2014

A study published on June17 in the Proceedings of the National Academy of Sciences revealed that over 689,000 users of the social media site Facebook were tapped as subjects in a psychological study conducted without their knowledge or permission. The paper, titled “Experimental evidence of massive-scale emotional contagion through social networks,” was authored by Adam Kramer of Facebook and Jamie Guillory and Jeffrey Hancock of Cornell University.

The unwitting subjects had their feed of status updates from “friends” within the website altered as compared to a control group within the sample, so that posts containing “positive” or “negative” emotionally laden words were omitted. The posts of the subjects were themselves monitored for such words, to see whether they would exhibit imprinting of the emotional states conveyed by the censored reality presented to them. The paper concluded that “emotions spread via contagion through a network."

Outrage rapidly spread across the Internet when it became clear that personal thoughts and descriptions were being inspected and altered as part of the study. Kramer attempted to quell the furor through a Facebook posting of his own. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” he wrote. “At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

These professed concerns are belied by the publication record of the authors. Jeffrey Hancock of Cornell has received funding from the Pentagon-created Minerva Research Initiative, a funding source which seeks to “improve the ability of DoD to develop cutting-edge social science research” through the ability to “define and develop foundational knowledge about sources of present and future conflict.” 

Hancock’s Minerva-supported research “Modeling Discourse and Social Dynamics in Authoritarian Regimes” analyzes communications with an eye to being able to “predict socially significant states, such as leadership, status, familiarity of group members, personality, social cohesion, deception, and social disequilibrium. This research is expected not only to advance the social sciences but also to address key national security questions that require the processing of large amounts of textual communication.” His own faculty description page specifies his academic interests as work in “psychological and interpersonal dynamics of social media, deception, and language.”

Hancock’s collaborator Michael Macy at Cornell has enjoyed the same Minerva funding in what is an obvious militarizing of social science. His work seeks to trace the “critical mass (tipping point)” of social contagions through their “digital traces,” giving as examples such events as the 2011 Egyptian revolution, the 2011 Russian elections, and the 2013 protests in Turkey. Among the goals is “to identify individuals mobilized in a social contagion and when they become mobilized.”

 Other Minerva supported work directed by Stephen Kosack at the University of Washington “seeks to uncover the conditions under which political movements aimed at large-scale political and economic change originate, and what their characteristics and consequences are.” The scale of this initiative is hardly modest: it is intended to extend previous work on 23 countries to 58 countries, using “216 variables” to offer predictions. Data mining is clearly occurring on an enormous scale.

Hancock, who served on Cornell's Institutional Review Board for Human Participants from 2009-2011, can hardly claim ignorance of the ethical responsibilities towards the use of informed and willing participants in psychological research. Indeed, Cornell's own guidelines for research state, “All research that involves human participants, regardless of the source of financial support, must be reviewed and either exempted from IRB review or approved by the Institutional Review Board (IRB) before it can be initiated.”

The Washington Post has reported that the Facebook study was not pre-approved by the Cornell IRB, though the board has since claimed that “no review by the Cornell Human Research Protection Program was required” because “Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data.” Furthermore, although Facebook has since added blanket permissions for “research” to its fine-print user agreement, this was not present when the research was actually conducted in 2012.

The American Psychological Association guidelines for human research identify several essential elements of informed consent. They require subjects be informed of the purpose of research, duration, procedures, their right to decline and to withdraw once participation has begun, the consequences of declining or withdrawing, the limits of confidentiality of the study, any incentives for participation, any prospective research benefits, any potential risks, discomfort, or adverse effects, and whom to contact for questions about the research and the rights of participants.

Such “experiments” take place on a daily basis on the Internet. Occasionally the window is opened so that the magnitude of this data collection becomes visible. Max Schrems, a 24-year-old law student in Austria, spent six weeks and 23 e-mails in 2012 navigating one of the few successful demands for a data-dump on the information which Facebook had collected regarding his use. The dump comprised 1,222 printed pages. Facebook maintains records not only on the particulars of an individual's use, but on their “private” chat messages, the various users who share the same IP address and physical computer account, and the list of machine addresses used to access FB by each participant.

In 2008, the American Anthropological Association (AAA) complained to the US government regarding Project Minerva, noting that such funding posed “a potential conflict of interest” and that it may “undermine the practices of peer review that play such a vital role in maintaining the integrity of research in social science disciplines.” Adopting in part the AAA recommendations, the DoD partnered with the National Science Foundation (NSF) to cooperate on Minerva management and grant-making, giving the Pentagon, as the AAA noted in a subsequent critique, “decision-making power in deciding who sits on the panels.” They further wrote of “concerns that the program would discourage research in other important areas and undermine the role of the university as a place for independent discussion and critique of the military.”

Fight Google's censorship!

Google is blocking the World Socialist Web Site from search results.

To fight this blacklisting:

Share this article with friends and coworkers