You have /5 articles left.
Sign up for a free account or log in.

Tens of thousands of academics and other observers of Internet life who did not know the name Adam Kramer on Sunday night certainly now it now.  But on the chance you’ve been “off the grid” for the past 24 hours, Adam Kramer is the Facebook data scientist who served as a the lead author on a research project that manipulated the positive and negative information in the Facebook News Feed to assess the emotional impact positive and negative news on some 690,000 Facebook users. 

To quote Mr. Kramer and his co-authors from the abstract of their work, recently published in the Proceedings of National Academy of Sciences in the United States (PNAS), their work tested “whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.”

A short, January 2011 profile of Kramer published by the American Psychological Association reveals that he earned his doctorate in social psychology at the University of Oregon. Based on his graduate experience at a prominent research university, which presumably included some exposure to the ethics and appropriate protocols of (and informed permission for) research involving human subjects, ya gotta wonder how this particular Facebook project got past the design phase without someone (anyone?) expressing concerns about human subject issues.  

As many observers have already noted, its one thing for Facebook to apply “big data” analytics to unobtrusive, transactional data; it’s quite another to “manipulate the environment” (i.e., mess with the News Stream on a Facebook account) to assess the emotional impact of the manipulation on some 690,000 Facebook users.

But while much of the uproar about Facebook’s inappropriate manipulation of human subjects has been  (appropriately!) directed at Kramer and his co-authors, missing from the commentary I’ve found on the Web thus far is any mention of the role of the (academic?) reviewers who read the manuscript and ultimately recommended it for publication by the National Academy of Sciences..  (Note: Forbes reports that researchers at Cornell passed on reviewing the final paper, although Cornell researchers did help design the study.)

Perhaps like you, esteemed reader, I just don't get it. Conventional (and consensual) wisdom suggests that the Facebook study failed two (count 'em, TWO!) reviews related to human subjects research protocols. The first failure was the work of the design/research team which, given their university training and presumed experience with institutional review boards (IRBs), should have known that the methodology -- manipulating the News Feed without consent -- was inherently problematic. And the second failure seems to be that of the editors and reviewers at PNAS who apparently failed to take Mr. Kramer and his colleagues to task on human subject issues when they reviewed the submitted paper and subsequently recommended it for publication.

In his commentary on this episode of Facebook’s newest assault on user privacy, NY Times Bits blogger Mike Isaac writes that “Sometimes, being wrong on the Internet means having to say your sorry. And by now, Facebook is very, very good at saying sorry.”   

Yet as my 10 year old son said to me one day, many years ago, when I tried to apologize for some parental sin of commission or omission, “sorry isn't good enough.”

Sample Media Coverage of the Facebook Furor:  New York Times, NY Times blog, NY Times Op Ed by Jaron Lanier, Wall Street Journal, Forbes,  Time Magazine (which includes Kramer’s explanation and apology from his Facebook page), Meet a Facebook Data Scientist (March 2012), American Psychological Association profile (Jan 2011)

Next Story

More from Digital Tweed