Facebook tinkered with users’ feeds for a massive psychology experiment
By William Hughes
Jun 27, 2014 3:30 PM
Scientists at Facebook have published a paper showing that they manipulated the content seen by more than 600,000 users in an attempt to determine whether this would affect their emotional state. The paper, “Experimental evidence of massive-scale emotional contagion through social networks,” was published in The Proceedings Of The National Academy Of Sciences. It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can! Which is great news for Facebook data scientists hoping to prove a point about modern psychology. It’s less great for the people having their emotions secretly manipulated.
In order to sign up for Facebook, users must click a box saying they agree to the Facebook Data Use Policy, giving the company the right to access and use the information posted on the site. The policy lists a variety of potential uses for your data, most of them related to advertising, but there’s also a bit about “internal operations, including troubleshooting, data analysis, testing, research and service improvement.” In the study, the authors point out that they stayed within the data policy’s liberal constraints by using machine analysis to pick out positive and negative posts, meaning no user data containing personal information was actually viewed by human researchers. And there was no need to ask study “participants” for consent, as they’d already given it by agreeing to Facebook’s terms of service in the first place.
Facebook data scientist Adam Kramer is listed as the study’s lead author.
In response to the concerns about the published study, Adam D. I. Kramer posted on his Facebook page the following proclamation on June 29, 2014:
OK so. A lot of people have asked me about my and Jamie and Jeff’s recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.
Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.
And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.
The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.
While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper. — in Floyd, VA.
In an interview the company released a few years ago, Kramer is quoted as saying he joined Facebook because “Facebook data constitutes the largest field study in the history of the world.” It’s a charming reminder that Facebook isn’t just the place you go to see pictures of your friends’ kids or your racist uncle’s latest rant against the government—it’s also an exciting research lab, with all of us as potential test subjects.
I have decided to add hyperlinks to Facebook, to the paper in question, to the main author of the paper, to the interview mentioned in the paper.I believe that these links will make life easier for a person trying to investigate and understand the issues that are mentioned in this article, bringing the person to the sources of this paper.
I have added the text of the Facebook post that Adam Kramer published following the publication of the paper
I have also used bold font to highlight passages of the paper that I felt may be of particular interest to the reader, namely the passages concerning “Facebook data use policy” which may cause concerns about users privacy.
After reading the sources I felt that the paper could use more quotes from the sources and address the topic in more detail to the benefit of the reader. By not clicking on the links the reader may miss on many important details that were not mentioned in the original short paper.