Facebook's emotion experiment angers users kept in the dark

  1. Some people are angry with Facebook's apparent lack of ethics after the social network teamed up with scientists and knowingly attempted to alter the emotional states of nearly 700,000 users.

    Social scientists at Cornell University and the University of California, San Francisco — along with Facebook — decreased either the number of positive or negative content that appeared in a user's news feed. 

    They were looking for something they called the "emotional contagion" effect — basically whether less positive stories would make users more negative and vice versa.

    The results were published in the journal PNAS in an issue published June 17.
  2. Online reaction quickly condemned the study as unethical, worrying about the impact it may have had on people suffering from mental illness such as depression.

    They also expressed discontent that the subjects of the study did not appear to have given informed consent.
  3. Using humans in social science research where conditions are manipulated ethically requires informed consent from the test subjects. This may have been legal, but most definitely not ethical.
  4. Facebook's data use policies — though many people complained they are hard to find — state the company is allowed to use information it receives about users for what it deems internal operations, "including troubleshooting, data analysis, testing, research and service improvement."

    As public outrage grew, one of the study's authors, Adam D. I. Kramer, released a public statement on his Facebook page, which he said was also to be considered Facebook's statement on the issue.
  5. In the statement, Kramer explained the scientists' position, saying they were intrigued by Facebook's emotional impact on people.

    The sample size was small, with only .04 per cent of users or one in every 2,500 Facebook accounts, and the study lasted only one week, he wrote.

    "Our goal was never to upset anyone," he goes on. "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

    He adds that the study took place in 2012 and the company has improved its internal review practices since then.

    Some people quickly agreed with him, noting that Facebook users should delete their accounts if they do not wish to participate in similar research.
  6. I really don't see why people are freaking out about this. Facebook has manipulated which posts you see in your feed for years (maybe a decade). The only difference with this time is that they published a paper and advanced psychological understanding in the process.If you want an unfiltered view of what your friends are doing, look you the news ticker in the upper corner of the page.
  7. You don't want to be an experiment like this? Don't use Facebook. You agreed when you signed up.
  8. I love the fact that people want to slam Facebook over such a tiny thing but they don't speak up about the blatant use of far worse tactics to drive sales or manipulate the customer experience. People who fear these things need to learn that all of there actions are optional. Don't like it don't do it, simple as that.
  9. Others were concerned to realize that, simply by having a Facebook account, they had agreed to participate in such internal research. They claimed Facebook's policies were too difficult to understand.
  10. One reason I was surprised about this study is that we didn't think we'd agreed to participation, but the license agreement says we did. It happens all the time, and it always sucks: There is disagreement between the legal situation described by legal agreements and a user's understanding as learned from user experience. For instance, my experience says "Kevin's Photos" are mine, but the terms of use might say (hell if I'm going to read them) that Facebook is showing *its own copies of* my photos. I think user experience should be designed to represent the legal truth.
  11. Despite appreciating Kramer's clarifications, many users were still angry over the fact that the company attempted to alter the emotions of certain users in the name of science.
  12. Amen to that. I appreciate the statement, but emotional manipulation is still emotional manipulation, no matter how small of a sample it affected.
  13. I think the entire Internet/media/online presence is an experiment in human behavior. we have no idea what it's doing to people- we are just starting to get it. it's not very pretty. however, while I don't hold corporations responsible for my part in it, I do feel manipulating my exposure and decreasing contact with people I *thought I'd purposefully included in my realm of contacts seems shady.I disagree with the power to manipulate emotions/ my mood or responses to the world are my choice, but FB deciding what world I see is bogus. IMHO. I don't watch murder movies for a reason & I block constant negativity as often as possible. I don't need someone else making that choice for me.
  14. I care less about my bank account being manipulated than my emotional state of being. I do see money and health as two separate items, but that's just me.
  15. And many others still had unanswered questions after reading Kramer's post. So, they commented hoping to get some more insight from him.
  16. Adam, can you tell us more about what the internal review process looks like? Do you weed out users under the age of 18? Are users notified in any way after a study that they were part of one? Is Facebook considering getting people to opt in to being part of psychological studies that involve active manipulation of their Facebook experience to see how it affects their mood?
  17. Did UCSF's or Cornell's Institutional Review Boards approve the study?
  18. But quickly grew dismayed when Kramer appeared unresponsive - despite having posted one answer prior informing people to consider his statement to reflect the company's stance on the situation.
  19. I also think the decision to post this defense, and then not to participate in the conversation — on Facebook! — that grew from it is... strange. Isn't that the point of the posting? To start the dialogue? You really want all these unanswered questions sitting here? Seems like an odd decision.
  20. One person attempted to clarify the situation, saying that Kramer is currently unavailable.
  21. Jay, Adam is currently traveling from the East Coast to the West Coast, so is likely unavailable for comment until he gets back online.
Read next page