Facebook testing encroaches on privacy

In Fall of 2013, there was news that a scientific paper was being submitted based on a social experiment using Facebook. In the following months, the media poured out that many were outraged, considering this experiment completely unacceptable. No Facebook users were explicitly made aware that this study was being conducted, and The New York Times summed up the feelings of numerous people in one sentence by saying: “To Facebook, we are all lab rats” (New York Times, “Facebook tinkers with users’ emotions,” 6.29.14).

The published study “Experimental evidence of massive-scale emotional contagion through social networks” (2014) examines whether positive or negative posts in one’s Facebook newsfeed affects the user’s own behavior in terms of the content the user posted. The study states that the experimenters manipulated 689,003 people’s Facebook newsfeeds for a total of one week (January 11-18, 2012) in which over 3,000,000 posts were analyzed by certain key words deemed as positive or negative and the amount the key words were used by Linguistic Inquiry and Word Count Software.

The results from the study did suggest that emotions expressed in one’s newsfeed influences the user’s own mood and that emotional contagion is present within social networks and online presence.

However, for many just because a study has statistically a significant result doesn’t just justify the lack of notification about the study. Many complained that they never consented to be a part of this study. The authors quickly responded that just by using Facebook, you actually did consent to be apart of this study and to have your data monitored by the server. The authors noted in their paper, “[The work] was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” (Forbes, “The Facebook experiment: what it means for you,” 8.4.14).

Many were not okay with being duped or tricked by the fine print of what one agrees to when making a Facebook page, but a lot of people were also not surprised. A Vassar student ’15 says she felt “uncomfortable with Facebook conducting these experiments, but not surprised in the least,” and said she will continue to use Facebook.

Brian Blau, a technology analyst with Gartner, a research firm that has studied this incident thoroughly, put it nicely when he said “Facebook didn’t do anything illegal, but they didn’t do right by their customers. Doing psychological testing on people crosses the line.”

Balu is right on the money. A major detail to note about this study is that fact that participants were not able to opt out because they were not told when the experiment was occurring or what it entailed. Yes, users agreed to the terms of service for Facebook, but I am disappointed that someone in the scientific community did not put a stop to this experiment for ethical reasons. Robert Klitzman, a professor of psychiatry and director of the Masters of Bioethics Program at Columbia University, shares this view and places blame on the universities that are affiliated with the research: Cornell and the University of California at San Francisco (CNN: Opinion, “Did Facebook’s experiment violate ethics,” 7.2.14).

A privacy activist Lauren Weinstein took to Twitter with her outrage over how this experiment was conducted, with shaky consent and keeping information from forced participants, because they couldn’t opt out: “I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s possible,” which led to an apology from leading researcher, Adam D.I. Kramer.

So was the outcry after worth the hassle of receiving a significant finding? Kramer suggests it wasn’t stating, “In hindsight, the research benefits of the paper may not have justified all of this anxiety” (New York Times, “Facebook tinkers with users’ emotions,” 6.29.14). Yet, even with this acknowledgment, Facebook is continuing to run studies.

They have recently “revamped” their policy regarding experiments, however none of the revamping involves providing information to the users. The new guidelines say that Facebook researchers must get sensitive projects approved by a new committee and that Facebook will better educate their employees about research policies. For users, they simply had to agree to Facebook’s “consent for research” which was added after the controversy of “Experimental evidence of massive-scale emotional contagion through social networks” (Yahoo, “Facebook promises to review user experiments more carefully,” 10.3.14). While I disagree with Facebook’s research methods, much of the public has been informed of what Facebook is doing and despite the disgust with the behavior, is continuing to log on.

Honestly, I hope someone from the scientific community steps in and makes sure that if Facebook is continuing to run experiments that may cause a severe impact to someone’s well-being, participants are fully consenting and have the option to opt out of the research. However, Facebook is by no means the first Internet company to compile data for research purposed without a users’ full knowledge. Google and Twitter for instance are also repeat offenders (The Wall Street Journal, “Furor erupts over Facebook experiment on users,” 6.30.14). However, as far as I could find, neither of those companies have tried to attempt to manipulate users’ emotions. Facebook promises to do better with research ethics, and I hope they keep their word.

—Delaney Fisher ’15 is a neuroscience major.

Leave a Reply

Your email address will not be published. Required fields are marked *

The Miscellany News reserves the right to publish or not publish any comment submitted for approval on our website. Factors that could cause a comment to be rejected include, but are not limited to, personal attacks, inappropriate language, statements or points unrelated to the article, and unfounded or baseless claims. Additionally, The Misc reserves the right to reject any comment that exceeds 250 words in length. There is no guarantee that a comment will be published, and one week after the article’s release, it is less likely that your comment will be accepted. Any questions or concerns regarding our comments section can be directed to Misc@vassar.edu.