Social network Facebook landed itself in incredibly hot water this week when it was revealed it intentionally manipulated 689,003 of its users’ emotions without permission or knowledge as part of a "science experiment."
The study, formally published in an article called "Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks," looks at a concept called emotional contagion. Facebook intentionally withheld positive status messages from a sizable portion of its users for a week just to see if it would make their own status messages less positive.
But while the actual effects of the study on participants’ psyches were minimal – "people produced an average of one fewer emotional word, per thousand words, over the following week," explains Facebook Data Scientist and experiment designer Adam D.I. Kramer – news that the study even happened is causing far sharper negative emotions across the Internet and beyond, with independent experts calling foul.
Most real scientific experiments that deal with psychology are performed under strict ethical guidelines for good reason, especially when the true nature of said experiment needs to be withheld from its subjects for the duration. American Psychological Association (APA) guidelines call for the “informed consent” of participants and for the true nature of the study to be revealed to those involved “as early as is feasible," neither of which seem to have happened.
"The study harmed participants," says University of Maryland Law Professor James Grimmelmann. "This is bad, even for Facebook."
Yesterday, Facebook responded to the growing concern over its psychological experiment with statement.
"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account," the statement reads. "We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely."
Kramer, meanwhile, apologized "for the way the paper described the research and any anxiety it caused" in a public status update on Facebook that largely defends the study. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
It’s not ethical to play with peoples’ heads without their knowledge. But that’s almost exactly what Facebook did in the name of science, in the name of PR and in the name of better learning how to tweak your news feed to make it more engaging. Its tone-deaf response to the scandal seems to suggest that the company doesn’t understand what the big deal is, raising legitimate concern over whether the social network will simply repeat its mistake.
Sadly, there’s no Facebook privacy setting that will allow you to opt out of future psychological research being performed on you in secret by the social network – the company believes you’ve already given consent as part of the site’s Data Use Policy. The only way to truly avoid becoming a guinea pig is to permanently delete your Facebook account.
[Concerned woman with tablet via Shutterstock]