Tech Made Simple

Hot Topics: How to Fix Bluetooth Pairing Problems | Complete Guide to Facebook Privacy | How to Block Spam Calls | Snapchat Symbol Meaning

We may earn commissions when you buy from links on our site. Why you can trust us.

author photo

Facebook Performs "Unethical" Psych Experiment on 700,000 Users

by Fox Van Allen on June 30, 2014

Concerned woman with tabletSocial network Facebook landed itself in incredibly hot water this week when it was revealed it intentionally manipulated 689,003 of its users’ emotions without permission or knowledge as part of a "science experiment."

The study, formally published in an article called "Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks," looks at a concept called emotional contagion. Facebook intentionally withheld positive status messages from a sizable portion of its users for a week just to see if it would make their own status messages less positive.

It did.

But while the actual effects of the study on participants’ psyches were minimal – "people produced an average of one fewer emotional word, per thousand words, over the following week," explains Facebook Data Scientist and experiment designer Adam D.I. Kramer – news that the study even happened is causing far sharper negative emotions across the Internet and beyond, with independent experts calling foul.

Most real scientific experiments that deal with psychology are performed under strict ethical guidelines for good reason, especially when the true nature of said experiment needs to be withheld from its subjects for the duration. American Psychological Association (APA) guidelines call for the “informed consent” of participants and for the true nature of the study to be revealed to those involved “as early as is feasible," neither of which seem to have happened.

"The study harmed participants," says University of Maryland Law Professor James Grimmelmann. "This is bad, even for Facebook."

Yesterday, Facebook responded to the growing concern over its psychological experiment with statement.

"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account," the statement reads. "We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely."

Kramer, meanwhile, apologized "for the way the paper described the research and any anxiety it caused" in a public status update on Facebook that largely defends the study. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."

It’s not ethical to play with peoples’ heads without their knowledge. But that’s almost exactly what Facebook did in the name of science, in the name of PR and in the name of better learning how to tweak your news feed to make it more engaging. Its tone-deaf response to the scandal seems to suggest that the company doesn’t understand what the big deal is, raising legitimate concern over whether the social network will simply repeat its mistake.

Sadly, there’s no Facebook privacy setting that will allow you to opt out of future psychological research being performed on you in secret by the social network – the company believes you’ve already given consent as part of the site’s Data Use Policy. The only way to truly avoid becoming a guinea pig is to permanently delete your Facebook account.

[Concerned woman with tablet via Shutterstock]

 


Topics

Internet & Networking, News, Computers and Software, Health & Fitness, Blog, Facebook, Privacy, Social Networking


Discussion loading

gravatar

From Patricia Rochester on June 30, 2014 :: 3:20 pm


Apparently Mr Zuckerberg assumes that if you are using his website you are fair game for him to do whatever he wants whether it’s right or wrong.

Reply

Home | About | Meet the Team | Contact Us
Media Kit | Newsletter Sponsorships | Licensing & Permissions
Accessibility Statement
Terms of Use | Privacy & Cookie Policy

Techlicious participates in affiliate programs, including the Amazon Services LLC Associates Program, which provide a small commission from some, but not all, of the "click-thru to buy" links contained in our articles. These click-thru links are determined after the article has been written, based on price and product availability — the commissions do not impact our choice of recommended product, nor the price you pay. When you use these links, you help support our ongoing editorial mission to provide you with the best product recommendations.

© Techlicious LLC.