Tech Made Simple

Hot Topics: How to Fix Bluetooth Problems | How to Cut the Cable Cord | Best Fitness Trackers Under $50 | Complete Guide to Facebook Privacy

Top News Stories

author photo

Facebook: We're Not Responisible for the News Feed Political Echo Chamber

by on May 08, 2015
in Facebook, News, Computers and Software, Blog, Social Networking :: 0 comments

Handshake with American flag backdropFacebook is no stranger to complaints over how its News Feed algorithm affects political discourse in this country. After all, your News Feed is designed to prioritize content based on what you’ve liked, clicked and shared in the past. That means diehard conservatives don’t see much content from liberal sources, while left-leaning Facebookers don’t see a lot of right-leaning content.

But is this really the fault of Facebook’s algorithm? On Thursday, the social network published a study in the journal Science that suggests its algorithm isn’t to blame for the one-sided partisan nature of your News Feed – you are.

"This is the first time we’ve been able to quantify these effects," explains Eytan Bakshy, the Facebook data scientist behind the study. "You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that’s not the case here."

Specifically, the study shows that the Facebook News Feed algorithm does filter out content from sources you’re not likely to agree with politically, but at a relatively low rate. Approximately 1 in 20 hard news stories from liberal sources were filtered from conservatives’ News Feeds; 1 in 13 hard news stories from conservative sources were filtered from the News Feeds of Facebook users who self-identify as liberal. But even if you are shown content from a source you’re likely to disagree with, the Facebook study reveals that you’re not likely to click to read it anyway. Conservatives read 17% of news articles from liberal sources they see; liberals only click 7% of news articles from conservative sources.

"Compared to algorithmic ranking, individuals’ choices about what to consume had a stronger effect limiting exposure to cross-cutting content," Bakshy concludes.

It’s not surprising to hear that we harbor part of the blame for the one-sided political nature of our News Feeds. As humans, we’re more likely to interact with those we agree with than those we don’t. And, of course, it’s not unusual for people to block friends based on their political views.

Already, though, critics are hammering away at the Facebook study. Christian Sandvig of Microsoft Research New England’s Social Media Collective blog has termed it the “It’s Not Our Fault” study, suggesting that Facebook is spinning the results. "There is no scenario in which ‘user choices’ vs. ‘the algorithm’ can be traded off, because they happen together," he writes. "Users select from what the algorithm already filtered for them. It is a sequence."

Overall, yes, Facebook is right – when it comes to limiting our exposure to differing views online, we are our own worst enemy. But at the same time, this study shows that the Facebook News Feed algorithm compounds this problem by making it even harder to see content we're likely to disagree with. Hopefully, Facebook will expand its pilot News Feed customization settings to give us users the ability to make these filtering decisions on our own.

[Handshake with American Flag backdrop via Shutterstock]



Discussion loading

© 2015 Techlicious LLC. Home | About | Meet the Team | Sponsorship Opportunities | Newsletter Archive | Contact Us | Terms of Use | Privacy Policy

site design: Juxtaprose