The Content Strategist

How Facebook’s Filter Bubble Warped My Perception of Reality

I couldn’t believe Donald Trump kept winning. Every primary he won, headline he weaseled his way into, and incendiary statement he made, it all seemed unbelievable. The group reaction seemed to be: No one believed this could happen.

Liberals, moderates, and even conservatives siding against Trump watch his meteoric rise with dropped jaws. They shouldn’t. But it’s understandable why they do.

I am a young feminist woman living in New York City, who recently graduated from New York University with a music degree. My closest circle of friends consists of a multicultural group full of different sexual orientations and monetary backgrounds. I am so far removed from the Trump voting base that I couldn’t tell you one friend voting for him. Millions of people are, but if you looked on my social media feeds, you wouldn’t think so.

Why? Because Facebook’s algorithm—and the algorithms of every other social network—has gotten frighteningly intelligent. The content personalization is so selective that if I based my worldview on the information I got from Facebook, I’d assume that nobody was voting for Trump, Bernie was going to sweep the primaries, and all elections were rigged.

Content personalization is an extremely useful tool that helps companies provide consumers with the content and products they should like. Behind Facebook’s manipulation of political discourse is the same logic that makes sure great movie and TV recommendations show up on the Netflix homepage. But sometimes it can go too far.

Facebook’s newsfeed algorithm is designed to focus on stories that will interest each individual user based on the friends they interact with and the links they click on. In a way, this system mimics life outside the internet, where we get to choose who we spend time with and where we focus our attention. But that’s the problem. On the internet, which is stocked full of new information every day, users only get exposed to such a small fraction of all the content out there.

A few weeks ago, Mark Zuckerberg met with a group of high-profile conservative to discuss the accusation that Facebook manipulated its Trending Topics section to blacklist conservative news stories. At the end of the meeting, some conservatives left with assurance that Facebook would address these concerns. But fixing Trending Topics won’t fix the underlying problem: Algorithms are essentially censoring information to enhance user experience.

In a TED Talk, Upworthy co-founder and chief executive Eli Pariser called these censored online spaces “filter bubbles.” Online experience can get so specialized that all information that contradicts your values will be edited out. The scary part is you’ll never see what’s been erased.

“There is no standard Google anymore,” Pariser said. “You can’t see how different your search results are from anyone else’s.” Every time we think we’re getting the full picture, even in a theoretically objective space such as search engine, we’re using a manipulated piece of technology bent more on customer satisfaction than truth.

According to FiveThirtyEight, Donald Trump has been leading the Republican polls throughout the entire election. But on the Facebook feeds of people like me, voices of Trump supporters have been silenced. Or perhaps muted is a better way to put it. I’ve read a few articles about who supports him and why, but I had to search for them. Without putting in the extra effort to educate myself, Facebook would just continue reinforcing my biases. And if I wasn’t already clicking on political articles, my News Feed would be utterly devoid of national news. In a democracy, we decide our representation based the opinions of the population.

How can we have a conversation if we can’t even hear other voices?

My biases are reinforced enough by the friends I spend time with, my job, my city. I make choices regularly based on my values, building a life on principles I consider important. But adhering to my principles and reading only about my values are two very different things. In order to be truly informed, we need to see the whole story, or as much of it as possible. I don’t want to have my world blindly reinforced. I want to know what’s going on the other side of the digital wall.