Social

Making Sense of Facebook’s Two Latest Algorithm Changes

Back in 2014, Facebook decided enough was enough: It was time to crack down on the clickbait plaguing user’s News Feeds.

Their explanation was simple. People may click on articles with baiting headlines (think of stories like “You’ll never what guess Facebook did to its News Feed. MINDBLOWING!”), but, according to an internal Facebook survey, they actually prefer stories with headlines that “helped them decide if they wanted to read the full article.”

Now, in 2016, Facebook has decided enough is enough again: Clickbait has to go. But for real this time.

It’s obvious that previous efforts to purge the News Feed of clickbait had largely failed. Previously, Facebook used the tell-tale sign of a bounce to determine if a post was clickbait: Basically, if people clicked on a story and then almost immediately switch back to the Feed, Facebook interpreted that behavior to mean the post had clickbaited the user.

This time, however, it’s using an algorithmic filter “similar to how many email spam filters work,” according to a blog post announcing the change. This filter looks for common phrasing and words used in clickbait headlines, and then punishes the posts—and the Pages that share them—by bumping them down in people’s News Feeds.

In other words, if you’re a publisher or a brand using obnoxious clickbait headlines on Facebook, it’s time to either figure out the next loophole or change up your headline format. If you want an audience that actually respects your brand, you should already be doing the latter.

If you’re worried that you might fall into the clickbait camp, Facebook gave some specific examples:

We’ve heard from people that they specifically want to see fewer stories with clickbait headlines or link titles. These are headlines that intentionally leave out crucial information, or mislead people, forcing people to click to find out the answer. For example: “When She Looked Under Her Couch Cushions And Saw THIS… I Was SHOCKED!”; “He Put Garlic In His Shoes Before Going To Bed And What Happens Next Is Hard To Believe”; or “The Dog Barked At The Deliveryman And His Reaction Was Priceless.”

In other words, if Upworthy is your spirit animal, it’s probably time to reevaluate your life.

But seriously, Facebook really, really wants you to inform people

A week after Facebook announced the clickbait crackdown, the company came out with yet another tweak to the News Feed. This time, however, it was much more vague. Facebook said that it will begin rewarding posts that users find “personally informative.”

What exactly does Facebook mean by informative, you ask? That’s a great question, and one the company doesn’t really answer. Per Facebook’s announcement post:

One of our News Feed values is that the stories in your feed should be informative. What makes someone feel informed about the world is personal. Something that one person finds informative may be different from what another person finds informative. This could be a news article on a current event, a story about your favorite celebrity, a piece of local news, a review of an upcoming movie, a recipe or anything that informs you.

Yes, “anything that informs you” is what Facebook defines as informative content.

What the change really boils down to is Facebook adding a new signal to its algorithm that uses its Feed Quality Program—basically a global survey that asks a small fraction of users questions about their News Feed—that will help determine if a post is informative. It will then combine that signal with people’s previous Likes and engagements to determine if a user would find a post personally informative. If they do, that post will appear higher in their Feed.

For brands and publishers, it’s yet another reminder that Facebook will continue to enforce its News Feed values. The tech giant wants a certain level of quality within its core product. So—much like Google does with its strict SEO rules—the company is telling brands and publishers to either fall in line or get lost.

Image by Getty Images
Tags: , , ,