Bot Traffic May Be Ruining Your Content Strategy, and Google Knows It
On July 30, Google announced a new bot filter, alpha tested by Nestlé, that allows Google Analytics users to exclude known bots and spiders from traffic statistics. If you’re a publisher—particularly a brand publisher—you should pay attention.
Bots just may be ruining your content strategy.
Bot traffic—including scrapers, hackers, spammers, impersonators—has been estimated to be as high as 61 percent of all traffic, according to a 2013 report by Incapsula. The sheer enormity of web traffic coming from bots not only hurts the ability of publishers to accurately measure the success of their content, but also their ability to plan for the future. A swarm of bots could, in some circumstances, give you the false sense that a particular tactic is working and lead to poor editorial and business decisions.
For traditional publishers, bot traffic can provide an upside by inflating pageview numbers and ad revenue, but for brand publishers, they’re nothing but a plague. After all, smart brand publishers are primarily concerned with building relationships with actual humans and tracking how those relationships develop, and bots mess that up. To use another metaphor, it’s like having a bunch of androids show up to your pep rally; suddenly, the crowd goes from 90 percent engaged to 70 percent lifeless.
“If you use [Google] Analytics to make decisions, then you want the data as clean as possible,” explains Andy Crestodina, co-founder and strategic director at Orbit Media Studios. “That means filtering out any traffic that aren’t your visitors. Almost every site has Analytics set up to filter out traffic from their own office. Filtering out bot traffic is similar. The better the data, the better the Analytics and the better your decisions will be.”
This begs the question: If false traffic is a big issue, and Google prides itself on accurate Analytics data, why has it taken them so long to implement a solution?
“I think that the Google Analytics team is responding to the fact that more and more bots are out there executing code, when it was not normal practice a few years ago,” says Yehoshua Coren, founder and principal of Analytics Ninja.
Google’s bot filtering solution is far from perfect. The update only filters out some non-human traffic, excluding known bots and spiders from the Interactive Advertising Bureau list, which is updated monthly. However, the list isn’t entirely comprehensive, and only includes a fraction of bots and spiders.
Luckily, there are other ways to spot bots, says Coren.
“With a quick look, it’s pretty easy to identify abnormal traffic,” Coren says. He typically looks for high bounces and direct traffic in the Google Analytics dashboard, and finds that it usually it comes from just one city and/or one ISP domain. He looks for traffic spikes coming through direct traffic, which is how bots typically show up. Before Google offered this new feature, he would simply create a filter or use advanced segments to exclude specific cities or domains.
That kind of next-level navigation of Google Analytics may sound daunting, but it’s likely necessary. You can’t rely on Google to get rid of all the bots for you; you need to be your own John Conner.
Contently arms brands with the tools and talent to become great content creators. Learn more.Image by rivalius13