If you live in a busy city, more than likely you’re being pegged by ads as someone you’re not.
Wait for the bus every night next to a Mexican restaurant? Ad data probably thinks you are a burrito addict. A female recent college grad living in Harlem? Marketing data probably thinks you’re a low-income single mother.
It’s not the pretty picture that data-driven was supposed to be. As discussed at Ad Age‘s Data Conference and in several recent articles, high levels of personalization—the practice of delivering ads tailored for a specific user based on rich sources of tracking data — is coming under scrutiny for missing the mark.
The cold, hard truth, it turns out, is that even the most creepy-sounding forms of personalization is usually not that personal at all.
So explained Ad Age‘s data reporter, Kate Kaye, and ProPublica’s Julia Angwin at the aforementioned Ad Age Data Conference. Both turned themselves into guinea pigs to evaluate the effectiveness of ad data tracking and gage as to get an idea of just how real privacy concerns actually are.
Kaye let five data-gathering firms track her movement for an article in exchange for access to their analysis of her. She wanted to involve even more companies, but many were reluctant, and the results from the data tracking firms who did agree to work with her revealed a likely reason: They just weren’t very good. Most of the time, they had her profile all wrong, from her income to her housing situation. “My trackers found me only sporadically and couldn’t always shadow me in my home city of Jersey City, N.J., which is apparently the Bermuda Triangle of data collection,” wrote Kaye.
Kaye, who didn’t reveal her ethnicity in her article, was pegged as a medley of different races representative of her neighborhood. In some cases, trackers got her movements and locations wrong enough to think she visited neighborhoods she didn’t. Some of the trackers thought she visited places she simply walked by.
To be sure, the tracking was correct sometimes. But often it was when she directly shared her data to apps, some of which she only used for the the sake of the experiment. So there was only minimal advantage there for the trackers.
ProPublica’s Angwin had a similar experience when she tried to go completely data-tracking free. “I stopped using Google search; I used all sorts of ad-blocking technology; I was maybe 50 percent successful,” she said on a panel during Ad Age‘s conference.
She, too, found that while the data brokers knew a lot about her, they were often largely unsuccessful at piecing it together into a meaningful whole. It’s one thing to see that you visited a website and send targeted ads afterwards. It’s quite another to pull in a range of offline, highly personalized data signals into an accurate composite profile.
“I was from Harlem, a single mother, low-income, with low education attainment,” said Angwin, a Columbia University MBA recipient with a celebrated career working in media. She seemed troubled in wondering, in her words, “how a business could be built around something so verifiably wrong.”
But Angwin had some solutions to offer. She believes all the difficulty with data is actually an opportunity for putting more power into consumers hands. “I feel slightly reassured by that bad data,” she said, explaining the privacy relief. “But at the same time, I feel slightly disturbed by that.”
Advocates for better data practice have some ideas for how data should be handled in ways that it’s not now. There’s a slowly emerging consensus, expressed in recent books like Dataclysm, that data can be for good or evil—and either way it’s going to influence behavior. For that reason, it’s imperative consumers have a more active say in how it’s used.
Kaye’s experiment showed the power companies with first-party data access—companies like Facebook or Amazon with which you share your data directly—have in shaping people’s decision-making. Unlike the data-tracking ad tech startups confusing your jaunt past a CVS with a purchase decision, companies like Amazon have a trove of actual purchase receipts and browsing history to build an accurate consumer profile. “Google, Facebook, and Amazon are formidable players in the consumer data arena,” wrote Kaye.
That being the case, there is something alarming about not even being able to own or participate in the gathering of your own data, especially when bad data can negatively affect you. In fact, Kaye explained, when she sought her own data from brokers, they made her sign a release form.
Angwin has been pushing for more rapport between the data collectors and the collected. “I wanted to be in the data economy,” she said. “But I wanted assurances I currently don’t have. I wanted to be able to see the data. I wanted to know when it is used to use a decision against me. I want to be able to transact.”
Data has the potential to be both a consumer’s best friend, offering up deals and anticipating needs, and its foe, leading him or her to unwanted products. That consumers have so little say in this whole affair, and that companies aren’t enlisting them to improve the data collection process overall, is strange, since everyone stands to benefit. You don’t have to be a privacy spook to see the logic in that.