By Drew Stein, Co-Founder and CEO, Audigent
All of the new technologies being touted as answers to the loss of the third-party cookie have forced marketers, agencies, publishers, and tech companies into camps. Each has their preferred method for collecting data, and each swears it is the most privacy-centric, consumer-friendly aggregation methodology. But what many either don’t see (or won’t admit) is that consumer data privacy is not just about data aggregation — it’s as much about how data gets activated.
To build a truly consumer-safe, privacy-centric advertising ecosystem, we need to talk about and make informed decisions around data activation. There are still many ways in which data can be gathered in a consumer-friendly manner, but in several cases, privacy-focused aggregation masks activation practices that fail to take consumer privacy into account. The inability to focus on the activation component allows bad data privacy practices to proliferate, which creates a problem for all involved.
Privacy-Centric Aggregation, Failing in Activation
There are a few issues at work here. While the work on privacy-compliant data collection is worthwhile, this part of the process has been turned into a marketing and PR competition, where everyone has been led to believe there can only be one true winner. That is simply not the case.
The truth is that there are many methods that can be used to collect and action data while championing consumer privacy: contextual, first-party data, hashed email, cohorts, clean rooms and probabilistic fingerprinting are all effective, evolving, and hold promise for being consumer-friendly. Of course, the third-party cookie is not gone yet, either.
There has been plenty of conversation weighing the pros and cons of these methods, alongside consumer consent. What’s unsaid is that it’s possible to aggregate data in a privacy-friendly fashion and then activate it in a way that does not actually protect consumer privacy. In fact, just the opposite.
Consider the hype surrounding contextual data. Companies with a horse in the race have been seeding the market with the idea that contextual data is the most privacy-safe way of aggregating insights, and therefore, it is the best option for a consumer-focused ad ecosystem. On the surface, contextual data is focused on page attributes and is deviceless, so it can be easily aggregated in a privacy-centric manner.
The catch is that when this contextual data is broadcast unencrypted into the bidstream, any party with access can grab that data and tie it back to a deterministic identifier that can be applied cross-channel or cross-device. All of a sudden, that contextual data, gathered in a privacy-centric manner, has been activated in a way that’s not privacy-safe at all.
What’s Good, and What’s Bad?
Meanwhile, consider an aggregation tactic like probabilistic fingerprinting, which has been shaded by the likes of Google and Apple. Unlike deterministic identifiers, anyone who has ever actioned a fingerprint can attest this methodology is firmly probabilistic, and the margin for error can be pretty wide.
That said, this method can be incredibly effective at building privacy-safe cohorts. When encrypted, compressed, encoded and converted into a cohort before it’s used to target ads, this method can actually be one of the more privacy centric methodologies.
Whatever the methodology, the point here is the good or bad rap each method gets is promoted by companies with other agendas and alternative technologies who have turned method into the target. It’s another case of “doing what’s right for the consumer” being used as an excuse for a revenue-driving business practice.
Two Steps for Protecting the Consumer
Want to actually protect consumers? The ad industry needs to follow two rules.
The first is that all data must be aggregated in a privacy-compliant, consented fashion. This means fully permissioning at the point of collection and/or point of use, with universal opt-outs. Fortunately, the industry appears to be on the right track here.
The second, equally important rule covers activation. In order to maintain data privacy, the industry needs to ensure security around what gets applied to the bidstream and who can access and read it so that insights cannot be siphoned off.
Unfortunately, the industry is going in the opposite direction here. The third-party cookie was certainly imperfect, but it was functional, and a huge part of that function was its encryption. The ad industry cannot move from encrypted identifiers to decrypted methodologies that broadcasts data to other partners, and it needs to seriously limit who has access to (and who can purchase) bidstream data.
Encrypting, encoding, cohorting, and compressing both deterministic and probabilistic identifiers before they get into the bidstream protects consumers and publishers from data leakage. When the only platforms that have access to this information are the DSPs, SSPs and publishers with the right decryption credentials,then we are closing an important loophole and heading down the right path as an industry.
As the industry waves goodbye to the cookie over the next two years, the conversations have to move beyond vilifying one data collection methodology while championing another and focus more holistically on actionability as well.
Especially as the people squabbling are only creating more breathing room for the unhealthy data practices of the silos and behemoths (where consumers lose the most), it does us no good to have identity solutions fighting for airtime and PR on this basis alone. There are clean ways of aggregating data, and unclean ways. And there are privacy-centric ways of activating data, and there are methods that do not uphold consumer privacy at all.
It’s unclear if the ad industry is unaware of the activation component of the privacy conversation, or simply turning a blind eye. Regardless, things need to change, otherwise the industry at large will continue to fail consumers when it comes to privacy and silos will continue to reign in their own best interests.
To learn more go to: