News >
OMG’s Marc Rossen Kicks off Adweek’s Look at Privacy Compliance in Clean Rooms
Advertisers Get Ahead of Potential Privacy Hiccups in Data Clean Rooms
Privacy regulators are getting more muscular
For all marketer excitement about the capabilities of data clean rooms, there has been a growing misconception that the data within them is automatically privacy safe. While there haven’t (yet) been any high-profile data breaches connected with clean rooms, some in the ad industry are trying to limit the likelihood that there will be, and are putting steps in place to continue building confidence among buyers and sellers around data sharing.
“Using a data clean room doesn’t automatically translate to total privacy compliance,” said Marc Rossen, svp, investment and activation analytics at OMG. “There’s a lot of considerations that go into privacy compliance and data sharing.”
There are several areas to uphold data privacy within clean rooms that buyers and sellers should be aware of, particularly around moving away from hashed email address matching, getting consented user data, pushing for transparency over privacy-enhancing technologies (PET) and mitigating risks with overlapping data.
“There’s an increasing amount of debate whether the anonymization process in cleanrooms truly does anonymize data,” said a global media executive, who requested anonymity to freely discuss industry relations. As a result, some marketers are increasingly becoming wary of using data clean rooms, the executive said.
Compounding the matter is that there are a number of different clean rooms available with different processes, including independent firms like Infosum and Habu and those built by walled gardens, like Google’s Ads Hub. The latter group has run into trust issues with marketers who are voicing frustration with the limitations and compulsion to use walled data clean rooms. And the concern is growing that clean rooms will be targeted by regulators since user consent for using and passing customer data is increasingly under scrutiny, as shown by recent fines from Sephora and Kochava.
“The biggest risk of using clean rooms is it will come under increasing regulatory scrutiny,” the media exec said.
PETs replace hashed email matches
Currently, the encryption used to translate first-party data into the format it can accurately be read in the clean room isn’t effective, and the process leading up to that encryption is poorly managed, according to the media exec.
This includes gaining peoples’ consent as well as both parties agreeing on a common encryption method.
The IAB Tech Lab mentions 11 PETs. Among them is multiparty computing—a substitute for hashed email matching—that ensures sales or ad exposure data never gets transferred to either party. Hashed email address matching is not considered best-in-class for privacy compliance. Clean rooms are required to deploy one or more of these PETs. For example, Infosum uses multiparty computing alongside homomorphic encryption and pseudonymization, among others.
“Clients should ask their agencies and clean room providers what the PET standards are and what’s the approach taken to utilize those standards,” said Rossen. “Without that transparency, we’re just going back to a place of a black box.”
Data leakage risks
Along with its data clean room standards, the IAB Tech Lab launched the Open Private Join and Activation (OPJA) specification to address interoperability within clean rooms. The goal is to find overlapping audiences between buyer and seller data sets and provide a framework to enhance that audience activation without transferring PII between the buyers and sellers.
However, the IAB Tech Lab points to multiple scenarios, albeit not nefarious, where overlapping audiences could lead to information leakage in a data clean room.
In one such case, an advertiser may perform multiple successive matches with a publisher using OPJA, taking special care to insert and remove an individual PII match key records and observe the outputted match rate to determine whether the added or removed record is present in the publisher’s inputted records. Matching system designers could introduce noise or minimum thresholds to the match rate results, mitigating the effects of this in practice.
There’s also a lack of due diligence among marketers and their data collection who digress from data minimization—a core tenet for data security within clean rooms. These include collecting data around sensitive attributes such as age, gender, race and income.
Not only does this create matches in a way that becomes discriminatory, as seen in the 2021 Facebook case, but for brands operating in categories like pharmaceutical or healthcare, it’s not worth the reputational risk of matching data within clean rooms, the media exec said.
“What advertisers need to understand is that while the tool itself may be secure, ultimately, a clean room is just a tool to enable data collaboration,” said Arielle Garcia, chief privacy officer at UM Worldwide. “Advertisers still need to make sure that the appropriate disclosures and permissions, like offering and honoring opt-out requests, are in place.”