While data privacy is the topic du jour in many tech and marketing circles, its applications can be tricky barring the existence of a standardized, universal definition. A precise definition, paired with a set of principles that balance consumer demands with the business needs of advertisers, developers and publishers, will help our societies scale privacy while supporting business growth, writes Joshua Koran, executive vice-president of data and policy at adtech firm Criteo.
Data breach scandals and the many antitrust investigations of dominant digital platforms have increasingly propelled the topic of privacy into headlines. Various browsers have made global headlines with ‘privacy’-branded initiatives. Yet, despite this attention, few have stepped back to define what ‘privacy’ actually means.
While privacy remains at the forefront of industry conversations, particularly surrounding the future of addressable media, the time is now to develop a more precise understanding of what ‘privacy’ really is in order to ultimately improve the transparency, auditability and accountability of our digital society.
Of course, this is no simple task, and each person may argue that their perspective is the right one. Yet most people would generally agree that risks related to privacy increase as more sensitive personal data is collected about our lives and shared beyond our control. We must collectively work together as an industry to redefine privacy so as to transform digital advertising, increase protections for consumers and improve both the diversity of content available and competition across the ad-funded web.
A multiplicity of definitions
To start, let’s acknowledge that the concept of privacy is multifaceted. We use the same word today to describe five very different sets of protections:
Identity and freedom from theft of identity
Seclusion and freedom from intrusion
Confidentiality and freedom from public disclosure
Private property and freedom from misappropriation
Personal data protection with remedies for violations
While the first four are related primarily to possession, the final point relates to interactions with both individual actors as well as organizations – essentially, in any interpretation, a person’s privacy relates to how privacy functions in action.
In line with this thinking, privacy regulations such as the EU’s General Data Protection Regulation and the California Consumer Privacy Act are focused in part on definitional aspects of what privacy is (e.g. identifiers and information), but primarily on how privacy functions (e.g. situational interactions among people and what to expect when rights are violated).
There is also much to be learned from the privacy practices of tech companies such as Apple and Google, both of which have updated their policies over the past few months. For example, both Google and Apple provide exemptions for ‘first parties,’ meaning that tracking user activity is not prevented, but smaller publishers and app developers are simply forbidden from sharing this data with the partners on whom they rely to operate and grow their businesses – otherwise known as ‘third parties.’ Regulators are scrutinizing the ‘corporate ownership’ exemption inherent in this distinction.
Balancing consumer privacy with marketers’ needs
These various, often conflicting definitions of privacy have contributed to mistaken perspectives on the role that third-party cookies play, resulting in multiple proposals with alternative solutions. However, by studying both recent US and European privacy regulations as well as Apple’s and Google’s own definitions of their privacy practices, we can identify five common principles that can be used to ensure the appropriate balance between protecting consumer data privacy rights and ensuring friction-free access to the ad-funded open web.
1. It matters whether identifiers are unique, random identifiers or linked to identity
The dividing line of whether personal data is pseudonymous or has been properly de-identified lies in distinguishing whether ‘identifiers’ are anonymized or can be linked directly to a specific user.
2. It matters whether personal information is sensitive or not
By default, sensitive categories of personal information should not be used for digital advertising, given that they expose people to greater risk of a substantive life impact and require enhanced notice and opt-in-based consent prior to use.
3. An organization should be able to share personal information with its partners, so long as it is appropriately anonymized
Even the world’s largest corporations need to work with other organizations and share data to operate and grow their business. Smaller organizations must rely on supply chain partners to an even greater extent. Yet care should be taken to ensure the data shared is ‘de-identified’ and does not directly identify an individual.
4. Even for ad-funded services, users should have choice over receiving personalized advertising
While access to ad-funded digital content and services requires that users be exposed to advertising, they should be given the choice whether or not to receive personalized advertising. For example, Apple’s own advertising policy states: “If you do not want to receive ads targeted to your interests from Apple’s advertising platform in these apps, you can choose to disable Personalized Ads.”
5. Tracking, especially for digital advertising, should be defined by whether personal information is linked to identity
Data collection and processing required for digital advertising does not need to be linked to individual users’ identity. Re-engaging existing customers is often an expected part of customer relationship management that may use identity-linked data to communicate – whether through direct mail, email, phone or another channel. But prior to linking digital activity to identity, users should be given the choice of whether to opt in.
The value of choice in data privacy and protection
Meaningful choice depends on the availability of multiple options from which to choose. In contrast, merely swapping which B2B advertising software logo collects and processes personal data to monetize the sites that users frequent does not improve privacy – especially when such data is nonsensitive and linked to random identifiers. While digital wallets and data portability are valued concepts, these trust-based models fail to give users appropriate choice over whether or not to keep their identity separate from their digital activity.
Moreover, we should all have greater choice over the diversity of content and services online – an ideal achievable only with robust competition. The alternative would be a return to the walled gardens of the past, such as AOL, CompuServe and Prodigy, from which the current version of the world wide web freed us.
Flourishing competition, however, depends on interoperable data and standards for its transmission that afford smaller organizations the same efficiencies enjoyed by larger rivals. The Partnership for Responsible Addressable Media, the Interactive Advertising Bureau, Prebid, the World Wide Web Consortium and many other organizations are actively working on the next generation of open-source web architecture.
By proactively engaging in these discussions today, we can be sure to build solutions that respect consumer privacy, while also supporting choice for marketers, media owners and, most importantly, people.
The use of personal data online must serve the interests of a free society. For competition and choice to exist, the web must remain decentralized. The internet, like the telephone before it, has annihilated time and space in human interactions. As we progress in our new information age, we must improve not only transparency and accountability, but choice. In short, privacy must scale.
Joshua Koran is executive vice-president of data and policy at Criteo.