In its scramble to advance ad targeting and measurement methods that work without third-party cookies, Google has been accused of throwing lots of barely-cooked concepts at the wall to see what sticks. But one idea intended to protect people’s privacy — Google’s Privacy Budget — is winding its way into several proposals.
Technologists, including some representing companies that make web browsers competing with Google’s popular Chrome browser, say the browser-based Privacy Budget concept is “vague” and “not useful.” Yet, despite its inchoate state and potential dismissal by other browsers, the method has been added not just to specs for digital ad techniques but for a variety of potential web standards. Meanwhile, if adopted by Chrome, the Privacy Budget could actually create privacy harms or, if poorly implemented, could disable some of the web’s standard functionality.
Like boilerplate at the end of a press release, a statement mentioning the Privacy Budget concept turned up recently in specifications for proposed tech that would enable services like Zoom to determine how best to format video for a conference call or for Netflix to choose the most appropriate format for movie streaming. The specs acknowledged the potential for misuse: the streaming tech could identify people via fingerprinting by helping distinguish someone’s device from another. So, as a possible mitigation for that privacy infringement, Google engineer Chris Cunningham, a co-editor of the spec, suggested that web browsers “may implement a ‘privacy budget.'”
“The Privacy Budget is kind of poisoning the space,” said Pete Snyder, senior privacy researcher and director of privacy at browser maker Brave. Snyder described what he said is a “constant” flow of statements mentioning Privacy Budget that have been inserted into specs for proposed technologies he has reviewed as chair of the Privacy Interest Group at the Worldwide Web Consortium (W3C). The group provides guidelines and advice for addressing privacy considerations in web standards specifications like the aforementioned proposed streaming technology. The Privacy Budget has also appeared as a protective panacea in specs for font-related technology and tech that helps websites choose optimal media content for people.
Designed as a defense against fingerprinting, the Privacy Budget is a browser-based technique intended to place limits on “fingerprinting surfaces,” or distinctive characteristics associated with someone’s device — such as installed fonts or configurations that help a streaming codec choose the best video format to use — that can be pieced together to detect or assign someone’s identity. In an effort to prevent fingerprinting, the Privacy Budget restricts the amount of those distinctive characteristics that a technology can access or detect.
Google engineers have pointed to the Privacy Budget as a way to reduce fingerprinting potential in its Privacy Sandbox tools since at least as far back as 2019, when it got a shout-out in early specs for FLoC, a highly-scrutinized ad targeting method the firm itself acknowledges could help identify people through fingerprinting. Google included the method way at the bottom of a list of privacy-protecting tech it plans to launch as part of its Privacy Sandbox initiative, giving it a wide 2023 launch window.
“Privacy Budget is an early-stage proposal designed to protect people from fingerprinting, a problem we believe is critical to solve holistically as the web evolves,” a Google spokesperson told Digiday for this story.
Privacy solution: details TBA
The problem is, even though Privacy Budget is mentioned in multiple proposed tech specifications as a possible safeguard for privacy abuses, it is far from ready for prime time. “[We want to] make sure [we’re] looking at mitigations we can ship today,” wrote Sam Weiler, an MIT security engineer during a meeting in May of the consortium’s Privacy Interest Group, expressing a concern commonly raised about the lack of details about how the Privacy Budget would be implemented.
“The Privacy Budget is more aspirational than a concrete proposal,” said Eric Rescorla, CTO at Mozilla’s Firefox, another of Chrome’s competitors. Although he said it is common for preliminary ideas to be included in collaborative specs as technologies are in development, he added, “I’m a little surprised it’s showing up in a lot of places because as an implementer I don’t know what to do with that text.”
Another member of the Privacy Interest Group, Konrad Dzwinel, an engineer at browser maker DuckDuckGo (another Chrome competitor), told Digiday in an email, “Google only shared a vague definition of the privacy budget idea so far, so we don’t have many thoughts about it. We do think that fingerprinting is an important problem to solve, we’re working to address it in our products, and are waiting for Google to share more details about their idea.”
“As with all Privacy Sandbox proposals we will continue to get feedback through the open and iterative process and provide resources for developers to test and integrate in advance to help ensure a smooth transition to a more private web,” said the Google spokesperson.
Disrupting standard web functions = ‘developer hell’
In general, technologists including privacy tech researchers Digiday spoke with about the Privacy Budget, as well as others commenting in developer forums, said it is not clear how it would be implemented without some serious negative side effects. For instance, it could disable technologies used to identify whether people have logged into a website, essentially disrupting what’s called session persistence by preventing sites from recognizing when a unique device has already logged in. One tech researcher Digiday spoke to who asked not to be named said that, if Chrome were to be updated without proper implementation plans made clear to site developers and ad tech providers, it could break a variety of standard web functions including recognizing logged-in users. “That’s basically what will happen simultaneously across the internet,” said the researcher.
Without more clarification, enabling the Privacy Budget in Chrome could amount to what Snyder called “developer hell.” He told Digiday, “It’s hard to get people to write code that says ‘check to make sure functionality is available every second.’ It seems like a non-starter from a developer perspective.”
And at this point, it does not appear that Google has addressed what to do when it sets off errors. When asked during the May W3C meeting whether Chrome would generate error messages if Privacy Budget limits run out, Google’s Cunningham wrote, “I don’t know what that would look like.”
Snyder and others also have said the technique could actually enable new privacy harms because prior browser behavior could be revealed by the amount of “budget” someone has remaining. “The way you spend your budget is in itself a unique identifier, which is ironic three times over,” he said.
But ultimately, said Rescorla, technical specs should come with workable approaches to limiting fingerprinting or other privacy infringements baked in. He said, “People should not fool themselves into thinking that, if the spec otherwise has privacy issues, that this text fixes those issues.”
The post Google’s vague privacy cure-all is showing up in new proposals, but some say it could break the internet appeared first on Digiday.