Americans have long been ignoring European data protection law, but it has not been ignoring us. Last year’s so-called “right to be forgotten” case from the EU’s highest court let people remove links about themselves from Google’s search results — and regulators insist that the links must disappear from U.S. search results, too. A ruling last week from the same high court closes off one of the main legal channels for European data to flow to the U.S. Now a new EU-wide regulation is nearly final, and it will set the rules for these and other data protection questions for years to come. It’s time to pay attention.
The new law is the General Data Protection Regulation. It does a lot of good things for Internet users — giving us the right to extract our data from one service to migrate to a competing one, for example. And it does a lot of things that — like the “right to be forgotten” — seem odd to Americans but align with the general European perspective on privacy and government regulation.
But the GDPR also takes these European norms and, to an unprecedented degree, exports them to the rest of the world. This has serious implications for free expression in countries like the U.S. that take a different approach to privacy and speech. And it imposes a weighty new regulatory burden — including product engineering and design requirements — on innovative, small or mid-sized companies around the world.
Given how long the GDPR has been in the works — nearly four years, with finalization expected in December — it’s remarkable how unclear some important parts still are. This may be a blessing in disguise, since it leaves room to negotiate with regulators or litigate in hopes of better outcomes. But better outcomes are not especially likely. My summary here reflects the law as understood by many — probably a great majority — of EU privacy regulators and experts. It is, therefore, the privacy law we will likely have to live with.
Imagine that a European businessman is accused of fraud in the U.S., and reaches a six-figure settlement with the Federal Trade Commission. Potential new clients or partners search for his name online, but can’t find it on Google’s EU search services, because he invoked his right under EU law to make Google take it down. This is the state of the law right now, and this example appears to be real.
Here is how that scenario can get a lot worse, even under existing law. First, Google could be compelled to suppress those same search results for users all over the world, not just users on its EU services. That’s what the French data protection regulator, CNIL, says current law requires. CNIL is in a public fight with Google about it right now. Second, our hypothetical businessman could take his demands to content hosting platforms like Twitter or Facebook, asking them to remove users’ posts about his past. Since those companies provide content-hosting platforms, rather than Web search indexes, it’s not yet clear if they need to do as he asks. But many thoughtful experts think the answer is yes: Platforms would have to delete those posts, or at least keep users from searching to find them on the site. (For once, Facebook’s lack of a decent search feature may be an advantage.) Those deletions, too, would have to be global, according to CNIL and many other EU privacy experts.
New rules that will go into effect with the GDPR make the global-removal-from-all-big-platforms scenario even more troubling. The regulation doesn’t really answer the question of whether global deletion is required, or whether Facebook and Twitter have to delete things, too. But it does give privacy regulators the right to impose enormous new fines — in some provisions, up to 5 percent of global turnover — for noncompliance. So, regardless of what courts think the law actually is, if EU regulators decided the news stories about our businessman’s fraud settlement needed to go away, they could be very persuasive.
I hope the companies would continue to fight for their users’ rights to find and share information in the face of removal demands like this, and I believe they could eventually win in European courts if they did. But it’s much harder to expect companies to take a stand on users’ behalf with that much money on the line. Simply removing the content is by far the easiest path.
The GDPR also brings the “right to be forgotten” deletions issue to far more companies, and far more Internet users. It used to be that only companies with some kind of establishment or equipment in Europe fell under EU privacy regulation. Those days are ending. The new law applies to anyone “offering” services or “monitoring” users in the EU. “Monitoring” appears to include everything from targeted advertising to basic content customization — like when IMDB recommends a movie based on what you looked at before. So European regulators could force foreign companies to delete user comments, and even their own published content.
In our fraud example, that would be a real-life New York Times article about the businessman and the FTC. Regulators probably would not ask for that removal, because of special treatment for journalism under the law. But it’s not impossible.
Using the same “monitoring” hook, the regulation would apply to, say, a Japanese gaming website that tracks user scores, once any Europeans start using the site. Or Genius. Or I Can Has Cheezburger.
Extending complex regulation to small or medium-sized companies with little connection to Europe is the other big problem with the GDPR, a problem that directly affects technology design and innovation — and investment. As I said, the regulation tracks European values with respect to privacy and government regulation. That means it imposes significant design and back-end engineering requirements on regulated companies in order to protect user privacy. Companies must build means for users to access and delete stored data, avoid logging or retaining data not necessary for the product’s initial purpose, squeeze new notices into the product UI, and perhaps launch products with all privacy options turned to the most restrictive settings — even if that means the product works poorly and loses users. It also means that companies must lawyer up for a whole new regulatory relationship, with requirements that include designating in-EU privacy representatives and providing regulators eight weeks notice — and opportunity to object — before launching certain products or features.
All this is a drag on innovation by the big players, who already have battalions of privacy lawyers and longstanding relationships with regulators. It is a much bigger obstacle for companies just getting big enough to attract users and regulatory attention in Europe. Some may choose expensive redesign work; others may simply cut off EU users. In either case, promising new technologies will face yet another barrier to adoption and commercial success, and users will face another barrier to promising new technologies.
The sad truth is that it is probably too late to change many of the GDPR’s problems. There may be room for improvement at the margins, but even that won’t happen unless affected Internet users, technologists, companies, industry associations and civil society groups raise a fuss. Soon.
Daphne Keller is the director of intermediary liability at the Stanford Center for Internet and Society. She was previously associate general counsel for intermediary liability and free speech issues at Google. In that role, she focused primarily on legal and policy issues outside the U.S., including the EU’s evolving “right to be forgotten.” Her earlier roles at Google included leading the core legal teams for Web search, copyright and open source software. Keller has taught Internet law as a lecturer at U.C. Berkeley’s School of Law, and has also taught courses at Berkeley’s School of Information and at Duke Law School. Reach her @daphnehk.
This article originally appeared on Recode.net.