As a digital privacy reporter, I try to avoid sites and services that invade my privacy, collect my data, and track my actions. Then the pandemic came, and I threw most of that out the window. You probably did, too.
I gave away tons of personal data to get the things I needed. Food came from grocery and restaurant delivery services. Everything else — clothes, kitchen tools, a vanity ring light for Zoom calls, office furniture — came from online shopping platforms. I took an Uber instead of public transportation. Zoom became my primary means of communication with most of my coworkers, friends, and family. I attended virtual birthdays and funerals. Therapy was conducted over FaceTime. I downloaded my state’s digital contact tracing tool as soon as it was offered. I put a camera inside my apartment to keep an eye on things when I fled the city for several weeks.
Millions of Americans have had a similar pandemic experience. School went remote, work was done from home, happy hours went virtual. In just a few short months, people shifted their entire lives online, accelerating a trend that would have otherwise taken years and will endure after the pandemic ends — all while exposing more and more personal information to the barely regulated internet ecosystem. At the same time, attempts to enact federal legislation to protect digital privacy were derailed, first by the pandemic and then by increasing politicization over how the internet should be regulated.
All things told, 2020 was a bust for digital privacy. But 2021 doesn’t have to be.
Privacy laws had momentum at the beginning of 2020
When 2020 began, there was reason to be optimistic that we’d get federal privacy legislation in the coming years, possibly even in this one. Though concerns have been brewing for some time, the 2018 Cambridge Analytica scandal marked a turning point in how many Americans (and lawmakers) viewed Big Tech, data, and tech’s impact on society.
The scandal involved a researcher at Cambridge Analytica, a political consulting firm that did work for the Trump campaign, who improperly accessed millions of Facebook users’ data. After this happened, Facebook CEO Mark Zuckerberg appeared before Congress, where he was grilled about user privacy. The Federal Trade Commission (FTC) ended up fining Facebook $5 billion for privacy violations. By the end of 2019, the vast majority of Americans felt they had no control over their data and were concerned about how it was being used, according to a Pew study. There was a bipartisan consensus that something had to be done.
Accordingly, Republicans and Democrats rolled out a slew of privacy bills, calling for jail time for tech CEOs, a new federal data privacy agency, and a legally enforceable internet version of the Do Not Call list. Senate Democrats released a framework for privacy laws, and Maria Cantwell (D-WA), ranking member of the Commerce Committee, came out with her own privacy bill in accordance with them. Privacy advocates approved.
Elsewhere, California rang in 2020 with the implementation of the California Consumer Privacy Act (CCPA), the strongest data privacy law in the country. Data companies, realizing where things were going, rolled out more privacy options to quell user fears and show that they were capable of regulating themselves. And seemingly everyone was disturbed by revelations that a company called Clearview AI scraped the internet for billions of public images to populate its facial recognition database, then tried to sell the service to law enforcement and private companies.
Then the pandemic hit, and Congress had more pressing concerns. At the same time, people needed the services that collect and use their data to carry out most aspects of their daily lives. (Essentially, anything you do that uses the internet is likely collecting your data in some way, and many of those services are monetizing it one way or another.) Some may not have been willing to trade their privacy for those services before. Now they had to.
The pandemic put more people online than ever, and their data followed
When stores closed and people were afraid to leave their homes, consumers turned to online shopping for groceries and other necessities. Restaurant delivery apps boomed as many restaurants went bust. Streaming services had a great year (except Quibi), taking the place of movie theaters and most other forms of entertainment.
School and work also moved online. Accordingly, employers turned to worker tracking software and schools turned to online proctoring services to monitor their employees and students from afar. Remote schools put children’s privacy at the mercy of edtech companies, some of which have spotty track records. Google’s school services — from Chromebooks to its Classroom software — added millions of young users.
People have turned to Zoom, only to find that the company hadn’t put a lot of thought into its privacy controls or cybersecurity. Zoombombing was easy and frequent, subjecting users to images of pornography and racism in the middle of their math classes and town meetings. Zoom claimed to offer end-to-end encryption; it didn’t. It also sent user data to Facebook and LinkedIn. The FTC wasn’t pleased, but the punishment for Zoom was basically a wrist slap.
Telehealth expanded significantly during the pandemic, offering patients a way to see their health care providers without having to risk visiting a physical office. The Department of Health and Human Services loosened HIPAA restrictions, temporarily allowing health care providers to use non-HIPAA-compliant services such as FaceTime, Facebook Messenger, and Skype to communicate with patients.
“There’s been some different incursions into people’s lives,” Jennifer King, the director of privacy at the Center for Internet and Society at Stanford Law School, told Recode. “I think that the real highlight is the lack of options. That isn’t, ‘I’m choosing to use Google for my searches over somebody else.’ It’s, ‘My school district has told me that we have to use Google Classroom.’ There’s no negotiating with them.”
Not only did people integrate more data collection and information exposure into their daily lives, but they were also told that this tracking could have public health benefits. Location data companies promoted their services as useful tools to track the virus’s spread or measure the effectiveness of social distancing as they tracked millions of people who were likely unaware that they were being tracked at all, let alone how. But it’s hard to say that any one of them did much good, given the pandemic’s mostly unchecked spread across the country.
The good news is that some of the more extreme and invasive privacy violations once feared at the beginning of the pandemic haven’t come to pass — yet.
“We don’t have widespread contact tracing apps that monitor our location or report data directly to the government,” Adam Schwartz, senior staff attorney at the Electronic Frontier Foundation (EFF), told Recode. “We don’t have immunity passports. And we don’t have GPS shackles or compelled phone malware for patients in home quarantine.”
But that’s not to say that some of those things won’t come to pass in the months ahead. As the vaccine rolls out, we may yet see those immunity passports, and companies are lining up to offer health verification programs to airports, offices, and concert venues.
America’s selective resistance to digital tracking and surveillance
Ironically, the one way that tracking has been proven to help slow or stop the spread of disease — contact tracing — hasn’t been effective during the pandemic because the majority of Americans won’t do it. (Neither will the Trump White House.) Giving up health information for the sake of helping other people is apparently the line some people won’t cross.
Manual contact tracing efforts have floundered due to a lack of resources to implement them and Americans’ reluctance to participate. Digital contact tracing tools were initially seen as a possible savior, but adoption rates have been low. Apple and Google’s unprecedented joint effort to create an exposure notification tool with seemingly decent privacy protections (to the point that public health officials criticized them for not providing enough data) hasn’t come to much. States were slow to roll their apps out, and then people didn’t want to use them.
Over the summer, the George Floyd protests put a spotlight on how police abuse their power, including using surveillance technology like facial recognition. Some companies stopped working with law enforcement at the height of anti-police sentiment, though how long those moratoriums last remains to be seen. Some cities and states put forward measures and laws prohibiting their use. Proposed federal legislation, on the other hand, hasn’t gone anywhere.
There’s also been some increased scrutiny on how law enforcement obtains data. Multiple reports have detailed how police simply purchase location data from data brokers instead of bothering with a warrant. Some lawmakers have pushed for regulations to stop this, and the subsequent discovery that some of this data came from a Muslim prayer app caused a great deal of outcry. That data was obtained by X-Mode, one of the many data location companies that promoted itself as a tool to fight the coronavirus earlier this year. Apple and Google have since banned from their stores apps that use X-Mode’s tracking code. But they haven’t done a thing to all the other apps that use trackers planted by other companies.
Regulating Big Tech has become increasingly politicized, which could further delay legislation
While some lawmakers did continue to sound the alarm about privacy throughout the year, especially with regard to issues raised by the pandemic, the focus on how to regulate the internet seems to have shifted away from privacy laws and toward curbing the power wielded by Big Tech companies. Generally, the left and the right have differing ideas of how to do this. Democrats are looking at using antitrust laws to break the companies up, while Republicans hope to take away immunity protections that allowed those companies (and the internet as a whole) to prosper.
Many Republicans have recently taken up the cause of repealing Section 230 to fight what they see as a greater or more immediate Big Tech evil: censorship of conservative voices by too-powerful and liberal companies. Section 230 gives internet platforms immunity from what their users post on them, and it’s necessary for Facebook, Twitter, YouTube, and countless other, smaller, sites to exist. Both sides have their issues with Section 230, but Republicans — encouraged by President Trump — have made it their rallying cry.
“Big Tech wants to run our country,” Sen. Hawley, who has become one of the leading Republican voices against Big Tech, told Recode. “And unless Congress does something about it, they will. Section 230 gives these companies unchecked monopoly power. It’s time to end those monopolies, end Section 230, and protect Americans.”
Making this now-politicized issue the center of any upcoming Big Tech legislation could derail the progress of privacy bills and antitrust investigations. But there is reason to be optimistic that those antitrust investigations will indirectly make things better for consumer privacy. By the end of 2020, attorneys general from almost every state in the country sued Facebook and Google over antitrust violations, with privacy playing a surprisingly large role in some of these lawsuits. Facebook is accused of using its market dominance to erode user privacy on its own platforms as well as prevent the rise of platforms that might have offered better privacy options. Google’s ad business — and its reliance on data collected about users — is another target of the suits. (Facebook and Google have denied that they engage in anti-competitive practices and call the suits meritless.)
“All of [the suits], if successful, should improve privacy by allowing consumers to ‘vote with their feet’ and leave established companies with bad privacy policies and go instead to new companies with better privacy policies,” said Schwartz, the EFF attorney. “However, this is an indirect path from here to privacy. The antitrust lawsuits are no substitute for comprehensive consumer data privacy legislation.”
The future of federal privacy laws
California, meanwhile, has done what the federal government could not, passing another privacy law called the California Privacy Rights Act (CPRA). Among other things, this law adds and funds a dedicated agency to investigate privacy violations. It also signals that Americans favor such laws — CPRA was a ballot measure approved by voters, not a law created by legislators. Washington state, meanwhile, is now on its third attempt to enact its own state privacy law. Several other states were considering their own privacy laws when the pandemic hit; they may well take them back up when it passes.
“I think as California’s new law goes into effect and other states start to pass their own privacy laws, demand for federal legislation by consumers and even businesses who want some certainty is only going to build,” Sen. Wyden, who co-wrote Section 230 and is one of the Senate’s biggest privacy hawks, told Recode. “I try not to make predictions after the past couple of years! But I’m sort of an eternal optimist.”
There are also signs of renewed federal interest in privacy regulation. The FTC recently ordered nine platforms — Amazon, TikTok, Discord, Facebook, Reddit, Snapchat, Twitter, WhatsApp, and YouTube — to disclose their data collection and ad targeting practices. Democratic and Republican commissioners voted in favor of the order. And the Senate Commerce Committee recently held a hearing to “revisit” the need for federal privacy legislation.
Finally, there are indications that the incoming Biden administration will have a more productive focus on privacy issues than the outgoing administration did. Biden is on record as not being a fan of Big Tech (especially Facebook) or the lack of regulations on the industry. He’s also indicated that he’s open to repealing Section 230. Biden and Vice President-elect Kamala Harris both have a pro-privacy track record. That said, they’ve also been criticized for being too friendly with Big Tech. But there may be limits on what the Biden administration can accomplish without bipartisan cooperation.
Wyden is hopeful. “By the end of last year, senior Democrats endorsed a set of really strong privacy principles, and several members introduced promising legislation,” he said. “Then, of course, we ran into the trifecta of Republican obstruction, a global pandemic, and the 2020 election. Overall, though, I’m encouraged that we cleared the way for something to get done in the next few years. I’m definitely looking forward to working with the Biden-Harris administration after the past four years.”
In the meantime, while the pandemic has made people more comfortable with giving away their privacy and exposed their information to more services, this may also make them more aware of how those companies take and use their information. We’ll see if familiarity breeds contempt or complacency.
I have to admit that, for me, the latter has been more true. Zoom was the only way I could see my grandma when the flights were canceled; I bought the security camera for my apartment after a break-in scare. If a shady data broker knowing everywhere I’d been for the last seven months would make this pandemic go away, I’d happily comply.
The upside of these services has never been greater, but users can only hope the companies that provide them respect their privacy, despite overwhelming evidence that they don’t. Hopefully, we’ll get a government that will acknowledge and protect our privacy rights — before they’re gone forever.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.