clock menu more-arrow no yes mobile

Filed under:

Apple and Google look like problematic heroes in the pandemic

New contact-tracing technology is supposed to go away after the pandemic. Privacy experts aren’t so sure it will.

A device using TraceTogether, a contact tracing app developed by Singapore.
A GovTech staff member demonstrates Singapore’s new contact-tracing smartphone app, called TraceTogether. Apple and Google’s contact-tracing tool has better privacy, but it isn’t perfect.
Catherine Lai/AFP via Getty Images
Sara Morrison is a senior Vox reporter who has covered data privacy, antitrust, and Big Tech’s power over us all for the site since 2019.
Open Sourced logo

Apple and Google are a month away from launching a series of updates to their smartphone operating systems that will use Bluetooth signals to track potential coronavirus cases. This week, the companies confirmed to Recode that the contact-tracing technology will go away when the pandemic does, which should assuage some privacy concerns. But as two of the world’s tech superpowers prepare to embed new surveillance features into their devices, skepticism mounts. Will Apple and Google’s joint effort to fight the virus have unintended consequences?

When the contact-tracing tool was announced, there was no indication that the software required to make it work would be temporary. That it involved changes to mobile operating systems actually made it seem more likely that it would be a permanent fixture. Apple and Google have said this deep integration was the only way to enable the nonstop tracking necessary for the contact-tracing tool to function properly. Nevertheless, the permanency of such a tool could lead to its features being used for other purposes, once its intended use was no longer needed. So it seems to be a relief that Apple and Google plan to sunset the contact-tracing tool at the end of the pandemic, though details about exactly which software features will go away and what qualifies as the “end of the pandemic” still need to be explained.

By promising an expiration date, Apple and Google have addressed a chief concern for privacy advocates, some of whom have been uncharacteristically amenable to certain types of tracking during the pandemic. But many more questions remain. How will Apple and Google prevent their tool from being abused by governments with access to it? How will the companies ensure that contact-tracing systems remain optional for smartphone users? Will the tool be effective enough to warrant the privacy compromises it requires?

“This is an extraordinary time,” Bennett Cyphers, staff technology at the Electronic Frontier Foundation, told Recode before the time limit was announced. “It means that we would accept some pretty extraordinary things that EFF would normally never endorse. But that has to come with some kind of limit on how it can be used and for how long it can be used.”

Although they seem to have answered the “how long” question, Apple and Google have so far provided limited information about the tool itself. The companies have released some technical documents showing how the system works, and they’ve answered a few questions from the media. But that’s about it. Beyond confirming that the tool would end, neither Apple nor Google has responded to Recode’s requests for comment.

The tool could leave a dangerous door open

The new Apple-Google tool sounds miraculous at first blush. Starting in mid-May, the companies plan to release software updates that will allow iOS and Android phones to exchange anonymized keys through Bluetooth to any other phones that come within a certain proximity. These feature will enable interoperability between iOS and Android phones, and public health authorities will exclusively build apps based on an application programming interface (API) built by Apple and Google. If a user tests positive for coronavirus, they will inform the app, which will then alert people who have been close enough to the infected person for their phones to exchange Bluetooth keys. The alert will tell them they’ve been in contact with an infected person, without revealing the person’s identity.

Although this talk of tracking and tracing might sound daunting, the Apple-Google tool includes specific privacy protections, such as using anonymized keys that change every 15 minutes to prevent anyone from tracing a specific key back to an individual; storing data on users’ devices rather than on a central server; using proximity detection instead of location data; and making the entire program opt-in. Apple and Google have said privacy and user trust was at the forefront of their collective minds when developing the tool.

Not everyone believes these good intentions can hold. Apple and Google’s privacy promises ring hollow to people who have seen how both companies have built themselves on the back of privacy compromises, many of which were made without the consumer’s knowledge.

“Two corporations, Apple and Google, have come to dominate the smartphone software ecosystem, and they have spent years spying on users and enabling consumer surveillance in their app stores,” Michael Kwet, a visiting fellow at Yale Law School’s Information Society Project, told Recode. “In the world we built, we now have to weigh the fate of our lives and economy against trust in Apple and Google, the ad-tech industry they support, and government intelligence agencies. … This is a nightmare.”

Just look at how well the tools hidden in many apps available through Apple and Google marketplaces can track you. And Google has trackers installed all over the internet, gathering first- and third-party data about potentially everything you do online. One way or another, everyone from location-data brokers to law enforcement can get access to a lot of your data through these companies’ devices. Apple and Google have made several efforts to combat some of these intrusions, but such intervention only shows that the companies can’t foresee all of the unintended consequences their innovations may have. They can only respond to them after the fact.

We also don’t yet know which countries, states, or cities will be participating in the Apple-Google contact-tracing effort. We do know that the API will be made available only to those governments’ public health authorities, though it’s unclear if the companies will take measures to prevent authoritarian governments from using the technology in unintended ways.

Along those lines, while users must opt-in to the contact-tracing feature, we don’t know if health authorities will be able build apps on top of the Apple-Google technology that could enable more invasive tracking. We’ve already seen systems like this in other parts of the world. The Chinese government, for example, made an app that assigns a health code to users, who must then show a healthy code in order to move around freely. The Apple-Google tool doesn’t do this, but it could be used to perform a similar function.

“I think there’s a very real possibility that businesses could, for example, require that their employees show proof of ‘non-infection’ before they’re permitted to return to work — voluntarily, of course,” Ashkan Soltani, a former Federal Trade Commission chief technologist who has written about privacy issues and Bluetooth tracking, told Recode.

Soltani also wondered what Apple and Google will do to prevent developers from adding identifiers in the apps they build using the contact-tracing API, such as location information and names.

On Thursday, the European Union released a list of privacy requirements for member states developing contact-tracing apps, suggesting the EU has some privacy concerns with regard to the Apple-Google tool. On the list is “urgent engagement with owners of the mobile operating systems” to ensure that the tool is “compatible with the EU common approach.” Meanwhile, the United Kingdom’s National Health Service was reportedly looking into ways to identify supposedly anonymous users of the contact-tracing app it is developing. The NHS plans to integrate the Apple-Google tool into that app, according to the BBC.

It’s not inconceivable that something like this could happen in the United States, where there are already instances of public health authorities sharing positive coronavirus tests with the police. Sen. Richard Blumenthal, an advocate of data-privacy legislation, said in a statement that he “urgently want[ed] to know how Apple and Google will assure that consumers’ privacy interests are strongly balanced with the legitimate needs of public health officials during the coronavirus pandemic,” adding that “a public health crisis cannot be a pretense to pave over our privacy laws or legitimize tech companies’ intrusive data collection about American’s personal lives.”

Bluetooth technology may not be up to the job

Privacy and security are chief among the concerns with the Apple-Google tool, but there are also issues with the technology itself. While more precise than GPS, Bluetooth signals may not be good enough to determine the proximity of other devices with the accuracy needed for contact tracing. The precision of these signals can depend on several factors, but some estimates suggest the technology could struggle with the 6-foot social distancing recommendation. If a Bluetooth signal can only determine your location to within 30 feet, for example, you’d get a notification that you were in proximity to an infected person when you might actually be at a safe distance. (Exactly how far the coronavirus can travel in the air remains unclear.) These false positives could happen in a densely populated location with such frequency that they become meaningless.

Bluetooth signals can also travel through physical barriers, so the Apple-Google contact-tracing tool might get mixed up if there’s a wall between two devices. That means you could get a notification about being exposed to the coronavirus when, in reality, you and the other person are in two separate apartments. The opposite could also be true: You might not get a notification when you have been exposed in a way the Bluetooth-based system didn’t register.

“The false negatives are the ones that concern me more,” Susan Landau, a cybersecurity and policy professor at The Fletcher School at Tufts University, told Recode.

False negatives could happen in many different ways. The infected person might not use the tool, or they may not have their phone on them when they came near you. Or someone with the virus might sneeze or cough more than 6 feet away from you. They wouldn’t trigger the proximity alert, but you still could have been exposed. Meanwhile, if this data is then used by governments to justify lifting shelter-in-place ordinances, any inaccuracies could be very costly.

“It’s essentially going to give us a false sense of safety while simultaneously infringing on people’s rights,” Soltani said.

Voluntary participation may not be enough

Finally, assuming the privacy issues don’t lead to any sinister outcomes and the Bluetooth technology pulls through, the Apple-Google contact-tracing tool might fail due to statistics. There’s the minimum percentage of the population that will have to participate in digital contact tracing in order for it to be effective. An Oxford University study puts the minimum at about 60 percent. And while having it integrated into the vast majority of smartphone operating systems — an estimated 3 billion people worldwide own an Apple or Android smartphone — is one of the better ways to encourage that, it still leaves out billions of people who don’t have smartphones.

Even in America, where 81 percent of adults own smartphones, mass adoption will be tricky from both a cultural and practical perspective. Smartphones are still less accessible to lower-income people and seniors, populations that have been hit particularly hard by the pandemic and will increasingly be left behind. Those who do have phones will have to download the latest software updates (and have newer-model phones that can support the updates), download the app from the relevant public health authority, carry their phones with them everywhere they go, and want to participate in the first place. The bar to entry might simply be too high for too many people.

“We cannot solve a pandemic by coding the perfect app,” the Electronic Frontier Foundation said in its report about such digital contact-tracing tools. “Hard societal problems are not solved by magical technology, among other reasons because not everyone will have access to the necessary smartphones and infrastructure to make this work.”

Even if the Google-Apple tool achieves the necessary widespread adoption, it alone may not be enough to eliminate or reduce the spread of the virus. Michael Osterholm, director of the Center for Infectious Disease Research and Policy at the University of Minnesota, said in a recent Project on Government Oversight panel discussion that contact tracing alone was a “somewhat superficial and not very meaningful” solution.

“When your house is on fire, there’s no way to do contact tracing,” Osterholm said. “[Coronavirus] is everywhere ... The only thing we know that really will work is a primary shutdown.”

This is especially true when you consider that many people who have the virus and are contagious are asymptomatic, meaning they never get tested at all. Only a mass-testing program, experts say, will allow people to safely stop social distancing. We need to significantly increase our testing capabilities for this; estimates range from 750,000 tests a week to 35 million a day. We aren’t anywhere near even the lower end yet, and there are many factors preventing us from achieving it anytime soon, from a shortage of basic materials like testing swabs to questions about the tests’ accuracy.

Landau, the Tufts professor, noted that even if Apple and Google use the tool only for the duration of the pandemic, we don’t know how they’ll define the end of the pandemic, and no one knows when that will be.

“It’s tempting to say after a certain time, the app will die,” she said. “We don’t know how long that will be. We don’t know if we’ll have a vaccine. We don’t know what percentage of the population will respond to that vaccine. There’s all sorts of unknowns.”

Even assuming Apple and Google have the most altruistic of intentions, their tool likely won’t be the pandemic-fighting silver bullet we all want. If we don’t know that it will solve this problem, it’s harder to justify the additional problems it will create. Altering their phones’ operating systems — even if that change is temporary — is a big deal. We can only hope the ends justify the means.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.