/cdn.vox-cdn.com/uploads/chorus_image/image/63712225/539904342.0.jpg)
A theory has taken hold in tech: Apple’s devotion to privacy will handicap it during the next major wave of computing, where artificial intelligence like voice interaction, personal assistants and automation take center stage.
This morning Apple gave its response: It won’t handicap us, because we can do both.
Apple’s answer? A concept called "differential privacy" — an en vogue statistical method designed to reap useful intel from big piles of data while protecting personally identifying information therein.
It’s a fitting approach. Apple has branded itself as antithetical to Google and Facebook, companies that rely on reams of data. But Apple also wants to provide the perks these companies offer — more smart, personalized services — that require reams of data. Most of the operating system updates Apple introduced on Monday at its developer conference revolve around these perks.
"All of this great work in iOS 10 would be meaningless if it came at the expense of your privacy," Apple SVP Craig Federighi said onstage.
Apple touted an endorsement from Aaron Roth, a computer scientist who specializes in differential privacy.
But the method is untested with wide use, as it will be across all Apple devices. And some researchers are skeptical of its real world applicability. One legal paper argued that the differential privacy method "will usually produce either very wrong research results or very useless privacy protections."
So it’s unclear if Apple would be able to deploy the method to deliver its advanced software — or, conversely, really provide the safeguards Apple promises.
Federighi went out of his way to note that Apple doesn’t assemble user profiles and it encrypts communications on iMessage and FaceTime. But its embrace of differential privacy acknowledges the value of data in delivering good software. Apple recognizes that it needs to analyze user behavior to improve the accuracy of its recommendations — the links users choose most often in response to a Spotlight search query, or the emojis that are most popular.
To obscure a person’s identity, the company said it will inject a small amount of "noise," or randomness, to what the user does — so each little interaction, on its own, is meaningless. But over time, trends will emerge that help improve features, like the QuickType recommendations or the links Spotlight suggests.
Apple also imposes a privacy "budget" on individual users, so it doesn't recover too much information from any user.
"It's a powerful technology that allows us to use information from users and still maintain user privacy," said Apple's Sebastien Marineau-Mes in remarks during the keynote event.
Still, it's a relatively new approach. And despite the emphasis on privacy, critics worry whether any amount of shrouding of data is enough to guard private information.
There’s a lot of interest in differential privacy at Microsoft. And Google is experimenting with this approach with its Rappor project, to help identify sites that are likely to infect its Chrome browser with malware.
Cynthia Dwork, a researcher with Microsoft, helped invent this approach and has published on the topic for years. She offered one example: Health researchers conduct a survey to see how many people carry the trait for sickle cell anemia within a particular group.
A straight-ahead data analysis would yield a precise result — say, seven people. But researchers could compromise someone’s privacy — say, the president’s — with a second question: How many people in the group, other than the president, has this inherited trait?
"If you have exact answers to both questions, then you can determine whether or not the president has the sickle cell trait," said Dwork. "By adding independently generated noise to both answers, we protect the privacy of the president."
This article originally appeared on Recode.net.