Apple’s New Privacy Technology May Pressure Competitors to Better Protect Our Data

Posted by on August 3, 2016 5:30 pm
Tags:
Categories: Blender

Apple’s New Privacy Technology May Pressure Competitors to Better Protect Our Data

A decade-old idea from Microsoft gets its first big test thanks to Apple.

Ten years ago, researchers at Microsoft introduced a breakthrough approach to protecting privacy in the age of big data. Later this year their idea, known as differential privacy, will get its highest profile tryout yet—thanks to Apple.

Differential privacy provides a way to mathematically guarantee that statistics about a pool of data collected from many people can’t be used to reveal much about the contribution of any one individual.  Apple has built it into the new version of its mobile operating system, iOS, for iPhones and iPads to be released this fall.

On a quarterly investor call last week, Apple CEO Tim Cook boasted that the technology would let his company “deliver the kinds of services we dream of without compromising on individual privacy.” Apple will initially use the technique to track trends in what people type and tap on their phones to improve its predictive keyboard and Spotlight search tool, without learning what exactly any individual typed or clicked.

Apple will use the method to collect data on the words and emojis typed on the iPhone keyboard, links clicked within apps, and the words people look up in the notes app.

Tim Cook

Privacy experts are cautiously hopeful that Apple’s move could force other tech companies to adopt an idea that is seen as a gold standard in academia, but gained little traction among tech companies.

“It’s exciting that things we knew how to do in principle are being embraced and widely deployed,” says Aaron Roth, an associate professor at University of Pennsylvania who has written a textbook on differential privacy. “Apple seems to be betting that by including privacy protections, and advertising that fact, they will make their product more attractive.”

In the version of differential privacy Apple is using, known as the local model, software on a person’s device adds noise to data before it is transmitted to Apple. The company never gets hold of the raw data. Its data scientists can still examine trends in how people use their phones by accounting for the noise, but are unable to tell anything about the specific activity of any one individual.

Apple is not the first technology giant to implement differential privacy. In 2014 Google released code for a system called RAPPOR that it uses to collect data from the Chrome Web browser using the local model of differential privacy. But Google has not promoted its use of the technology as aggressively as Apple, which has this year made a new effort to highlight its attention to privacy (see “Apple Rolls Out Privacy-Sensitive Artificial Intelligence”).

The iPhone maker has not made its differential privacy code open source as Google did with RAPPOR, leading critics to argue that users can’t be sure it really works. But the company intends to release a paper outlining its system later this year. Apple also allowed Roth to review its differential privacy algorithms, which he says are designed in line with the principles of the technique.

Arvind Narayanan, an assistant professor at Princeton University, is hopeful that Apple’s privacy stand will pressure other companies to follow suit. The popularity of Snapchat’s disappearing messages, and the occasional outcry when a company is caught doing something that looks unseemly, show that people do care about privacy, even if the tech industry provides few opportunities to express that, he says.

“People want to exercise these choices but they can only do it at certain moments, within the limitations of their time and technical ability,” says Narayanan. Pew Internet reported last year that 65 percent of Americans think it is “very important” to be in control of what information is collected about them.

Google and other tech companies have followed Apple’s lead on privacy-enhancing technology before. When Apple introduced the mobile iMessage service in 2011, its end-to-end encryption feature that prevented Apple from being able to read user messages was unusual because that protection hadn’t been added to mass-market software before.

Other companies later followed in Apple’s footsteps after the company kept boasting of the technology’s benefits, irking the FBI and other government agencies in the process. Facebook’s WhatsApp activated end-to-end encryption for its more than one billion users this year, and Google will make it an option in the company’s Allo messenger app due this summer.

Differential privacy does have downsides for companies in a position to make money from user data, though. Implementing any new technology takes time, and a customized differential privacy algorithm needs to be worked out for each specific application of the technology to a different type of data collection.

The technique could also constrain the experimentation seen as vital to technology companies by taking away the live databases of user information that can be tinkered with in search of lucrative new ideas. “Companies are going to be making a calculation of whether customers care enough about privacy for it to be worth the effort to use this,” says Roth.

Leave a Reply

Your email address will not be published. Required fields are marked *