Close ad

Although since the end of March, when Apple's dispute with the FBI is over about the security level of iOS, the public discussion about the security of electronic devices and users' data has calmed down considerably, Apple continued to emphasize the protection of the privacy of its customers during the keynote at WWDC 2016 on Monday.

After the presentation of iOS 10, Craid Federighi mentioned that end-to-end encryption (a system in which only the sender and recipient can read the information) is activated by default for applications and services such as FaceTime, iMessage or the new Home. For many features that use content analysis, such as the new grouping of photos in "Memories", the entire analysis process takes place directly on the device, so the information does not pass through any intermediary.

[su_pullquote align=”right”]Differential privacy makes it completely impossible to assign data to specific sources.[/su_pullquote]In addition, even when a user searches on the Internet or in Maps, Apple does not use the information it provides for profiling, nor does it ever sell it.

Finally, Federighi described the concept of "differential privacy". Apple also collects the data of its users with the aim of learning how they use various services to increase their efficiency (e.g. suggesting words, frequently used applications, etc.). But he wants to do it in such a way as not to disturb their privacy in any way.

Differential privacy is an area of ​​research in statistics and data analysis that uses different techniques in data collection so that information is obtained about a group but not about individuals. What is important is that differential privacy makes it completely impossible to assign data to specific sources, both for Apple and for anyone else who might gain access to its statistics.

In his presentation, Federighi mentioned three of the techniques the firm uses: hashing is a cryptographic function that, simply put, irreversibly scrambles the input data; subsampling keeps only part of the data, compresses it, and "noise injection" inserts randomly generated information into the user data.

Aaron Roth, a professor at the University of Pennsylvania who closely studies differential privacy, described it as a principle that is not simply an anonymizing process that removes information about subjects from data about their behavior. Differential privacy provides a mathematical proof that the collected data can only be attributed to the group and not to the individuals of which it is composed. This protects the privacy of individuals against all possible future attacks, which anonymization processes are not capable of.

Apple is said to have significantly helped in expanding the possibilities of using this principle. Federighi quoted Aaron Roth on stage: "The broad integration of differential privacy into Apple's technologies is visionary and clearly makes Apple a privacy leader among today's technology companies."

When the magazine Wired Asked how consistently Apple uses differential privacy, Aaron Roth declined to be specific, but said he thinks they're "doing it right."

Source: Wired
.