
Apple announced it will introduce differential privacy in their iOS and macOS, which means they will securely collect data from their users in order to provide advanced AI services.
Connected technologies are more and more valued in modern life. It impacts not just consumer markets, but also public transport and energy supply.
The new approach in technology helps building insights that were not possible before. Google already collects a large quantity of information from its services. However, many users are concerned about their privacy and want more control on what type of information about them gets to be used by the companies.
Apple’s Approach to AI Systems
Apple wants to build a system that can protect the user’s privacy while collecting information that is useful to developing future features for their services.
An example of how the AI will work is the recommendation for a certain emoticon if the user in question seems to prefer that drawing more than others.
Differential privacy would include collecting data and usage patterns. The private information will be obscured to protect the person. The new system promises to keep individual identities protected.
Apple will use patterns noticed in user collected data to discover general patterns that would later be the foundation of an enhanced user experience.
The company already announced that it will use the data collected with differential privacy methods to enhance the app search. The algorithms will crowdsource deep links’ popularity in order to count the frequency of visits, while never associating the link to one particular user.
How is Differential Privacy Better?
The algorithm will contain hashing, noise injection and sub-sampling in order to secure the privacy of their users.
For example, hashing means scrambling the data before storage, and it and it will make the owner of the data to be difficult to trace.
Sub-sampling is a technique that permits Apple to analyze information directly on the person’s device, extracting only the pattern without any personal details.
Noise injection includes random data injection inside the collected information which will protect against cross-reference.
In comparison, Google’s anonymized data method removes any personally identifying information from the data package before using it. The system has its flows, as in 2007 Austin University researchers managed to connect a Netflix anonymized database with public IMDb information and compromised the identity of a part of the Netflix users.
Differential privacy was built to resists deanonymization attempts and thus can better protect users’ privacy.
Image Source: Wikipedia