But then I focused on some of the details, and a certain simplicity emerged:
Personal information will only be shared by Apple to provide or improve our products, services and advertising; it will not be shared with third parties for their marketing purposes.
Opting out through oo.apple.com applies only to Apple advertising services and does not affect interest-based advertising from other advertising networks. However, if you select Limit Ad Tracking on your mobile device, third party apps are not permitted by contract to use the Advertising Identifier
The iTunes EULA this is not. The longer I looked at the pages of small type, the more I realized how different it was from policies I've read from most other large companies. (That's one of the career hazards of being a security analyst: You have to read a lot of these things.)
The user rules
Yet even the written policy doesn't seem to reflect how Apple really views privacy, as reinforced in the WWDC keynote:
- iOS extensions were designed to prevent them from being able to circumvent a user's privacy settings. No keyboards sniffing keystrokes and sending them off to the Internet (as has happened on Android).
- Both HealthKit and HomeKit are designed so users control their own data, and must explicitly allow it to be shared with outsiders.
- With Touch ID, not only does your fingerprint never leave the device, but apps can never see anything stored in the Secure Enclave.
- The privacy-minded DuckDuckGo search engine will be a default option, right next to Bing and Google.
And when you really dig into the details, you learn that Apple lets you NSA-proof your iCloud keychain, encrypts Messages and FaceTime calls end-to-end, protects an employee's personal information from his or her employer when using Mobile Device Management, and has designed the iPhone without law-enforcement back doors.
But in the most telling recent news of all, it appears the Apple will randomize the WiFi hardware address of iOS devices to frustrate location and advertising trackers who use this address to know who you are when you move around in public. This is a subtle feature that the vast majority of iOS users won't ever realize exists, even as it protects them.
With every iteration of OS X, iOS, and iCloud, we see Apple add increasing the privacy protections it provides its users. It has consistently enabled customers to protect their personal information from advertisers, governments, third-party developers, and even Apple itself.
This is a company that destroys the keys to its encryption hardware after setting them up in the data center, just in case an employee decides to sneak in a back door or hand the keys off to a government agency. It designed systems like iMessages that a government could technically force them to sniff, but only with a fundamental change to the system architecture.
The question becomes, why? These changes, in some cases, affect usability--popping up reminders and approvals for every application that wants access to location data or our photo libraries, say, or implementing sandboxes that constrain developers (causing some to leave the Mac App Store completely).
I believe the answer is profit, with a smidgen of righteous anger.
Corporations generally limit their altruism to charity, not to core product and business decisions. Apple likely sees a competitive advantage in privacy, especially when its biggest direct competition comes from advertising giant Google and the enterprise-friendly Microsoft. Apple believes consumers not only desire privacy, but will increasingly value privacy as a factor in their buying decisions.
Plus, even CEOs and product managers get creeped out when the government reads their email.
Look hard across Apple's security and privacy technologies and practices, and a set of principles emerges:
Customers own their data. Vendors (including Apple itself) must ask for permission before collecting that data, or letting anyone else collect it. Both iOS and OS X ask before sending data to Apple, and now include granular controls on what applications can see what data, all at the user's control.
Collect the smallest amount of data needed for usability, anonymize it when possible, and delete it when you no longer need it. For example, Siri data is associated with a random number, not your Apple ID, and voice data is deleted after 6 months.
Encrypt as much as possible, while maintaining usability. iCloud Mail and iWork in the Cloud encrypt data, but need to see it for the cloud services to work. But Apple doesn't need to read iMessages, so those are encrypted end to end.
No back doors. All application data on iOS is encrypted with your passcode and a secret hardware key unique to your device, embedded in the hardware, that Apple doesn't track and can't recover.
Protect customers from privacy abuse by developers, employers, and governments. Apps can't access personal or location data stored on iOS or OS X without permission, and you can remove permission whenever you want. (This is inherent to app sandboxes.) If you own your iOS device, even with Mobile Device Management your employer can't access your private data. Across the board, Apple continues to add technology, such as iOS Extensions, to enhance the platform without reducing privacy. Apple even locked developers out of access to device IDs when they were being abused for tracking and advertising.
A critical advantage
This is all quite different from Google or Facebook, which collect and store massive amounts of identifiable data as part of their core businesses. It also separates Apple from Microsoft, which places the needs of enterprise customers ahead of consumers, under the assumption that the enterprise is the owner of the technology (a rapidly declining trend). Lastly, it increases the trust a consumer has not only in their hardware, but the applications running on it.
Apple is leveraging their business model and technologies to create a difficult, if not insurmountable, gap for competitors to cross.
Google can't stop scanning user email, since targeted advertising is its core business. Facebook won't encrypt messages end-to-end for the very same reason. Microsoft can't restrict enterprise administrators from controlling phones and computers, since enterprise manageability is core to its primary customer base, especially as it loses ground in the consumer market. Android--okay, Google--can't dictate hardware design, and thus can't consistently secure customer data on the device. Essentially, Apple uses the difference in its business model to attack competitors on privacy.
Apple makes its money selling hardware to consumers. All of its software and services are predominantly there to drive hardware (and to a lesser extent, media) sales. The consumer is the customer, not advertisers or enterprises. The only other companies in a similar position--such as Sony--lack the strength, software, and ecosystems to truly compete. Apple also clearly sees nothing to gain in designing systems that support government snooping (though it will be interesting to see how that works as it extends its services into China and other nations where domestic monitoring is legally mandated).
Apple didn't always place privacy so front and center. Most iOS privacy features only appeared in iOS 6, and only after some very public (albeit overhyped) abuses by certain apps. OS X only gained location privacy in Lion, and a full privacy center in Mountain Lion. Apple provided nearly no security or privacy details on iCloud until earlier this year. Apple still owns an advertising network.
The issues of safety and security--and by extension, privacy--provoke visceral emotions in people. Apple has always tried to build an emotional connection between its devices and customers. With its increasing focus on privacy, it's clear that Apple not only sees privacy as important to maintaining this bond, but as a means of differentiating itself from the competition. For a variety business and technical reasons, it's an advantage that will be hard for Apple's competition to duplicate.