Imagine you are at the coolest place ever. Sunshine, all kind of delicious fruits, peaceful animals, chill music, a summer breeze, romantic sunsets, just paradise …
Are you there?
Then, one day, the gardener tells you: “Of every tree of the garden thou mayest freely eat: but of the tree of the knowledge, thou shalt not eat of it: for in the day that thou eatest thereof thou shalt surely die.”
What would you do?
A) Would you avoid the tree and the potentially yummy, but poisonous apples?
B) Would you shake the apple tree and sell the apples?
C) Or maybe your rebel inside just cannot resist? You take one curious bite, then many, and deal with the consequences (if you get caught).
That was easy. Remember, he said, “thou shalt surely die”.
Let’s leave the tree alone.
But then, an evil voice whispers: “Ye shall not surely die. It’s just that your eyes will open because you will know EVERYTHING …”.
Let’s assume for a moment, that you always wanted to know everything. Limitless. So much data, pardon so many apples. Right there, in front of you! Imagine, you would know everything! Healthy bonus: you wouldn’t die.
What would you do now?
We’re turning tables.
We take it easy and let Apple answer the question.
In this post:
- the gardener is you and me
- our (walled) garden is the technology paradise (Apple ecosystem) we are living in.
- It is full of cool gadgets and services, which we love buying from Apple, and we’re having fun and all,
- but there is this one forbidden tree, with big juicy apples, full of our deepest secrets.
- So we tell Apple: all cool, have fun in our tech paradise, but leave our apples alone!
- And we might get pretty angry and kick Apple out of our little paradise if they betray us. Well, at least some of us would …
Estimated reading time: 24 minutes
Here’s a table of contents, for easier orientation and navigation in this longer post:
Table of contents
- How does Apple deal with our Apples?
- Privacy is a Human Right
- Why this topic, here and now?
- How do we get there?
- Apple Privacy: In Plain English
- “Apple products are designed to do amazing things. And designed to protect your privacy.”
- “This is how we protect your privacy.”
- “Here’s how to manage your privacy.”
How does Apple deal with our Apples?
Privacy is a Human Right
Here’s what Tim Cook, Apple CEO, says about privacy, from a recent interview with MSNBC (transcribed on recode): “The truth is we could make a ton of money if we monetized our customer. If our customer was our product, we could make a ton of money. We’ve elected not to do that. (applause) Because we don’t … our products are iPhones and iPads and Macs and HomePods and the Watch, etc., and if we can convince you to buy one, we’ll make a little bit of money, right? But you are not our product. You are our customer. You are a jewel. (laughter) We care about the user experience. And we’re not going to traffic in your personal life. I think it’s an evasion of privacy. I think it’s – privacy to us is a human right.”
Right, the UN, US, EU, and others define privacy as a human right. But then, why is Tim so pushy? Well, this was around the Facebook/Mark Zuckerberg hearing and Tim has a history of positioning Apple (vs Facebook, Google, and Amazon) as the one company which “takes privacy extremely seriously”.
This goes back to Steve Jobs, who already in 2010 stated: “Privacy means that people know what they’re signing up for. In plain English and repeatedly. That’s what it means. I am an optimist. I believe people are smart, and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them, if they get tired of your asking them. Let them know precisely what you’re gonna do with their data.” By the way, Mark has been spotted in the audience back then.
Facebook will delay the release of its own smart speaker, “due to the public outcry over the current data sharing scandal”.
Why this topic, here and now?
This site is about smart home and our smart assistants. It’s this intimate area at home, where we expect security and privacy. Consequently, we should be careful, which companies we invite.
Since Apple obviously uses it’s privacy features as a marketing tool, I found their “plain English” privacy version a bit blurry. Superficial articles I found online are misleading. So, being a technical guy, with 2 decades of speech recognition on my back, I had to dig deeper. Surprise, despite Apple’s infamous secrecy, it’s amazing how much detailed information we can obtain directly from Apple.
Given the recent public attention the topic “privacy” received, I will let this post be the first one in the security category. Hopefully, it inspires us to check the privacy policies of the smart home device vendors at our homes. (And yes, I’ll also write about security cams and stuff.)
How do we get there?
We’ll start with an intro, basically the “light” plain English version of Apple’s privacy page, which gives a brief overview of “what” is covered. After that, we will look into the “how”, the more detailed privacy information Apple provides us. This covers:
- the safeguards Apple has built in to protect our privacy,
- how Apple personalizes our experience (without sacrificing our privacy),
- how Apple supports privacy in apps,
- recommendations from Apple, about what we can do to protect our privacy.
Finally, we will look into whats planned for Apple privacy in the near future and wrap up our findings in the conclusion.
I know this sounds like a lot, if not too much. But bear with me, it’s an important topic and it is our human right. We pay Apple a lot for our devices and it’s good to see what they give us in return to protect our privacy.
Apple Privacy: In Plain English
Apple has extended its privacy page and I encourage you, to check it out yourselves. We find following well laid out introduction sections, in plain English, structured as follows:
“Apple products are designed to do amazing things. And designed to protect your privacy.”
Apple gives us a couple of examples of our personal data and explains that it’s safe on its devices because they are designed from the ground up to protect our tree. Nice.
- “Only you can access your device”: The six-digit passcode, Touch ID, Face ID and an alphanumeric passcode. Obviously, biometric access makes it more convenient to unlock your phone, and it’s much safer than to leave your phone unlocked.
- “Your personal data belongs to you, not others”: Photos, Siri, Directions. Apple doesn’t gather your personal information to sell to advertisers or other organizations. Well, good to know, but we would not have expected that anyway.
- “Your Apple Pay transactions are safe:” When using a credit card, Apple Pay can’t create a history of our purchases. With Apple Pay Cash, data is stored for fraud prevention. Ok.
- “Your features improve while your data stays private”: Here we can find the term “Differential Privacy” for the first time. It’s a cool concept, and we are going to look into it a bit later. But it applies only to Analytics/usage data, where we anyway have to explicitly opt-in, but usually, don’t do. So what?
- “Whether you store it or send it, your data is protected.” Apple Pay, iMessage and Facetime are end-to-end encrypted, which means not even Apple can watch us, or hand over our data to authorities. Our fingerprint and Face IDs are stored together with our personal data in a “secure enclave”, where neither apps nor iOS can access them. Ok, the last point sounds a bit fuzzy, but altogether, cool!
- “Your apps play by your rules.” This reminds us, that we can control in our privacy settings, which app can access which data/service. Remember Steves: “Ask them. Ask them every time”. Also, Apple defines strict rules for app developers and has a strict review process. Nevertheless, there have been cases of misbehaving apps, and Apple reacted by removing them from the App Store.
“This is how we protect your privacy.”
Time to look into the more detailed privacy statements:
“We build safeguards into our products to protect your privacy.”
Apple lists the different modules, where our personal data is stored and how it’s secured. Also, we find a prominent link to a more detailed explanation of differential privacy. So, here we go with a brief version: differential privacy is a way to scramble your data by throwing random information into it, so it makes no sense to anyone anymore. But if you combine the data of many people, the random info will average out and suddenly the data makes sense again. You can learn from many, you can’t learn individually. Super smart privacy-respecting way to learn from us in order to improve our user experience. Again, this feature as of now only applies to the Device Analytics (Settings/Privacy/Analytics). We’d need to enable this feature to contribute to the group learning.
- Encryption: A reminder that iMessage, Facetime, Apple Pay, are encrypted. Apple is proud that they were one of the first companies to encrypt discs on MacOS and data on iOS. Plus, Apple will never build in backdoors into any of their products. Remember the FBI?
- Apple Pay: Since it’s about our money, more information on how credit card information is safely stored in a secure element. We can furthermore find a link to more detailed information around security and privacy for Apple Pay.
- iMessage and FaceTime: Again, the information that our iMessages and Facetime calls cannot be decrypted, even by Apple, or the FBI. Super private. Facetime data is never stored and iMessages are backed up to iCloud, but we can turn this off or specify for how long we want to store our data.
- Health and Fitness: Health data is a sensitive topic. We decide with whom we want to share it and it’s encrypted when our iPhone is locked. When it’s backed up it’s encrypted in transit and on iCloud (we will look into iCloud security a bit later).
- Analytics: If we want to help Apple and app developers to improve their apps, iCloud usage (including Siri!), health and fitness features, health records or wheelchair mode, we can share our analytics data. Personal information is either not logged, removed or protected by differential privacy. This is also the place where we can check out the Data & Privacy page (by tapping on more info for any of the analytics settings), which informs us in great detail how your data is managed.
- Safari: Another proudly pioneered feature is third-party cookie blocking and private browsing. Web pages are sandboxed in single tabs, so bad sites cannot reach other data. Content blocking is implemented to prevent others from tracking your browser activity. A new Intelligent Tracking Prevention reduces cross-site tracking, minimizing the ads which follow you from website to website.
- iCloud: Remember celebgate? A week ago the fourth hacker who leaked photos of celebrities got arrested. No, they did not hack iCloud. They phished for the login data, downloaded backups (which were not encrypted back then) and extracted the photos from there. Needless to say that Apple lists in great detail how it protects our data on iCloud. Our iCloud Keychain (including all of our saved accounts and passwords), payment information, Wi-Fi network information, Home data and Siri information are end-to-end encrypted (so not even Apple or the FBI can have a peek), but only if we activated two-factor-authentication (2FA). Since most of us deal with HomeKit, and 2FA is a prerequisite to access our HomeKit remotely, we are super safe here. If you have not activated 2FA yet, it’s a good time to do so. Here is a link to detailed security information for iCloud. You can find a 2FA how to in the next chapter.
- CarPlay: Not too much info here. Only essential information is shared from your car, e.g. the GPS location to improve map info.
- Education Privacy: Apple is very active in terms of supporting schools and universities. Plus they are committed to safeguarding student privacy. We can also find links to detailed information which student and teacher data is collected and how it is used.
“Get a personalized experience and maintain control of your privacy.”
This section details which information is collected to improve our user experience.
Photos: Face and place recognition happens on the device. If we have iCloud Photo library enabled we can share this information between multiple devices. iOS supports now a finer granularity for controlling photo access. An app can request to access only a single photo and read and write access can be specified independently.
Siri and Dictation: Finally, our favorite topic. To summarize: Siri learns from us, without knowing who we are. When we enable Siri, a random identifier is created, which is associated with our device (not our Apple ID). When we disable Siri, we “restart our relationship” with Siri. Our Siri profile gets deleted and she learns from scratch. Apple explains Siri features, where our data remains locally. In case there is real-time info required from Apple Servers (e.g. our location for time to leave predictions, taking the current traffic into account), our requests are anonymized and cannot be traced back to us. There are not too many details how Siri privacy actually works, so we will dig deeper when looking into the iOS Security Guide below.
Health and Fitness on HealthKit: Health data is a sensitive topic, Apple reminds us here again that the Analytics for “Improve Health & Activity” and “Improve Wheelchair” do not contain personal information.
News: Which news we read is a personal topic, so Apple links the info to an anonymous News ID and not our Apple ID. Siri can suggest us stories, channels and topics, which we like, only based on on-device information like the apps we use and sites we visit with Safari. Since 3rd parties provide the news content, Apple shares our usage data only in aggregated form. A detailed News related privacy link is provided.
Apple Music: A key feature, for which Apple provides a very detailed privacy information. Apple has to collect the information which songs we hear, for how long, to be able to compensate the partners/artists. A huge part of the policy explains how our shareable Apple Music profile works. Apple does check the contacts in your address book to suggest friends but does not save the data. We can disable the feature, “Allow Friends to Find You” in case we do not want to show up in suggestions of friends, who have our contact details.
Maps: Again random identifiers which are not linked to our Apple ID are used. Here Apple states the opposite of what is written in the Siri section, time to leave data is created on the device (Easy to imagine that with this plethora of Apple privacy info, things can get mixed up). Apps which use the map only receive minimal information.
Siri and Spotlight Suggestions: When searching with Siri a random identifier associated with our location is used, which changes every 15 minutes. So neither Apple nor apps can create a long-term profile of our searches.
Advertising: News and App Store display targeted Ads. News ads are personalized based on what we read and whom we follow. This info will not leave the News app. App Store Ads are personalized based on our search and download history. We can turn personalized Ads off by enabling “Limit Ad tracking”.
“We give developers powerful tools to protect your data.”
One of the “perceived” weak points of Apple are misbehaving apps on the App Store. Remember Zuckerberg’s hearing notes: “Lots of stories about apps misusing Apple data, never seen Apple notify people.” Well, media does, and we can find the list of AppStore banned apps quickly. Maybe he meant Uber’s trickery, who knows, he never explained.
- Apps: App developers have to agree to specific guidelines to protect our user privacy and security. Misbehaving apps will be removed from the App Store. Apps undergo a through a review process before we can download them. When we install an app, we are prompted for permission, the first time the app tries to access information. We can change the app permissions anytime. Certain information on our devices cannot be accessed by apps at all.
- DeviceCheck: Many developers try to store device information even when we delete/reinstall their apps: like, has this device already used a free trial or has this device been used for fraudulent activity. To discourage developers from sneaky tricks, like in the Uber case, Apple offers now to save 2 bits per app (equals 4 states) which Apple can save together with a timestamp for the developer.
- HomeKit: Only apps for configuration and automation are allowed. Apple does not know which devices we are controlling and when. Siri only associates our devices with the random identifier, not our Apple ID. Data related to our home is stored in our keychain, always encrypted between devices and also when we control them remotely. Location-based automations are triggered via HomeKit, so 3rd party apps don’t receive location information. Apple states twice that they don’t know which devices we are controlling and when. We believe you. Home, sweet private Home!
- Machine Learning: Our Apple devices are so powerful, that machine learning runs on them, hence our personal information does not need to leave the device. Apple uses it for image and scene recognition in photos, predictive text and more. App developers can use it to analyze our sentiment, translate and predict text, and other crazy stuff without putting our privacy at risk.
- ResearchKit and CareKit: Both kits are open source. ResearchKit enables apps to gather meaningful data for medical research. CareKit is a platform to create apps which should help us to take an active role in our well-being. All of the apps which access our health data, must ask for our consent and provide detailed information how our data is handled. An independent ethics review board reviews them. There is a link with detailed info on ResearchKit and CareKit, and I found the already available apps around epilepsy, Parkinson, early autism diagnosis and more, quite amazing.
- HealthKit: All our fitness apps use HealthKit to share the data with each other and with Apple’s Health app. All the apps may not use or disclose our health data to third parties unless they do it for improving our health and then only with our permission.
- CloudKit: This helps apps to synchronize their settings across our devices. Developer receive a unique identifier and not our Apple IDs. Only with our permission, apps can use our e-mail to connect us with other app users.
“Here’s how to manage your privacy.”
In this final chapter of “Apple Privacy in Plain English”, Apple reminds us of what we can do to improve our privacy on Apple devices.
“Secure your devices.”
iCloud is only as secure as the weakest of our Apple devices.
- Put a passcode on your device: The more complex, the better. The 6 digit passcode allows for 1 Million combinations. Here’s how to.
- Enable Touch ID or Face ID: With touch or glance the unlock becomes very convenient. The biometric models are saved in the secure enclave, which is basically a closed system on its own on our device and they never see the iCloud.
- Auto-unlock your Mac: We can use our Apple Watch to conveniently unlock our Mac (2FA needed).
- Find your lost device: We can enable this feature to find our device if it gets lost or stolen. If we cannot get our device back, we can wipe all the data remotely. If it’s an iPhone or Apple Watch we can block the device to be activated again.
“Secure your Apple ID.”
Our Apple ID is the key to iCloud which holds our calendar, contacts, e-mails, photos, and backups. Here’s how we can safeguard it:
- Choose a strong Apple ID password: Make it long and make it strong, here’s how.
- Turn on two-factor authentication: This adds a second layer of security by sending a verification code to all of our trusted devices. No code, no login from a new device. Here’s how to enable 2FA.
- Beware of phishing: Ever got a strange call or e-mail asking for your account data? Some have. Don’t you ever think that Apple would do that. Turn on 2FA and tell email@example.com. Here’s more info from Apple on phishing.
- Pay attention to notiﬁcations about your Apple ID: If we access our account from a new device, Apple notifies us. If we get such notifications without accessing our account, we should immediately change our Apple ID password here or contact Apple ID Support if this is not possible.
“Be aware of what you’re sharing.”
- Data & Privacy Information: If Apple asks us for information, they will display a new screen with information on what data is shared for what use.
- Conﬁgure your iCloud settings: We decide what is synchronized via iCloud and what not. In our iCloud settings, we can enable and disable services individually.
- Emergency SOS: We can use our Watch to call emergency services, inform selected SOS contacts and share our location for a specific period.
- Manage your location data: We can specify which apps have access to our location in our location settings.
- Control data shared with apps: We had to explicitly allow apps access to our location, contacts, calendars, or photos. We can always change this in the settings.
- Limit targeted interest-based ads: If we do not want to see targeted ads in the App Store and the News app, we can enable Limit Ad Tracking.
- Browse the web privately: Private browsing does not remember the sites we visit, our search history or any forms filled. Here are the Safari settings for iOS.
- Protect your children’s privacy: Parental controls allow us to control the websites, type of movies, access to Facetime and Camera and download of apps. With Family Sharing you have insight into children activity and content. Here’s more info on Family Sharing.
Apple Privacy in Legal English
If Apple would not care about our privacy they would have left us with this boring legal version.
It starts with what personal information Apple collects and how Apple uses this information. Then it goes on with non-personal information and that Apple can do with that data basically whatever they are up to. They list some examples, nothing shocking though.
Next, we find a lengthy chapter on Cookies and other technologies. Our IP is considered a non-personal information unless our local law defines otherwise. Then it details cookies, targeted ads, website tracking and marketing e-mails and what they track there.
Apple goes on with what data they have to share with our service provider or other parties. And in case they have to share our data with public and governmental authorities by law (you can check out Apple’s transparency reports here) they will have to do that.
Apple explains how they always encrypt our personal information not only in the case they put iCloud on 3rd party storage (Google, Microsoft or Amazon servers).
Children & Education is the next topic and details the process of creating Apple IDs for kids under the age of 13 and family sharing. Above link to access, correct and delete data applies also for parents.
Apple reminds us that unless we provide consent our location data shared with apps is anonymous.
The data which apps and services collect from us is governed by their privacy policies (e.g. Facebook app) and Apple encourages us to learn about those privacy practices.
If we are in Europe or Switzerland, our data is controlled by Apple in Ireland. Apple abides by APEC rules system, which ensures the protection of personal information transferred among APEC economies.
Apple communicates it’s privacy and security guidelines to Apple employees and enforces privacy safeguards within the company.
Last but not least, if we have any questions we can ask them here.
If you’re into legal texts, you can indulge in the countless Terms & Conditions of Apple products and services, or just have a laugh here.
Apple’s Security Guide in Technical English
I know, I know, the last part was a bit boring and I am already afraid that more of this is waiting for me when I check the Amazon and Google privacy policies.
Anyway, we are done with legal, now comes the technical part. Let’s check out what we can find about Siri and HomeKit in Apple’s iOS Security Guide. If you are a techy person, you can find a lot of interesting technical details in these 81 pages, I will focus on Siri and HomeKit here.
As already mentioned above, HomeKit is a home automation infrastructure which uses iCould and iOS to synchronize data in a protected way, so that not even Apple sees it.
Behind the scenes, our iOS device creates encryption keys and stores them in our keychain. This is our HomeKit identity. HomeKit accessories create their own keys. When we add a new accessory the iOS device and the HomeKit accessory exchange their keys in a secure way. During usage, the HomeKit accessory and iOS use those keys to authenticate each other.
HomeKit data (homes, accessories, scenes, users) are encrypted on the iOS devices and only saved encrypted to iCloud. This data is treated as an opaque blob, which means it looks like binary garbage from the outside. Since the keys for encryption are only on the iOS devices, the content is inaccessible to anyone else during transmission and iCloud storage.
When we invite another user into our HomeKit the same security mechanisms just like when adding a HomeKit accessory are used. The original home user authenticates the new user with the devices so that the accessories can accept the new user.
Siri receives anonymously minimum information to be able to understand our HomeKit voice commands.
HomeKit IP cameras encrypt their streams with random keys. When apps display the camera view, a separate process decrypts the streams so the apps cannot access or store the content of the stream. Apps are not permitted to capture screenshots from the video stream.
Apple TV (or any home hub) allows us to remotely access our HomeKit. Two-factor authentication is needed on the iCloud account and AppleTV is added to HomeKit using the same security as HomeKit accessories. When we access our homes remotely through iCloud, Apple does not see which devices we control or which notifications are sent.
We can talk naturally and send messages, schedule meetings, place phone calls, listen to music and much more. Siri has been designed so that only the minimum personal information possible is sent to Apple and even this data is fully protected.
When we enable Siri for the first time, a random identifier is created. This identifier is not tied to our Apple ID, but rather to our device. Once we disable Siri, this identifier is recreated and any old session data on Siri servers deleted.
Information about our home, music library, contacts and relations, reminders, etc. is sent to Siri so she can make sense of our commands. Siri fetches the information from our devices on a demand basis: If more information is needed to perform a task she will fetch the information rather than sending all info upfront. The basic principle here is: minimum information only is sent to Siri, fully protected and this information is deleted after 10 minutes of inactivity.
Our voice recordings are sent to Siri servers. If it’s only a dictation, we receive the text back. If it’s a command Siri analyzes the additional information and sends the command information back to be executed on the device. Most commands can be executed on the device without sending additional information to the server (“read messages”, “whats on my calendar”, etc).
Siri keeps a copy of our voice commands in anonymous profiles (remember the random ID) for half a year. This voice profile is trained with any Siri commands we are uttering. After half a year another copy is saved without the random identifier and is used for improving Siri for everybody, for two years. Siri R&D will also pull out some samples without identifiers for ongoing improvement and quality assurance.
If you like to dig deeper than the security white paper, I’d recommend the Apple Developer documentation or the Apple Machine Learning Journal. Last summer Apple launched the latter with articles from Apple engineers about Siri, Encryption, FaceID and more.
What’s coming (initially only to some of us)
The EU is going to enforce the General Data Protection Regulation (GDPR), which are privacy protection rules, by the end of May 2018. These rules will make sure that organizations dealing with EU citizens (like Apple, Facebook, Amazon, and Google) give users insight into the data which is saved about them. Companies are obliged to handle the data more responsibly and the fines for not doing so, go up to 4% of the annual global turnover or 20 Million Euro (whichever is greater).
As we have seen in this post, Apple devices are private by design. Apple will additionally extend it’s Apple ID management site to get a copy of our data, temporarily deactivate our accounts or delete our entire Apple ID. Furthermore, Apple has deployed tools to developers so that the information saved in apps can be controlled by us the same way.
These features will be rolled out initially only in Europe, but Apple plans on making them available globally for all of us.
Every day, each of us, average internet users, generates about half a Gigabyte of data. This equals to around: this super long post times 18.000.
Some of this data is quite personal, still, we trust the big companies to not take a byte of it or sell it to others.
Let’s get back to our initial question:
Apple, what would you do?
Remember, you could know everything about us!
A) Would you avoid the forbidden tree and our yummy apples?
B) Would you shake the apple tree and sell our apples?
C) Or maybe your rebel inside just cannot resist? You take one curious bite, then many, and deal with the consequences (if you get caught).
Apple has clearly answered with A).
Our Apple devices are private by design, not by afterthought. Wherever possible, Apple will avoid the tree and leave our private data on the Apple device. Our HomeKit and Siri uses are completely safe and anonymous, to the point that even Apple cannot tell that it’s ours.
There were reports of Ex-Apple employees who saw this privacy focus as a reason for slow progress in Siri development. I doubt that. Apple won’t forget Steves advice: whenever they need something from us, they just have to ask.
Personally, I am now thinking of turning Analytics on and send Apple my then randomly scrambled usage data to contribute to improving our user experience. Something I usually avoid.
If you are wondering whether Apple’s logo has anything to do with our story: Would be cool, but no. According to Walter Isaacson’s Steve Jobs biography, the art director Rob Janoff got the assignment with the instruction from Steve: “Don’t make it cute!”. He came back with two apples, one whole, one with a bite taken out of it. Steve picked the latter because the whole apple looked too much like a cherry to him.
I hope you’ve enjoyed this post!
Stay safe & private!
P.S. You can find more Siri posts here!