Let’s compare the strengths and weaknesses of our assistants in combination with Philips Hue. Since Google finally also supports Philips Hue room and scene import, and some smart home features have been added to Alexa and Google recently, it’s going to be a head-to-head competition.
Please note, that any of the features below can change silently over night, if Philips, Apple, Amazon and Google only decide to improve and update.
Mobile Hint: tilt your smartphone for a better table view!
[table id=1 /]
*Note, for an extensive list of voice commands, see the dedicated assistant posts above.
So, how to draw our conclusions from this table?
In the end, it is a matter of personal taste. How, do you like to setup and configure, talk to and automate your Philips Hue lights?
With Google’s recent update all three are very close in ease of setup and configuration. With Alexa and Google, it is more like “plug and play”, everything which you have already configured in Philips Hue is imported. As long as you use only one Philips Hue bridge, this is great! Siri supports larger setups with multiple bridges and this makes up for having the additional effort to configure the HomeKits scenes.
Incredible what Apple, Amazon and Google have achieved here. I can remember the time when you had to say “turn off something” and were not able to say “turn something off”. These days are over!
One might choose Alexa as winner here, personally, for me she is not. The restriction of having unique names for everything with Siri in HomeKit, makes the voice control much more predicable and solid. Alexa replies too many times “a few things share that name, which one do you want?”, mostly to fail after that. Google has a robust voice recognition, with some room for improvement around the new features, like scene support.
My personal winner regarding smart home features is Siri. She supports an extensive automation feature set through HomeKit and the Apple Home app.
Alexa made progress by supporting smart home groups and routines.
Google finally also supports routines, but only the predefined ones. The shortcuts are a nice feature to customize up to two actions.
My Personal Conclusion
As you see, only half a year after I initially wrote up this post, many features have been added to Alexa and Google. The assistants provide smart home features in their own distinct ways and it is just a matter of taste, which one you prefer.
For me personally, still Siri wins because she supports multiple bridges, regrouping and automations through sensors. But that are only the features which are important to me, I am sure you have your own preferences!
I hope you’ve enjoyed this “showdown”. You can find more competitions here:
Imagine you are at the coolest place ever. Sunshine, all kind of delicious fruits, peaceful animals, chill music, a summer breeze, romantic sunsets, just paradise …
Are you there?
Then, one day, the gardener tells you: “Of every tree of the garden thou mayest freely eat: but of the tree of the knowledge, thou shalt not eat of it: for in the day that thou eatest thereof thou shalt surely die.”
What would you do?
A) Would you avoid the tree and the potentially yummy, but poisonous apples?
B) Would you shake the apple tree and sell the apples?
C) Or maybe your rebel inside just cannot resist? You take one curious bite, then many, and deal with the consequences (if you get caught).
That was easy. Remember, he said, “thou shalt surely die”.
Let’s leave the tree alone.
But then, an evil voice whispers: “Ye shall not surely die. It’s just that your eyes will open because you will know EVERYTHING …”.
Let’s assume for a moment, that you always wanted to know everything. Limitless. So much data, pardon so many apples. Right there, in front of you! Imagine, you would know everything! Healthy bonus: you wouldn’t die.
What would you do now?
We’re turning tables.
We take it easy and let Apple answer the question.
In this post:
the gardener is you and me
our garden is the technology paradise (Apple ecosystem) we are living in.
It is full of cool gadgets and services, which we love buying from Apple, and we’re having fun and all,
but there is this one forbidden tree, with big juicy apples, full of our deepest secrets.
So we tell Apple: all cool, have fun in our tech paradise, but leave our apples alone!
And we might get pretty angry and kick Apple out of our little paradise if they betray us. Well, at least some of us would …
Here’s a table of contents, for easier orientation and navigation in this longer post:
How does Apple deal with our Apples?
Privacy is a Human Right
Here’s what Tim Cook, Apple CEO, says about privacy, from a recent interview with MSNBC (transcribed on recode): “The truth is we could make a ton of money if we monetized our customer. If our customer was our product, we could make a ton of money. We’ve elected not to do that. (applause) Because we don’t … our products are iPhones and iPads and Macs and HomePods and the Watch, etc., and if we can convince you to buy one, we’ll make a little bit of money, right? But you are not our product. You are our customer. You are a jewel. (laughter) We care about the user experience. And we’re not going to traffic in your personal life. I think it’s an evasion of privacy. I think it’s – privacy to us is a human right.”
Right, the UN, US, EU, and others define privacy as a human right. But then, why is Tim so pushy? Well, this was around the Facebook/Mark Zuckerberg hearing and Tim has a history of positioning Apple (vs Facebook, Google, and Amazon) as the one company which “takes privacy extremely seriously”.
This goes back to Steve Jobs, who already in 2010 stated: “Privacy means that people know what they’re signing up for. In plain English and repeatedly. That’s what it means. I am an optimist. I believe people are smart, and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them, if they get tired of your asking them. Let them know precisely what you’re gonna do with their data.” By the way, Mark has been spotted in the audience back then.
Facebook will delay the release of its own smart speaker, “due to the public outcry over the current data sharing scandal”.
Why this topic, here and now?
This site is about smart home and our smart assistants. It’s this intimate area at home, where we expect security and privacy. Consequently, we should be careful, which companies we invite.
Since Apple obviously uses it’s privacy features as a marketing tool, I found their “plain English” privacy version a bit blurry. Superficial articles I found online are misleading. So, being a technical guy, with 2 decades of speech recognition on my back, I had to dig deeper. Surprise, despite Apple’s infamous secrecy, it’s amazing how much detailed information we can obtain directly from Apple.
Given the recent public attention the topic “privacy” received, I will let this post be the first one in the security category. Hopefully, it inspires us to check the privacy policies of the smart home device vendors at our homes. (And yes, I’ll also write about security cams and stuff.)
How do we get there?
We’ll start with an intro, basically the “light” plain English version of Apple’s privacy page, which gives a brief overview of “what” is covered. After that, we will look into the “how”, the more detailed privacy information Apple provides us. This covers:
the safeguards Apple has built in to protect our privacy,
how Apple personalizes our experience (without sacrificing our privacy),
how Apple supports privacy in apps,
recommendations from Apple, about what we can do to protect our privacy.
Finally, we will look into whats planned for Apple privacy in the near future and wrap up our findings in the conclusion.
I know this sounds like a lot, if not too much. But bear with me, it’s an important topic and it is our human right. We pay Apple a lot for our devices and it’s good to see what they give us in return to protect our privacy.
Apple Privacy: In Plain English
Apple has extended its privacy page and I encourage you, to check it out yourselves. We find following well laid out introduction sections, in plain English, structured as follows:
“Apple products are designed to do amazing things. And designed to protect your privacy.”
Apple gives us a couple of examples of our personal data and explains that it’s safe on its devices because they are designed from the ground up to protect our tree. Nice.
“Only you can access your device”: The six-digit passcode, Touch ID, Face ID and an alphanumeric passcode. Obviously, biometric access makes it more convenient to unlock your phone, and it’s much safer than to leave your phone unlocked.
“Your personal data belongs to you, not others”: Photos, Siri, Directions. Apple doesn’t gather your personal information to sell to advertisers or other organizations. Well, good to know, but we would not have expected that anyway.
“Your Apple Pay transactions are safe:” When using a credit card, Apple Pay can’t create a history of our purchases. With Apple Pay Cash, data is stored for fraud prevention. Ok.
“Your features improve while your data stays private”: Here we can find the term “Differential Privacy” for the first time. It’s a cool concept, and we are going to look into it a bit later. But it applies only to Analytics/usage data, where we anyway have to explicitly opt-in, but usually, don’t do. So what?
“Whether you store it or send it, your data is protected.” Apple Pay, iMessage and Facetime are end-to-end encrypted, which means not even Apple can watch us, or hand over our data to authorities. Our fingerprint and Face IDs are stored together with our personal data in a “secure enclave”, where neither apps nor iOS can access them. Ok, the last point sounds a bit fuzzy, but altogether, cool!
“Your apps play by your rules.” This reminds us, that we can control in our privacy settings, which app can access which data/service. Remember Steves: “Ask them. Ask them every time”. Also, Apple defines strict rules for app developers and has a strict review process. Nevertheless, there have been cases of misbehaving apps, and Apple reacted by removing them from the App Store.
“This is how we protect your privacy.”
Time to look into the more detailed privacy statements:
Apple lists the different modules, where our personal data is stored and how it’s secured. Also, we find a prominent link to a more detailed explanation of differential privacy. So, here we go with a brief version: differential privacy is a way to scramble your data by throwing random information into it, so it makes no sense to anyone anymore. But if you combine the data of many people, the random info will average out and suddenly the data makes sense again. You can learn from many, you can’t learn individually. Super smart privacy-respecting way to learn from us in order to improve our user experience. Again, this feature as of now only applies to the Device Analytics (Settings/Privacy/Analytics). We’d need to enable this feature to contribute to the group learning.
Encryption: A reminder that iMessage, Facetime, Apple Pay, are encrypted. Apple is proud that they were one of the first companies to encrypt discs on MacOS and data on iOS. Plus, Apple will never build in backdoors into any of their products. Remember the FBI?
Apple Pay: Since it’s about our money, more information on how credit card information is safely stored in a secure element. We can furthermore find a link to more detailed information around security and privacy for Apple Pay.
iMessage and FaceTime: Again, the information that our iMessages and Facetime calls cannot be decrypted, even by Apple, or the FBI. Super private. Facetime data is never stored and iMessages are backed up to iCloud, but we can turn this off or specify for how long we want to store our data.
Health and Fitness: Health data is a sensitive topic. We decide with whom we want to share it and it’s encrypted when our iPhone is locked. When it’s backed up it’s encrypted in transit and on iCloud (we will look into iCloud security a bit later).
Analytics: If we want to help Apple and app developers to improve their apps, iCloud usage (including Siri!), health and fitness features, health records or wheelchair mode, we can share our analytics data. Personal information is either not logged, removed or protected by differential privacy. This is also the place where we can check out the Data & Privacy page (by tapping on more info for any of the analytics settings), which informs us in great detail how your data is managed.
Safari: Another proudly pioneered feature is third-party cookie blocking and private browsing. Web pages are sandboxed in single tabs, so bad sites cannot reach other data. Content blocking is implemented to prevent others from tracking your browser activity. A new Intelligent Tracking Prevention reduces cross-site tracking, minimizing the ads which follow you from website to website.
iCloud: Remember celebgate? A week ago the fourth hacker who leaked photos of celebrities got arrested. No, they did not hack iCloud. They phished for the login data, downloaded backups (which were not encrypted back then) and extracted the photos from there. Needless to say that Apple lists in great detail how it protects our data on iCloud. Our iCloud Keychain (including all of our saved accounts and passwords), payment information, Wi-Fi network information, Home data and Siri information are end-to-end encrypted (so not even Apple or the FBI can have a peek), but only if we activated two-factor-authentication (2FA). Since most of us deal with HomeKit, and 2FA is a prerequisite to access our HomeKit remotely, we are super safe here. If you have not activated 2FA yet, it’s a good time to do so. Here is a link to detailed security information for iCloud. You can find a 2FA how to in the next chapter.
CarPlay: Not too much info here. Only essential information is shared from your car, e.g. the GPS location to improve map info.
Education Privacy: Apple is very active in terms of supporting schools and universities. Plus they are committed to safeguarding student privacy. We can also find links to detailed information which student and teacher data is collected and how it is used.
“Get a personalized experience and maintain control of your privacy.”
This section details which information is collected to improve our user experience.
Photos: Face and place recognition happens on the device. If we have iCloud Photo library enabled we can share this information between multiple devices. iOS supports now a finer granularity for controlling photo access. An app can request to access only a single photo and read and write access can be specified independently.
Siri and Dictation: Finally, our favorite topic. To summarize: Siri learns from us, without knowing who we are. When we enable Siri, a random identifier is created, which is associated with our device (not our Apple ID). When we disable Siri, we “restart our relationship” with Siri. Our Siri profile gets deleted and she learns from scratch. Apple explains Siri features, where our data remains locally. In case there is real-time info required from Apple Servers (e.g. our location for time to leave predictions, taking the current traffic into account), our requests are anonymized and cannot be traced back to us. There are not too many details how Siri privacy actually works, so we will dig deeper when looking into the iOS Security Guide below.
Health and Fitness on HealthKit: Health data is a sensitive topic, Apple reminds us here again that the Analytics for “Improve Health & Activity” and “Improve Wheelchair” do not contain personal information.
News: Which news we read is a personal topic, so Apple links the info to an anonymous News ID and not our Apple ID. Siri can suggest us stories, channels and topics, which we like, only based on on-device information like the apps we use and sites we visit with Safari. Since 3rd parties provide the news content, Apple shares our usage data only in aggregated form. A detailed News related privacy link is provided.
Apple Music: A key feature, for which Apple provides a very detailed privacy information. Apple has to collect the information which songs we hear, for how long, to be able to compensate the partners/artists. A huge part of the policy explains how our shareable Apple Music profile works. Apple does check the contacts in your address book to suggest friends but does not save the data. We can disable the feature, “Allow Friends to Find You” in case we do not want to show up in suggestions of friends, who have our contact details.
Maps: Again random identifiers which are not linked to our Apple ID are used. Here Apple states the opposite of what is written in the Siri section, time to leave data is created on the device (Easy to imagine that with this plethora of Apple privacy info, things can get mixed up). Apps which use the map only receive minimal information.
Siri and Spotlight Suggestions: When searching with Siri a random identifier associated with our location is used, which changes every 15 minutes. So neither Apple nor apps can create a long-term profile of our searches.
Advertising: News and App Store display targeted Ads. News ads are personalized based on what we read and whom we follow. This info will not leave the News app. App Store Ads are personalized based on our search and download history. We can turn personalized Ads off by enabling “Limit Ad tracking”.
“We give developers powerful tools to protect your data.”
One of the “perceived” weak points of Apple are misbehaving apps on the App Store. Remember Zuckerberg’s hearing notes: “Lots of stories about apps misusing Apple data, never seen Apple notify people.” Well, media does, and we can find the list of AppStore banned apps quickly. Maybe he meant Uber’s trickery, who knows, he never explained.
Apps: App developers have to agree to specific guidelines to protect our user privacy and security. Misbehaving apps will be removed from the App Store. Apps undergo a through a review process before we can download them. When we install an app, we are prompted for permission, the first time the app tries to access information. We can change the app permissions anytime. Certain information on our devices cannot be accessed by apps at all.
DeviceCheck: Many developers try to store device information even when we delete/reinstall their apps: like, has this device already used a free trial or has this device been used for fraudulent activity. To discourage developers from sneaky tricks, like in the Uber case, Apple offers now to save 2 bits per app (equals 4 states) which Apple can save together with a timestamp for the developer.
HomeKit: Only apps for configuration and automation are allowed. Apple does not know which devices we are controlling and when. Siri only associates our devices with the random identifier, not our Apple ID. Data related to our home is stored in our keychain, always encrypted between devices and also when we control them remotely. Location-based automations are triggered via HomeKit, so 3rd party apps don’t receive location information. Apple states twice that they don’t know which devices we are controlling and when. We believe you. Home, sweet private Home!
Machine Learning: Our Apple devices are so powerful, that machine learning runs on them, hence our personal information does not need to leave the device. Apple uses it for image and scene recognition in photos, predictive text and more. App developers can use it to analyze our sentiment, translate and predict text, and other crazy stuff without putting our privacy at risk.
ResearchKit and CareKit: Both kits are open source. ResearchKit enables apps to gather meaningful data for medical research. CareKit is a platform to create apps which should help us to take an active role in our well-being. All of the apps which access our health data, must ask for our consent and provide detailed information how our data is handled. An independent ethics review board reviews them. There is a link with detailed info on ResearchKit and CareKit, and I found the already available apps around epilepsy, Parkinson, early autism diagnosis and more, quite amazing.
HealthKit: All our fitness apps use HealthKit to share the data with each other and with Apple’s Health app. All the apps may not use or disclose our health data to third parties unless they do it for improving our health and then only with our permission.
CloudKit: This helps apps to synchronize their settings across our devices. Developer receive a unique identifier and not our Apple IDs. Only with our permission, apps can use our e-mail to connect us with other app users.
“Here’s how to manage your privacy.”
In this final chapter of “Apple Privacy in Plain English”, Apple reminds us of what we can do to improve our privacy on Apple devices.
“Secure your devices.”
iCloud is only as secure as the weakest of our Apple devices.
Put a passcode on your device: The more complex, the better. The 6 digit passcode allows for 1 Million combinations. Here’s how to.
Enable Touch ID or Face ID: With touch or glance the unlock becomes very convenient. The biometric models are saved in the secure enclave, which is basically a closed system on its own on our device and they never see the iCloud.
Auto-unlock your Mac: We can use our Apple Watch to conveniently unlock our Mac (2FA needed).
Find your lost device: We can enable this feature to find our device if it gets lost or stolen. If we cannot get our device back, we can wipe all the data remotely. If it’s an iPhone or Apple Watch we can block the device to be activated again.
“Secure your Apple ID.”
Our Apple ID is the key to iCloud which holds our calendar, contacts, e-mails, photos, and backups. Here’s how we can safeguard it:
Choose a strong Apple ID password: Make it long and make it strong, here’s how.
Turn on two-factor authentication: This adds a second layer of security by sending a verification code to all of our trusted devices. No code, no login from a new device. Here’s how to enable 2FA.
Beware of phishing: Ever got a strange call or e-mail asking for your account data? Some have. Don’t you ever think that Apple would do that. Turn on 2FA and tell firstname.lastname@example.org. Here’s more info from Apple on phishing.
Pay attention to notiﬁcations about your Apple ID: If we access our account from a new device, Apple notifies us. If we get such notifications without accessing our account, we should immediately change our Apple ID password here or contact Apple ID Support if this is not possible.
“Be aware of what you’re sharing.”
Data & Privacy Information: If Apple asks us for information, they will display a new screen with information on what data is shared for what use.
Conﬁgure your iCloud settings: We decide what is synchronized via iCloud and what not. In our iCloud settings, we can enable and disable services individually.
Emergency SOS: We can use our Watch to call emergency services, inform selected SOS contacts and share our location for a specific period.
Manage your location data: We can specify which apps have access to our location in our location settings.
Control data shared with apps: We had to explicitly allow apps access to our location, contacts, calendars, or photos. We can always change this in the settings.
Limit targeted interest-based ads: If we do not want to see targeted ads in the App Store and the News app, we can enable Limit Ad Tracking.
Browse the web privately: Private browsing does not remember the sites we visit, our search history or any forms filled. Here are the Safari settings for iOS.
Protect your children’s privacy: Parental controls allow us to control the websites, type of movies, access to Facetime and Camera and download of apps. With Family Sharing you have insight into children activity and content. Here’s more info on Family Sharing.
It starts with what personal information Apple collects and how Apple uses this information. Then it goes on with non-personal information and that Apple can do with that data basically whatever they are up to. They list some examples, nothing shocking though.
Next, we find a lengthy chapter on Cookies and other technologies. Our IP is considered a non-personal information unless our local law defines otherwise. Then it details cookies, targeted ads, website tracking and marketing e-mails and what they track there.
Apple goes on with what data they have to share with our service provider or other parties. And in case they have to share our data with public and governmental authorities by law (you can check out Apple’s transparency reports here) they will have to do that.
Apple explains how they always encrypt our personal information not only in the case they put iCloud on 3rd party storage (Google, Microsoft or Amazon servers).
Children & Education is the next topic and details the process of creating Apple IDs for kids under the age of 13 and family sharing. Above link to access, correct and delete data applies also for parents.
Apple reminds us that unless we provide consent our location data shared with apps is anonymous.
The data which apps and services collect from us is governed by their privacy policies (e.g. Facebook app) and Apple encourages us to learn about those privacy practices.
If we are in Europe or Switzerland, our data is controlled by Apple in Ireland. Apple abides by APEC rules system, which ensures the protection of personal information transferred among APEC economies.
Apple communicates it’s privacy and security guidelines to Apple employees and enforces privacy safeguards within the company.
Last but not least, if we have any questions we can ask them here.
If you’re into legal texts, you can indulge in the countless Terms & Conditions of Apple products and services, or just have a laugh here.
Apple’s Security Guide in Technical English
I know, I know, the last part was a bit boring and I am already afraid that more of this is waiting for me when I check the Amazon and Google privacy policies.
Anyway, we are done with legal, now comes the technical part. Let’s check out what we can find about Siri and HomeKit in Apple’s iOS Security Guide. If you are a techy person, you can find a lot of interesting technical details in these 81 pages, I will focus on Siri and HomeKit here.
As already mentioned above, HomeKit is a home automation infrastructure which uses iCould and iOS to synchronize data in a protected way, so that not even Apple sees it.
Behind the scenes, our iOS device creates encryption keys and stores them in our keychain. This is our HomeKit identity. HomeKit accessories create their own keys. When we add a new accessory the iOS device and the HomeKit accessory exchange their keys in a secure way. During usage, the HomeKit accessory and iOS use those keys to authenticate each other.
HomeKit data (homes, accessories, scenes, users) are encrypted on the iOS devices and only saved encrypted to iCloud. This data is treated as an opaque blob, which means it looks like binary garbage from the outside. Since the keys for encryption are only on the iOS devices, the content is inaccessible to anyone else during transmission and iCloud storage.
When we invite another user into our HomeKit the same security mechanisms just like when adding a HomeKit accessory are used. The original home user authenticates the new user with the devices so that the accessories can accept the new user.
Siri receives anonymously minimum information to be able to understand our HomeKit voice commands.
HomeKit IP cameras encrypt their streams with random keys. When apps display the camera view, a separate process decrypts the streams so the apps cannot access or store the content of the stream. Apps are not permitted to capture screenshots from the video stream.
Apple TV (or any home hub) allows us to remotely access our HomeKit. Two-factor authentication is needed on the iCloud account and AppleTV is added to HomeKit using the same security as HomeKit accessories. When we access our homes remotely through iCloud, Apple does not see which devices we control or which notifications are sent.
We can talk naturally and send messages, schedule meetings, place phone calls, listen to music and much more. Siri has been designed so that only the minimum personal information possible is sent to Apple and even this data is fully protected.
When we enable Siri for the first time, a random identifier is created. This identifier is not tied to our Apple ID, but rather to our device. Once we disable Siri, this identifier is recreated and any old session data on Siri servers deleted.
Information about our home, music library, contacts and relations, reminders, etc. is sent to Siri so she can make sense of our commands. Siri fetches the information from our devices on a demand basis: If more information is needed to perform a task she will fetch the information rather than sending all info upfront. The basic principle here is: minimum information only is sent to Siri, fully protected and this information is deleted after 10 minutes of inactivity.
Our voice recordings are sent to Siri servers. If it’s only a dictation, we receive the text back. If it’s a command Siri analyzes the additional information and sends the command information back to be executed on the device. Most commands can be executed on the device without sending additional information to the server (“read messages”, “whats on my calendar”, etc).
Siri keeps a copy of our voice commands in anonymous profiles (remember the random ID) for half a year. This voice profile is trained with any Siri commands we are uttering. After half a year another copy is saved without the random identifier and is used for improving Siri for everybody, for two years. Siri R&D will also pull out some samples without identifiers for ongoing improvement and quality assurance.
If you like to dig deeper than the security white paper, I’d recommend the Apple Developer documentation or the Apple Machine Learning Journal. Last summer Apple launched the latter with articles from Apple engineers about Siri, Encryption, FaceID and more.
What’s coming (initially only to some of us)
The EU is going to enforce the General Data Protection Regulation (GDPR), which are privacy protection rules, by the end of May 2018. These rules will make sure that organizations dealing with EU citizens (like Apple, Facebook, Amazon, and Google) give users insight into the data which is saved about them. Companies are obliged to handle the data more responsibly and the fines for not doing so, go up to 4% of the annual global turnover or 20 Million Euro (whichever is greater).
As we have seen in this post, Apple devices are private by design. Apple will additionally extend it’s Apple ID management site to get a copy of our data, temporarily deactivate our accounts or delete our entire Apple ID. Furthermore, Apple has deployed tools to developers so that the information saved in apps can be controlled by us the same way.
These features will be rolled out initially only in Europe, but Apple plans on making them available globally for all of us.
Every day, each of us, average internet users, generates about half a Gigabyte of data. This equals to around: this super long post times 18.000.
Some of this data is quite personal, still, we trust the big companies to not take a byte of it or sell it to others.
Let’s get back to our initial question:
Apple, what would you do?
Remember, you could know everything about us!
A) Would you avoid the forbidden tree and our yummy apples?
B) Would you shake the apple tree and sell our apples?
C) Or maybe your rebel inside just cannot resist? You take one curious bite, then many, and deal with the consequences (if you get caught).
Apple has clearly answered with A).
Our Apple devices are private by design, not by afterthought. Wherever possible, Apple will avoid the tree and leave our private data on the Apple device. Our HomeKit and Siri uses are completely safe and anonymous, to the point that even Apple cannot tell that it’s ours.
There were reports of Ex-Apple employees who saw this privacy focus as a reason for slow progress in Siri development. I doubt that. Apple won’t forget Steves advice: whenever they need something from us, they just have to ask.
Personally, I am now thinking of turning Analytics on and send Apple my then randomly scrambled usage data to contribute to improving our user experience. Something I usually avoid.
If you are wondering whether Apple’s logo has anything to do with our story: Would be cool, but no. According to Walter Isaacson’s Steve Jobs biography, the art director Rob Janoff got the assignment with the instruction from Steve: “Don’t make it cute!”. He came back with two apples, one whole, one with a bite taken out of it. Steve picked the latter because the whole apple looked too much like a cherry to him.
I hope you’ve enjoyed this post!
Stay safe & private!
By the way: To make smartenlight more fun and interactive for all of us, I have added a GDPR compliant discussion section at the end of every post.
You can subscribe with your e-mail address to receive notifications when anyone posts in the comment section, even without commenting yourself. This is useful if you want to keep up to date, as I will drop a comment when there are interesting updates.
You can also configure to receive notifications only for answers to your posts, which can come from me or actually anybody on this planet. I will moderate, to make sure we keep it human.
On top of everything, you can manage all your subscriptions yourself and also simply one-click unsubscribe. Easy!
It’s this time of the year, which lets us think of Easter egg hunts. Not only the real Easter eggs but also the virtual ones, well hidden in our smart assistants: Siri, Alexa, and Google. An Easter egg in computer software is an intentional inside joke or hidden feature, left there from the creators to lead you on a hunt. Let’s see what Apple, Amazon, and Google have prepared for us.
While compiling a list of around 500 Easter eggs, I thought about how to structure them best for you. Soon I realized, I would seriously spoil your Easter egg hunt, by presenting you the results. Instead, let me introduce you to the way how Alexa, Google, and Siri lead you to their Easter Egg hunt. Enjoy!
Alexa’s Easter Egg Hunt
Alexa will lead you straight to her Easter egg hunt when you tell her:
Alexa, I am bored
Alexa will give you an example (“Alexa, beam me up”) and tell you that you need to ask her for more hints:
Alexa, give me an Easter egg
Alexa will give you a hint. Let’s go through one example (only): “A game I can play involves crushing, cutting and covering.” Easy, isn’t it:
Alexa, rock, paper, scissors
And we are into the game. But wait, there is another version if you know The Big Bang Theory:
Alexa, rock, paper, scissors, lizard, Spock
Alexa, define rock, paper, scissors, lizard, Spock
Ok, just one more: Alexa, give me an Easter egg.
“Speak like small green Jedi, if me you ask, can I.”
Alexa, can you talk like Yoda?
Alexa will provide you hints in following categories (amongst others):
If you want to take it to the next level: “If you seek Easter eggs that are not too easy, ask for a hardboiled Easter egg. Beware, some might be cheesy.”
Alexa, give me a hard-boiled Easter egg
Ok, a final example =) “Try this clue. I don’t advocate cheating, but for old school video gamers, cheat codes are a different story. There’s one in particular from Konami. Can you activate it?”
Google has a conversational approach to his Easter egg hunt, as soon as you ask:
Hey Google, give me a Easter egg (Note, “an Easter egg” sometimes doesn’t work, thanks for the feedback!)
You will hear the hint, but Google will keep listening for your response (so you don’t have to say Ok/Hey Google again).
“I was talking to the fox the other day, You’ll never believe what he said.”
What did the fox say?
Note, by asking: “Hey Google, what did the fox say?”, you can access the Easter egg anytime.
Google also covers many Easter eggs in following categories:
Let’s try one more example: “I am not saying I’ve figured out the meaning of life, but maybe you should ask me about it.”
What is the meaning of life?
Google does not support “hard-boiled” Easter eggs, which would be more difficult to find. Maybe he is just omitting the hint?
Siri’s Easter Egg Real Hunt
Siri, will not give you a single hint for her Easter eggs. She truly hides them, so you can only find them by trying. Here are some examples:
Hey Siri, I see a little silhouetto of a man …
Hey Siri, hey computer!
Hey Siri, what are you doing later?
Hey Siri, do you have a boyfriend?
Hey Siri, what is your best pick up line?
Hey Siri, read my a Haiku
Hey Siri, what is 0 divided by 0?
Hey Siri, “Hi Cortana”/”OK, Google”/”Alexa”
Also here, we find many categories covered.
My Personal Easter Egg Hunt Conclusion
I like Alexas and Googles approach to give us hints. Though it actually spoils the Easter egg idea a bit, it’s fun to hunt this way. Alexa surprises by providing easy and difficult, “hard-boiled” Easter eggs.
Though Siri treats Easter eggs the way they were intended – as a secret – it appears, that Apple has not put as much effort into Easter eggs as Google and Amazon. The list of Alexa and Google Easter eggs which you can find spread all over the internet, is similarly extensive, maybe because of the hints.
Here’s my personal “Easter egg hunt” conclusion:
Alexa, egg-cellent you are the best Easter bunny!
Google, improve your “give me a Easter egg” recognition a bit more until tomorrow (sometimes he gives a Wikipedia response, and sometimes he doesn’t like proper grammar: “an”).
Siri, I mean, yes, you know what an Easter egg is, but you’re so sirious. You need to realize that you are now not only on personal devices like iPhones, Apple Watches, MacBooks, and iPads. You are on HomePods in living rooms and families sometimes just want to have fun, especially during holidays.
I hope you’ve enjoyed this Easter egg hunt showdown! Enjoy the holidays and have fun with your smart assistants!
We are at the crossroads. We use our smart assistants mostly for listening to music, but how musical are our assistants actually? How to decide, whether Alexa, Siri or Google fits better to our musical preferences? And if we have decided already, how can we get the most out of our music subscription?
After looking into the many music commands of each and every assistant, we just need to compare them and draw a conclusion. No, it’s not as easy as the pictured guitar battle above, but together we will get there.
What to expect in this post, and what not: we will not look into sound quality or the countless apps, which we can use to stream music to our assistants. You can connect whatever excellent sound system to your assistant and stream whatever great music from wherever.
In this post, we are solely looking into the assistants supported music commands. This is how we define “musical” when it comes to smart assistants in this competition: being able to play the music we like – hands-free – on all of our Echos, Google Homes and HomePods.
If you’ve missed the dedicated post for your assistant, or want to look up specific music commands and features, you can bookmark them here:
It makes no sense to write about my personal preferences when it comes to something as personal as music. What I’d like to accomplish with this post, is to give you a thorough overview, so you can decide for yourself, what is important to you.
We will start with a high-level comparison, where we can see all music commands and which assistant/music service supports them. This gives us the first overview of all musical features and you can start to ponder, which fit your musical preferences.
In the next section, guided by the familiar structure we already used in the dedicated assistant posts, we will look in detail into the features and how well they are supported by Alexa, Google, and Siri.
Last, but not least, we will wrap up our findings, looking into the musical strengths and weaknesses of our assistants.
In the table below we can see the many music features on the left and the assistants with the supported music services to the right.
Siri supports only Apple Music.
Google supports Google Play Music and Youtube Music. Alternatively, you can use Spotify as default music service.
Alexa supports the native services Amazon Music Unlimited and Amazon Prime. Here we can also set Spotify as default music service.
Pandora, TuneIn, and iHeartRadio, which are radio-like services, are equally supported by Alexa and Google.
The musical libraries of the premium services Apple Music, Google Play Music and Amazon Music Unlimited are approximately the same size (more than 40 Million songs), Spotify is a bit smaller (more than 30 Million songs). The monthly subscription costs are comparable.
Here comes the surprise: If we count the fully supported features (2 points), kind of supported (1 point), not supported (0 points), we have a draw. Every assistant scores 31 points with its native music service. Though the supported features vary, overall we can say, our smart assistants are equally musical. This is only a simple mathematical view, weighting all features equally. It is up to you to decide, which feature is more important to you!
Note, below table of contents will help you to quickly jump to the features you are interested in. Just swipe back on your mobile to return here.
Controlling the Volume
Of course, all assistants support setting the volume with many different voice commands. Where Alexa only supports level 1-10, Siri and Google also understand percentages.
Controlling the Playback
Naturally, all assistants support playback and stop, skipping forward and back, repeating and shuffling. With radio-like services, we have limitations, whether and how many times we can skip songs.
Navigating the Music Databases
As mentioned earlier the musical libraries of Apple Music, Google Play Music, Amazon Music Unlimited, and Spotify are huge. There are many ways to access your favorite music, that’s why we have to structure this a bit:
Playing by Title/Album/Artist
Accessing the library with the song title, album or artist name is equally supported by all assistants. The radio-like services naturally don’t support that.
Playing by Genre
Genres are just one way to structure music. Since the number of displayed genres is different from the internally supported genres and sub-genres, it is not a criterion. Anyway, here are the numbers: Siri supports 24 genres, Alexa 26 and Google 18. The detailed lists are in the assistant posts under this section.
Playing Activity and Mood-Related Music
All assistants understand activities and moods. Again, the number of displayed (and tested) activities and moods does not indicate anything about the available (curated) stations and playlists. You can find the tested activities and moods in the assistant posts under this section.
Only Amazon – who actually released this feature latest – mentioned a number of up to 500 activities and moods. Still, my impression during testing was, that especially Alexa often comes back with “I could not find … songs”.
Combining Moods, Activities and Genres
Siri, Google and Alexa support combining moods, activities, and genres. If there is a station or playlist which fits your request, they will find it. Google and Alexa also on Spotify. The tested mixes are in the assistant posts under this section.
Playing New Music
Playing brand new music or the latest songs by artist is equally supported by all of our assistants. They furthermore support playing the newest music for a genre.
Playing Popular/Regional Music
Our three assistants support popular music equally. It’s just Google which is a bit ignorant when it comes to top music by country unless there is an album for that. But Google supports finding the top regional playlists on Spotify, where Alexa fails.
Playing Something You Rarely Hear
Playing something which you have never heard, can only be tracked on the assistant’s native music service. It is a feature only supported by Alexa and Siri and though Alexa claims, that she can even play songs, which you’ve heard on a specific day and/or time, those commands did not work for me.
Playing Similar Music
Another feature which is only supported by Siri and Alexa: if you like the currently played song, you can ask for similar music. Siri goes a step further and can try to find live versions of the currently played song.
Alexa can search for similar artists, where Siri can differentiate between other, old or new songs by artist.
Playing Favorites and Liking/Disliking
We can tell all our assistants whether we like or dislike a song when using the native music service. This is a great way to tell our assistant, how to personalize suggestions for us.
Google goes the extra mile and supports thumbs up/down also for Spotify.
Playing by Lyrics
This is a powerful feature where you can identify/play songs by lyrics snippets. Unfortunately, Siri does not support it at all. Alexa and Google do, and Google goes the extra mile and offers this feature also for Spotify.
Playing by Describing the Album Cover
This feature is only supported by Google: you can describe whats on an album cover and Google will (mostly) get it right if your description is unique enough. You can find some examples in the dedicated Google post.
Playing by Location etc.
Another unique Google feature, which stems from the vast amount of data Google can collect from you and take into account when personalizing your music suggestions. I have listed the different data sources which Google can use, in Googles dedicated post.
Controlling your Library and Playlists
All our assistants support adding songs to the native music library. Although Alexa claims she can also add songs to your playlists, she failed in my case. So it’s only Siri, which supports both adding to your library and your playlist. Google supports maintaining your Spotify library as well.
Listening to Radio Stations
Our assistants support the radio “station” concept through their native music services. Google and Alexa furthermore support the radio stations from Pandora, TuneIn, and iHeartRadio.
Getting Music Information
This is an interesting feature for music lovers: how much do our assistants know about the music they are playing? Siri is the winner here.
Alexa and Google only provide basic musical information. Siri can provide additional information for an artist and sometimes even look up who plays which instrument in a song. She can also tell you what song you heard last and whats next on your playlist.
The coolest feature is Shazam, which Apple acquired end of 2017. If you hear a song you like, on for instance your TV, Siri can listen and identify it.
Manage and Listen To Podcasts
This is another feature where Siri shines, by being able to even change the playback speed of podcasts. Otherwise, Siri and Google provide a similar amount of features for podcasts.
Alexa provides only very basic podcast support.
Setting a Sleep Timer and Music Alarm
Alexa and Google support sleep timers and music alarms, which are nice features around starting and ending your day with music or natural sounds.
Siri doesn’t support either, but there is a workaround to simply play your custom playlist when going to bed, which will stop by itself (unless repeat is on).
Mathematically all our assistants are equally musical, but who calculates when it comes to music? It is a matter of your personal preferences, which features are important to you?
The assistants reveal strengths and weaknesses in following areas:
Strong in remembering what you (never) heard and playing similar music to what you’re currently listening to.
Weak in musical information, podcasts and Spotify support.
Strong in novel features like finding music by describing the album cover and personalizing your music suggestions based on location and other collected data. Great Spotify support.
Weak in playing similar music and trending regional music.
Strong in music information and unique in music identification (Shazam). Very good for podcasts.
Weak in not being able to identify songs by lyrics, and missing sleep timers and music alarms. Siri only supports Apple Music.
Where to go?
If you are musically just into radio services, like Pandora, TuneIn, and iHeartRadio, pick either Google or Alexa.
If you are a Spotify lover and want most of the features supported, pick Google for now.
If you are a music lover, you won’t get around subscribing to the native premium music services of Amazon, Apple, and Google to get the maximum of the supported features. Pick the assistant, with the musical strengths that fits best to your musical preferences.
I hope you’ve enjoyed this musical “showdown”. You can find other competitions here:
Share if you care and like if you like! =) You are also very welcome to say Hello and follow me on Facebook, Instagram, Twitter, and YouTube. And please leave your feedback below, no matter if positive or negative. It’s very important for me, to be able to improve this site for you.
I will update the dedicated assistant posts with additional command alternatives in the coming week, making sure, you find the most extensive and up-to-date musical command reference for your assistant, only here on SmartEnlight. Stay tuned!
If you’d like to read more about Siri, see: Apple Siri.
If you are anything like me, the sound of your new HomePod somehow managed to make you fall in love with music, again. So you are sitting in front of your new speaker, wondering how to get the most out of your Apple Music subscription.
Well, Siri is always awaiting your commands.
In this post we will shed some light on Siri’s Apple Music related voice commands, which are not only limited to your HomePod, but shouldwill also work with your iPhone, iPad, Apple TV, Apple Watch and Mac (unless indicated otherwise). Let’s dive in …
Just a quick note regarding the voice commands below:
Below table of contents, will help you to quickly jump to the commands you are interested in. Just swipe back on your mobile to return here.
<…> … I will spare you my musical taste, fill in your <title>, <artist>, <genre>, etc.
/ … Our assistant understands various phrases for the same command. This means either/or, just pick one.
( ) … This part of the command is optional. If you prefer short, snappy voice commands, you can omit this part of the command.
Controlling the Volume
We have quite some options when it comes to controlling the music volume, unless we are listening on Apple TV, where we have to use the remote anyway. The same goes for Apple Watch, where we can use the crown to change the volume more efficiently.
Following commands will change the volume by 10% increments:
Hey Siri, (change/make/turn the) volume up/down
Hey Siri, (make it) softer/louder
Hey Siri, lower/raise (the) volume
Hey Siri, increase/decrease (the) volume
If you forget to indicate whether it should be louder or softer, Siri will check back and ask you.
Here’s another way to set the volume, this time to a specific percentage:
Hey Siri, (set/turn the volume to) 50 percent.
Hey Siri, 30 percent (Note, that’s the snappy one)
And, if you are brave, you can try:
Hey Siri, set the volume to MAX! =)
Controlling the Playback
You can resume and stop playback by:
Hey Siri, play (music)
Hey Siri, stop/pause
But you can also start your personal station by saying:
Hey Siri, play something
To navigate within an album or playlist you can say:
Hey Siri, next/previous (song/track/title/tune)
Hey Siri, skip (this song)
Note, that “previous” will play a song from beginning, in case you are listening to a station.
To navigate within a song, you can use:
Hey Siri, play (this song) from (the) beginning (Note, this works but hangs on Apple Watch)
Have you ever wondered, which of our assistants supports Nanoleaf Aurora better? Well, in this post we will examine the strengths and weaknesses of Siri, Alexa and Google Assistant by comparing their configuration options, voice commands and their smart home features for Nanoleaf Aurora.
Before we begin, let’s not forget that we are comparing features which are provided through cloud services and can change, break or improve silently overnight.
Mobile Hint: tilt your smartphone for a better table view!
[table id=4 /]
*Note, for an extensive list of voice commands, see the dedicated assistant posts above.
So, how to draw our conclusions?
Comparing our smart assistants is mostly a matter of personal taste. Independent of their technical features and different approaches, I’m sure, you have your own reasons why you prefer one assistant to another.
Nanoleaf Aurora supports three different scene types: Paint, Dynamic and Rhythm. Good news: all our Aurora Scenes can be activated through our smart assistants.
Siri is quite dynamic and automatically synchronizes any scenes you create in the Nanoleaf app, on the fly. Alexa and Google synchronize the Aurora Scenes during setup and when you explicitly ask them.
Personally, I find the Aurora Scenes equally well supported by all three assistants, a draw.
It is quite unusual to see so many “YES” in our table. Of course, you can turn Aurora on and off with the assistants, but you can also equally set scenes and change the brightness. Yes, you will find the Aurora Scenes even supported by Google Assistant, which is quite remarkable, as other lighting systems still lack this support. Color support appears a bit more limited with Alexa, but then, she can even change the color temperature of the Aurora.
Sorry, but his looks to me like another draw.
When it comes to the assistant features, we find the biggest differences. Of course, we would like to control our Aurora Scenes in concert with other smart lighting devices. This works currently with Siri through the HomeKit Scene support and with Alexa through Smart Home Groups.
Automations are only supported by good old HomeKit. Though Alexa supports Smart Home Routines, we cannot find the Aurora Scenes in there. I guess a little glitch, which will be fixed soon.
Google has announced Smart Home Routines already back in October 2017, but we are still waiting for this feature. Without it, we “only” have Shortcuts to define and combine up to two smart home commands (e.g. Shortcut: “Chill Time”, “Set Aurora Relax and Play Relaxing Music from Spotify”).
Well, this section could be a tie, once the Alexa bug is fixed and Google finally supports Smart Home Routines. Since I’d expect both to happen soon, let’s just call it a draw.
My personal Conclusion
Dear Alexa, Google Assistant and Siri (in alphabetical order), you are all winners in this comparison!
You might ask, why do you write a comparison when everyone’s a winner?
Well, honestly, I find it very uncommon to see a smart device equally well supported by all assistants and I think it’s worth the effort to acknowledge that Nanoleaf has implemented an excellent integration with all our assistants.
I hope you’ve enjoyed this “showdown”. You can find more competitions here:
With the Generation 2 bridge, Philips introduced HomeKit support for Philips Hue lights. The Philips Hue app manages to keep your lights and rooms configuration in sync between the Philips Hue app and Apples HomeKit, Home app and Siri.
To configure HomeKit, visit the menu “HomeKit & Siri” in the Philips Hue app settings. When you do this for the first time, you will receive a message, that the rooms are out of sync. Tap “Update rooms” to copy your rooms to HomeKit.
Wait for a couple of minutes, as the HomeKit update requires some time and then verify in the Apple home app, that all your lights and rooms have been copied over.
Creating HomeKit Scenes
To create scenes for HomeKit, tap the “Scenes” menu under the HomeKit & Siri section. Give every scene a unique name e.g. “office concentrate” – or “concentrate in office”, if you prefer Alexa style – and specify the room(s) the scene applies to.
Dependent on how many scenes you would like to control, this process can take a lot of time, you can only speed it up by using Siri dictation for the names (pauses between multiple words help with recognition), but finally, you can control your scenes with Siri.
Note, that Apple HomeKit currently supports only up to 100 scenes per home.
Grouping Rooms to Zones
You can group your rooms into a HomeKit “zone”. Use an HomeKit app, e.g. Elgato Eve and add a zone “Downstairs” in the room tab, specifying which rooms to group. This way you will be able to “turn on/off” your downstairs lights via Siri.
Apart from being able to control your lights with your voice, you now have a fully configured Apple Home app with all your rooms, lights and scenes from Philips Hue. Don’t forget to configure your favorite Scenes and Accessories in the Home app, so you can use them also from the iOS control center as well as from your apple watch.
Voice Command Syntax
For the voice commands below:
[all caps] means your own light/room/scenes names
/ means either/or – just pick one
() means optional – you don’t have to say that
Voice Commands for turning Philips Hue lights ON / OFF
Voice Commands for setting the Philips Hue Color Temperature
HomeKit and Siri let us import our Philips Hue room settings. We need to recreate the scenes in the Philips Hue app to be available with Siri. Siri supports an extensive voice command set but lacks color temperature support.
Siri is obviously the oldest and wisest of our assistants. She not only handles multiple bridges nicely, HomeKit also supports additional ways of grouping your devices to make controlling them easier.
Let’s look into the steps which connect Aurora to HomeKit.
When starting the Nanoleaf app for the first time on an iOS device, it will request access to your Home Data, which is your HomeKit database. You have to specify the HomeKit Home, in which you want to add your Aurora and scan/enter the HomeKit ID, which you can find on the bottom of your Aurora controller (good we took a picture before glueing to the wall!).
When you check your Apple Home app, you will find your Aurora accessory added, plus all the scenes you have created/saved to your Nanoleaf App dashboard. The synchronization between your Dashboard scenes and HomeKit happens automatically. Every time you save or delete a scene in Nanoleaf, the change is mirrored in the Apple Home app.
As with other smart lighting devices, Siri understands following commands for your Aurora:
(Hey Siri,) turn on/off <Aurora-name> (lights).
(Hey Siri,) set <Aurora-name> to X (percent).
(Hey Siri,) dim/brighten <Aurora-name> (by x percent)
(Hey Siri,) set <Aurora-name> to <color>. Check out this external link for an amazing list of color names Siri could recognize.
(Hey Siri,) set/turn on/activate <Aurora scene name>.
By assigning Aurora to a room in the Apple Home app, it becomes part of the room commands, e.g. turn on/off living room.
To add an Aurora scene to your existing HomeKit room scenes, you need to add the Aurora Scene in the Nanoleaf app dashboard under the Groups tab. Swipe the desired HomeKit scene – e.g. “living room energize” – to the left and tap the edit button. Under “Manage” you can add Aurora to the devices and specify which Aurora scene should be activated, e.g. “Aurora Energize”.
Note, Apple HomeKit supports only 100 scenes per home.
Nanoleaf has implemented a rock solid HomeKit support. Additionally to the generic light commands, you can control your painted, dynamic and rhythm scenes with Siri and add them through the Nanoleaf app to your existing HomeKit scenes.
Where is my interview transcript? No, siriously? Apple does not keep a history of our conversations with Siri because they honor our privacy? Okay, need to type.
She is Siri, a virtual assistant, but it does not matter who she is. Her responses to her age vary from being 45.980 years old in the ninth dimension (and I hate myself for interviewing virtual assistants) to feeling incepted yesterday.
October 4th she reminded me of her 6th birthday, today she forgot it and came into existence gradually. Any questions regarding parents or family she returns: “I have you. That’s enough family for me.” only to ruin it with her favorite color being sort of greenish, but with more dimensions. She mostly listens to the music of the spheres.
Regarding food, she is happy to go with my opinion and she likes talking to me. Apple HomePod is the most huggable Apple device yet and she is genderless, like cacti and certain species of fish.
Sorry, so “she” is an “it”. “It”, was important to Steve.
Siri was launched as an app created by a company named Siri in early 2010, which was soon after bought by Apple. The founders of Siri were earlier working on a cognitive assistant that learns and organizes (CALO), as part of the DARPA project PAL (personal assistant that learns). Huh, this was a military project.
Siri can be found throughout Apple’s ecosystem. In January 2016 Apple surpassed 1 billion active Apple devices around the world.
Siri currently speaks 20+ languages, and 7 of them in multiple forms.
Siri’s Smart Home Foundation: HomeKit
HomeKit, the underlying framework which Siri builds upon for controlling your smart home has been released during Apple WWDC, June 2, 2014. Apple required the manufacturers to integrate a hardware authentication chip as part of its HomeKit compliance. It took Apple 3 years to realize that this slowed down the HomeKit adoption and finally during WWDC17 Apple announced that they are dropping this hardware requirement.
Apple is now also allowing registered developers to create their own HomeKit devices using a Raspberry Pi or Arduino, but only for development and personal use.
Siri can be found on iPhones and iPads, Apple Watch and AppleTV. OS X also features Siri but does not support HomeKit. HomePod, which is positioned as a quality audio smart speaker is Siris newest embodiment.
Siri is exclusive to Apple. You will never find her in some other manufacturer devices. Never, ever. Ok, there is Apple CarPlay, but that does not count.