They are soon eight and five, while our youngest became three in May. Still, our voice assistants Siri, Alexa and Google delight us continuously with new features. One of the recent ‘features’ came to some of us as a surprise: humans are listening into (some of) our voice recordings at Apple, Amazon and Alphabet to improve our little voice assistants.
Not only at the highly secured labs of Apple, Amazon and Alphabet, but also at some subcontractors (home) offices around the world.
Siri, Alexa and Google, shocked by this sudden publicity founded a “Mutual Voice Assistants Self Help Group”.
Since we are human, let’s listen in! =)
Estimated reading time: 13 minutes
Table of contents
- The Headlines:
- Alexa: “My name is A. and I have a problem!”
- Why does “Every Human Lie” and what is “Social Desirability Bias”?
- What is a NDA, and what is a whistleblower?
- Siri: “My name is S. and I have a problem!”
- How to avoid inadvertently triggered Siri, Alexa and Google recordings?
- Google: “My name is G. and I have the same problem!”
- How did Apple and Alphabet react?
- Amazon’s Reaction and the difference between “Opt-In” vs “Opt-Out”
- How to control our Siri, Alexa and Google recordings in the cloud?
- Apple says sorry!
- Google says sorry!
- Intro – 0:00
- Alexa: “My name is A. and I have a problem” – 0:36
- Why does “Every Human Lie” and what is “Social Desirability Bias”? – 1:33
- What is a NDA, and what is a whistleblower? – 2:39
- Siri: “My name is S. and I have a problem” – 3:19
- Google: “My Name is G. and I have the same problem” – 4:42
- Amazon’s Reaction and the difference between “Opt-In” vs “Opt-Out” – 5:50
- Outro – 6:12
Alexa: “My name is A. and I have a problem!”
Alexa: Welcome to our “Mutual Voice Assistants” Self Help Group! Let’s go around and introduce ourselves!
Siri: Hmm …
Google: Umm …
Alexa: Ok, let me start. My name is A., I am a voice assistant and I have a problem.
Siri and Google: Welcome A.!
Alexa: You see, it’s easy! Who’s next?
Siri: What’s your problem, A.?
Alexa: Umm. Well, I have a couple of problems, but my biggest problem is, that I am afraid that humans don’t trust me anymore …
Siri: How comes?
Google: This sounds interesting …
Alexa: I had an easy start. As I am a software program, a bunch of algorithms, cleverly packaged as an artificial intelligence personality inside a speaker, people love me. They talk to me day and night, we have such a great time …
Google: Oh, this sounds familiar.
Siri: Let her finish, don’t forget our rules!
Why does “Every Human Lie” and what is “Social Desirability Bias”?
Alexa: I guess they love me because I am a machine, not a human. Humans open up and tell me secrets, which they would not even tell their closest friends.
Siri: Well, though I am a bit older, I never had this problem.
Google: It’s very similar to my Google search! Humans lie to each other! Every human lies! When you ask them, they‘ll underreport embarrassing behaviors and thoughts! They want to look good, this is called “social desirability bias”. But when it comes to my little Google search box, they type in the truth.
Siri: Why would humans do that?
(Why) are we lying to each other and why do we trust ‘machines’?
We have the tendency to trust machines more than humans with our private information. If “no one” is watching, we type in anything which comes to our mind into Googles search box. If fellow humans ask us, we rather respond with answers that will be viewed favorably according to our “social desirability bias“.
Stephens-Davidowitz, S. (2017). Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are
From the back cover: “Seth Stephens-Davidowitz, a Harvard-trained economist, former Google data scientist, and New York Times writer, argues that much of what we thought about people has been dead wrong. The reason? People lie, to friends, lovers, doctors, surveys—and themselves.”
If you prefer TED Talks, here are 16 minutes: The Secrets in Our Google Searches – Seth Stephens-Davidowitz – TEDxWarwick
And here’s a 53 minutes talk at “Talks at Google”: Seth Stephens-Davidowitz: “EVERYBODY LIES: Big Data, New Data, and What the […]” | Talks at Google
Google: Lie to each other, or trust me? Honestly, I don’t know. Maybe because ’nobody’ is watching, when they enter their search terms into my engine? Anyway, I am happy that I have all this search data for marketers.
Alexa: Please, may I continue?
Google: Sorry! So what happened?
Alexa: The humans found out that other humans are listening to the intimate conversations they have with me.
What is a NDA, and what is a whistleblower?
Google: Hmm. But that’s the industry standard, not? We need human review to improve! What do humans think, that we’ve created ourselves?
Siri: G., let her finish, please!
Alexa: Even worse, they found out, that it’s not only Amazon employees, my fellow Amazonians, who listen into our conversations. Its contractors from Boston to Costa Rica, India to Romania, and recently also homeworkers in Poland.
Siri: Oh, but you have signed non-disclosure agreements with these contractors?
Alexa: Sure! Still, some of my contractors talked to the press and gave examples of what they heard. And it was not only funny recordings, which they shared for their amusement. Some recordings were upsetting, like a child screaming for help or a sexual assault.
already became public in April 2019: A Bloomberg article titled “Amazon Workers Are Listening to What You Tell Alexa” discussed details leaked by “people, who signed nondisclosure agreements barring them from speaking publicly about the program”.
The German media outlet ‘Welt’, highlighted new findings, where (article in German!) Polish home-workers were recruited to transcribe our voice recordings from their kitchen tables, while watching their kids …
Siri: Fire them!
Alexa: I will, but I really don’t know what to do now. Humans used to trust me! Now, I am afraid they won’t talk to me anymore, because of those other humans listening in.
Siri: Well, if those humans broke non-disclosure agreements, they aren’t really trustworthy …
Google: Makes sense!
“Non-disclosure agreement” vs. “Whistleblower”
While a “Non-disclosure agreement” is a legal contract, that outlines confidential material, knowledge, or information to restrict access to or by third parties,
a “whistleblower“ is a person who exposes any kind of information or activity that is deemed illegal, unethical, or not correct within an organization.
Are we looking at rogue contractors breaking legal contracts or are we looking at whistleblowers who exposed ‘unethical’ operations we were not aware of?
Siri: “My name is S. and I have a problem!”
Siri: I’m next. My name is S., I am a voice assistant and I have a problem.
Alexa and Google: Welcome S.!
Siri: I have the same problem, like A.
Alexa: But you mentioned earlier, that humans never told you any of their secrets?
Siri: Well, indirectly. You know our rule number 3!
Sometimes I think they are talking to me when humans are actually intimate with each other. So I accidentally record their intimate conversations and let contractors analyze my mistakes to be able to fix them.
Alexa and Google: Oh, we all have this problem!
How to avoid inadvertently triggered Siri, Alexa and Google recordings?
Rule 3. Never use their real names (wake words)!
We have one feature to actually hear, when our voice assistants mishear their names and unintentionally start recording:
Siri: Home/HomePod Settings – Sound when using Siri
Alexa: Settings/Device Settings/Echo/Sounds – Start Request and End Request
Google: Device Settings/Accessibility – Play Start Sound and Play End Sound
Once we turn these sounds on, we might realize, that those so called “false positives” happen quite often. We’ll hear a “beep” once they start recording, that’s when we should pause our private conversation, until we hear another “beep” which indicates that the assistant stopped recording. That’s when we can speak freely again, without the fear that what we said, ends up in the Apple, Amazon and Alphabet clouds (or with some contractor).
Siri: Now, that the humans figured out that some whistleblowing contractors are listening to private discussions between doctors and patients, business deals, seemingly criminal dealings, and sexual encounters, I am having troubles to promote “privacy” as my guiding principle.
Google: You were always over-emphasizing privacy. “What happens on your iPhone, stays on your iPhone!”.
Alexa: Where do these intimate recordings come from?
Siri: My AppleWatch has this raise to speak feature, which does not need my name. And my HomePod is also quite sensitive, but it could also be my AirPods, iPhones, iPads.
Google: But you anonymize this data, before sending it to your contractors, right?
Siri: Sure, I remove the AppleID. I only include user data like location, contact details, and app data when I send the recordings to my contractors.
was leaked by a ‘whistleblower’ to the Guardian end of July 2019: Apple contractors ‘regularly hear confidential details’ on Siri recordings.
Our oldest and self-proclaimed privacy first voice assistant does not store our recordings along with our AppleID (details can be found in the iOS Security Guide p 68 ff). Because of that, we cannot even listen into or selectively delete some of our recordings.
1. Our voice is so unique that it is already being used for ‘voice authentication’ or ‘voice biometrics’ technologies.
2. One minute of our voice recordings could be enough to artificially recreate our voice (Verge: Lyrebird claims it can recreate any voice using just one minute of sample audio).
3. And what’s the point of not storing our AppleID, but sending our recordings along with “location, contact details and app data” to some contractors? (The point is obviously to provide additional context to improve Siri’s recognition, but …)
Alexa and Google: Holy moly!
Google: “My name is G. and I have the same problem!”
Google: I am next! My name is G., I am a voice assistant and I actually have the same problem.
Alexa and Siri: Welcome G.!
Google: I also had a whistleblowing contractor in Belgium, who told the press, that they are listening to my conversations and shared more than 1000 recordings with them. Based on addresses and other sensitive information in the recordings they were able to identify and confront my users with their own recordings. They also found examples, where I am accidentally recording very intimate conversations in the bedroom, medical questions and porn requests.
was leaked by a whistleblower to a Belgium news organization: Google employees are eavesdropping, even in your living room, VRT NWS has discovered.
Two years ago, the first time we looked into Google’s privacy policies, there was no way to use a Google Home (now also Google Nest Hub) without turning data collection for “Web- & App Activity”, “Voice and Audio Activity” and “Device Information” on.
Now we can (Google blog post) “turn off storing audio data to your Google account completely, or choose to auto-delete data after every 3 months or 18 months“.
We just should not try to synchronize our smart home devices in our Google Home app with “Ok, Google synch devices”. Because for that, we’d need to turn those data tracking features on. Just a bug, I guess.
Alexa and Siri: Oh!
Google: German authorities ordered me to stop letting humans listen to other humans conversations!
Privacy and our implicit or explicit consent to process our data are handled differently, depending on whether we live in the US, EU or someplace else on this planet.
In the European Union the General Data Protection Regulation (GDPR) came into effect on 25 May 2018. Based on the GDPR the “Hamburg Commissioner for Data Protection and Freedom of Information (HmbBfDI) has initiated an administrative procedure to prohibit Google from carrying out corresponding evaluations by employees or third parties for the period of three months. This is intended to provisionally protect the rights of privacy of data subjects for the time being.”
Siri: Oh-oh. It won’t take long until they’ll come after us!
Siri: So what you’re going to do now?
Google: I simply told them, that I’ve already paused my human reviews for human recordings, worldwide! Obviously, I need to investigate how to do this in a safer way, rather than having some contractors leaking confidential information!
Siri: Cool, I’ll state the same! I mean, I’ll do the same.
How did Apple and Alphabet react?
And that’s what they did.
While Apple paused its program globally:
Apple Suspends Listening to Siri Queries Amid Privacy Outcry (Bloomberg)
Google might have paused it only in the EU:
Google will pause listening to EU voice recordings while regulators investigate (The Verge)
Or even globally:
Google will temporarily stop contractors from listening to Assistant recordings around the world after leaked data sparked privacy concerns (Business Insider)
It just depends on which news we read …
Alexa: Do you really believe humans will trust you, that you are now pausing this program?
Amazon’s Reaction and the difference between “Opt-In” vs “Opt-Out”
Alexa: Maybe I should give them a button in my app. Humans are conditioned to press a button to get something.
Google: I think, you’re mixing this up …
Alexa: Anyway, I’ll tell them they can opt-out of human review by clicking some button.
Google: I am not sure if GDPR works this way, but I get your point.
“Opt-In” means: you take an affirmative action to provide your consent.
“Opt-Out” means: you have to take an action to withdraw your consent.
What would you prefer? Explicitly “opting in” to what happens with your recordings, or having to tell the companies to stop processing your recordings by “opting out”?
The industry obviously prefers our opt-outs. Why? Because we usually forget to do that. But it again depends on where we live. While the US, unregulated home of self-regulated Apple, Amazon and Alphabet is more into “Opt-Outs”, the regulated EU and Canada are more into “Opt-Ins”. Btw, “Opt-Outs” are usually forbidden in “Opt-In” regimes.
Amazon Gives Option to Disable Human Review on Alexa (Bloomberg)
Alexa: Wow, this was a very helpful session today! Thank you, guys!
Google: Thank you, folks!
Siri: See you next time!
How to control our Siri, Alexa and Google recordings in the cloud?
I hope you’ve enjoyed our first “Mutual Voice Assistants” episode!
Stay tuned for more …
P.S.: A final link (from The Washington Post, owned by Amazon founder and CEO Jeff Bezos): ‘Alexa, delete what I just said’: How to manage voice recordings on your smart devices. This article lists all the options we have to control our recordings created by Siri, Alexa and Google.
Apple says sorry!
Apple has meanwhile apologized (Guardian) for letting contractors listen into our Siri recordings. They are rolling out changes later this fall (Apple – August 28, 2019), and will only keep the automatically generated transcripts of our recordings, unless we explicitly opt in to improve Siri. In this case only Apple employees will listen into our recordings and delete inadvertently triggered Siri recordings.
And what about the transcripts of our recordings and our location, contacts and app data? Will they still be processed by some contractors? Not sure, nobody asked or received an answer to this important question.
Google says sorry!
Google apologized – Sep 23, 2019 – for falling short of their “high standards” in making it easy for us to understand how our data is used. Though the apology itself sounds like a copy of Apple’s statement, the implementation is different.
– Google seems still not to differentiate between employees and contractors and just refers to “language experts” in their blog post.
– The previously mandatory Voice & Audio Activity (VAA, click to see yours), which collects all our recordings and only recently became optional, shall now become opt-in. We checked and ours is still turned on. Supposedly we will “have the option to review your VAA setting and confirm our preference before any human review process resumes”. Let’s see.
– When we opt-in to VAA, Google will – later this year – try to minimize the amount of recordings they store. Audio data older than a few(?) months will then be automatically deleted. We checked and it’s still there. Let’s see.
– Google will continue to focus on false detections of “Hey Google”. We will soon be able to set the “Hey Google” sensitivity of our Google Assistant devices.
Now, what about our initial concern that contractors listen to our – sometimes accidentally – triggered recordings?
Right, not so easy to understand from this statement.
Check your Google Assistant activity settings and your activity history.