Apple Contractors Might Have Listened to You Having Sex Via Siri

Be careful about what you say around Siri—you don’t know who is listening.

Siri
Apple contractors have regularly listened to recordings of couples having sex, drug deals and confidential medical information—just to name a few. Mandy Cheng/AFP/GettyImages

OK, let’s say you’re a disgruntled contractor working for Apple, and you have the opportunity to listen in on complete strangers’ private conversations via Siri—wouldn’t you do that? Don’t pretend to be on some sort of moral high horse; you’d be curious to eavesdrop on the lives of strangers. But from a consumer point of view, you would find this random surveillance very creepy and unsettling, right?

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

Well, it’s happening—whether you like it or not.

SEE ALSO: Amazon Admits Alexa Saves Your ‘Deleted’ Conversations

Last Friday, The Guardian reported that Apple (AAPL), get this, is paying contractors to listen to private recorded Siri conversations as part of their job. Yes, really.

This civilian snooping is designated as part of their job providing quality control, or as they refer to it:  “grading” to help improve Siri’s voice assistant proclivities.

Just to restate, this is Apple—not the NSA. (Insert Edward Snowden joke here.) And just as we learned with FaceApp, you most likely did not read Siri’s terms of use and willingly agreed to let Apple do this.

And if you did happen to read the terms of service, The Guardian mentioned that Apple doesn’t explicitly state that this eavesdropping work (i.e., listening to your Siri recordings) is being done by a genuine human being.

So how does it work?

The “official story” from Apple is that this snooping is done to help Siri with dictation, to help create a better Siri of tomorrow so the device can better understand and recognize what we say.

So, a portion of Siri’s interactions are sent to contractors around the world. They are asked to grade the responses by utilizing a set of criteria: whether requests were intentional and the responses were appropriate, or whether the Siri response was accidentally set off by, say, a dog barking… or you sharing the most intimate details of your life with a loved one in the private confines of your home.

Apple contractors have regularly listened to all types of niceties of private lives, such as recordings of couples having sex, drug deals and confidential medical information—just to name a few.

Apple has stated that less than 1% of daily Siri activations are used for review (a very small percentage, unless that 1% is… you!), this type of “grading” is done in secure facilities and contractors are required to abide by Apple’s strict confidentiality requirements.

Oh boy, oh boy, oh boy. Let’s hope one of these Apple contractors doesn’t go rogue after becoming disgruntled and use these private, personal recordings for the benefit of evil, blackmail, extortion or public shaming. We already have a whistleblower who has disclosed this information to The Guardian

This type of secret-human-surveillance-on-our-everyday-lives isn’t limited to Apple. Amazon (Alexa) and Google (Assistant) have also disclosed that human workers listen to recorded conversations—to help improve functionality.

The big difference is, both Amazon (AMZN) and Google (GOOGL) allow customers to opt out of having their recorded conversations listened to. Meanwhile, our friends at Apple don’t offer this type of privacy protection option.

Yes, more dystopian tech to worry about as we essentially let Big Brother keep tabs on us through all of our fun devices and the sites we use on a daily basis.

Just be very, very careful about what you say around Siri—you don’t know who is listening.

Apple Contractors Might Have Listened to You Having Sex Via Siri