The Vette Barn

The Vette Barn (https://www.thevettebarn.com/forums/index.php)
-   Off Topic (https://www.thevettebarn.com/forums/forumdisplay.php?f=38)
-   -   Workers hear drug deals, medical details and people having sex (https://www.thevettebarn.com/forums/showthread.php?t=119902)

Mike Mercury 07-29-2019 8:19am

Workers hear drug deals, medical details and people having sex
 
Apple contractors 'regularly hear confidential details' on Siri recordings

Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.

Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.

But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.

Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

A whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information.

Siri can be accidentally activated when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes can be understandable – a BBC interview about Syria was interrupted by the assistant last year – or less so. “The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

That accompanying information may be used to verify whether a request was successfully dealt with. In its privacy documents, Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple services”. There is no specific name or identifier attached to a record and no individual recording can be easily linked to other recordings.

Accidental activations led to the receipt of the most sensitive data that was sent to Apple. Although Siri is included on most Apple devices, the contractor highlighted the Apple Watch and the company’s HomePod smart speaker as the most frequent sources of mistaken recordings. “The regularity of accidental triggers on the watch is incredibly high,” they said. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”

Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

The contractor said staff were encouraged to report accidental activations “but only as a technical problem”, with no specific procedures to deal with sensitive recordings. “We’re encouraged to hit targets, and get through work as fast as possible. The only function for reporting what you’re listening to seems to be for technical problems. There’s nothing about reporting the content.”

As well as the discomfort they felt listening to such private information, the contractor said they were motivated to go public about their job because of their fears that such information could be misused. “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.

“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

The contractor argued Apple should reveal to users this human oversight exists – and, specifically, stop publishing some of its jokier responses to Siri queries. Ask the personal assistant “are you always listening”, for instance, and it will respond with: “I only listen when you’re talking to me.”

That is patently false,, the contractor said. They argued that accidental triggers are too regular for such a lighthearted response.









https://i.imgflip.com/2424qs.jpg

dvarapala 07-29-2019 10:21am

I don't know about Siri, but when you wake up an Echo device it responds with a short tone and the blue ring on top lights up. When that happens it's really easy to STFU. :D

04 commemorative 07-29-2019 10:43am

So I called my friends home phone after they left here on a 4 hour drive to go home.....I called 1 hour before they got home and left this message.......
Hey Dick it's me,just wanted to see if this worked....let me know...Alexa play 70's music !........They walked in to their house and Billy Joel was playing ! lol It works if the answering machine is close enough to Alexa!

z06psi 07-29-2019 11:40am

Nope, no thanks.

mrvette 07-29-2019 11:57am

I got a note from my son this AM, made reservations from Ca. with his son, so two of them flipping into Bawltimore to see my daughter's latest offering, another cousin, about a month old now....so we meeting at BWI on the 29th of Aug.....spend some time there, and then they going off to eastern shore to see their mom, I going to drive south to see some old antique friends, probably for the last time......may stop see a couple of other friends too....and flip back south on another airline on the 5th of Sept......

SO that's the plot, and it took TWO HOURS and a final phone call to make the stupid travel/air connections for me to get into BWI at same time they are, so to pick up car and my son made B&B res. already......Orbitz travel is really screwed up, I should have gone to the airlines directly and booked the flights....what a mess....two hours for a 20 minit effort......kiss my ever loving ASS.......:issues::issues:

Mike Mercury 07-29-2019 12:22pm

who would want to listen in on me :confused:

worker at the Siri monitoring office:

"apparently... Mike Mercury is having another difficult bowel movement."

Aerovette 07-29-2019 12:54pm

I struggled with the decision about buying an Echo Dot. Then I realized how absolutely uninteresting listening in on me would be. Now I have 8 of them scattered through the house.

That being said, I am also testing them. I have an object that I name at least 5 times while at my house. Something I have never googled or asked Alexa about. I say it over and over from time to time. The MOMENT I see any ads or pop ups for this particular thing, the Echos will all be for sale on ebay.

I have automated a lot of things and it's all pretty cool...for now.

C5SilverBullet 07-29-2019 2:58pm

Quote:

Originally Posted by aerovette (Post 1679223)
I struggled with the decision about buying an Echo Dot. Then I realized how absolutely uninteresting listening in on me would be. Now I have 8 of them scattered through the house.

That being said, I am also testing them. I have an object that I name at least 5 times while at my house. Something I have never googled or asked Alexa about. I say it over and over from time to time. The MOMENT I see any ads or pop ups for this particular thing, the Echos will all be for sale on ebay.

I have automated a lot of things and it's all pretty cool...for now.

If they want to listen to me yell at my kids, go ahead. :lol:

carlton_fritz 07-29-2019 7:08pm

Quote:

Originally Posted by 04 commemorative (Post 1679204)
So I called my friends home phone after they left here on a 4 hour drive to go home.....I called 1 hour before they got home and left this message.......
Hey Dick it's me,just wanted to see if this worked....let me know...Siri play 70's music !........They walked in to their house and Billy Joel was playing ! lol It works if the answering machine is close enough to the answering device !

Siri, play Afternoon Delight on repeat.

04 commemorative 07-29-2019 9:27pm

I meant to say "Alexa" not Siri......I have corrected it in my post.


All times are GMT -5. The time now is 1:03am.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2024 DragonByte Technologies Ltd.
Copyright © 2009 - 2024 The Vette Barn