Within the last couple of weeks, Apple has found itself in the middle of another privacy and security misstep. A report surfaced stating that contractors were given access to Siri recordings and would often hear private conversations while trying to ‘grade’ Siri commands in order to improve the service in the future. This created some worry, as it seems that Siri could be used to essentially spy on users without their knowledge.
Apple was employing third-party contractors to listen to ‘less than 1%’ of Siri commands in order to improve it in the future. The issue is that sometimes Siri activates for no reason at all, leading these contractors to listen in on couples having sex, confidential medical talks between doctors and patients and some criminal activity, such as drug deals.
The initial whistleblowing report came from The Guardian and has led to Apple releasing its own statement via Bloomberg. Apple is “suspending Siri grading globally” while the company conducts a “thorough review” of its grading practises. In a future software update, users will also gain the ability to opt out of Siri grading.
This isn’t exactly a great look for Apple, as the company has been championing privacy and security for several years now, but there still seem to be flaws in the company’s practices. Siri grading will likely return but at least now, iOS and Mac users will know that such a program exists and can choose to opt out.
KitGuru Says: This is quite the privacy blunder, although I do understand why Apple chose to do things this way. Siri can mistake normal conversation for a wake command from time to time, and Apple needs user data to figure out how to put a stop to that. Still, a program like this should have never been a secret and an opt-out should have been presented from the very beginning. Hopefully we don’t encounter similar screw-ups later down the line, especially with Apple moving into the financial sector with the Apple Card.