Amazon’s AI-based home security system is sending footage of users’ private moments to dozens of algorithm trainers halfway around the world, according to former employees – not unlike its Alexa “smart” speakers.
Amazon’s Cloud Cam home security device regularly sends video clips to employees in Romania and India, who help “train” its AI algorithms, according to five current and former employees who spoke to Bloomberg. The workers review the clips in order to help the system distinguish pet from threat – benign movement from malignant intruder. The only problem? Cloud Cam users have no idea they’re being watched by human eyes.
The Cloud Cam’s terms and conditions say nothing about humans watching footage from their security cameras, and an Amazon representative insisted that the only clips reviewed by employees are submitted voluntarily, for “troubleshooting” purposes. Other than that, “only customers can view their clips,” the spokeswoman told Bloomberg, insisting Amazon “take[s] privacy seriously.”
But there were no “obvious technical glitches” that would indicate the clips were selected for troubleshooting, and some included obviously private content – specifically, couples having sex – that customers were unlikely to want shared, two of the employee whistleblowers claimed. The auditors typically watch and annotate about 150 20-30 second clips a day.
Amazon acknowledged that clips containing “inappropriate content” are flagged and discarded so they are not used in training the AI, but avoided explaining how the inappropriate content would have ended up in front of the trainers in the first place if – as the spokeswoman claimed – all clip submissions were voluntary.
While the company has tried to keep a lid on the program, barring employees from using their phones on the floor where Cloud Cam clips are audited, some of the juicier footage has leaked out, one source claimed.
This isn’t the first time Amazon has been caught secretly using humans to “help” its AI devices along. Earlier this year, it emerged that thousands of Amazon employees and contractors were listening to audio snippets from the company’s “smart” speaker, Alexa, unbeknownst to customers. The company was forced to update its privacy policies with a disclaimer and streamline the process for deleting recordings, which pacified some customers, until it came out a few months later that even deleting a recording didn’t necessarily remove it from Amazon’s servers. The company now offers Alexa users the option of excluding their recordings from human review.
Like this story? Share it with a friend!