About POP!

POP! is INQUIRER.net’s premier pop culture channel, delivering the latest news in the realm of pop culture, internet culture, social issues, and everything fun, weird, and wired. It is also home to POP! Sessions and POP! Hangout,
OG online entertainment programs in the
Philippines (streaming since 2015).

As the go-to destination for all things ‘in the now’, POP! features and curates the best relevant content for its young audience. It is also a strong advocate of fairness and truth in storytelling.

POP! is operated by INQUIRER.net’s award-winning native advertising team, BrandRoom.

Contact Us

Email us at [email protected]

Address

MRP Building, Mola Corner Pasong Tirad Streets, Brgy La Paz, Makati City

Girl in a jacket

Private photos captured by a robot vacuum cleaner ends up leaked on social media

How much do you trust your AI-powered home devices?

In this day and age, having smart devices powered by AI (artificial intelligence) inside our homes and offices is normal. It wouldn’t shock us to witness ‘Alexa’ (AI assistant) play music for us or turn on or off our lights. That’s how present technology is shaping our routines.

These devices make our lives easier, that’s for sure. However, there are still chances that they might also come with dangers we may not be not aware of.

In 2020, MIT Technology Review reported that they had obtained 15 screenshots from video recordings captured by robot vacuum cleaners. These sensitive photos got leaked on private groups in Facebook and Discord. Among the photos are those of a woman sitting on her toilet and a child lying on the floor. According to the technology site, “the photos were sent to Scale AI, a startup that contracts workers around the world to label audio, photo, and video data used to train artificial intelligence.” They also specified that it was the gig workers from Venezuela who posted the photos.

iRobot, the world’s largest robotic vacuum manufacturer, which was recently acquired by Amazon for $1.7 billion, revealed that the series of sensitive photographs mentioned were taken by the 2020 special development versions of Roomba J7 series robot vacuum cleaners.

In a LinkedIn post, the company’s CEO, Colin Angle, explained that the collection of data is a necessary part of the development process to train machine learning algorithms.

Despite this, he assured people that the “development robots are modified with software and hardware that is not present on the production robots that consumers purchase.” A neon green label that reads “video recording in process” is also attached to them. The participants, paid collectors and employees, who were provided with the smart devices have signed agreements, acknowledging that the company will collect data streams.

The statement also confirms that the company is “terminating its relationship with the service provider that leaked the images, actively investigating the matter, and taking measures to help prevent a similar leak by any service provider in the future.”

The incident proves that the sensitive and private recordings collected by smart devices are not completely safe, no matter how big or reliable the company is. Malfunctions can still occur and our data can still land in the hands of irresponsible workers.

This is not the first time that AI powered devices posed a threat to their collected private data. Back in 2020, an AI company leaked more than 2.5 million medical records, including names, insurance records, medical diagnosis notes, and even payment records. And just recently, it was reported that a famous AI app automatically generated nude photos of some users despite them not uploading not-safe-for-work photos.

AI powered smart devices offer convenience, and they are here to stay. However, let this be a lesson for us.This is just one more reminder that errors can still occur and that our sensitive information is not ‘always’ safe with them. Using them ‘could’ be harmless and beneficial too, but we should be extra cautious about the data streams we input.

Other POP! stories that you might like:

Relatable tweets if you’re just like most of us with unending money problems

The ‘bike lane issue’ proves that some Filipinos tend to lack comprehension

The New York Times criticized due to the offensive pattern that appears in crossword puzzle

Just tweets that show the Filipino struggle of being perennially late all the time

Here’s how personalities, artists, weigh in on the controversial AI art

About Author

POP! Channel Lead

Related Stories

Popping on POP!