About POP!

POP! is INQUIRER.net’s premier pop culture channel, delivering the latest news in the realm of pop culture, internet culture, social issues, and everything fun, weird, and wired. It is also home to POP! Sessions and POP! Hangout,
OG online entertainment programs in the
Philippines (streaming since 2015).

As the go-to destination for all things ‘in the now’, POP! features and curates the best relevant content for its young audience. It is also a strong advocate of fairness and truth in storytelling.

POP! is operated by INQUIRER.net’s award-winning native advertising team, BrandRoom.

Contact Us

Email us at [email protected]


MRP Building, Mola Corner Pasong Tirad Streets, Brgy La Paz, Makati City

Girl in a jacket

Amazon’s Alexa recorded a family’s private conversation—and sent it to a random contact

Two months after many Amazon Echo users reported hearing unprompted laughter from Alexa, another glitch in the software has caused the machine to record and then send a family’s private conversation to a random person on their contacts list.

A woman named Danielle told KIRO-TV that she received a call from an employee of her husband who urged them to turn their Amazon Echo speakers off. The employee then told her that he had received an audio file of their conversation in their home in Portland. The employee, who lives in Seattle, was 282 kilometers away.

“I felt invaded, like total privacy invasion,” Danielle said. She immediately called Amazon after the incident. An Alexa engineer she spoke to confirmed that the software did record and send a conversation their family had to the employee.

“He apologized like 15 times in a matter of 30 minutes and said, ‘We really appreciate you bringing this to our attention, this is something we need to fix’,” she said.

In a statement, Amazon explained what they think happened:

“Echo woke up due to a word in background conversation sounding like ‘Alexa’. Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’.”

Amazon explained that the incident was an extremely rare occurrence, and that they are “evaluating options to make this case even less likely.”

Danielle’s family, however, will no longer be using their Echo speakers despite Amazon’s explanation.

“I’m never plugging that device in again — I can’t trust it,” Danielle said.


Read more from InqPOP!: 

How smart appliances can be a hacker’s gateway to your data and home
This husband got confused when his wife asked for a Fenty highlighter
These kids’ hilarious cookbook recipes are your best guide to cooking—or not
This oozing jet-black mozzarella cheese will satisfy your goth cravings
Lebanon’s ‘Tower of Bitterness’ transformed into colorful public installation art






About Author

Related Stories

Popping on POP!