Alexa wants to gossip

Tags: | | | | |
POSTED: May 25, 2018

Alexa is Amazon’s AI-ish household assistant. It lives inside a device called an Echo, and acts as a hub. You connect it to other devices from televisions and phones to heating systems.It sits listening to everything you say, waiting to hear you say “Alexa”, at which point it interprets the next thing you say as a command and does what it thinks you have told it to do.

The Register reported today on an issue a woman from Portland, Oregon had with her Alexa-stuffed house.

She had spoken to her local television station, KIRO7, about the problem.

“My husband and I would joke and say I’d bet these devices are listening to what we’re saying,” said Danielle, who did not want us to use her last name. Every room in her family home was wired with the Amazon devices to control her home’s heat, lights and security system.

But Danielle said two weeks ago their love for Alexa changed with an alarming phone call. “The person on the other line said, ‘unplug your Alexa devices right now,'” she said. “‘You’re being hacked.'”

That person was one of her husband’s employees, calling from Seattle.

“We unplugged all of them and he proceeded to tell us that he had received audio files of recordings from inside our house,” she said. “At first, my husband was, like, ‘no you didn’t!’ And the (recipient of the message) said ‘You sat there talking about hardwood floors.’ And we said, ‘oh gosh, you really did hear us.'”

The Register asked Amazon what exactly had happened and Amazon replied that

The Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right.” As unlikely as this string of events is, we are evaluating options to make this case even less likely.

If I understand this correctly the issue did not spring from a software error per se, but from the underlying logic that powers Alexa. It did what its programmers wanted it to do, but it interpreted the sounds it heard wrongly during several consecutive steps. It also presumably spoke to the couple at a volume too quiet for them to hear as they concentrated on their conversation.

This raises several questions.

It raises questions about how Amazon, Google and others can program round problems such as this. It also raises a more fundamental issue: what does Alexa do that offers such benefits that sensible people would risk this kind of malfunction?

What does Alexa actually do that people can’t more usefully do themselves?