Alexa recorded couple's conversation, then sent it to contacts
An American couple found their Amazon Alexa voice assistant system had recorded their conversations and sent them to a contact.
The couple, from Portland, Oregon, had been joking about Alexa listening in on them. Then the woman took a call from one of her husband's employees, saying that they were being hacked and should unplug their devices.
Danielle, who did not want to give her last name, said: "He proceeded to tell us that he had received audio files of recordings from inside our house. At first, my husband was, like, 'No you didn't!' He said, 'You sat there talking about hardwood floors.' And we said, 'Oh gosh, you really did hear us.'"
Although the conversation was not highly personal, Danielle said she still felt "invaded". She unplugged all of her devices and contacted Amazon, who sent engineers to her house. Danielle said they apologised to her profusely. However, she says that the device did not audibly advise her it was preparing to send the recording, which it is programmed to do.
Amazon has blamed an "unlikely" series of events for the behaviour of the Alexa, which is installed in every room of the couple's house to control heat, lighting and security. The company insists the devices are not always listening and only "wake up" when the word "Alexa" is spoken. A spokesman said, in this case, a word in background conversation must have sounded like "Alexa". Other conversation and noises were heard as requests to send messages and confirmation.
Amazon has aroused suspicion in the past by encouraging customers to place listening devices in every room. It has filed patent applications for more invasive listening devices, which record all of the time.
One included an algorithm that would analyse when people say they "love" or "bought" something. This patent included a diagram in which two people talking on the telephone were given separate targeted advertisements after they hung up. There were also worries in 2016 when scientists found that voice assistants such as Alexa could be woken up by sounds unintelligible to humans.
The group found that commands could be hidden in white noise, with the device switching on and going to websites without being asked to do so.
In May, these findings went further, with researchers claiming they could embed commands directly into recordings of music or spoken text. This could mean that as you listen to music, the voice assistant may hear an instruction to send a message. (© Daily Telegraph, London)