Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

 In alexa, Biz & IT, Cortana, Google Assistant, Policy, Siri, voice assistants

Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

Serving the Technologist for more than a decade. IT news, reviews, and analysis.
Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

Enlarge (credit: Schönherr et al.)

As Alexa, Google Home, Siri, and other voice assistants have become fixtures in millions of homes, privacy advocates have grown concerned that their near-constant listening to nearby conversations could pose more risk than benefit to users. New research suggests the privacy threat may be greater than previously thought.

The findings demonstrate how common it is for dialog in TV shows and other sources to produce false triggers that cause the devices to turn on, sometimes sending nearby sounds to Amazon, Apple, Google, or other manufacturers. In all, researchers uncovered more than 1,000 word sequences—including those from Game of Thrones, Modern Family, House of Cards, and news broadcasts—that incorrectly trigger the devices.

“The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans,” one of the researchers, Dorothea Kolossa, said. “Therefore, they are more likely to start up once too often rather than not at all.”

Read 8 remaining paragraphs | Comments

“Election” can trigger Alexa; “Montana” can trigger Cortana.

Recent Posts
Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.

Not readable? Change text. captcha txt