Uncovered: 1000 phrases that incorrectly trigger Alexa Siri and Google Assistant

Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

MAKE SURE YOU ASK YOUR FRIEND BEFORE VISITING INSIDE THEY HOME IF THEY HAVE ALEXA, SIRI, OR GOOGLE ON AND LISTENING BEFORE YOU GO AND ASK THEM TO TURN IF OFF – CAUSE YOU DON’T WANT TO BE SPIED ON AND RECORDED!!!!!!!

“Election” can trigger Alexa; “Montana” can trigger Cortana.

That which must not be said

Examples of words or word sequences that provide false triggers include

  • Alexa: “unacceptable,” “election,” and “a letter”
  • Google Home: “OK, cool,” and “Okay, who is reading”
  • Siri: “a city” and “hey jerry”
  • Microsoft Cortana: “Montana”

The two videos below show a GoT character saying “a letter” and Modern Family character uttering “hey Jerry” and activating Alexa and Siri, respectively.

Unacceptable privacy intrusion

When devices wake, the researchers said, they record a portion of what’s said and transmit it to the manufacturer. The audio may then be transcribed and checked by employees in an attempt to improve word recognition. The result: fragments of potentially private conversations can end up in the company logs.

The risk to privacy isn’t solely theoretical. In 2016, law enforcement authorities investigating a murder subpoenaed Amazon for Alexa data transmitted in the moments leading up to the crime. Last year, The Guardian reported that Apple employees sometimes transcribe sensitive conversations overheard by Siri. They include private discussions between doctors and patients, business deals, seemingly criminal dealings, and sexual encounters.

Leave a Reply

Your email address will not be published. Required fields are marked *