Artificial Intelligence Devices: All the Better to Hear you With my Dear

How many of you reading this have a small sticker covering your front camera on your laptop because you’re paranoid that someone may be somehow watching you?

Me too… I want as much privacy as possible in this technology dominated world. I’ve watched far too many conspiracy theory videos to be living a camera-sticker-free existence.  



Privacy is a BIG concern at the moment, especially after Facebook’s recent blunder. In case you aren’t familiar, personal data from tens of millions of users was accessed in order to sway US election results.

The latest privacy concern regards our smart home assistant devices listening in on our conversations a little too closely. These include the Apple Home Pod, Amazon Alexa, Echo, Dot, and Google Home. Each of the contraptions come wired with up to 7 microphones and noise cancelling capabilities… All the better to hear you with my dear.

For the purpose of this investigation (yes, please whip out your spy goggles), we are exploring two of the most contested devices for  actually listening in on our conversations: Amazon and Google Home, which also happen to be the two best sellers.



So. Are they really listening ALL of the time?

Well yeah… Kind of! These devices are listening the whole time they are turned on. Whether they activate is based on the use of their specific ‘wake word’ or phrase. It has been found that Amazon intended on using Alexa as a way to build profiles on their consumers by monitoring and utilising phrases like ‘I love skiing’ to target advertising. This means that we are voluntarily inviting big brother to come watch over us! Orwell called it.



These home assistants have been extremely sensitive to their ‘wake words’ as of late. My personal favourite example of this is the phrase “cocaine noodles”, which Google Home supposedly interprets as “OK Google”.

There have been a few more serious cases where people had their devices activate and perform commands without their knowledge. A more recent example of this is when Amazon’s Echo sent a recording of a couple’s conversation to someone in their contact list. Luckily it was only a conversation about wooden flooring, but wowee that’s creepy.



Here’s what Amazon had to say:

“Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future.”

“Echo woke up due to a word in background conversation sounding like ‘Alexa.’ Then, the subsequent conversation was heard as a ‘send message’ request…  Alexa then interpreted background conversation as ‘right.’ As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

How scary is that? It literally confirms the moral panic most of us have about our devices listening in on our private conversations.



This leads us to the question: are these devices secure? Can they be hacked?

We already know that companies can access certain information that has been recorded on our smart home devices… So does this mean that other people can too? Yes, yes it does.

Researchers at the University of California, Berkeley, and Georgetown have the ability to send secret audio instructions to Amazon’s Alexa and Google Home that are inaudible to the naked human ear. This research suggests that someone other than us (the supposed sole owners of these devices) has the ability to control and use our digital home assistants without us even knowing!



Semi all good and well if this is just companies looking for some market research… but what if it’s more than that? Even hackers?…

Researchers in China and the United States have demonstrated that artificial intelligence systems can be secretly activated. If this ability is misused, then this technology could be used to “unlock doors, wire money or buy stuff online – simply with music playing over the radio” by embedding direction into recordings.



It’s not all bad news… If you really MUST have one of these newfangled devices, then there are ways we can use them while maintaining some privacy.

Here’s what we CAN do about this

If you happen to own one of these devices already or are looking at owning one, you should review privacy settings and activity authorisation. Looking at these agreement contracts probably feels a little foreign since we are all used to instantaneously clicking the ‘I agree’ button because ain’t nobody got time to read that novel. That might just have to change.

Pam Dixon, executive director of the World Privacy Forum said that a lot of people are not aware that voice recordings on these smart home devices are actually stored until you delete them. “People should think about what they are asking their voice assistants and know that they can delete that information,” she said.

It might also be useful to simply turn the microphone or the entire device off when it’s not in use for complete privacy. Otherwise, here are some additional quick steps for a more secure home assist device. Take note my friend.

  • Change your ‘wake word’ to something that is not likely to be used in everyday conversation or misinterpreted
  • Ensure that your network and home wireless includes no identifying information such as saved bank details, passwords or addresses
  • Strengthen your passwords so they can’t be hacked easily
  • Go through and delete any old recordings that may be stored on the device
  • Read your privacy agreements



The cases of deception we touched on really do demonstrate how with leaps and bounds in the realm of artificial intelligence (AI), comes leaps and bounds in technological manipulation.

If this information gives you the heebie-jeebies like it does me, now might be the prime time for stickering, unplugging and throwing all AI devices into a river and going to live off the grid.


No Comments Yet

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>