Voice-controlled devices: how safe are they, and are you ready?
on 24 May, 2017
It's only a matter of time before voice controlled appliances and programs become an indispensable part of our everyday lives. The clunky, difficult-to-use systems of the past have been replaced with sleeker, more intuitive solutions that make digital life almost effortless..
Today's systems can capture a user’s spoken words with an astonishing level of accuracy and then analyse, interpret and act out their intentions. It’s possible to open up websites, conduct online searches, and even ''interact' with automobiles and home automation devices using voice recognition technologies like Apple’s Siri or Google Now. And what’s more, many international banks are switching from PIN numbers to voice ID for account access.
But before we get carried away with the benefits of hands-free voice tech, it's prudent to take a step back and recognise some of its vulnerabilities and how these flaws impact its overall security for users.
Today’s cybercriminals are much better at impersonating someone’s identity than many people realise. Most voice-control hacks are carried out via voice impersonation. Scammers can clone audio captured from a phone call or an online video to create a fraudulent voice sample. In comparison to other types of biometric authentication, like fingerprints or retina scans, the risk of a clone attack via voice impersonation is much higher, raising some critical security red flags. So how can you take advantage of voice recognition without giving up your personal privacy to the trolls of cyberspace?
First, if you're one of the many people who have linked their credit card and e-wallet info to their phones, then your privacy is at the immediate risk of being exploited. Instead of relying solely on voice for your interactions with your device, you can use a PIN, password or other security feature that's keyed to some type of input other than your voice. Even if someone figures out how to emulate or duplicate your vocal footprint, you'll still have a measure of safety.
Second, be especially watchful if you regularly communicate with connected home devices or a home automation hub via voice command. Since your equipment is always awaiting your spoken instructions and processing all the sounds in a given environment, a hacker could obtain valuable information regarding your personal schedule by intercepting the signals that these products send and receive. This could lead to a burglary or home invasion. Voice recognition is also vulnerable to its own mistakes, and has facilitated several instances of rogue Amazon purchases in the United States. One little girl asked an Alexa gadget to “play dollhouse” with her, and inadvertently ordered a KidKraft Sparkle mansion dollhouse as well as “four pounds of sugar cookies” from the online retailer.
"Privacy and security of IoT is big right now following recent attacks like the Mirai botnet and malware targeting specific brands of smart TVs," says Cris Thomas, a security expert and spokesperson for Tenable Network Security. "While I can't speak with authority on Alexa specifically, one of the privacy risks of IoT devices is that they are always listening." If you’re one of the many consumers throughout the UK who has purchased an Amazon Echo or Echo Dot since their release last September, you can take a few basic precautions by keeping your firmware and software up to date and by changing your user credentials from their factory-set defaults. Amazon also maintains an information page with Alexa and Alexa Device FAQs for users in search of security support.
Keeping abreast of new security risks is one of the best ways to make sure your information stays safe in the hands of cutting-edge technology. New types of threats, exclusive to voice-activated systems, are emerging, and it’s up to users themselves to limit the risk of unintended and potentially dangerous consequences.
Because of the differences between the ways that humans and computers interpret sounds, it's possible to crease audio snippets that sound like meaningless noise to people but that speech-recognition algorithms recognise as instructions. By playing a recording of this audio while in range of a target device, hackers could have it open a website that contains malware and then gain access to all the information contained on its storage unit. The only way to counter this threat is to keep your phone or tablet patched up to the latest OS version, which will eliminate many of the vulnerabilities that could otherwise expose you to harm.
Several companies are taking notable action to defend clients against the misuse of voice features. Barclays Bank, for example, claims that its voice authentication process evaluates approximately 100 vocal characteristics to securely verify the identity of customers who use its telephonic banking services. This provides a measure of peace of mind to those who are concerned about their account details being compromised. Most organisations don't possess the resources of Barclays though, and so it might be prudent to avoid revealing personal info to automated phone systems unless you trust in the competence and thoroughness of the firm you're dealing with.
Keeping your personal information safe requires a few new strategies with the introduction of automated voice interfaces. Fortunately, you can reduce the chances of falling prey to a security threat by taking a few commonsense measures. For early-adopters of voice recognition tech, it should be understood that privacy risks go hand in hand with excitement of participating in the first wave of a world-changing technological movement – but luckily, many of the recommended steps are merely extensions of good tech privacy behaviour, so implementing these measures ought not to be far from the scope of precautions you’ve already put into practice.
Beth Kotz is a freelance writer and contributor for numerous home, technology, and personal finance blogs. She graduated with BA in Communications and Media from DePaul University in Chicago, IL, where she continues to live and work.