by 👨💻 Graham Pierrepoint
Virtual assistants are becoming more and more commonplace – largely thanks to the growth of Apple’s landmark Siri program in recent years – with Amazon aiming to lead the market with their physical voice-activated assistant, Echo (powered by Alexa technology) and Google jumping into the game with their Home hub. It’s all part of what has been termed the ‘internet of things’ or IoT – where more and more of our household objects and standards will tie in together via Wi-Fi and will be streamlined for ease of use through the internet. It’s all very impressive stuff – but at the same time, this technology could get quite scary – and if reports this week are anything to go by, your home assistant could be hoodwinked by a dolphin – or at least a hacker creating high-pitched noises.
Research is suggesting that ultrasonic, high-pitch noises and commands appear to offset many of the popular devices leading the virtual assistant market – meaning that Amazon’s Echo, Apple’s Siri and Google’s Home could all be at risk of being compromised if the right frequencies are applied at the wrong time. There could be a risk – as research has shown – of such commands allowing the devices to make calls and website choices on behalf of a remote user – and without permission of the owner.
Researchers have been able to activate and use various assistants on a number of devices from considerable distances away – providing that they were using frequencies high enough to pitch at them. Other teams have discovered that certain high pitch noises have been treated as human speech by devices such as the Amazon Echo range – making the case for security concerns over just how sensitive virtual assistants are. This research could mean that much hardware may be heading back to the factory for tweaking – with Google at the very least stating that they were to be looking into such concerns in due course.
Certain attacks employed during such research, however, were ineffective on devices which required unlocking or only responded to one user’s voice – meaning that protecting your virtual assistant against such hacking could be fairly simple. For the meantime, however, it remains to be said that all users of such hardware should simply take caution – and, at the very least, anyone with an Echo, a Home device or Siri – should keep them away from dolphins at all costs!