Hackers could use artificial intelligence tools to steal user passwords with near-perfect accuracy by “listening” to an unsuspecting person’s keystrokes, a recent study from UK showed.
A group of UK-based computer scientists trained an artificial intelligence model to identify the sounds generated by keystrokes on the 2021 version of a MacBook Pro — described as a “popular off-the-shelf laptop.”
When the AI program was enabled on a nearby smartphone, it was able to reproduce the typed password with a whopping 95% accuracy, according to the study results published by Cornell University.
The hacker-friendly AI tool was also extremely accurate while “listening” to typing though the laptop’s microphone during a Zoom video conference.
Researchers said it reproduced the keystrokes with 95% accuracy – a record for the medium.
The researchers warned that many users are unaware of the risk that bad actors could monitor their typing to breach accounts – a type of cyberattack they called an “acoustic side channel attack.”
The program was able to “listen” to identifying elements of each key press, such as sound wavelengths. The smartphone, an iPhone 13 mini, was placed 17 centimeters away from the keyboard.
The research was conducted by Joshua Harrison of Durham University, Ehsan Toreini of the University of Surrey and Maryam Mehrnezhad at Royal Holloway University of London.
The possibility of AI tools aiding hackers is just another risk factor for the burgeoning technology.
AI tools can make online scams harder to detect because AI makes it easier to personalize scams for each target, Insider reported last Tuesday.
Two specialists raised the alarm in 2019 over how the advancement of AI and 5G technology would heighten vulnerabilities in internet-connected devices, amplifying cybersecurity threats.