There are a lot of ways to steal info off a computer once you’re connected to it. Whether over Wi-Fi, Bluetooth, Ethernet, or even a USB stick, once a computer is connected to the outside world, it’s no longer safe from prying eyes.
One solution is to “air-gap” important systems, or separate them from other computers and the network at large by removing any remote access. It’s a solid plan, but now researchers from the Cyber Security Labs at Ben Gurion University have skirted that security method. The team managed to transfer data using an infected PC’s cooling fans.
To do this, the computer has to be infected with malware designed for it. Once it’s installed, the malware flips the fan speed between 1,000 RPM and 1,600 RPM, an audible difference that a microphone, like one found on a smartphone, can easily pick up on. The demo shows the computer rattling off a long chain of numbers, in binary because of the fan’s two speeds, and a nearby phone listening and interpreting.
In doing so, the malware effectively defeats the air gap. A computer with absolutely nothing connected to it — not even a monitor — could still have data stolen with this attack. The catch, of course, is that a device with a microphone needs to be planted near the target device. That means this malware is never going to target massive numbers of users, but it could still be used to pull off heists worthy of a Bond film.
The malware, which the team calls “Fansmitter,” allows for up to 1,200 bits an hour to be transmitted, in ones and zeroes, over the air to a phone. That’s a full 150 alphanumeric characters per hour, more than enough to steal a couple of passwords or an encryption key.
Malware that attacks air-gapped systems has become an increasingly popular topic over the last few years, as the methods of limiting access to a networked machine become less effective. Fansmitter is not the most practical attack, but it proves that even keeping a system disconnected from the Internet — and any peripherals — does not provide absolute security.
Related Posts
New study shows AI isn’t ready for office work
A reality check for the "replacement" theory
Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns
The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.
Microsoft tells you to uninstall the latest Windows 11 update
https://twitter.com/hapico0109/status/2013480169840001437?s=20