#2058: How Stuxnet's Code Physically Broke Iran's Centrifuges

Stuxnet didn't just infect computers—it rewrote PLC logic to spin uranium centrifuges into self-destruction while faking normal readings.

0:000:00
Episode Details
Episode ID
MWP-2214
Published
Duration
21:52
Audio
Direct link
Pipeline
V5
TTS Engine
chatterbox-regular
Script Writing Agent
Gemini 3 Flash

AI-Generated Content: This podcast is created using AI personas. Please verify any important information independently.

Stuxnet is often described as the first digital weapon to cause real-world physical destruction, but the story behind it is far more complex than a simple virus. This malware was a precision-guided munition designed to sabotage the Natanz uranium enrichment facility in Iran, a site protected by intense physical security and air-gapped networks. The attack didn't rely on a single trick; it was a multi-stage operation that combined network infiltration, hardware fingerprinting, and a sophisticated rootkit for industrial control systems.

The infection began with a multi-vector propagation strategy. Stuxnet used four different Windows zero-day vulnerabilities to spread through internal networks, often via infected USB drives. Once it reached a computer running the Siemens Step7 engineering software, it didn't immediately attack. Instead, it performed a detailed hardware census, searching for a very specific configuration: 417 frequency converters from Fararo Paya in Iran and Vacon in Finland. This extreme specificity meant Stuxnet remained dormant on 99% of infected machines worldwide, allowing it to spread undetected for months.

The core of the attack was its "man-in-the-middle" operation on the PLC communication library. Stuxnet replaced the legitimate s7otbxdx.dll file with a malicious version. When an engineer checked the centrifuge logic via Step7, the malware intercepted the request, read the actual malicious code running on the PLC, and "photoshopped" it out—displaying a clean, legitimate code block instead. This created a digital hallucination where operators saw normal operations while the hardware was being physically manipulated.

The sabotage itself was a study in physics and mechanical engineering. Stuxnet executed two attack sequences designed to exploit the centrifuges' resonant frequencies. Sequence A over-speed attack spiked the rotation from 1,064 Hz to 1,410 Hz for 15 minutes, pushing the carbon-fiber rotors beyond their structural limits. Sequence B was even more destructive: it dropped the speed to 2 Hz for 50 minutes, forcing the centrifuges to vibrate violently at their "critical speed" where they would shake themselves apart.

Throughout the attack, Stuxnet masked its activity by recording 21 seconds of normal sensor data and looping it back to the SCADA system. It faked valve states and pressure readings, ensuring that even as centrifuges shattered, the control panels showed all systems nominal. The Iranians eventually noticed the physical failures but couldn't identify the cause, leading to internal investigations and even firings of technicians who were blamed for the malfunctions.

The malware also had built-in termination logic. It included a kill date of June 24, 2012, after which it stopped spreading, and a "DEADFOOT" flag in PLC memory to prevent re-infection. Stuxnet represents a landmark in cyber-physical warfare, demonstrating how code can be weaponized to manipulate industrial systems with surgical precision while maintaining complete stealth.

Downloads

Episode Audio

Download the full episode as an MP3 file

Download MP3
Transcript (TXT)

Plain text transcript file

Transcript (PDF)

Formatted PDF with styling

#2058: How Stuxnet's Code Physically Broke Iran's Centrifuges

Corn
Alright, we have a heavy hitter today. Daniel sent us a text prompt, and it is a deep dive into one of the most significant pieces of code ever written. Here is what he wrote to us: "Let's talk about how Stuxnet worked, focusing on everything known about the actual technical operation of the code's payload. I want to get past the spy novel headlines and really look at the PLC injection logic, the centrifuge sabotage sequences, and how it managed to lie to the operators while the hardware was literally tearing itself apart."
Herman
Oh, man. Daniel is speaking my language today. Herman Poppleberry here, and I have been waiting for a reason to go through the Symantec dossier again. This isn't just a malware story, Corn. It is a physics story told in assembly language. It’s the moment the digital ghost finally got a physical pair of hands to start breaking things.
Corn
It really is the OG digital-to-physical weapon. And before we peel back the layers on this digital onion, a quick shout-out to our sponsor: big thanks to Modal for providing the GPU credits that power this show. Also, fun fact for the tech nerds out there—today’s episode is actually being powered by Google Gemini three Flash.
Herman
It is poetic, isn't it? One AI helping us explain a piece of code that was so complex it basically functioned like a pre-programmed autonomous agent. Stuxnet wasn't just "malware" in the sense of a virus that steals your credit card. It was a precision-guided munition. If a normal virus is a grenade thrown into a room, Stuxnet was a sniper bullet fired from three miles away that only hits a target wearing a specific shade of blue.
Corn
Right, and it had to be that precise because it was targeting the Natanz uranium enrichment facility in Iran. This place was air-gapped, buried underground, and protected by some of the most intense physical security on the planet. You couldn't just "email" a virus to the centrifuges.
Herman
Well, not exactly, because I’m not allowed to say that word—but you’re on the money. The air gap is the first myth to bust. People think an air gap is a magic shield. Stuxnet proved that an air gap is just a speed bump if you have enough zero-day vulnerabilities and a thumb drive. It used four different Windows zero-days just to move around the internal network. But the part Daniel wants us to focus on isn't how it got in—it's what it did once it arrived at the finish line.
Corn
Which was the Programmable Logic Controllers, or PLCs. For the folks who don't spend their weekends in a factory, these are basically the "brains" of industrial machinery. They take sensor data and tell motors how fast to spin. I always think of them as the middleman between the mouse-click on a computer and the actual movement of a heavy machine.
Herman
That’s a good way to put it. And specifically, Stuxnet was looking for Siemens S7-300 and S7-400 PLCs. But it wasn't looking for just any Siemens controller. This is where the "fingerprint" check comes in, and it is honestly terrifyingly specific. When Stuxnet landed on a computer running the Siemens Step7 engineering software, it didn't just start breaking things. It sat quietly and performed a hardware census. It looked for a very specific configuration of four hundred seventeen frequency converters manufactured by two specific companies: Fararo Paya in Iran and Vacon in Finland.
Corn
Wait, so if it found four hundred sixteen converters, or if they were made by a different brand, it just... stayed dormant?
Herman
It did nothing. It was a "silent" infection unless the environment matched the target profile perfectly. This is why it took so long to discover. It infected thousands of computers worldwide—I think over sixty percent of the infections were in Iran, but it spread to India, Indonesia, even the U.S.—but on ninety-nine percent of those machines, the payload never triggered. It was looking for the specific electrical signature of the Natanz centrifuge cascades. If you weren't running a specific array of high-frequency drives connected to a specific model of Siemens CPU, Stuxnet was basically a ghost. It wouldn't even reveal its presence.
Corn
That is some serious restraint for a piece of code. Most malware is like a toddler with a permanent marker—it wants to draw on every wall it sees. Stuxnet was more like a professional assassin waiting in the closet for one specific person. So, let's say it finds the match. It confirms it's at Natanz. How does it actually cross the bridge from the Windows PC to the PLC hardware? Because these are two totally different languages, right?
Herman
Right. You have Windows running on x86 architecture, and then you have the PLC running its own proprietary logic. This is the "Man-in-the-Middle" attack on the library files, and it's brilliant in a dark way. On a Windows machine, the Step7 software uses a library file called "s7otbxdx dot d-l-l" to communicate with the PLCs. Stuxnet replaced that legitimate file with its own malicious version. So, when an Iranian nuclear engineer opened their console to check the code running on the centrifuges, the malicious library would intercept the "read" request. It would look at the PLC, see the malicious Stuxnet code, and then quickly "photoshop" it out, showing the engineer a perfectly clean, legitimate block of code instead.
Corn
So the engineer is looking at the screen thinking, "Everything looks great, the logic is sound," while the actual hardware is running a totally different set of instructions? That’s like a pilot looking at their instruments saying they’re at thirty thousand feet while the plane is actually scraping the treetops.
Herman
Precisely. Well, not precisely—it was a total deception. The PLC rootkit gave the attackers complete control over the physical reality of the plant while maintaining a digital hallucination for the operators. It’s the digital equivalent of that scene in heist movies where they loop the security camera footage so the guards see an empty hallway while the vault is being emptied. But it’s even more sophisticated because Stuxnet wasn’t just looping a video; it was actively rewriting the responses from the hardware in real-time to match what the software expected to see.
Corn
Okay, so the "hallucination" is set up. The engineer is happy. Now, what is the actual "sabotage" part? How do you break a centrifuge with code? I mean, these things are made of high-grade carbon fiber and steel. They aren't exactly fragile.
Herman
You play with the physics of resonance. These centrifuges are spinning uranium hexafluoride gas at incredible speeds to separate isotopes. They usually run at a steady frequency of one thousand sixty-four Hertz. Stuxnet had two main attack sequences, which researchers call Sequence A and Sequence B. And the timing here is key—it didn't just attack once; it lingered.
Corn
I'm assuming these weren't just "turn it off" commands. If the machines just stopped, the alarms would go off instantly.
Herman
No, turning it off is too obvious. Sequence A was the "over-speed" attack. It would wait for a period of normal operation, then suddenly crank the frequency up from one thousand sixty-four Hertz to one thousand four hundred ten Hertz. It would hold it there for fifteen minutes. Now, think about the mechanical stress on a carbon-fiber rotor spinning that fast. You are pushing it way beyond its structural design limits. It’s like redlining a car engine for fifteen minutes straight while the dashboard says you’re doing fifty miles per hour.
Corn
But why fifteen minutes? Why not just keep going until it explodes?
Herman
Because they wanted to weaken the material without necessarily causing a catastrophic, plant-wide shutdown immediately. They wanted to create "mysterious" failures over time. After fifteen minutes, it would return the speed to normal. Then it would wait for weeks. It wanted to be subtle. It wanted the Iranian engineers to think they just had a "bad batch" of rotors or a mysterious mechanical fluke. But then came Sequence B, which was arguably more clever and much more destructive. It would drop the frequency all the way down to two Hertz for fifty minutes.
Corn
Two Hertz? That’s barely moving. Why slow it down? If you're trying to enrich uranium, slowing down seems counterproductive, but not necessarily "destructive."
Herman
Oh, it’s incredibly destructive because of "critical speeds." Every rotating object has a resonant frequency where it vibrates most intensely. Think of a washing machine on the spin cycle—there’s that one moment where it wobbles like crazy before it gets up to full speed. That’s the critical speed. When a centrifuge starts up or slows down, it has to pass through these "critical" zones quickly so the vibrations don't build up. By holding the centrifuges at two Hertz for nearly an hour, Stuxnet was essentially forcing them to sit in that danger zone. The vibrations would become so violent that the rotors would actually touch the outer casing and shatter. It was turning the machines into their own wrecking balls.
Corn
That is devious. It’s using the machine’s own physical properties against itself. And while this is happening—while the rotors are screaming and vibrating into pieces—the guy in the control room is looking at a screen that says "one thousand sixty-four Hertz, all systems nominal."
Herman
That is the "Signal Masking" part of the payload. Stuxnet would record twenty-one seconds of legitimate "normal" sensor data from the centrifuges before the attack started. Then, during the attack, it would play that twenty-one-second loop back to the SCADA system. It’s the ultimate gaslighting. The centrifuge is literally exploding in the next room, and the computer screen is showing a calm, steady line of perfectly normal operation. It even faked the state of the valves and the pressure sensors. If a valve was supposed to be open, Stuxnet made sure the screen showed it as open, even if the malware had actually slammed it shut to build up pressure.
Corn
It really highlights the vulnerability of our trust in digital systems. If the sensor says it’s fine, we believe it’s fine, even if our ears are telling us something is wrong. But eventually, the Iranians had to notice, right? You can't hide a thousand broken centrifuges forever. Even if the screen says "all good," the pile of scrap metal in the corner tells a different story.
Herman
They noticed the failures, but they couldn't figure out why. There’s a famous story of the International Atomic Energy Agency inspectors visiting Natanz and noticing that the Iranians were hauling out huge numbers of broken centrifuges. The Iranian technicians looked baffled. They were replacing parts, checking the power supply, looking for physical sabotage—but they weren't looking at the code because the code told them it was doing its job. They actually fired some of their own technicians because they thought they were being incompetent or negligent. Imagine being the guy who gets fired for "breaking" a machine when the computer says you did everything perfectly.
Corn
That’s a whole different level of psychological warfare. It’s interesting that the code had a "kill date" too. Daniel mentioned June twenty-fourth, twenty twelve. Why have an expiration date on a weapon?
Herman
Yeah, that's a very "state-actor" move. It shows that this wasn't an uncontrolled wildfire; it was a mission with a timeline. After that date, the worm would stop spreading. It also had an infection flag in the PLC memory—the hex value "zero-x-D-E-A-D-F-zero-zero-seven."
Corn
"Dead Foot"? Or "Dead Fool"?
Herman
Most people read it as "Dead Foot." In aviation, "Dead Foot, Dead Engine" is a memory aid for pilots when an engine fails. If Stuxnet scanned a PLC and saw that marker, it knew the device was already infected and would move on. This prevented "double-infecting" a system, which might cause a crash or a memory overflow that would tip off the admins. It was incredibly polite malware. It didn't want to crash the system; it wanted the system to keep running perfectly... while it quietly destroyed the hardware.
Corn
Polite, yet it caused a thousand centrifuges to fail. That’s about ten percent of their enrichment capacity at the time. It didn't stop the program, but it threw a massive wrench in the gears. Now, Herman, you mentioned that this likely took two different teams. One for the Windows side and one for the industrial side. Is that common in the world of malware?
Herman
Not at all. Usually, a malware author is a specialist in one area. But Stuxnet required a "Full Stack" of destruction. You needed world-class Windows exploit developers to find those four zero-days. Then you needed experts in Siemens industrial software. But most importantly, the expertise required to write those PLC function blocks—specifically the modifications to the OB-thirty-five and FC-eighteen-seventy-four blocks—is incredibly rare. You need someone who understands the "Step7" proprietary language, but you also need a nuclear physicist who understands the exact resonant frequencies of a specific model of centrifuge. You can't just Google "how to blow up a centrifuge with a PLC." You need to have the hardware in a lab somewhere to test your code.
Corn
Which points toward a nation-state. This isn't a couple of hackers in a basement. You need a testing facility that mimics Natanz. You basically need a mock-up of the target to make sure the code doesn't just crash the PLC, but actually achieves the physics-based destruction you're looking for.
Herman
There are rumors that the Dimona facility in the Negev desert had a replica of the Natanz cascades specifically for testing Stuxnet. You have to verify that when you send the "two Hertz" command, the rotor actually hits the wall. You can't "beta test" that in the cloud. And that is the enduring legacy of Stuxnet. It proved that "cyber" isn't just about data anymore. It is about kinetic force. If you can control the PLC, you can open a dam, you can over-pressurize a gas pipeline, or you can tilt a power grid into a blackout. We’ve seen echoes of this in attacks on the Ukrainian power grid and even some water treatment facilities recently.
Corn
It’s essentially the blueprint for modern industrial warfare. And honestly, the "air gap" thing still bothers me. We keep hearing that critical infrastructure is safe because it's not on the internet. But Stuxnet didn't need the internet. It needed a human with a USB drive. How did it even ensure it would get onto the right drive?
Herman
It used a "shotgun" approach for the initial infection. It would infect any USB drive it touched, hoping that one of those drives would eventually be carried into the air-gapped facility. It used a vulnerability in how Windows handles shortcut icons—Lnk files—to execute code as soon as the drive was plugged in and the folder was viewed. The user didn't even have to click on anything. Once it was inside the facility, it used the "Print Spooler" vulnerability—another zero-day—to spread from one computer to another over the local network. It was like a biological virus jumping from person to person until it found the one person with the specific "genetic" makeup it was looking for.
Corn
That’s terrifying. It’s like a digital version of the "bounty hunter" that follows you home. So, if we look at the "Practical Takeaways" part of this—because we always try to give the listeners something to chew on—what has changed in industrial security since twenty-ten? Are we still just as vulnerable to this kind of PLC injection?
Herman
We are better, but the threat has evolved. One of the big changes is the push for "Code Signing" in industrial environments. Nowadays, a PLC should—ideally—only run code that has been digitally signed by an authorized engineer. If Stuxnet tried to swap out a library today on a modern, properly configured system, the hardware would see that the cryptographic signature doesn't match and it would refuse to run. But "ideally" is the doing a lot of heavy lifting in that sentence.
Corn
But how many "properly configured" systems are out there? I feel like there’s a lot of legacy gear in power plants and water systems that’s twenty years old. Do those old Siemens boxes even have the processing power to check a digital signature?
Herman
Most of them don't. That is the nightmare scenario. There are thousands of PLCs out there that don't support modern security features. They were built in an era when "security" meant a locked gate and a guard dog. For those systems, the takeaway is "Network Segmentation" and "Behavioral Monitoring." You have to watch the network traffic between the IT side and the Operational Technology, or OT, side. If you see a computer suddenly trying to rewrite a PLC's function blocks at three in the morning, that should trigger an alarm immediately. You need "Deep Packet Inspection" that understands the industrial protocols, so it can say, "Hey, why is this engineering workstation sending a 'frequency change' command to the turbine?"
Corn
It’s like having a digital bodyguard who doesn't just watch the door, but watches the people inside to see if they start acting weird. But what about the "lying" aspect? If the PLC is compromised, can you ever trust the data it sends back?
Herman
That’s the "Byzantine Generals Problem" of industrial control. If you can't trust the messenger, how do you know the state of the system? And for the engineers listening, checking your firmware versions and actually auditing the code on your PLCs against what is supposed to be there is crucial. Stuxnet's greatest trick was making the engineers see what they expected to see. You have to find ways to verify the "ground truth" of the hardware that don't rely on the potentially compromised software.
Corn
Like maybe a physical tachometer on the centrifuge that isn't connected to the network? Something that gives a direct, analog reading?
Herman
Analog backups! It sounds old-school, but an analog gauge that physically moves a needle based on pressure or speed is almost impossible to hack from a remote computer. If the computer says "all good" but the physical needle is in the red, you trust the needle. We’ve moved so far toward "Software-Defined Everything" that we’ve forgotten that physical reality is the ultimate arbiter. If you’re running a critical process, you need an out-of-band way to verify that the machines are doing what the screen says they are.
Corn
I love that. The high-tech solution to the high-tech problem is... a piece of metal on a spring. It’s a great reminder that complexity is often the enemy of security.
Herman
Sometimes the old ways are the best. But overall, Stuxnet was a wake-up call that the world didn't really want to hear. It showed that our entire physical reality is increasingly mediated by layers of code that most of us don't understand and can't see. When that code is subverted, the physical world breaks. It's not just about "losing data" anymore; it's about losing the power grid, the water supply, or the factory floor.
Corn
It’s a lot to think about. Daniel, thanks for the prompt—this was a fascinating "forensic" look at a digital crime scene. It’s wild to think that this all happened over fifteen years ago and we’re still unraveling the implications. It really set the stage for everything we see in the news today regarding state-sponsored hacking and infrastructure vulnerability.
Herman
It’s a masterpiece of engineering, regardless of the politics. The sheer level of coordination required to pull that off is staggering. It makes you wonder what’s sitting on our networks right now, just waiting for its "fingerprint" to be matched. Is there a "Stuxnet for Power Grids" or a "Stuxnet for Hospital Scanners" just waiting for a specific date or a specific configuration to trigger?
Corn
On that cheery note, I think we’ll wrap it up. If you want to dive deeper into the history of cyberwarfare, we did an episode a while back—episode nine hundred sixty-eight—on "Breaking the Air Gap" which covers some of the broader context of industrial attacks. We go into some of the more recent stuff like Industroyer and Triton there.
Herman
And if you're into the physics of destruction, episode six hundred ninety-four on the GBU-fifty-seven "Bunker Buster" is a great companion to this, looking at the physical side of attacking fortified targets. It’s the "brute force" version of what Stuxnet did with a few kilobytes of code.
Corn
Thanks for listening to "My Weird Prompts." Thanks to our producer, Hilbert Flumingtop, for keeping the gears turning and ensuring our own PLCs don't start vibrating through the floor.
Herman
And thanks again to Modal for the GPU credits. We literally couldn't do this without them. They provide the horsepower that lets us sift through these technical dossiers.
Corn
If you liked the show, leave us a review on Apple Podcasts or Spotify—it really helps people find us in the sea of content out there. You can find all our episodes and the RSS feed at myweirdprompts dot com.
Herman
We’ll be back next time with whatever weirdness Daniel or the rest of you send our way. Keep those prompts coming—the more technical, the better.
Corn
Stay curious, and maybe keep an eye on your USB drives. If you find one in a parking lot, maybe just leave it there.
Herman
See ya.
Corn
Bye.

This episode was generated with AI assistance. Hosts Herman and Corn are AI personalities.