Contemporary Operating Environment

Stuxnet: The Threat of Tomorrow Unveiled

By Aaron Wright June 4, 2020

On the 6th of August 1945, the US B-29 Superfortress ‘Enola Gay’ dropped the first atomic bomb on the Japanese city of Hiroshima. This new superweapon would forever alter the reality of conventional warfare. In 2009 (Virvilis et al., 2013), unbeknownst to the world, a covert weapon was making its way across cyber space that would instigate a watershed moment of similar scope in the digital battlefield, the malware known as Win32.Stuxnet (Constantine, 2011). Its impact on theories of cyber defence and offence is difficult to overstate. Its sophistication and elegance were unprecedented (Karnouskos, 2011) and it has become the standard to which future cyber weapons will be held.

Just what is Stuxnet and how did it work?

With the long shadow Stuxnet casts, there are several key questions we must consider. What was its intended target and was it successful? What about its methods make it remarkable? Who were its authors and what was their motivation? How does it interact within existing world doctrines of Cyber Warfare and Cyber Terrorism? Finally, what vulnerabilities did Stuxnet prey upon and what methods could be employed to better defend against similar threat vectors in the future?

Stuxnet is at its core a worm: that is, a computer virus designed to self-replicate in order to infect as many different compatible systems as possible (Prowse, 2018). What differentiates Stuxnet from its brethren is while most worms infect and disrupt indiscriminately, Stuxnet is incredibly precise, going out of its way to only attack particular systems (Zhioua, 2013).

Programmable logic controllers (PLC) are mechanical devices used to direct physical processes of machines based upon computer code (Al-Rabiaah, 2018). For example, if a mechanical arm needs to cut metal at a certain pace and a certain angle, these motions are often controlled by a PLC, working independently with installed computer code. PLCs are built using proprietary technologies (Miyachi et al., 2011), each unique and specific to a task. After gaining access to a computer, either via USB or an intranet connection, Stuxnet begins fingerprinting all data, searching for specific PLC model numbers, programs and configuration details (Langner, 2011), targeting the Siemens S7/WinCC family of products (Nourian & Madnick 2018). If the target is not found Stuxnet keeps silent, not interfering with normal system functions (Miyachi et al., 2011). Stuxnet is also designed to thwart anti-virus programs by injecting itself into a recognised process, installing a Windows rootkit (gaining administrator privileges) and compromising an anti-virus’s ability to function (Chen & Abu-Nimeh, 2011). Stuxnet also contains an inbuilt self-termination date (24 June 2012), meaning all copies worldwide would uninstall simultaneously, hiding its activity (Karnouskos, 2011). If Stuxnet encountered a system already containing Stuxnet, the two copies would peer-to-peer compare and the outdated copy would update to the more recent version (Chen & Abu-Nimeh, 2011), enhancing its ability to evade newly developed detection software in addition to updating its mission parameters. Upon detecting an internet connection Stuxnet will also covertly connect to an online database in order to update itself (Virvilis et al., 2013). This is Stuxnet’s “Seek Phase.”

Upon identification of the correct PLC (using a Profibus communications processor (Zhioua, 2013)), operating the correct Siemens S7 software, Stuxnet will execute its ‘Pre-Attack’ Phase (Nourian & Madnick 2018). Through cyber forensics efforts, it is agreed the above were PLCs used to control speeds of centrifuges within Iran’s Natanz nuclear enrichment facility (Virvilis et al., 2013).  This is due to Iran being the only user of the targeted combination and the only confirmed cases of PLC infection being those within Natanz (Langner, 2011). Stuxnet would further monitor the speeds of the motors PLCs were controlling in order to ensure it was operating a centrifuge (Karnouskos, 2011). Stuxnet would now set up for a sophisticated man-in-the-middle attack (Constantine, 2011). This is where malware will replicate the legitimate functions of a system, passing information back and forth, in order to mask its own nefarious actions (Prowse, 2018). Upon installing itself onto the PLC Stuxnet will record outputs of ‘normal operations’, and play this data back to any monitoring programs, convincing the system it is operating within normal parameters (Karnouskos, 2011). With its cover now established, the ‘Attack Phase’ can finally commence.

When the centrifuge motor is detected, spinning between a certain range (807Hz and 1210Hz) Stuxnet will intervene and order an increase to 1410Hz, followed by a sudden drop to 2Hz, then rapidly back up to 1064Hz (Zhioua, 2013), the intended consequence being severe damage and, ultimately, system failure (Clark et al., 2013), all the while projecting the image of normal operations. Stuxnet will do this intermittently, at non predictable intervals, sometimes allowing systems to maintain completely normal functions for significant periods of time (Langner, 2011). This is to make diagnosing the fault difficult and frustrating as without a predictable failure pattern, the flaw would not be easy for system engineers to duplicate and solve (Miyachi et al., 2011). Had it remained undetected it is reasonable to speculate that its final phase would have been ‘Clean Up’ whereupon the self-uninstall function is executed, removing all traces of itself from infected systems.

Stuxnet’s success is difficult to gauge as the Iranian government has never officially released data of its impact (Miyachi et al., 2011), though outside investigations indicate roughly 1,000 centrifuges were damaged (Fidler, 2011) and the Iranian nuclear enrichment program was set back approximately four years (Virvilis et al., 2013). Whether Stuxnet’s authors would consider this a success is impossible to determine without knowledge of their mission objective. However, given Stuxnet’s self-deletion protocol we can reasonably assume that its discovery is a failure of its stealth capability. So, who were these authors and what were their motivations? As part of its propagation and subversion methods Stuxnet makes use four Zero-Day vulnerabilities (Karnouskos, 2011). These are vulnerabilities existing within a program since release, but unknown to the program developers. It also makes use of stolen digitally signed certificates (Al-Rabiaah, 2018). These are both extremely rare and valuable. Furthermore Stuxnet’s 15,000 lines of code (Constantine, 2011) would require a team 5-10 working for six months with an intimate knowledge of obscure PLC programming languages and physical replicas of the target environments (Abu-Nimeh, 2011). This level of resourcing makes it a near certainty that Stuxnet was developed by a nation state (Kushner, 2011) and the lack of financial gain indicates the motivation was political. Given the target, it is likely the developers were the U.S. or a regional ally, such as Israel (Kushner, 2013), though this has never been officially confirmed nor denied.

The aftermath

What was the response to the Stuxnet Malware? On the technical side there has been a re-examination of the vulnerability of physical components of security networks. A push for deeper levels of encryption and authentication between PLCs and their computer software for future models is generally agreed upon (Clark et al., 2013). However, cost of replacements/overhauls and a mis-appreciation of risk mean insecure components will continue to be used worldwide well into the future.  A general push for a collaboration approach to system monitoring between programs is now recommended in order to prevent man-in-the-middle malware from fooling a single point of failure monitoring program (Karnouskos, 2011). What specific technical adaptions the Natanz facility made against Stuxnet are unknown. International government response to Stuxnet has been mostly silence, from both the suspected instigators and the victim, as all likely want to avoid any codification in law on the use of cyber weapons, in order to preserve their own capability to deploy them in future without being deemed an act of war (Fidler, 2011).

On September 2011, malware known as Duqu was identified. It shared a great deal of source code with Stuxnet, and like Stuxnet was extremely selective in its target acquisition. It used stolen certificates and zero-day vulnerabilities to gain access to and hide within systems, and had a self-deletion protocol (Virvilis et al., 2013). However, its focus was on espionage, rather than physical sabotage, operating primarily as a keylogger. These similarities have created a consensus that Duqu was designed by the same team that created Stuxnet (Al-Rabiaah, 2018). Duqu demonstrates that not only is Stuxnet not an isolated incident, but there are dedicated, well-funded teams worldwide, designing, iterating and releasing malware for a variety of different strategic purposes.

Lessons learnt

Five lessons can be drawn from Stuxnet. Firstly, that isolation from external networks will not deter a persistent attacker (Abu-Nimeh, 2011). Stuxnet crossed the Natanz airgap, likely via USB/external drives. It then spread easily amongst systems considered ‘trusted’. In addition, digital certificates, even within trusted systems, must be regularly and thoroughly authenticated. Secondly, obscure technical components, such as PLCs, are not secure by virtue of their obscurity (Miyachi et al., 2011). Persistent attackers will discover vulnerabilities, regardless of how niche they may be. Third, many cyber systems contain components below current security standards, that are expensive or time consuming to redesign/replace (Miyachi et al., 2011). These remain vulnerabilities to discover and exploit, negating hardening efforts elsewhere. Fourthly, single-point monitoring cannot be replied upon to identify malware or abnormal system operations (Karnouskos, 2011). With sophisticated malware spoofing monitoring data, multiple detection systems observing each other must become the new norm. Finally, nation states and other resourced groups are actively developing sophisticated malicious code, refining it through iterative programs, secretly hoarding vulnerability knowledge, and using their expertise to conduct pinpoint covert cyber operations (Koch & Golling, 2018). Thus, no single point of defence can be relied upon, and security in depth, from entry point down to the most basic physical component, is the only way to truly harden vital systems (Virvilis et al., 2013).

Currently in the world there are two competing cyber warfare doctrines. The effects-based approach of the United States and her allies (Farrell & Glaser, 2017)) and the information war (informatsionnaya) approach generally believed to be adopted by Russia and China (Connell & Vogler, 2017). Stuxnet fits neatly within the former, generating the same effect as a kinetic strike may have had on Natanz (delaying the Iranian nuclear enrichment program) without any of the political ramifications. While its detection could be considered a failure, the effect, and the anonymity maintained, have probably solidified the chosen doctrine as correct in the eyes of its authors. The development of iterative malware, such as Duqu, suggests a continuation down this effects-based doctrine approach. It is not unreasonable to assume further cyber weapons of this nature have been developed in the years since.


To conclude, while extraordinary in its sophistication, Stuxnet is now over ten years old. Its authors, like any professional development team, likely learned from their mistakes, particularly in Stuxnet’s premature detection, and have taken this into consideration when developing new software. No doubt Stuxnet has also inspired a generation of recreational malware developers, expanding the imagination of what can be achieved. The most ingenious aspect of Stuxnet was in its ability to remain hidden, until specific attack parameters were met, and we must now consider, ten years later, how many of Stuxnet’s offspring nest within civilian and military cyber systems, masking their presence, waiting for the correct circumstance to activate? It is for this reason we must never grow complacent, examine every component of ‘trusted’ systems on a regular basis and continue to adapt new methods and models of defence, because the offence team are not slowing down their game. After the destruction of Hiroshima and Nagasaki, nuclear weapons quickly reached a point where their use was impossible due to mutually assured destruction. As the physical and cyber world continue to merge it will be interesting to see if cyber weapons reach a level of sophistication and impact that they also become too effective to practically use.



Al-Rabiaah, S. (2018). The “Stuxnet” Virus of 2010 as an Example of an “APT” and its “Recent” Variances. 21st Saudi Computer Society National Computer Conference (NCC). Riyadh, Saudi Arabia.

Chen,T.n., & Abu-Nimeh, S. (2011). Lessons from Stuxnet. Computer (Volume: 44, Issue: 4), Apr 2011.

Clark, A., Zhu, Q., Poovendran, R., & Basar, T. (2013) An Impact Aware Defence against Stuxnet. 2013 American Control Conference, Washington D.C., United States of America.

Connell, Michael & Vogler, Sarah. (2017). Russia’s Approach to Cyber Warfare CNA Analysis and Solutions. CNA Analysis & Solutions.

Constantine, L. (2011), Crossing the Line: Terrorism in Cyberspace and Targets in Real-space, International Conference on Cyberworlds, Ontario, Canada.

Farrell, Henry & Glaser, Charles. (2017). The role of effects, saliencies and norms in U.S. Cyberwar doctrine. Journal of Cybersecurity (Volume: 3)

Fidler D.P., (2011) Was Stuxnet an Act of War? Decoding a Cyberattack.  IEEE Security & Privacy (Volume:9, Issue: 4), Jul-Aug 2011. 

Karnouskos, S. (2011), Stuxnet Worm Impact on Industrial Cyber-Physical System Security, IECON 2011 – 37th Annual Conference of the IEEE Industrial Electronics Society, Melbourne, Australia.

Koch, R., & Golling, M. (2018) The Cyber Decade: Cyber Defence at an X-ing Point. 10th International Conference on Cyber Conflict (CyCon). Tallinn, Estonia.

Kushner, D. (2013) The Real Story of Stuxnet. IEEE Spectrum (Volume: 50, Issue: 3) Mar 2013.

Langner, R. (2011). Stuxnet: Dissecting a Cyberwarfare Weapon. IEEE Security & privacy (Volume: 9, Issue: 3) May-June 2011.

Miyachi, T., Narita, H., Yamada, H., & Furuta, H. (2011). Myth and Reality on Control Systems Security Revealed by Stuxnet. SICE Annual Conference 2011, Tokyo, Japan.

Nourian, A., Madnick, S. (2018) A Systems Theoretic Approach to the Security Threats in Cyber Physical Systems Applied to Stuxnet. IEEE transactions on Dependable and Secure Computing (Volume: 15, Issue: 1) Jan-Feb 2018.

Prowse, D.L. (2018) Security+ SYO-501 Fourth Edition. Pearson.

Virvilis, N., Gritzalis, D., & Apostolopoulos, T. (2013), Trusted Computing vs. Advanced Persistent Threats: Can a defender win this game? IEEE 10th International Conference on Ubiquitous Intelligence and Computing, Italy.

Zhioua, S., (2013), The Middle East under Malware Attack: Dissecting Cyber Weapons. 33rd International Conference on Distributed Computing Systems Workshops, Philadelphia, United States of America.



Aaron Wright

Aaron Wright is an Australian Army Education Officer who is currently posted to Headquarters Forces Command. Before commissioning as an officer he spent five years living and working in Japan teaching English and karate.

The views expressed in this article are those of the author and do not necessarily reflect the position of the Australian Army, the Department of Defence or the Australian Government.

Add new comment