A world where cyberwarfare is the weapon

The Stuxnet worm, used by the US in Iran, showed the power of a targeted network attack


The Stuxnet computer worm temporarily succeeded in its goal of crippling Iran’s controversial uranium enrichment programme but, as the world’s first digital weapon, it opened the door to a future in which forms of cyberwarfare will likely become the norm.

That's the stark picture painted by Countdown to Zero Day, a book by Wired magazine writer Kim Zetter that explores the development and impact of this complex malware developed secretly by the US and Israel.

The worm surreptitiously caused serious malfunctions in the centrifuges used in Iran for the process of enriching uranium. Stuxnet spread, not through the internet, but in a relatively old-fashioned way – by being introduced into a computer system via a storage format like a USB stick. Once a computer reads the stick, it launches the worm, says Zetter.

The worm then uses a so-called zero day exploit – a security flaw not yet publicly known – to release its payload. In this case, Stuxnet used a glitch in the Windows operating system that made millions of computers vulnerable to a Stuxnet infection.

READ MORE

Since Stuxnet's exposure in 2010, following months of reverse engineering by security company Symantec – led by California-based Irishman Liam O'Murchu, a Symantec researcher and reverse engineering expert – 20 countries have announced digital warfare programmes, Zetter told an audience in San Francisco recently.

According to Zetter, President Bush is known to have been approached about the operation sometime in 2006. Stuxnet itself existed by 2007, and may have already been released “into the wild” by then, a decade after the US national Security Agency (NSA) was given permission to begin to develop CNAs – computer network attacks.

Such attacks interested the military because there was a low cost of entry, the attacker didn’t need lots of armaments, and didn’t need to be physically close to the target, Zetter says. In addition, many industrial control systems run critical national infrastructure, so a CNA could inflict severe damage.

Inspection process

The first signs of something being not quite right with Iran’s centrifuges emerged in early 2009, when they were under a twice-yearly inspection process by the UN’s

International Atomic Energy Agency

(IAEA).

“Iran now had 8000 centrifuges installed,” Zetter says. But the IAEA “started noticing Iran was removing centrifuges at an unprecedented rate.” Because the centrifuges used were prone to faults, it would be normal to decommission about 10 per cent annually, “but IAEA inspectors realised they were decommissioning [up to] 2000 over the course of two months.”

Six months later, a seemingly unrelated incident would eventually lead to Stuxnet’s discovery.

“Some computers started crashing uncontrollably. They were crashing and rebooting, crashing and rebooting, and the systems administrators in Iran tried to wipe the operating system from the machines and reinstall, and that didn’t solve the problem.”

So, they contacted the Belarus company that supplied their antivirus software, and asked them to remotely access the systems and see if they could figure out what was going on.

Researchers there “found a handful of files that they through were suspicious. And as they started taking apart these files, they found one was a zero day exploit. But not just any zero day exploit, but a really ingenious zero day exploit that attacked seven versions of the Windows operating system,” says Zetter.

Researchers around the world

The Belarus company notified

Microsoft

(who gave the worm the name Stuxnet), and also made the files available to other researchers around the world. “At that point, Symantec picked them up,” says Zetter, via its Dublin threat centre.

O’Murchu and two other Symantec researchers in California began to reverse engineer the worm, a process that took four months and produced a 70-page dossier explaining how it worked.

Like conventional weapons, Stuxnet had two parts, a delivery mechanism and a payload.

“Stuxnet used zero day for delivery, but they found it used five zero day exploits over the course of its life, which was completely unprecedented – so they knew at this point that it was a nation state attack,” says Zetter. “In addition, Stuxnet used at least three other methods of spreading.”

One was to infect two files programmers use to program Siemens software in a small industrial computer called a PLC – a programmable logic controller – used to control the centrifuges in Iran.

Stuxnet initially would record the data produced by the PLC for 30 days. After that, the sabotage began.

“Stuxnet would start closing some of the valves in the centrifuge so that gas could go in through the entry valves, but couldn’t get out. Then it would wait about two hours, or until the pressure had increased about five times the normal pressure. During that two-hour period Stuxnet would feed back the data it had recorded during those 30 days. So the operators looking at the machines would think that everything was running fine,” says Zetter.

Pressure

Stuxnet also disabled the safety mechanisms on the centrifuges. As pressure built, the gas would solidify and damage the centrifuge’s rotors. At the end of 30 days, the process would start over.

“What this tells us is, they were not looking for one-time catastrophic destruction, but incremental damage over a period of time,” says Zetter. “It wouldn’t be initially noticed and would prevent the Iranian technicians from locating the source of the attacks and anticipating when an attack might be coming.”

A second version of Stuxnet, authorised by President Obama, used similar tactics to affect the frequency converters feeding electricity to the centrifuges and telling them how fast they should spin. This caused the motors to deteriorate quickly.

None of this was understood until November of 2010, after Stuxnet was reverse engineered.

Stuxnet eventually was exposed due to several weaknesses. One was that the increased capabilities built into the later version caused it to spread so quickly that contractors supplying services to Iran carried the worm to their offices worldwide and to clients. A few months after its launch, Stuxnet already had infected more than 100,000 computers, says Zetter.

The file was also huge for a piece of malware – 500kb when most average 25 kb – due to a bug in the zero day code that caused Stuxnet to append new versions of files to existing files, rather than just replace them. This made the worm more visible to technicians.

Incompatibility

Another bug caused the unknown incompatibility on some computers that caused them to crash and display the Windows “blue screen of death”, which drove the technicians to contact the Belarus antivirus firm.

Zetter says the attackers also failed to eventually kill the code and stop log files in Stuxnet from communicating back to the command and control server, even after it was apparent the worm had spread beyond Iran. She argues this was to keep the backdoors into millions of computers globally that were obtained in this way, which would form the basis for subsequent mass surveillance programmes.

Ultimately, Stuxnet set a worrying precedent because it “ignores the fact that our systems are just as vulnerable, because [US]systems are the most connected systems in the world.”

Its development also meant the US lost moral ground for demanding other countries not use cyberwarfare techniques, Zetter says. And, inevitably, it launched many more digital warfare programmes across the world.