The dangers of over-reliance on ICT

7

BY PETER WANYONYI

Ukraine does not normally make much news in information and communications technology. But in December 2015, that changed rather dramatically. Two days before Christmas, engineers at one of Ukraine’s largest power stations suddenly realised that they had lost control of their computer systems to an unknown attacker. The engineers watched with horror as a remote agent took control of their main computer systems, including the monitors, and then proceeded to shut off power to hundreds of thousands of Ukrainian homes. As this progressed, another attack was underway at a different Ukrainian power station – this one used a compromised computer that no one could locate, to also shut down dozens of substations. December is the freezing depth of winter in Ukraine, and as engineers rushed to use manual overrides to restore power across the country, citizens reliant on those power stations were freezing to death.

It was reckoned that Russia was to blame, for the Russians and the Ukrainians have been engaged in a high-stakes regional geopolitical game revolving around Ukraine’s desire to get closer to the European Union, and Russia’s determination to ensure that never happens. Many figured that the attacks were a warning to Ukraine – but the Ukrainians failed to heed the warning, and a year later the Russians attacked Ukrainian power plants again, once more in mid-winter. This time, the power shut-down was far more extensive. And, just to intensify the pain, the affected companies received thousands of automated calls to their call centres. The calls came from abroad, and they ensured that genuine customers couldn’t reach the companies to find out what was going on. And, in a final coup de grace, the attackers then launched massive malware attacks that paralysed the entire computer systems of the power companies, destroying the computers in entirety and ensuring that the data held on those computers was rendered irrecoverable.

This sort of attack on utilities is not something entirely unknown. Back in 2007, researchers in the United States demonstrated a scary attack methodology: they caused a 2.25MW diesel generator to blow up by sending it 21 lines of instructions that had been altered maliciously, and which instructed the generator to rapidly open and close its circuit breakers out of phase from the rest of the grid. The generator was destroyed in just three minutes – and would have been destroyed much faster had the researchers not kept pausing the test to inspect its effects.

ICT is a fantastic tool, but it carries with it risks and is, like most good tools, a double-edged sword. Iran discovered this, to general anger and frustration – not to mention millions of dollars in costs – in 2010, when its Natanz nuclear facility was the target of a huge cyber-attack claimed to have been orchestrated by Israel. At the centre of any conventional nuclear facility is the need to have enriched nuclear fuel. Enriched Uranium is necessary for power generation and, more importantly, for the manufacture of nuclear weapons. To achieve enrichment, uranium is typically put through a system of centrifuges that separate the highly enriched stuff from its less enriched isotopes. Those centrifuges are linked to each other using computerised control systems, and it is these systems that were the targets of the 2010 attack, which utilised a highly sophisticated computer worm called “Stuxnet”. The worm sent instructions to the centrifuges to speed up out of control, all the while sending fake control data that showed everything to be proceeding without a hitch. Hundreds of expensive, difficult-to-replace centrifuges were destroyed in the process, setting the Iranian nuclear programme behind by years.

Today, just about every public utility has computerised control systems. But, as we saw during the recent elections, computers aren’t everything, and they have a tendency to fail when you need them the most. Even when they do not fail, they are susceptible to devastating attacks by third parties, and if an organisation is totally reliant on them without any manual backups, such attacks can put the organisation out of business at the very least, and can cause extensive destruction and even loss of lives in the worst case.

Much as ICT makes our work and lives easier, it is advisable that the most critical systems – including public utility control systems – are designed with manual overrides in place. This ensures that those systems can continue to operate even when their computerised controls fail. When the responsibility of a system is to deliver a public service or product that is critical to the lives of people or without which the economy cannot function, then those systems must absolutely have backup mechanisms allowing them to survive the failure of their control systems. Because, as every IT professional knows, every IT system fails sooner or later. 

The author is an information systems professional.