There has been some discussion about how effective Stuxnet could be against a facility without insider knowledge of how the control system for a process is set up. It is easy to see how an attacker with a detailed knowledge of what programmable logic controller (PLC) operates which valve, pump or other device could cause serious disruptions or damage at that facility with a tool like Stuxnet. But, if an attacker doesn’t have that insider knowledge, how much damage could they actually do?
First we need to look at what kinds of motivations could be behind a Stuxnet attack. In an earlier blog I looked at a criminal extortion type attack. In this blog I would like to limit the attacks considered to terrorist attacks. Even here there would be two types of attacks to consider; an attack to shut a plant down or an attack to have a larger, off-site terror effect.
Plant Shutdown
We can imagine a number of players that could have a reason to want to shut down a chemical facility. The obvious are competitors (an unlikely suspect, the chemical industry has not gotten that cutthroat), political adversaries (perhaps in retaliation for a perceived attack on one of their process industry), or eco-terrorists (with a particular concern with the products of that plant).
Given access to a Stuxnet worm and the ability to program it to collect/report PLC label data, an attacker with minimal knowledge of the process could make the plant effectively inoperable for extended periods of time. All it would take is the knowledge that most programmers use tags for devices that are readily identifiable; valves are labeled valves, pumps are labeled pumps.
With this knowledge all it would take to make the plant shutdown would be to re-program some of the PLCs to ignore change of state commands. For example, once a valve opened, change the programming to ignore a ‘close’ command. Or, for a pump, once it started, program it to ignore ‘stop’ commands. While some of these could have serious consequences, the vast majority would be more of a serious nuisance than a catastrophic hazard. The probability favors management identifying the problem and shutting the plant down before a catastrophic failure resulted.
Again, in most cases the shutdown would be for a relatively limited amount of time. The mess would have to be cleaned up, potentially some equipment replaced, but the actual length of the shutdown would be determined by how long it took to clear the Stuxnet caused programming problem and make the system proof against a repeat attack. But this could be sufficient for most of the probable attackers of this sort.
Off-Site Terror Attack
Deliberately conducting an attack with a high-probability of an off-site effect, would take a bit more process knowledge; not necessarily insider knowledge, but at least a knowledge of the chemistry concerned. Again the ability to collect tag information from the system would be a key to executing this kind of attack.
In a system designed with due respect to process safety, there will be a number of operating restrictions programmed into the control system. For example there would be prohibitions for opening valves of two incompatible chemicals at the same time to prevent uncontrolled reactions. Valve tags frequently contain chemical names, again to make it easier for the programmer to keep track of what valves control what chemicals. Thus it could be a simple matter to determine which safety programming could be reversed to cause a potentially catastrophic process reaction.
An alternative tactic would to change temperature controls in a critical reaction. This could be done by manipulating the parameters of a temperature controller. When a process control provides for automated cooling to be applied when a temperature was reached in a process, changing that to the application of heating could easily cause a runaway reaction that could have catastrophic consequences.
This is not as high a probability attack as placing an explosive device next to a flammable liquid storage tank, but it is less likely to result in the attacker being captured or killed in the process of the attack.
Insider Assisted Attack
To conduct a Stuxnet attack with a high-probability of success, especially if success is measured by the level of off-site consequence, will require some level of insider knowledge of the process. This does not mean that the person programming the control system (a process engineer or even perhaps a Siemens employee or contractor) has to be working with or for the attacker. Programming notes or process safety files will provide enough insider information to allow for more effective attacks using Stuxnet.
This type of information may be available to a number of employees, contractors, or even visitors to the facility, depending on the level of physical and document security at the facility. Process safety information, for example, is required by regulation to be accessible to all employees.
Stuxnet is a Threat
Thus we can see that Stuxnet or Stuxnet clones are a definite potential threat to chemical facility control systems. As far as I can tell, there is not currently any iron clad defense against Stuxnet type attacks. The use of a variety of cyber security techniques may increase the chance of detecting this type of attack before it can cause serious process damage.
Limiting the use of USB memory devices on the control system is an obvious first step as is limiting the connections between the ICS computers and the Internet. Furthermore, protecting all other computers on the corporate networks with updated anti-virus signatures and all security patches will help protect the ICS from this type of attack.
Subscribe to:
Post Comments (Atom)
1 comment:
Hey PJ - I agree with the points you make. In addition, I believe the Stuxnet worm should make us rethink plant safety designs. In my books, digital safety systems should be air-gapped from operations systems, and mechanical safeties should be designed to make catastrophic malfunction impossible. Safety systems are currently evaluated on a probabilistic basis - what is the likelihood that enough components will fail simultaneously in a way that could cause catastrophe? Safety systems need to start being evaluated from an adversary's perspective: is there a set of components, which if destroyed or caused to malfunction simultaneously, can cause a catastrophe?
In your example of a reaction tank with feeds from two different and incompatible chemicals for example - is there a mechanical safety system that can ensure that both valves are not open simultaneously, and that the tank is empty of all inputs before the source of inputs is switched?
I fear, though, that mechanical design changes like this are prohibitively expensive as a retrofit, and will be considered only for new plants. But to me, this kind of design is part of the future of security at chemical facilities. The worst consequence that a worm, or even an insider with access to the control system, should be able to produce is denial-of-service: triggering a safety shutdown. The air-gapped digital safeties and the mechanical safeties should be designed to prevent any more serious consequences.
Post a Comment