Tuesday, January 31, 2012

More Waxahachie Emergency Response Notes

Last October I looked at the fire at the Magnablend chemical facility in Waxahachie, TX as a learning tool for emergency response planners. Recently the facility was once again in the news for emergency response activities related to the aftermath of that fire. According to a news article on WFAA.com recent rains in the area caused containment ponds that collected fire-fighting water (and subsequent rain fall that helped ‘clean’ the facility) to overflow; ponds that “were presumed to still be polluted with chemical residue” according to the article’s author Brett Shipp.

Typically these run-off collection ponds are initially put into place by emergency responders and later improved somewhat by whatever clean-up company comes in to remediate the site. The initial runoff from the firefighting effort would probably have the highest concentration of dangerous chemicals. That is presuming, of course, that teams are able to quickly get into the facility and stop whatever leaks remain.

The initial fill of these ponds is usually emptied quickly in an effort to limit any additional environmental exposure to the chemical mixture involved. Most professional site restoration companies are well experienced in the physical and legal requirements of this process. These operations should be coordinated with local emergency response personnel so that they can respond appropriately to any incidents that occur in the process.

The containment structures are typically left in place until final site clearance is received to collect any subsequent run off from facility clean-up operations or rainfall runoff. The water collected is usually less contaminated than the initial collection in these ponds, but, depending on the chemicals involved at the site, may still harbor dangerous levels of hazardous chemicals. Remember what constitutes ‘dangerous levels’ is dependent on the chemicals involved, some chemicals are still dangerous down to the part per million or even part per billion levels in the environment.

Local emergency response planners need to ensure that these collection ponds are monitored for contaminant levels and liquid level in the ponds. When heavy rains are forecast for the area consideration of draining the current contents before the rain event may prove to be beneficial. Areas of the country that experience frequent short-notice periods of heavy rainfall may want to consider requiring secondary containment facilities to catch any pond overflows.

Provisions need to be put into place to keep these ponds isolated from the community, including restricting access to the ponds. They certainly meet the definition of ‘attractive nuisance’ and may actually be potential targets for fringe elements of the radical environmental movement, particularly if the company involved is already on the hit list for whatever real or imagined environmental slights. Less radical elements may also attempt to include such sites in ‘environmental actions’ designed to call attention to the hazards.

As with all emergency response plans a formal process needs to be put into place to review these situations on an on-going basis. Initial emergency plans for all facilities housing dangerous chemicals need to include run-off management plans. Those plans need to be reviewed and modified as necessary before the incident commander turns the scene back over to the owner or the environmental remediation company designated for site clean-up.

Monday, January 30, 2012

Siemens – The Big ICS-CERT Advisory

Today the DHS ICS-CERT folks published an unusual advisory. They combined reports of vulnerabilities from four separate researchers; Billy Rios, Terry McCorkle, Shawn Merdinger, and Luigi Auriemma; and combined them into one big (eleven separate vulnerabilities) advisory on the Siemens WinCC application. Not only is the big from the number of vulnerabilities, but the potential consequences of the exploitation of these vulnerabilities is really big. ICS-CERT notes that:

“Successful exploitation of these vulnerabilities could allow an attacker to log on to a vulnerable system as a user or administrator with the ability to execute arbitrary code or obtain full access to files on the system.”

Given the wide range of facilities that this Siemens application is used, an attacker would have a wide range of potential targets that could essentially be exploited at will, shutting down electrical transmission facilities, water treatment facilities, chemical plants, even automotive manufacturing facilities. Simultaneous attacks on a number of targets across a number of manufacturing and utility sectors could have a catastrophic impact on local, state, national, or even world economies.

The catalogue of vulnerabilities includes:

• Insecure authentications;
• Weak default passwords;
• Cross-site scripting;
• Header injection;
• Client-side attack;
• Lack of telnet daemon authentication;
• String stack overflow;
• Directory traversal (two separate vulnerabilities);
• Denials of Service; and
• Arbitrary memory read access.

The good news (and I’m really having to stretch here to call this ‘good news’) is that ONE of the vulnerabilities requires user interaction to exploit. Fortunately for Siemens’ customers there have been so few successful social engineering attacks over the last year or so (pardon the gross sarcasm). The bad news (and it doesn’t come much worse than this) is that there are publicly available exploits for 7 of the 11 (Oh Craps, I know, pardon the pun) vulnerabilities.

The good news (another stretch) is that Siemens has dealt with each of these vulnerabilities. They have

• Patched 5;

• Changed product documentation to explain how to correct one during set up;

• Recommended deactivation of transport mode for four others; and

• Explained that users have the option of disabling the final vulnerability.

The bad news is that no one outside of Siemens has verified if any of the above actions prevent the exploit of any of the eleven vulnerabilities included in this report.

The final good thing is that ICS-CERT put all of these vulnerabilities into a single advisory, making it easier to keep track of what has been fixed or not. It might be a good idea to do the same sort of thing for Siemen’s PLCs.

New Version of HR 3674, ‘the’ House Cybersecurity Bill

As I noted in my blog post Saturday, there will be a subcommittee markup hearing for HR 3674, the Promoting and Enhancing Cybersecurity and Information Sharing Effectiveness Act of 2011 (PRECISE) Act of 2011. As is usual with markups of bills like this, the hearing will start off with the Chairman, Rep Lungren (R,CA) introducing his revised language for the bill and the subsequent proposed amendments will be made to that new language. So let’s take a look at the new version of his bill.

Overview


First off nothing has been removed from the bill at this point (that could change later this week); so everything I wrote about this bill (then a draft of this bill) still pertains to this revised language.

Most of the changes have been technical wording changes that will be mainly of interest to lawyers and judges if this bill ends up being signed by the President. There were, however a couple of new sections that were added at the end of the bill. They include:

§ 4. Report on Support for Regional Cybersecurity Cooperatives;
§ 5. Pilot Program on Cybersecurity Training for Fusion Centers; and
§ 6. Assessment of Sector by Sector Cybersecurity Preparedness.

Please note that §5 provides for training fusion center personnel in IT security practices to protect their information systems, not about cyber security threat assessment. It would have been nice to see a training requirement here for instance that would direct fusion center analysts to ICS-CERT for assistance in evaluating potential control system threats or attacks.

The bulk of the remaining changes can be found in Subtitle E, National Information Sharing Organization (NISO). Most of these changes have apparently been made to ensure that the NISO is not a ‘threat’ to civil liberties or legitimate information sharing activities.

ICS Coverage?


This bill remains at heart an information system protection bill not an ICS protection bill. The new version does include an additional mention of ‘industrial control systems’. In §226(a)(7) the bill would require the Secretary of DHS to:

“establish, in coordination with the Director of the National Institute of Standards and Technology, the heads of other appropriate agencies, and appropriate elements of the private sector, guidelines for making critical infrastructure information systems and industrial control systems [emphasis added]  more secure at a fundamental level, including through automation, interoperability, and privacy-enhancing authentication”.

There continue to be a number of sections of the bill that do not contain the explicit language “critical infrastructure information systems” and these may imply coverage of control systems. These are generally reporting requirements or information sharing requirements and they do not provide any regulatory authority.

For example the new §4 of the bill requires the Secretary to report on:

“the Secretary’s plan to provide support to regional, State, and local grassroots cyber cooperatives designed to decrease cyber disruptions to critical infrastructure, increase cyber workforce training efforts, increase community awareness of cybersecurity, organize community cyber-emergency preparedness efforts, build resiliency of regional, State, and local critical services, and coordinate academic technical and policy research effort”.

There is mention of potential grant program supporting these ‘cyber cooperatives’ (and that term is never defined), but there is no spending authority for such grants. This means that the grant money would have to come out of some existing grant program.

National Information Sharing Organization


The most controversial area of this bill continues to be the establishment of the National Information Sharing Organization which is also the section of the bill that sets up the conflict between this bill and HR 3523 (the bill sponsored by the House Intelligence Committee). Most changes to the NISO sections of this bill address privacy concerns.

For example §244(9) sets for the requirements for the protections of ‘privacy and civil liberties’. The new version of this bill adds subparagraphs (B) and (C) that specify that only ‘cyber threat information’ may be shared within NISO and that all “personally identifiable information not necessary to describe a cyber threat” be removed from information shared by and through NISO.

I noted in my earlier blog on this bill that the private sector board members of NISO did not include anyone from the water, chemical or transportation critical infrastructure key resources (CIKR) sectors. The revised version changes that somewhat in that it adds the water sector to those represented on the Board. The continued lack of chemical or transportation sector representation effective shuts those sectors out of NISO participation.

The new version of this bill also financially guts NISO after FY 2015. Federal funding up until then consists of $20 million each fiscal year (and that comes out of the existing DHS S&T budget, no new money). After FY 2015 the only federal money going to NISO will be the Federal membership fee for NISO. Even that will be limited by §253(b) to no more than “the fee collected from the largest private sector member of the National Information Sharing Organization”.

Since §253(a) prohibits Federal appropriations supporting NISO, that fee will have to come out of the budget of DHS or three other “Federal agencies with significant responsibility for cybersecurity” {§243(b)}. Since none of the four is required to pay the Federal governments ‘fair share’ fee I bet this gets lost in the annual budget shuffle.

There are two new terms specifically defined in the NISO sections of this bill that might increase the applicability of NISO to control system security information sharing (but don’t hold your breath); ‘cyber attack’ and ‘cyber security criminal act’. The inclusive language for ‘cyber attack’ includes the phrase “causes or attempts to cause damage and loss” {§248(f)(1)(B)}. For ‘cyber security criminal act’ the phrase is “efforts to degrade, disrupt or destroy a cybersecurity system or network” {§248(f)(2)(A)}. Neither constitutes a resounding commitment to ICS security information sharing.

Further Amendments


The subcommittee markup hearing that starts on Wednesday (and may become a multi-day hearing) will undoubtedly include many changes to the wording of this bill. Watching the hearing itself will be little help in identifying those changes as the exact wording of the changes is rarely included in the live proceedings. Usually we just get the interpretations of what the various congress critters think the language means.

We will have to wait until the actual amendment language is posted to the House Homeland Security Committee web site. The staff of that Committee usually does a pretty good job of getting that information up quickly. After that we will have the full committee markup (maybe as early as next week). Then we will have to wait for four other committees to act (or more likely fail to act) on the bill.

Saturday, January 28, 2012

Congressional Hearings – Week of 01-30-12

Congress has a full week (for Congress 4 days is a full week) of work ahead of them including two hearings that will certainly be of interest to readers of this blog; ISCD Problems, and Cybersecurity Legislation.

ISCD Problems


The Environment and Economy Subcommittee of the House Energy and Commerce Committee will be holding hearings on the current problems at ISCD on Friday. Actually the title of the hearing is “Evaluating Internal Operation and Implementation of the Chemical Facility Anti-Terrorism Standards program (CFATS) by the Department of Homeland Security”; and I thought that I had a tendency to get wordy.

No witness list is currently available, but I would bet that it includes on the first panel Under Secretary Beers and Director Anderson. If that is the only panel of witnesses, the hearing will be a typical Congressional waste of time. If the second panel is industry reps, it will be almost as much of a waste of time. The only way that this hearing will be meaningful is if it includes sworn testimony from people within ISCD including the facility inspection force; I’m not holding my breath.

What is disappointing is that the first hearing on this topic is by a subcommittee of the Energy and Commerce Committee. First we are certainly past the point where we should be wasting time with Subcommittee hearings since they will certainly have to be duplicated by the full committee before anything can be accomplished. Secondly it is a sign of the utterly stupid organization of oversight of DHS components in Congress that this hearing is not being held by the Homeland Security Committee. Of course Rep King (R,NY) and Thompson (D,MS) have been absolutely silent on the ISCD issue so maybe it is better that someone else does the hearings.

One last rant point here; if the hearing record does not include a public copy (redacted if absolutely necessary) of the internal NPPD report on the problems, the Subcommittee needs to be swept from office in November and the Committee Staff fired on the spot. I know, it won’t happen, but I just had to vent.

Cybersecurity


The Subcommittee on Cybersecurity, Infrastructure Protection, and Security Technologies of the House Homeland Security Committee will be holding a potentially multiple day mark-up hearing on HR 3674 starting on Wednesday. I did a blog post on this bill before it was actually introduced and most of that discussion remains applicable to the bill going into this hearing.

Chairman Lungren (R, CA) will be submitting substitute language for this bill at this hearing. There are some interesting changes being proposed (including some minor but specific control system language), but that is a subject for a separate blog post.

This bill has the hallmarks of being the potential cyber-security bill for this session. The only drawback is that it was also referred to the following committees for consideration:

• House Oversight and Government Reform
• House Science, Space, and Technology
• House Judiciary
• House Intelligence (Permanent Select)

I know the Intelligence Committee has their own bill (HR 3523) that has some conflicting provisions with the current and proposed versions of HR 3674, so we can bet that they won’t hold any hearings on this bill. Similar issues may arise with the other committees as well. The House and Senate leadership are committed to passing cybersecurity legislation this session, but that doesn’t necessarily trump committee politics.

Friday, January 27, 2012

ICS-CERT Publishes OAS OPC Advisory Update

Today was the day that the DHS ICS-CERT published their updated Advisory on the Open Automation Software OPC Systems.NET vulnerability. As I mentioned in an earlier blog post this update adds a second vulnerability to the one initially discovered by Luigi; the second being discovered by Digital Security Research Group (DSecRG).

The latest vulnerability is this system is a reported buffer overflow vulnerability in the ActiveX control for the system. It would allow a moderately skilled attacker to ….. Hmm ICS-CERT doesn’t say what the vulnerability would allow an attacker to do and neither does the DSecRG report on the vulnerability. Oh well, I guess it doesn’t matter because the updated version of OPC Systems.net released to deal with the Luigi vulnerability also fixes this one. And everyone always updates their systems when a security update becomes available – don’t they?

The long history of this Advisory (dating back to the original Luigi based alert) shows how complicated ICS vulnerabilities can get. This update makes things even more interesting by noting that the new buffer overflow vulnerability in the OAS OCP Systems.NET isn’t really an OAS vulnerability. The vulnerability actually resides in the ActiveX component FlexGrid 7.1, a third-party component of OCP Systems.NET.

As I have mentioned a number of times finding a vulnerability in a third-party component automatically brings a question to my mind; what other ICS systems are using the same component and thus potentially have the same problem. Unfortunately, there is no way for anyone to know since system vendors don’t report if/when/where they use third party component software. Until, of course, a security researcher finds the same vulnerability in another system.

PHMSA Radioactive Tissue Holder Notice

NOTE: This is not about chemical security, ICS security, pipeline safety, or even chemical safety. Sometimes I just have to vent about government stupidity and I own this space.

Today the Pipeline and Hazardous Material Safety Administration published a Safety Advisory Notice in the Federal Register (77 FR 4398) dealing with radioactive tissue holders; you know, facial tissues, Kleenex®. It seems that Bed Bath and Beyond ® sold some 220+ tissue holders in the United States that were contaminated with Cobalt-60 during their manufacture in India and emit low levels of radiation.

Now having read that information in the Summary section of the notice, I expected to read in the body of the notice that PHMSA was providing shipping instructions for sending these radioactive sources back to somewhere. Since most consumers would not have access to training on shipping hazardous materials or preparation of the paperwork required to accompany such shipments I really expected that PHMSA would provide notice that they were publishing an unusually special Special Permit to allow consumers to get this dangerous material into appropriate hands.

Didn’t happen. It simple tells people to contact Bed Bath and Beyond for “information about proper return procedures”.  WHAT? Okay, cool down, read some more, there must be an explanation.

How dangerous is this tissue holder? Here is what the notice says:  

“The highest identified radioactivity level on the surface of the tissue holders was approximately 20 mrem/hr, however most of the tissue holders showed much lower levels. A person who spends eight hours in close contact with one of these tissue holders (such as having the tissue on a bedside table next to the bed) could possibly get a maximum yearly dose of about 500-700 mrem. While no unnecessary radiation exposure is desirable, the dose from the tissue holders is not expected to cause any appreciable health effects. To put this into perspective, a person living in the United States receives a radioactive exposure of about 360 mrem/year from naturally-occurring background radiation.”

Okay, it’s really not that big a thing. People should be able to pack these up in a sturdy cardboard box and ship it to some B3 location for appropriate consolidation and disposal. B3 will have some issues to deal with and will be screaming at their Indian supplier. Consumer question, is someone actually making tissue dispensers with steel? What ever happened to plastics for gosh sakes?

Now for the big question: What the Hell is PHMSA doing publishing this notice? Wouldn’t it be more appropriate coming from the Consumer Product Safety Commission? Aren’t they the ones that are responsible for protecting us against unsafe consumer goods????? PHMSA is in the Department of Transportation. They are responsible for transportation issues related to hazardous materials, not radioactive sources sitting on the night stand. How many consumers read the damned Federal Register?

PHMSA is behind enough in their normal work. If it isn’t transportation related, let the appropriate federal agency handle public notices of this sort. Do your work not theirs.

BTW: Did anyone tell the TSA airport security screeners about these dangerous tissue holders that could be used as potential radiological devices aboard aircraft? Do they have pictures to help them identify these devices? Do they have radiological detection devices? I am being sarcastic here, let’s not get carried away.

Thursday, January 26, 2012

Rail Borne Chemical Threat to Super Bowl?

An Indianapolis TV station’s web site is reporting that CSX will halt rail traffic past the Lucas Oil Stadium on Super Bowl Sunday this year. The tracks run within a block of the stadium, and train traffic will not be allowed on those tracks starting about 3 hours before game time until after the Stadium is emptied after the game. The fear is, of course, that a hazardous material leak (accidental or deliberate) that close to the game site could put thousands of people at risk.

Okay, a pause to allow our friends from Green Peace and other environmental activist organizations to ask “What about the thousands of people who live and work along that same line every day?”

Accidental Releases


They are, of course, correct in that anyone living within a certain distance of a railroad track is placed at increased risk of exposure to any hazardous chemical that is being carried in any of the rail cars. The amount of increased risk is infinitesimal; railroads have a very enviable safety record either in the absolute number of fatal chemical incidents or the number of releases per ton-mile of hazardous material transported. Only pipelines have a better overall safety record.

If the risk is sooo small, why are they stopping the train traffic on game day? A small part of the reason is that even an infinitesimal risk is way too high when you are dealing with high-profile events like the Super Bowl. Even a relatively small and hardly dangerous leak of a moderately toxic chemical along the nearest point of approach to the Stadium (named after an oil company, how ironic) would result in the game being stopped and the stadium being evacuated. That would kill Indianapolis’ chance of ever getting another high profile event in their fair city.

Deliberate Attacks


The real reason has nothing to do with accidental releases. If that were the case, those trains would be stopped every time the Colts play at home. It hasn’t happened. It won’t happen. No one is concerned with accidental releases. It is a terrorist attack converting one or more of those railcars to chemical weapons, improvised explosive devices or flame weapon that keeps the CSX security people awake at night as Super Bowl Sunday approaches.

Now the risk of a successful terrorist attack (defined as resulting in a catastrophic failure of the railcar tank resulting in impressive off-site consequences; death and destruction) on a rail car is relatively low. The cars are made of very thick, welded metal, that was designed to resist damage in normal handling and low speed derailments. A portable explosive device designed to take out such hardened targets are not available via Terror-U-Online; it requires the services of an explosives engineer, lots of hands-on time with a stationary vehicle. Oh yes, they have to be large enough to be readily detectable.

Unless, of course, one were to place the device inside of the sealed and filled railcar. But that’s a topic of a completely different post.

Partially Successful Attack


Of course, a successful attack doesn’t really have to be successful to be successful, if you get my meaning (of course you don’t, I’m being entirely too cute, but I will explain). If a targeted release of a chemical (and it wouldn’t even have to be really hazardous) were visible to the news teams covering the game, the security advisors for the event would have to immediately begin evacuation procedures even before they knew the actual nature of the release. There would be wide spread panic resulting in potentially hundreds of deaths or serious injuries; all in front of the eyes of the world.

And that is the reason that CSX is stopping the flow of all rail cars by Lucas Oil Stadium on Super Bowl Sunday, but letting them flow the other 365 days of 2012.

Reader Comment: TSA Video Surveillance Report

I got a real nice response to yesterday’s TSA Video Surveillance blog post from the President of SightLogix. The comment is posted to the original blog and is well worth reading. The interesting point that he makes (from my point of view) is that the un-redacted TSA video surveillance report (and others like it) is posted on the “TSA’s Secure Webboard”. This is apparently a restricted information (SSI I presume) sharing site that is accessible to registered Airport Security Coordinators; appropriate as that’s who needs this type of information about these security measures.

The interesting comment from him is that he (personally) does not have access to the un-redacted TSA report about the testing of the installation of his company’s equipment. I understand that there is a whole ‘need to know’ issue here, but business decisions and equipment recommendation need to be made based upon reports like. Oh well, I would hope that someone in his organization has access to this web site or was at least allowed to review the report before it was posted.

Now the other point; the ASC web site is great for airport people. But this is not an issue restricted just to airports. Any number of other critical infrastructure facilities have boundaries that need to be surveilled. The information from this testing would be a great piece of information for security managers at these sites as well. I would think that TSA would be able to find a way to share the information with other TSA monitored security programs (a small list to be sure) like railroad facilities.

DHS is going to have to be involved in making this information available to the rest of the non-transportation facilities that have federally mandated perimeter security requirements like CFATS and MTSA. The information about boundary security is applicable to almost any type facility.  

Reader Email – Expected Alerts are not Coming

In my last two ICS-CERT related blogs I noted that the Digital Security Research Group (DSecRG) web site had two additional ICS vulnerabilities reported that had not yet shown up as ICS-CERT alerts. I heard from two different sources today the reason that those alerts are probably not forthcoming. The first came from a semi-anonymous email (it came from a gaming site, but it was signed with a PGP signature) and the second was from a caller claiming to be from ICS-CERT but I didn’t catch the name as I was running between three meetings at the time.

Default Passwords


The DSecRG web site describes vulnerabilities in Tecomat PLCs and the Open Automation Software (OAS) OPC system. According to both sources (in almost identical wording, same person perhaps?) the Tecomat PLC vulnerability is really nothing more than a list of default passwords that should be changed upon system installation; anyone want to venture a semi-educated guess as to how often they are actually changed on PLC’s? I don’t know but I would suspect much less often than security folks would like to see. After all PLC’s are not connected to the internet, so why bother?

Both sources said:

“That is not a vulnerability. If they are not changed than that is a configuration issue. (We can not prevent integrators from being stupid).”

The pejorative aside, I can certainly understand why ICS-CERT and many security professionals would take that attitude. They have enough serious ICS security issues without having to worry about people not changing default passwords.

Having said that, many of these systems were installed before most organizations had even heard the term ‘cybersecurity manager’. Now most critical infrastructure facilities (at least) have a person wearing that hat (okay and maybe a couple others as well) who needs to determine if there are any unresolved vulnerabilities in their legacy systems (all new systems, as we all know, come with sophisticated cybersecurity suites; SARCASM Warning). I would expect that a real common problem in many (if not most) of those older systems is that they were installed without changing any of the default passwords.

If an energetic cybersecurity manager knew which systems came with default passwords and knew what they were, it would be a relatively easy (okay so that is a slight exaggeration, and our receptionist is just slightly pregnant) to go back and check all of those devices to ensure that the default password is not still active. Without lists like this from people like DSecRG or ICS-CERT, it would be nearly impossible to determine what the default password on legacy systems might be to verify that they had, in fact, been changed.

Well, if ICS-CERT isn’t going to worry about the problem, maybe SCADAHacker can just add that to the lists he is maintaining on various ICS security issues.

OAS OPC Advisory


Both sources told me today that ICS-CERT was going to be issuing an update on the recent OAS OPC advisory. That update (already planned apparently) will also address the vulnerabilities identified on the DSecRG web site as they are already being dealt with by OAS. If that update provides appropriate mitigation measures for the DSecRG identified vulnerabilities, that certainly sounds like an efficient way of dealing with the problem. No word on when that will be published; hopefully in the next day or two.

Wednesday, January 25, 2012

TSA Analysis of Video Surveillance System

I typically don’t try to promote specific security systems as I am not a ‘qualified expert’ in much of anything that would allow me to make an authoritative evaluation of any particular product. Every once in a while I run across (thanks in this case to a SCADAHacker tweet) an evaluation of a system by someone or an organization that should be qualified to do such an analysis and I think it’s worthwhile to look at such evaluations. I recently ran across a TSA report on the use of a video analytics system used to secure an airport perimeter that falls into this category.

The Report


The report was prepared as part of TSA’s Airport Perimeter Security project that provides a technical evaluation of perimeter security systems currently being employed at facilities around the country. This project should provide security managers with an important independent evaluation of integrated security products to supplement claims made by manufacturers and system integrators. This is apparently the first of 15 (perhaps 21, the wording of the report is sort of vague) such reports that TSA is currently preparing.

The actual evaluation was done by the National Safe Skies Alliance, a non-profit organization formed to “support testing of aviation security technologies and processes”.

Redacted Information


One would expect that an in depth review of a security system would involve the disclosure of some sensitive information that might be useful to someone trying to compromise that system. This report is no exception. TSA has dealt with that by redacting (blacking out) certain information in the report. While protecting the security of the installation being evaluated, it does somewhat compromise the usefulness of the evaluation.

For example the report redacted a site diagram (page 3, 15 Adobe) showing the areas covered by the video system; an understandable exclusion. Partially understandable, but certainly less helpful to security managers, was the redaction of the test intrusion detection rates in reporting the test results for the four individual intrusion techniques tested (with any details of the intrusion technique redacted). What makes this somewhat confusing is that in the summary discussion of the system accuracy the report notes that over 900 intrusion scenarios were performed (four intrusion techniques performed at a variety of locations within the detection range of seven devices) and that “every alarm instance was accurately reported through the primary management software” (page 13, 25 Adobe).

So what is redacted is the rate of failure to detect; darn that could be valuable information for security managers. What is less clear is how this would compromise system security unless the detection rate is extremely poor. If the system had a high rate (say 80% for the sake of discussion) that would warn attackers to stay away since there attack would have an 80% chance of being detected at the perimeter. On the other hand, if the detection rate were low (say less than 20%) that might make the attacker more willing to risk the attack.

Missing Information


While one can understand why much of the redacted information is not available, the information that is specifically missing from the report is much more bothersome. One of the general complaints about automated surveillance systems is their relative high-rate of nuisance alarms (natural environmental movements that set off the detectors) or false alarm (inappropriate detections with no known cause) rates. Those rates are missing from this report.

In the ‘Scope’ section of the report the author notes that the evaluation period was insufficiently long to establish nuisance or false alarm rates or to determine their cause. I find this hard to believe when there was time enough to evaluate 900 intrusion attempts by two field testers. At the very least the report should have included information about the number of nuisance or false alarms observed during the test period. This may not be statistically sufficient to establish a true rate, but it would provide valuable data in any case.

What concerns me more is the fact that the report states the reason the report could not distinguish between nuisance alarms and false alarms (an important distinction) was that the causes of alarms “had not been recorded by BUF (airport security personnel) personnel” so there was no way to verify alarm type. This would seem to indicate that security personnel were not really paying attention to the alarms on their system, or at the very least were not investigating alarms sufficiently to determine if an intrusion were actually taking place. This is not a fault of the report, but rather of the security management at the facility.

Interestingly, in the discussion of the results portion of the report there is a large redacted box in the section dealing with “Nuisance and False Alarm Reporting” (page 12, 24 Adobe). It would be really nice to know what was discussed there.

Overall Report Evaluation


I’m glad to see that TSA is having this type of system evaluation done. Unfortunately the usefulness of the information presented is compromised by the redaction of evaluated data. In most cases I can understand and even agree with the reasoning for the redaction in the public presentation of this data. For this to be worthwhile, however, TSA is going to have to find a way to make the un-redacted information available to airport security managers and security managers at other critical infrastructure sites. Otherwise this report will just sit on a shelf collecting dust.

Tuesday, January 24, 2012

ICS-CERT – Two New Advisories but Two Alerts from Last Week still Missing

This afternoon the DHS ICS-CERT published two new advisories, both with multiple vulnerabilities. The advisories are for Ocean  Data Systems’ Dream Reports and MICROSYS’ Promotic systems. Strangely missing are the two alerts that I predicted this weekend for vulnerabilities publicly disclosed by the Digital Security Research Group (DSecRG).

Ocean Data Systems Advisory


Rios and McCorkle reported the two vulnerabilities addressed in this advisory. The first is a cross-site scripting vulnerability that is remotely exploitable and does not require much in the way of skills to execute. The second is a write access violation vulnerability that is a tad bit more complicated to exploit, requiring a successful social engineering attack and the creation of a specially crafted data file.

Ocean Data Systems has published a new version of the Dream Report product that has been confirmed to be free of these two vulnerabilities. Separate CVE numbers have been assigned, but are not yet active.

MICROSYS Advisory


While it is not mentioned in this advisory, it is an update of an alert issued last October for three vulnerabilities found in the Promotic HMI. Those vulnerabilities were reported by our friend Luigi. The vulnerabilities identified were:

• Directory Transversal, CVE-2011-4518;

• ActiveX Stack Overflow, CVE-2011-4519; and

• ActiveX Heap Overflow, CVE-2011-4520

All three are remotely executable by a relatively low-skilled attacker. The first could be used to cause some data leakage and the other two could be used as part of a DOS attack. The latest version of Promotic is free of these vulnerabilities and is downloadable from the MICROSYS website. The above listed CVE numbers are not yet active.

Missing Alerts


Last Sunday I noted that in addition to the WAGO vulnerability covert in an ICS-CERT alert from Friday, there were two other system vulnerability reports from DSecRG describing vulnerabilities in Tecomat PLCs and the Open Automation Software (OAS) OPC system. Both of those should have received ICS-CERT alerts on Friday or yesterday. There were still not yet posted as of 20:30 EST today; curiouser and curiouser.

Reader Comment: Basecamp Communications Devices

It took me a while, but I finally got a chance to ‘moderate’ a response to this weekend’s blog post on the Basecamp disclosure process from Dale Peterson; one of the drawbacks to traveling cross country by car is that you can’t do much work on the internet. Dale explains the reasoning for including the Koyo ECOM100 and notes that the Schweitzer alert was for a wireless communications device, the SEL 2032 Communications Processor.

As Dale points out, vulnerabilities in the communications nodes between the PLCs and the control system are essentially major vulnerabilities for the control system and the PLC; they can allow protected access to both. As such they were clearly fair game for analysis.

The only point that I was trying to make about the ECOM100 being a ‘ringer’ (and the same point should have been made about the Schweitzer device) is that the PLC vendors had clear public notice about what was going to happen with the research into their devices. Since they should have known about the disclosed vulnerabilities (especially the ones that were specifically designed into the systems), they have no cause to complain about the ‘uncoordinated disclosures’. They are the ones that put their customers at risk not Project Basecamp.

Unless the Project Basecamp team provided direct notification to Koyo and Schweitzer about their products being included in the evaluation, the same blanket dismissal of concerns does not apply. On the other hand, the process industry really does need to understand that these types of devices (and I assume that the same types of vulnerabilities will show up in many if not most of these types of devices currently in use) may provide a broad avenue of attack on control systems. This clearly needs to be recognized and addressed.

So with the caveat that the following does not apply if they received advanced notification of inclusion in the Project Basecamp investigation, I think that both Koyo and Schweitzer were poorly treated by an uncoordinated disclosure of their vulnerabilities. More importantly their customers may have been unduly put at risk by not allowing these two manufacturers a chance to correct the system defects before the vulnerabilities were made public.

Twenty lashes with an al dente noodle for each of the uncoordinated disclosures for these two manufacturers (again with an immediate pardon if they received advanced notification of inclusion in the process) to Dale Peterson for his unsportsmanlike conduct. On the other hand, I think that it is time to look at all of the devices and systems that we employ to control critical processes, so a small quiet kudo to Dale as a salve to his wounds for his efforts (and of course the hard work of the entire Project Basecamp team and supporters) to bring formal attention to this problem.

Monday, January 23, 2012

The Disclosure Debate – Basecamp Disclosures

I have been asked to weigh in on the ongoing debate about the recent PLC vulnerability disclosures by Digital Bond’s Project Basecamp. The apparent assumption behind the request is that since I am not a cybersecurity researcher, but rather a chemical facility security advocate, that I might have a different set of insights into the disclosure process. As I am almost always willing to provide my opinion on just about any topic, I could hardly turndown the request.

Ground Rules


First off I have to make clear that I have a professional relationship with Digital Bond. I periodically post on their blog about cybersecurity legislative matters. Dale Peterson has asked me to do so periodically, but he does not provide any remuneration beyond the access to a wider audience for my musings. He has personally made clear to me that I would have to really work hard to piss him off enough with any Project Basecamp criticisms to harm our professional relationship. That’s good to know, but it doesn’t really influence what I would write; people who know me well realize that I will express my professional opinions almost completely regardless of who will be upset by them or impressed by them.

Second, readers of this blog will almost certainly be aware that I generally come down on the side of full and open discussion of vulnerabilities. Over the last 4½ years I have described a number of potential physical vulnerabilities for chemical targets and discussed how they could most probably be successfully attacked by terrorists. I usually leave out critical details that only a well-trained terrorist or military man would be aware of so as not to encourage wannabes, but those details are not going to affect the response of defenders in any material fashion. And that is the key to the discussion of vulnerabilities on this blog; they are provided so that owners and operators of high-risk chemical facilities might better understand the risks they face.

Finally, I am not now, never have been, nor probably ever will be the owner of a control system. I have been a user as a process chemist, but I have never been responsible for the purchase, set up or protection of an industrial control system. It may be a subtle difference, but I don’t want anyone thinking that my musings in anyway represent the opinions of any portion of the chemical security community beyond the owner of this blog.

The Vulnerabilities Exist


The vulnerabilities that were discovered by Project Basecamp exist and have existed for some time. The Project Basecamp team went looking for these specific vulnerabilities because they exist in other PLCs, specifically the Siemens PLCs. And no one was really surprised that they were able to find these particular vulnerabilities.

The designers of these PLCs knew that these vulnerabilities were there. In many cases the vulnerabilities were apparently specifically designed into the equipment. The vendors could have corrected these vulnerabilities at any time.

Finally, Project Basecamp has been in the works for some time. Dale has been talking about what the team was going to be doing for quite some time. Nobody in the vendor community or the security researcher community or in the regulatory community should have been surprised by the results or the way in which they were communicated at the end of the Project.

Systems are at Risk


The facilities that use control systems that use these PLCs are at risk for potential attacks on their facilities employing the vulnerabilities that were reported by the Project Basecamp team. They have been at risk for such attacks since they first employed these devices. There has been some incremental increase in the level of that risk since Basecamp disclosures were made; how much of an increase no one really knows for sure.

The lack of surety is due to the fact that no one knows who else has been working on discovering the details behind these vulnerabilities and has already developed specific attack vectors using these vulnerabilities. In fact, using the Stuxnet model (or even the Duqu model) we don’t know how many facilities may have already been successfully attacked using these vulnerabilities.

Dale obviously selected a good team, but I would be extremely surprised if there weren’t hundreds of security researchers out there with skills at least as good as this team. Yes, I said hundreds. Do not forget that China and Korea (and probably Russia and India and Israel and …) have specifically gone about developing offensive cyber-warfare capabilities which would require developing thousands of cyber security research specialists; many of which would of necessity be focused on industrial control systems. And that’s not even considering the cyber-criminal underground that certainly exists.

The Upside


What has certainly increased is the awareness that these specific vulnerabilities exist and the methods to exploit them are now generally available. Any cyber-security contractor, ICS owner, or government regulator can use these tools to determine if a specific ICS installation is susceptible to attack using these vulnerabilities.

There will be some installations where other security measures already in place make an outside attack very difficult or perhaps impossible (I wouldn’t hold my breath waiting on that) to attack. There will be others where the local Junior High School computer nerd can own the facility. Most will fall somewhere in the middle between these two extremes.

Knowing the specific level of vulnerability and the mode of attack that could be employed, security controls can be put into place to mitigate (though certainly not eliminate) the risk of attack using these specific vectors. Most of these are well known and understood. ICS-CERT (and Digital Bond) have been talking about them for years.

Regulators should take specific note of the tools made available via the Project Basecamp disclosures. Any security inspection at a power transmission facility or high-risk chemical facility that does not use include the use of these tools to evaluate the security of the control systems employed at that facility cannot be called a real security inspection (Congress please note that this reality should be included in any ‘comprehensive cybersecurity legislation’ being developed in this session). ICS-CERT should immediately develop a training program for Federal, State and local government security inspectors in how to utilize these readily available tools to conduct such inspections.

The Downside


Sorry Dale. Your team has significantly lowered the knowledge threshold required to design and implement an attack on any control system using these devices. You have increased the number of potential attackers with the necessary skills to effect successful attacks using the tools that your team made possible. You are going to continue to catch some heat for that and it is certainly deserved. But you all knew that going in.

The Exception


Dale did slip a ringer in on us. Project Basecamp was advertised as a look at the vulnerabilities in PLCs. Including the Koyo ECOM100 was a bit of a surprise since it is not a PLC by any stretch of the imagination. I am surprised that no one has called Dale out on including this Ethernet connection device in the Project Basecamp investigation.

If they hadn’t found so many critical vulnerabilities in the ECOM100 I would have been one of the first to cry ‘Foul’. Realistically though, the communications between the PLCs and the control system are an important part of the operation of the PLCs. The wide spread implementation of Ethernet connections have made the modern use of the PLC possible; the older method of hardwiring each PLC was just too time consuming and the source of too much system downtime.

I only wish that Dale’s team had included a wireless server instead of an Ethernet device. These are becoming more widespread. In my opinion vulnerabilities in these servers potentially pose a much higher threat to the next generation of control systems as they may provide another undocumented link to the outside world.

The Way Forward


Cyber attackers will always respond quicker than system owners. But maybe we as a society need to have a public, very visible, successful attack on a modern control system. We need to understand that every tool has inherent risks associated with the tool. We require manufacturing facilities to have guards and safety devices in place to protect the workers from the inherent dangers associated with modern manufacturing equipment. Those guards and devices are now an integral part of the machine design, installation and maintenance process at modern manufacturing facilities. We really need to get to that same point with cyber-security tools.

So, maybe Project Basecamp disclosures will become the ICS version of ‘Unsafe at Any Speed’ or ‘The Silent Spring’ or even ‘The Jungle’; making the inherent vulnerabilities in modern industrial control systems more widely known. Industry never did appreciate Nader, Carlson or Sinclair, but society owes them all a large vote of thanks.

Thanks Dale.

Sunday, January 22, 2012

ICS-CERT Publishes Five S4 Based Alerts Plus Two Other Alerts

On Friday the DHS ICS-CERT published 7 separate alerts, five of which referenced vulnerabilities that were publicly discussed at Digital Bond’s SCADA Security Scientific Symposium (S4) in Miami, FL. These alerts, combined with a similar alert published on Thursday, may mark just the tip of the iceberg as Dale Peterson noted on the DigitalBond.com blog that 30 students at a HMI hacking class before the actual symposium “were quickly finding 0days using ActiveX and File Format Fuzzing”.

Oh yes, the two other alerts. They were based upon uncoordinated disclosures by the Digital Security Research Group (DSecRG) for systems produced by WellinTech and WAGO.

S4 Alerts


The five S4 alerts issued Friday included a general alert for disclosures made during the Project Basecamp portion of S4. The alert notes that the reported vulnerabilities in multiple vendor products included “buffer overflows, backdoors, weak authentication and encryption, and other vulnerabilities that could allow an attacker to take control of the device and interfere or halt the process it controls” (page 1). The four other S4 related alerts dealt with specific vulnerabilities in systems from four separate vendors; those vendors were:



Koyo (Note: not a PLC vendor, but an Ethernet vendor that provides communications between PLCs and the actual control system)


Project Basecamp was a detailed search for and reporting of vulnerabilities in various PLC’s used by industrial control systems. Dale has become increasingly vocal over the last six months or so about his dissatisfaction at cybersecurity community’s disregard of the consequences of the insecure design of programmable logic controllers (PLC). In both his blog and in any other venue that would listen (or even pretend to listen) he has made it clear that everyone in the control system vendor and researcher community has known for at least 10 years that the basic PLC design has inherent cyber-security flaws that make them vulnerable to attack. These vulnerabilities were made painfully clear in the design of the Stuxnet virus.

Because the Stuxnet worm exploited vulnerabilities in the Siemens PLC, many of the Siemens security flaws have been publicly documented, while the rest of the industry breathed a sigh of relief that their systems weren’t being used by the Iran’s nuclear program. The whole point of Project Basecamp was to formally tell the world that Siemens was not alone in their ‘insecure by design’ problems.

That the world, at least the security professional side, has taken notice cannot be doubted. There has been significant discussions in a number of forums (on LinkedIn.com and on the SCADASec list for instance) and in the cyber related press. Unfortunately, most of that discussion has been about the public disclosure of the vulnerabilities (along with some Metasploit® modules published to aid in the exploit of those vulnerabilities) rather than on the potential effects of the vulnerabilities on real world control systems. Hopefully, the fait accompli provided by Dale and the Basecamp team will eventually allow for a more detailed discussion of the vulnerabilities and how to protect control systems from attack using those vulnerabilities.

ICS-CERT does make a valuable contribution (with a forgivable sideways slap at Project Basecamp) to that inevitable discussion in the general Basecamp alert. They note (page 2):

“This public release increases the potential for cyber attack on these devices, particularly if the devices are connected to the Internet. ICS-CERT reminds users that the use of readily available and generally free search tools (such as SHODAN and ERIPP) significantly reduces time and resources required to identify Internet facing control systems. In turn, hackers can use these tools combined with the exploit modules to identify and attack vulnerable control systems. Conversely, owners and operators can also use these same tools [emphasis added] to audit their assets for unsecured Internet facing devices.”

But, less anyone forget, the Iranian PLCs that were the Stuxnet target were not connected to the Internet, nor were their control systems. Many of the vulnerabilities reported by the Project Basecamp team will allow an attacker to exploit the vulnerabilities without having to target an internet connected PLC; it will require a higher skill level and more system knowledge. There are loads of attackers with the appropriate skills and system knowledge can be easily obtained via social engineering attacks. Internet-isolated control systems (if there are really such things in existence) are not safe from attacks based upon these vulnerabilities.

WellinTech Alert


The WellinTech alert provides initial information on a reported password encryption vulnerability in the KingSCADA product that could allow an attacker to read and use a user password, thus gaining user level access to a control system. Exploiting this vulnerability requires access to the SCADA server.

WAGO Alert


The WAGO alert concerns multiple vulnerabilities in the I/O System 750. The vulnerabilities include:




Interestingly a DSecRG press release notes that the WAGO disclosure of the 750 series controller vulnerabilities was made in support of Project Basecamp. Additionally the DSecRG web site notes two other control system vulnerabilities released by DSecRG on the same day. One deals with a default password vulnerability on Tecomat PLCs (more Project Basecamp fallout?) and an ActiveX vulnerability on an OPC system. I expect that we’ll see ICS-Alerts on these on Monday.

Friday, January 20, 2012

Latest Edition of CRS Report on Chemical Security

Yesterday Steven Aftergood, over at Secrecy News (a publication of the Federation of American Scientists) published a link to the latest Congressional Research Service (CRS) report on Chemical Facility Security. This is a recurring report on the CFATS program providing Congress with a summary of the issues and options that Congress might have for dealing with those issues.

I’ve written about earlier versions of this report and as is usual for the CRS this latest version provides a good summary of the CFATS program and the political issues currently facing the program. Of special interest is the funding summary chart provided on page 4 (page 8 according to Adobe) and the chart describing the current number of facilities regulated under CFATS by tier on page 5 (9 Adobe). The CRS researchers provide information in these charts that is not generally and/or readily available to the public.

The report also provides the most current numbers (2011 year-end) for the inspection process at CFATS facilities. It reports (page 7 – 11 Adobe) that DHS has conducted 180 pre-authorization inspections, has approved 50 site security plans (presumably a little over half of the current Tier 1 facilities) and has yet to complete a single implementation security inspection (insuring compliance with the site security plan). I suppose that the 180 pre-authorization inspections means that these have started on the Tier 2 facilities, but it could also mean some number of multiple inspections at Tier 1 facilities.

For the first time I find that I am going to have to criticize a portion of the report, the section dealing with the current management issues. The single paragraph describing these problems the CRS report mainly relies on the FoxNews.com article that most of us have also had to rely upon. The only information received from DHS on this subject was personal communications between the report author and the “Department of Homeland Security” on January 5th that confirmed that Under Secretary Beers had requested the report and that “DHS expects to assess the success of

the action plan and revise it as necessary” (page 8 - 12 Adobe). Obviously the CRS researcher was not given access to the DHS report, a serious DHS shortcoming in my opinion.

Given that only shortcoming (and it is certainly not the fault of the CRS author, Dana A Shea) I still recommend that anyone interested in chemical facility security or its regulation and legislation to get and read this report. Kudos to FAS for making these CRS reports readily (and freely) accessible to the public that paid for them.

EPA Sends Final Rule form 2012 Methyl Bromide Exemptions to OMB

Yesterday the Office of Management and Budget (OMB) web site announced that the Environmental Protection Agency (EPA) had submitted for approval the final rule for their 2012 Critical Use Exemption From the Phaseout of Methyl Bromide.

As has been the general practice at EPA for some time now, internal delays have pushed the publication of this rule past the time when it needs to be published to allow for industry to properly plan their production and importation requirements. One would assume that once again the EPA has notified by letter the producers and importers of methyl bromide of the actual amounts that will be authorized regardless of the outcome of the rulemaking process. EPA estimates that the final rule will be published in March; I predict after June.

[Insert standard complaint about DHS not including methyl bromide in the CFATS list of chemicals of interest (COI) because EPA was supposedly phasing out the use of this chemical in 2005]

More interesting is the fact that the OMB web site provides information on this rule making progress based upon the Fall 2011 Unified Agenda of Regulatory and Deregulatory Actions. Typically OMB and the various Executive Branch Departments provides notices in the Federal Register when this updated agenda is published; hasn’t been done yet. I will be looking at the Unified Agenda items for DHS that affect chemical and cyber security in more detail in a future blog.

Thursday, January 19, 2012

ICS-CERT Publishes Alert for Disclosure at Digital Bond’s S4 Conference

This afternoon the DHS Industrial Control System Cyber Emergency Response Team (ICS-CERT) published an alert for a vulnerability that was disclosed during today’s presentations at the SCADA Security Scientific Symposium (S4) put on by Digital Bond (full disclosure; I have provided some blog posts for Digital Bond over the last year or so). The alert is based upon information presented by Reid Wightman about the GE D20ME PLCs.

The advisory mentions two vulnerabilities; data leakage and arbitrary code execution. It does not mention the password retrieval tool mentioned in Dale Peterson’s blog post this evening about the day’s presentations at S4 or in the press release from Rapid7.

It is almost certain that more vulnerability alerts will come out of these discussions and classes in Miami this week.

No Hearings on ISCD Issues – Really?

Alexis Rudakewych, the Government Relations Manager at SOCMA has an interesting guest-blog post over at ChemicalProcessing.com that addresses the problems with the CFATS implementation that were made public a couple weeks back in a FoxNews.com article. In the posting she makes the very predictable (and in very many ways legitimate) argument that the current issues provide further argument for providing the CFATS program with a long term extension of the current authorization without substantial modification. Her arguments are well reasoned and certainly worth reading.

She makes one comment though, that I must take exception to. She states that:

This news could easily derail the advancement of any of the three pending CFATS bills in the House and Senate, all of which have already been approved by their respective committees, and instead redirect Congress's attention to oversight hearings on the program in lieu of a multi-year authorization.

While the CFATS program is small potatoes in the great scheme of the federal government (so small that it isn’t even a line item in the budget) it is an important part of defending the United States against potentially serious terrorist attacks. It is arguably the single most important program defending against the terrorist use of WMD against the homeland.

We now have a situation that has developed over the last couple of years where the implementation of that program has virtually stalled because of apparent management issues. I say apparent because it appears that no one, including Alexis, has seen a copy of this internal DHS report. For Congress to continue funding this program without a serious and public look at these management issues (and the Department’s plan for resolving them) would be political malfeasance of the highest order.

Industry has spent a great deal of time, money and other resources preparing for the site security plan approval process. They are almost certainly going to have to spend even more before the process is complete. I would think that industry would want more than just the unsupported assurances of the NPPD management, the same management that apparently failed miserably in its oversight of the program in the first place, that the problems were being fixed.

If industry really wants to have long-term authorization of this program pass, they should be demanding an immediate hearing (maybe even a joint hearing) on this issue in the very near future along with a public reporting of the internal investigation. Hearings should go beyond the routine appearance of Undersecretary Bears and Director Anderson. It should include the full management team of ISCD, union reps (as the unions were apparently blamed for being part of the problems) and at least one regional commander of the chemical facility inspectors. It might not be a bad idea to also include some of the original management of ISCD to see if the current problems actually had their roots in the initial design of the program.

CFATS is too valuable a program to let it die from lack of attention. If something isn’t done soon to correct these problems industry is going to reduce its support for the CFATS program. Money budgeted for security spending will be cut back so that it can be applied to money making efforts that improve their bottom lines.

I have long maintained that the failure of both sides to come to a reasonable compromise on the IST issue has doomed this program to a year-by-year reauthorization standard. This problem is going to make it more difficult to get the necessary support necessary for the long-term reauthorization process to be completed. Failing to publicly deal with the problem will make it impossible.

ICS-CERT Upgrades Schneider Alert and Issues New Luigi Advisory

Today the DHS Industrial Control System Cyber Emergency Response Team (ICS-CERT) published two new advisories covering vulnerabilities in two ICS systems. The first upgrades an alert from December concerning multiple credential vulnerabilities in various Schneider systems and the second addresses a vulnerability in Certec’s atvise SCADA/HMI product.

Schneider Vulnerabilities


This advisory upgrades the information on an alert on various Schneider ICS products that was published last month. As noted in that alert that there were three separate hard-coded credentials in various Schneider applications involving the Telenet port, Windriver Debug port and the FTP service. This advisory confirms the earlier report that Schneider has developed and has now made available patches to deal with the vulnerabilities in the first two services, but the FTP service remains vulnerable to attack on some portion (maybe all, it is not clear in the advisory) of the affected systems. Schneider is continuing to work on a mitigating patch for the remaining vulnerable service.

Interestingly enough, the patches now available remove the vulnerable services (more accurately two of the vulnerable services) from the products. They were apparently included to allow remote maintenance and diagnostics of the products. Again, apparently this was the reason for the hard-coded credentials; it did not allow the owner-operator to inadvertently lock-out Schneider’s access to the system. Of course it did not allow the owner-operator to deliberately lock-out Schneider either and that is a security issue; the lack of access control.

Once again, I want to raise the issue about access to critical systems at high-risk chemical facilities. CFATS requires that anyone with unaccompanied access to critical systems at high-risk chemical facilities must be vetted against the Terrorist Screening Database (TSDB) and have other unspecified background checks completed before they can be given access to the critical systems at the facility. Who is going to ensure that all of the techs at Schneider (and any other vendor with remote access to control systems) have been properly vetted in accordance with the CFATS regulations?

NOTE: The CVE file for these vulnerabilities is already available.

Certec Vulnerability


The second advisory is for a newly reported vulnerability in the Certec’s SCADA/HMI product; atvise. The unnamed vulnerability (The advisory actually calls it a “denial of service (DoS) vulnerability”, but that describes the result of an attack not the vulnerability.) was reported by our old friend Luigi. Since this is an ‘advisory’ instead of an alert and it includes a mitigation, it would appear that Luigi has completed his second or maybe third coordinated disclosure. Actually, that’s not fair; Luigi’s name appears next to a number of upcoming ZDI (Zero Day Initiative) advisories.

This Voldemort vulnerability (okay forgive the Harry Potter® reference; Lord Voldemort is most often referred to in the series as ‘he who cannot be named’ because he is soooo evil) would allow a low skill level attacker to remotely execute a DOS attack. Certec has created a new version of atvise that does not have the vulnerability; it is available on their web site.

NOTE: The CVE link for this vulnerability is provided but the file is not yet active.

Wednesday, January 18, 2012

The House Calendar

With the House now officially back in session, they so notified the President and Senate yesterday, it is appropriate to look at the official calendar for the coming year. This document is the plan for when the House will meet in Washington and when individual members will be working back home in their district on the ‘people’s business’ and maybe some time on getting re-elected (Sarcasm alert).

As we saw last year the House plans on ‘working in their districts’ for at least one full week out of every month this year. Additionally we see the Easter recess (Mar. 30th thru April 13th) the Summer recess (Aug. 6th thru Sep. 7th) and the Election recess (Oct.8th thru Nov. 12th). All in all, the House plans on meeting in Washington on only 28 week this year; and only two of those will be five-day weeks (Sep. 10-14 and Oct. 1-5).

Interestingly, this being an election year, the House Majority Leader (who sets and publishes this calendar) already has plans for an extensive Lame Duck session with Washington meetings to be held thru December 14th with a week off for Thanksgiving.

As always, circumstances alter cases. It would not be unusual for the home week at the end of September or the week before Christmas, for instance, to be interrupted for action on budget bills or continuing resolutions. It would be unusual, however, for any of the Washington days to be eliminated.

Tuesday, January 17, 2012

ICS-CERT Alert on Another Luigi Vulnerability

Today the DHS Industrial Control System Cyber Emergency Response Team (ICS-CERT) published yet another alert on multiple (two) vulnerabilities reported by Luigi. This time the affected system is the Rockwell Automation FactoryTalk SCADA/HMI. Luigi reported a malformed packet vulnerability and a read access violation vulnerability. Either would allow a remote exploit that could result in a DOS attack. As always, Luigi has provided sample exploit code on his web site.

DHS is Watching – So What?

Last week Mark Hosenball did an article over at Rueters.com about DHS “operating a ‘Social Networking/Media Capability’”. It seems that he had discovered a Privacy Compliance Review document on the DHS sight describing the fact that DHS was ‘monitoring’ a large number of blogs and social networking sites. A number of activist sites have picked up the story and are chastising DHS for the invasion of their privacy and Cyptome.org has provided a copy of the January 2011 version of that document on their web site.

Sorry folks, this is old news. I blogged about this back in the summer of 2010 when an alert reader notified me that I was on the list of sites monitored by DHS. I wasn’t upset about it then, I am not upset about it now. In fact, I am flattered and pleased. Readers of this blog know that I have been trying to influence DHS policy on a number of matters and I can’t do that if they don’t pay attention to what I write.

Privacy Issue???


The whole point of blogging and tweeting is to share information. Placing these ramblings on the internet is done with malice aforethought. I intend for people, as many as possible, to read and think about my thoughts, opinions and insights. I want to have people read, assimilate, think about and respond to my musings; every political writer (and make no bones about it, this is at heart a political blog) does.

Does it bother me that DHS has monitored my postings about how they are doing or not doing their jobs? Of course not; I want them to. Maybe they will make some minor (or better yet major) changes in their processes and procedures based upon my ideas. Great, I will have helped to make them a better agency.

How can I be concerned about privacy issues with the information posted in this blog? I have deliberately set this up as an open communications device, broadcasting to the world. There is no requirement to sign-up to receive approval to read this stuff. I want everyone with anything to do with chemical and cybersecurity to read this blog. If my ego weren’t so big that I thought my ideas could improve the world I wouldn’t be spending the countless hours that I do on this blog.

I have one last thing to say about privacy and the internet; there is no privacy on the internet. If you post anything on the internet anyone will be able to see it. If you don’t know that in your soul, if you don’t realize all of the potential implications of that, if you don’t accept that, please, just blow up your computer to save yourself the ultimate embarrassment. It will come back to bite you in the most uncomfortable way possible.

Grow up people. This is not Orwell’s 1984 this is Social Media 2012. Even DHS gets that.
 
/* Use this with templates/template-twocol.html */