There was an interesting commentary by Jake Brodsky on ScadaMag.Infracritical.com about understanding process data. It looks at process data from an engineer’s point of view and as is usual Jake provides some interesting insights, well worth reading. It also made me think back to my beginnings as a process chemist, fresh out of college.
I started back to college (that’s a whole different story for another day) while working in a specialty chemical plant as a technician. My job at that time had little to do with process chemistry, I was shipping samples of products to customers and salespeople. But, I had to get my samples from the plant where I worked and I worked closely with plant operators.
When I graduated, I moved into the research lab, working on supporting the toll manufacturing group in the company. Initially that meant troubleshooting process upsets, particularly out-of-spec batches. Some of those problems were easy to solve; talk to the operators, see what they did ‘wrong’ and change the batch instructions to make clear how not to do that again. Toll manufacturing frequently involves short runs of unique products where operator experience with the process is lacking, this makes many operator ‘errors’ more a problem with instructions rather than operator skills.
It quickly became apparent that better instructions depended more upon a better understanding of how the chemical reaction process proceeded than could be gained in small-scale lab-reactions. So, I needed to collect process data to understand what was happening in the reaction vessel. Part of this was straight chemical analysis, collect frequent samples (every 15 minutes over a three-hour period on one of the first products where I used this technique) from the vessel and using analytic chemistry to track the changes in chemical composition.
To make this analysis worthwhile for operators, I had to tie that analytical data to process variables that the operator could observe and control. Unfortunately, at the time (middle 90’s) the process measurement was very primitive by today’s standards, circular chart recorder for temperature, on-vessel pressure gauges, and digital readouts in the control room for load cells. And operator written notes for times of manually opening and closing valves, and starting and stopping pumps; in some cases ‘manually’ meant pushing buttons for air-actuated valves, in others it was turning valve handles by hand. To be able to link all of these operations together into a reasonably accurate timeline, it meant running around the control room and the plant floor watching controls, readouts, and operator actions. All that work was worthwhile though; I was able to refine the process control instructions and reduce the batch process time by 30%. My bosses were well pleased; operators not as much, as I made their jobs more complicated with more detailed instructions.
As the plant moved to an electronic control system (an interesting process itself) this process became much easier. I could sit in the control room and watch an operator HMI and record the data in ‘real time’, with very little lag between recording the various variables. As I continued to refine process instructions it became apparent that something was ‘wrong’ with some of the data that I was collecting; the correlation between the process data and the analytical data was not as tight. Still, process times continued to shorten and finished product variables were tightened; the businesspeople loved it.
Process upset investigations actually became more difficult. Unless I had been watching (and recording data on) the batch at the critical point (and most of these batches ran for 24 hours) I had to rely on operator notes and recollection for problems with the increasingly detailed process control scheme. A customer suggested that we get a data historian that would be able to record the data in detail on every batch we made. They were an important customer (and they paid for their raw materials, so they had a vested interest in eliminating off-spec batches) so a well-known data historian was acquired. My job became more of a computer-data analysis job, and I was able to spend less time looking over the shoulders of operators.
Still, there were data anomalies that I could not explain; flow rates did not accurately track load cell changes, there were lags between valve operations and expected changes in measured process values, and a number of other problems too numerous to mention. For the most part I could ignore these problems in early stages of process improvement, but as the process got tighter these issues became more important. It was not until the company hired a process engineer (another interesting story), that someone was able to explain these issues (many discussed in Jakes article) and we were able to take these ‘control variables’ into account in additional process improvements.
In any case, I hope that this helps to explain why the operations folks (and their managers up the line) are intimately interested in process data. I was fortunate that my managers did not demand ‘real time’ access to the raw data. They were happy with the analysis that I was able to provide from that data. Unfortunately, new management came with a new owner, and it was decided that a process chemist was no longer needed when the plant had a process engineer. But that is another story.
No comments:
Post a Comment