Enterprise Manufacturing Intelligence (EMI) has emerged as a key growth application with both new vendors as well as established ones in the space, all vying to give manufacturing executives insight into their operations. Almost every one of these EMI tools provides capabilities to not only look at the key performance indicators (KPI) of the enterprise, but also the ability to drill down to the very data that was aggregated to produce the KPI in the first place.
The power of these tools—as claimed by the suppliers—is that they allow an executive to get to the root of the problem. For instance, Plant 6 is having a productivity issue, so use the EMI dashboard to drill down and ascertain it. This leads to discovering that the turret lathe on the whatchamacallit production line has been experiencing downtime due to lubrication failures. Sounds like just the tool to put an end to the problem, right? WRONG.
In this post, we'll discuss why many organizations are using information as well as technology for monitoring, controlling, and resolving issues in the wrong way. In doing so, we'll share some perspective on how to free up executive resources, so leaders can provide the value they're capable of delivering.
There Is Too Much Focus on Senior Executives' Troubleshooting Plant Floor Problems
Much of the emphasis on having tools to diagnose plant-floor issues in the hands of executives is in response to solving a problem of many executives’ own doing. In an effort to continually "lean out" the business, they have eliminated layers of supervision so that in many cases there is but two or three levels between the plant floor and senior corporate management. So, when there are production problems, they can’t turn to the managers that used to be in place (consolidating information up from each of the manufacturing operations) and ask them what was wrong and why it hasn’t been fixed.
Given the speed at which business operates today, however, that's not necessarily a bad thing. In the past, many operational issues were not discovered until month-end close, and then the investigation and reporting might have taken additional weeks. The corrective decision might have gotten back to the plant floor one or more months afterwards, by which time the problem may or may not even exist.
Clearly, being able to see the issue in near real time and provide corrective action within hours or days is preferable to the old way of doing things. In fact, it is the evolution of the EMI tools we have today that has enabled this transformation in business.
But Is This the Right Way to Solve the Problem?
Taking a contrarian point of view, I am going to argue that we have been going about the role changes associated with leaning out manufacturing all wrong. Remember, we pay our senior executives as much a as a thousand times more than we pay the average plant floor worker. You’d think that if we are paying executives that much we would expect them to be making decisions a thousand times more important and valuable than the decision plant-floor staff typically makes.
They should be making decisions like “where should we site the next facility?” or “should we acquire one of our competitors?” or “should we expand into a new market?” They should not be making decisions about how the lubrication on a turret lathe is failing and what should be done to fix it.
A Short Example from the Pulp and Paper Industry
Here is a true, albeit historical, anecdote from the early days of using process data historians, about the results one company achieved turning the model on its head. It has been published in the past but it was so long ago many of today’s practitioners weren’t even in industry when it first came out. This particular case study is from the pulp and paper industry, in which I was working at the time.
In fine paper, like the kind used for copying and printing, the basic ingredients are wood pulp and several additives that aid in giving the paper whiteness, opacity, and a finish that holds ink or photocopy printing. The common additives are cornstarch, which aids in finish, a white clay called kaolin which aids in opacity, dyes, which adjust color, and titanium dioxide (TiO2) which aids in whiteness, brightness, and opacity. Fine paper has a number of specifications such as whiteness, opacity, brightness as well as ash content, which is important for disposal and recycling purposes.
In this example, whenever quality problems such as off-whiteness or off-brightness came up, the typical reaction of an operator was to adjust the TiO2, considered to be the “miracle drug” of paper-making since it could solve virtually any problem. Of all the ingredients I listed above (paper, wood pulp, corn starch, clay, dye and TiO2), care to guess which is the most expensive? Clearly titanium dioxide. This led to the situation where at the end of every month management would look at cost figures and complain about the high cost of titanium in the process. The word would trickle back down to operators stop using so much titanium but as soon as a process problem would come up they would always resort to the wonder drug in order to meet production demands.
At this point a new project was introduced into the plant. The project was the implementation of the first 32-bit CPU-based data historian on the market with a user-configurable report builder and graphics front-end. With the project in place one of the first applications the project team tackled was the paper-making process cost reduction issue. The control system used by the operators displayed all of the set-points used to control the papermaking process in the usual units such as pounds per hour, gallons per minute, or similar units. Whenever a quality problem surfaced, the operators knew that by just increasing the TiO2 slurry flow a few liters per minute, the problem could be solved. The other ingredients could also be adjusted but the magnitude of the changes in flow or addition rates were always higher since the recipe was heavy on other materials, with TiO2 being one of the smallest quantities in the recipe.
Of course, this is what was leading to the very problems that were driving management to continually complain about cost. The project team took an unusual approach to the problem. Since the data historian sat between the control system and the business computer (an IBM 360), it was able to combine information from both systems. The project team extracted information from the purchasing system on the cost of the various ingredients used in making paper and the flow rates from the control system and created a hybrid display that showed the usage--not in classic process units--but in dollars per ton of production. Thus, the operators were able to get immediate feedback on the economic implications of the adjustments to the process they were making. Since the plant, while unionized, had managed to implement a profit sharing plan, the operators were well incentivized to help control costs.
The results of giving operators business information was a savings of well over $400,000 in today’s dollars. So by giving the plant floor staff insight into the business drivers, a group of $30-an-hour individuals generated significant savings. The argument can be made that we need to do more of this, letting plant floor people understand the financial implications of their decisions, while incentivizing them to make the right ones, so they can prevent the problem from ever occurring. Is it not far better to never have a problem like excessive TiO2 costs by giving operators more information than having a $2,000,000 a year executive use a tool only to identify the problem instead?
I make the case that while EMI and the IoT are providing valuable insight into manufacturing operations, there is as much or more to be gained by giving manufacturing staff the financial insight into the business, at least as it relates to their specific jobs.