It’s no surprise that data-driven businesses make better decisions. So, with the amount of data surging through the manufacturing environment, professionals from the shop floor to the top floor should have everything needed to optimize production, right? Unfortunately, because of the fractured IT landscape found in many enterprises, more often than not this isn’t the case.
In today’s competitive global economy, the opportunity cost of letting manufacturing process and event data remain underutilized is significant (and growing). And, increasingly, as we continue to speak with manufacturing industry leaders and research the space, we’re seeing more and more manufacturing data initiatives find their place on the docket of executive priorities.
To gain more insight into this topic, in February 2014 we spoke with Tom Braydich, the former Director of Electrical and Maintenance Engineering at Campbell Soup Company. With 37 years of experience in the Food and Beverage industry, he has played a vital role in helping manufacturing operations leverage technology to improve the business by getting more out of their data.
Braydich will also be speaking at LNS Research’s Global Executive Council Meeting for Manufacturers on March 19. If you’re interested in hearing more about his experiences with transforming Campbell Soup Company’s disparate IT environment into a consistent source of actionable intelligence, you can find more information here. Below are some key takeaways from our interview.
LNS: Could you tell us a little bit about your teams’ role in manufacturing performance projects while you were at Campbell Soup Company?
TB: For manufacturing performance projects, my team was responsible for several key areas. First, we identified technical solutions. Second, we developed standards to accompany those solutions. And third, we initiated and disseminated those new technologies and standards that enhanced the performance and security of the electrical and control systems throughout Campbell's worldwide.
LNS: When we first spoke, you mentioned working on a Process Data Initiative at Campbell Soup Company. Could you tell us about that?
TB: Today, more than ever, people are starved for accessible data. This data might include process data, status/alarm data, or trend data. It could come in the form of real-time data or historical data. It could be delivered electronically or via hard copies. The challenge is getting that information to the requestor in a format that meets his or her individual needs in a timely manner. That’s what our goal was.
LNS: What were the main objectives you had to determine before proceeding this initiative? And what were your biggest roadblocks to executing on those objectives?
TB: Before doing anything, we had to define the scope. We had to agree on the design, rollout process, and economic justification. We also had to determine whether or not we would execute a centralized implementation, have a top-down or bottom-up design, who would own the design, who would own the data, where the data would reside (local or cloud or both), and who would be the owner or business leader for the initiative. The list is extensive. The biggest hurdle we had was agreeing on a RACI (Responsible, Accountable, Consult, and Inform) chart which defines everyone’s role in the project.
LNS: Did the execution of this initiative require any shifts in organizational leadership or culture?
TB: When you’re dealing with this type of initiative, one thing that often has to be overcome is this battle of corporate versus the plants. Too many times in the past, initiatives were forced down from “above” and the plants had no say so in the project at all. For this project, we included a plant project delivery leader. He brought some local skin in the game, and ensured that the local team was engaged from the start through completion. Although this surfaced a few tough conversations, it helped to eliminate the hidden agendas between all departments.
LNS: Which organizations at Campbell Soup Company drove the initiative?
TB: It was a combination of Supply Chain and Operations, the Maintenance and Reliability group, and Quality.
LNS: Briefly describe the processes and technologies that were adopted?
TB: In the scope of this project, we deployed electronic data acquisition systems (downtime system, performance improvement, asset monitoring and line setup), electronic Statistical Process Control (SPC), electronic batch records, and a process data historian.
LNS: What improvement metric targets and achievements can you share about your journey?
TB: The project has a number of successes. First, the downtime systems were justified on an anticipated capacity increase of 1 can/min increase (leap of faith). We also went to using actual runtime-based, versus calendar scheduling for preventative maintenance. In addition, we achieved huge improvements in OEE (overall equipment effectiveness), with the ability to perform accurate line-to-line comparisons.
With data acquisition in place for asset monitoring and line setup, we were able to create more effective line ownership between maintenance and operations. And the electronic SPC system is helping operations to better understand where they are with their process.
LNS: What are some of the main lessons learned that you can share?
TB: It truly has to be a team effort. In addition to the corporate players, all initiatives need a local project delivery leader. You need that type of check and balance in the program, as well as the perspective of different groups. It’s also important to have a resident expert to champion the project or different phases of the project. And finally, as you transition to the new system, it’s critical that you also focus on capturing tribal knowledge that exists in the manufacturing plants and corporate offices.
Join LNS Research’s Mark Davidson and Tom Braydich on March 19, as the two discuss this project more in detail as well as best practices for executing your own Process Data Initiative. For more information follow the button below.