Within manufacturing and industrial settings, we’re seeing a shift in the dynamics of many business models toward the delivery of more than just finished products. Companies producing in areas such as jet engines, heavy machinery, and other equipment-focused industries are beginning to offer services subsequent to the point of sale.
This trend seems to be emerging concurrently with the increasing use of software and diagnostics built into pieces of equipment. As a consequence of growing product complexity, many organizations are capitalizing on the need for expertise in working with the product.
With an increase in critical failure mode data coming from field technicians, how does this affect the quality of products? In this blog, we’ll discuss what technologies companies are using to sift through the data and how it’s being used to improve business performance.
Capturing Failure Mode Information
Previously, we’ve written on the importance of post-production data in identifying and resolving failure modes. By automating the flow of feedback from customer complaint management to product design, companies can create an environment for closed loop quality management. A similar model can be adopted in the areas of service or maintenance of the product.
Trained field technicians travel on-site to investigate issues pertaining to product failures. After assessing the situation by speaking with operators and running tests, the solution can come by means of a system reset, upgrading the software, replacing a part or entire product, and so on. Relative to other forms of field data, the value of taking notes and talking directly with users cannot be understated.
Finding Answers in the Sea of Data
Ideally, as an engineer, you may want field data codified to correlate to specific failure modes. This is unfortunately very difficult, mainly because of two factors. First, there is a vast pool of potential errors that could cause a failure mode. And second, large organizations could have hundreds and even thousands of field technicians providing data.
Massive amounts of data can quickly become overwhelming and even cause bottlenecks for companies attempting to identify common issues. This seems to be an area for improvement even for mature companies. We recently spoke to a senior person at a large equipment manufacturing firm supplying to the banking industry that once had a similar issue. He explained to us how his company was leveraging six sigma methodologies and text mining technology to overcome it.
Through statistical analysis, a text mining application identifies trends in large volumes of information. It then categorizes the results and, in some cases, can give them context. With the use of business intelligence tools, organizations can subsequently interpret and analyze the data.
Closing the Loop on Quality Management
Text mining software can be used for many reasons, from finding themes on the internet for marketing campaigns to identifying terrorist activities, but in this case, it was used to decipher the notes of roughly 7,000 field technicians. By using text mining functionalities, this company was able to take a set of raw, difficult to use data and quickly provide valuable information to engineering to resolve failure modes.
Although we want to identify and resolve issues prior to production, this doesn’t always happen and we often need to rely on post-production information. Market leading companies are continuously finding ways to strengthen the integrity of data, facilitating cross-functional collaboration and communication.
Enterprise Quality Management Software (EQMS) is a great tool for this and can create a platform for improvement across the value chain, but it relies heavily on data. When a roadblock pops up, such as the one described above, companies that standout are the ones that find innovative ways to feed good information into the system.
You might also be interested in: