KBC, a Yokogawa company, recently released the latest version of its flagship process modeling software, Petro-SIM 7, featuring tight integration with OSIsoft’s PI historian and Asset Framework (AF). With the launch of Petro-SIM 7, KBC introduced a solution for upstream in wellhead to production facilities, refining, and petrochemicals; and in so doing claimed the Digital Twin status.
KBC and Digital Twin
KBC is the first company to tightly integrate its simulation software with a historian, using what it calls the Petro-SIM Integrator. According to KBC, the Integrator enables automatic creation of PI AF templates within the PI System from Petro-SIM and automated update of the PI AF template if the Petro-SIM model changes. AF is a component of the PI System that structures and contextualizes operational data for real-time decision-making. The Petro-SIM Integrator enables the automated population of the Petro-SIM model with current PI System data and automatic population of the PI System with Petro-SIM outputs. Any changes to a PI Tag — a data point or stream tracked within the PI System — triggers an automatic notification to Petro-SIM. All outputs of Petro-SIM's models are routinely written back into the PI System in real-time. PI’s considerable calculation capabilities are then leveraged for analysis, KPIs, and reporting by bringing Petro-SIM insights to PI Vision dashboards.
Process data is automatically validated, mass balanced and reconciled through Petro-SIM, supporting comparison of measured versus simulation model versus linear programming (LP) model outputs to help track when models and actual plant performance diverge. Other parameters also include, but are not limited to, temperature, pressure, flow, density, viscosity, stream characterization of feeds and products, catalyst activity and run length projections, catalyst circulation rate, and heat exchanger / fired heater fouling. Once in PI, data can be integrated with any number of external systems, including KBC’s cloud-based software-as-a-service Operating Goals Manager (OGM) application as well as third-party analysis tools, such as Tableau, Qlik, and Spotfire. It should be noted that KBC has an extensive OSIsoft practice from Industrial Evolution’s acquisition by Yokogawa in 2016.
In taking this architectural approach, KBC has chosen to leverage the process industry market’s leading historian capabilities versus assembling an architecture where data sources are either integrated by the analytics tools on demand, e.g., Trendminer, or are aggregated to an intermediate database, for example, Hadoop, in a cloud-based platform-as-a-service, such that applications can then utilize the data, e.g., Honeywell Connected Plant.
Upsides and Downsides of the KBC Approach
There are advantages and limitations of KBC’s approach. The benefits are rich process analysis functionality, rapid time and value to the market, and almost sure-fire acceptability from OSIsoft users, but the obvious limitation is being tied to one historian. Petro-SIM can write its results back to most other historians, e.g., Honeywell’s Ph.D., however, this ability currently lacks the automated creation/update of contextual information as with PI AF.
A third-party tool such as Element Analytics’ Asset Hub is an alternative, but perhaps not with as rich and automated functionality. An additional stumbling block is that while historians are often the primary data sources for some twins, twins will need access to other sources and forms of data. For example, real-time vibration waveform data and unstructured maintenance data. While OSIsoft points out that the PI System can store real-time waveform vibration data as a binary large object (BLOB), the system doesn’t have the tools for directly analyzing waveform data. No doubt some users do this, and there are third-party systems that facilitate this in a historian, but it reinforces the architecture question. For those companies taking a more comprehensive approach to Twins, the historian-centric approach may not be best.
From an analysis viewpoint, the ability to quickly get the process model aligned with the historian and tap its full visualization capabilities is advantageous. “What if?” and “What’s best?” scenarios can be run automatically to determine available strategies that maximize profitability. Moreover, planning and scheduling linear sub-models can be updated to reflect the proposed changes. KBC has chosen to focus on two key asset and value chain intersections: one, in the downstream by tying Petro-SIM to planning and scheduling, and second, in the upstream by linking well and production networks to production facilities, and both are of real value to end users.
However, when it comes to predictive and Bl prescriptive analytics, KBC isn’t quite there yet, though both are on the near-term roadmap, leveraging the artificial intelligence (AI) technology of its parent, Yokogawa. To be able to use machine learning (ML) to automatically do the model versus actual versus plan comparison and back casting, learn from it i.e., a self-tuning model, will go a long way to building confidence in Digital Twin as a useful tool. This can undoubtedly relieve much work from the shoulders of process engineers, planners, and schedulers. Why not let ML do the repetitive, heavy lifting?
What’s the next stop for KBC? In addition to adding the advanced analytics capabilities to achieve full Digital Twin status, we would like to see how asset twins from its partner, GE Baker Hughes, are incorporated with the process twin, resolving any potential overlaps and integrating its dual predictive and prescriptive capabilities. This may require some re-thinking of the solution architecture. Beyond that, we look forward to how it continues to integrate the value chain with operations.
For manufacturers who are not married to the status quo and haven’t looked at what KBC can do, you should. For vendor competitors, it’s your turn to step up to the bar and explain how you are approaching the same challenges that KBC is tackling. The race for process digital twins is on.