AspenTech recently announced the general availability of the aspenONE® V12 software release, which embeds Artificial Intelligence (AI) across the portfolio and uses the Cloud to deliver enterprise-wide analytics and insights. The v12 release continues to support AspenTech's strategy of embedding AI and Machine Learning (ML) inside their products such that end-users can take advantage of these powerful tools without having to rely on data scientists or layer on external third-party analytics solutions.
Inside AspenONE® v12
AspenONE® V12 consists of a series of multiple product releases:
- Aspen Hybrid Models™ bring together AspenTech's process models (i.e., HYSYS) and Machine Learning
- Aspen Maestro™ – a new capability for Aspen DMC3TM and Aspen Mtell® that automates the development of better models faster by guiding a less experienced user on building a model or agent.
- Aspen Deep-Learning IQ™ – enables the building of more accurate models and predictions.
- Aspen Verify™ for planning – uses AI to capture knowledge and verify to plan.
- Aspen Multi-Case™ – runs thousands of simulations cases concurrently, on-premise, or in the Cloud, providing a more thorough analysis.
- Aspen Event Analytics™ – provides rapid insight into production events.
- Aspen MES Collaborative™– aggregates data to an enterprise-level historian.
- Aspen Cloud Connect™– provides flexible connectivity to transfer data from Edge to Cloud.
- Aspen Capital Cost Estimator Insights™(ACCE) – streamlines the user experience integrated with ACCE and Aspen Enterprise Insights to provide visualization and workflow within a hybrid Cloud environment.
So with these releases, we see continued enhancement of existing products and further integration of selected components of the overall AspenTech portfolio.
LNS Research's View
With AspenONE® V12, AspenTech is well on the way in applying AI and ML to critical processes without additional data science expertise. Moreover, it offers better support for new users without deep process knowledge or experience. With these additions. AspenTech continues to work on both sides of the performance equation: first, by improving reliability through Mtell and Fidelis to unlock capacity and increase utilization; second, by performance maximization through planning and scheduling, hybrid process models, and advanced control and optimization, i.e., GDOT - or what LNS Research informally calls the "gas pedal side" of operations.
With Aspen UnifiedTM,, they integrated planning and scheduling. Now, they have further enhanced the planning, the process model, and the predictive maintenance with this series of additional horsepower improvements. Still, there is quite a ways to go to achieve the Self-optimizing Plant. For example, how will asset and process models integrate? And, wouldn’t it be part of self-optimization if planning and scheduling were integrated with Energy Trading and Risk Management (ERTM) so that feedstock selection, e.g., crudes, would be automated and optimized for market conditions? Last but not least, how will all of these processes and workflows be orchestrated and in what data architecture? Of course, AspenTech gets this in spades and is working on all of these fronts.
Figure 1 - Performance Maximization - The "Gas Pedal"
Looking Ahead
AspenTech says that the Self-Optimizing Plant represents the future of operational excellence. LNS agrees, though there is much more to Operational Excellence 4.0 (OE 4.0) than the equation's gas pedal side. And competitors are not sitting on their laurels either. Still, the message is quite clear: existing AspenTech users should take advantage of these enhancements because even small improvements in efficiency, accuracy, and productivity can bring significant benefits. Frankly, the industry is going to have to work a lot smarter with less effort if we are to achieve OE 4.0 and eventually autonomous operations. For end users, there are only a handful of vendors capable of delivering on the promise of self-optimization. AspenTech is front and center on that list, and should be on yours.