Andy provided a great overview about why industrial companies need a 360° view of their assets, why an asset-oriented Data Hub is needed to create that, and what are the requirements that Data Hub must fulfill in order to meet the needs of asset-intensive industries. If you haven’t read it, and are struggling to enable your engineers and operators to become citizen data scientists, I highly recommend you give it a read. You’ll understand why prior analytics strategies just aren’t working.
With that background, Element’s Unify™ is the solution that fulfills this gap. Here’s a quick primer on Element Unify:
Element Unify allows you to build an enterprise data model of your assets, using your existing systems to create a comprehensive view of your instrumentation, equipment and processes. It does this by connecting to real-time, design, and other operationally relevant systems. It then constructs a comprehensive data representation of your assets and operations that we call an Enterprise Asset Model.
Definition: An Enterprise Asset Model is a digital representation of real-world, physical objects and how they are interconnected
Rather than focus on solving specific analytical or simulation use cases as Digital Twins tend to, the purpose of the Enterprise Asset Model is to aggregate operational data into a common enterprise model of your assets to solve operational challenges. With all the information in one place in its raw format, it becomes easy to create new views (these are typically in the form of tables, hierarchies, or process diagrams) of the data you need to enable a given use-case, or group of stakeholders. This virtually eliminates the need for bespoke data wrangling or ETL projects that are typically a prerequisite to any analytics project.
The best way to describe this is through a real world example, creating an Enterprise Asset Model for a fleet of pumps…
Pumps have a lot of relevant data tied to their operations: Historian information, Operator Notes and Rounds, Design documents, Operating specs, Lab reports, HazOps, etc. To enable insights across these data sets, you typically have to customize the integration to the systems relevant to your use case, then shape that data set for your analysis (an effort referred to as extraction, transformation, load, or ETL). The challenge with this approach, is that this effort is specific to that single analytical use case. Additionally, this ETL activity is typically where 80% of any analytical effort takes place forcing a lot of reduplication - and re-organization - of data, and consuming valuable engineering time, time that engineers should be spending improving the operation of their assets. This prevents engineers from becoming citizen data scientists.
Instead, the “Big Data” approach has turned this process on its head. Rather than have the use case define the integration and create a lot of redundant data, Element Unify connects to the sources themselves, aggregating the raw data into one place. These direct connections make sure that your existing systems continue to remain the system of record, while also ensuring that everyone is not duplicating data.
How Element Unify builds and maintains Asset Twins to share with data consumers and applications
In order to construct an Enterprise Asset Model of your pump fleet, you need to define what a typical pump looks like. This is done through an Asset Template that provides a common structure for how all pumps should be defined. Without this structure, you’ll continue to compare ‘apples-to-oranges’, forcing a considerable amount of one-off customization work to support enterprise-scale analytics.
With a pump template defined and connected to its data, the data now has a reference framework on where to associate the data for any digitally represented asset. This associates historian data to the sensor attributes of a template, and design data to attributes, like functional location or manufacturer name. These mappings are what define the Enterprise Asset Model of your pumps.
With an Enterprise Asset Model now built, it's important to know you can use it. The biggest challenge that operators face is ensuring that the data underlying the Model is trustworthy. We need to know that the data values are of high integrity and whether there are any gaps. Element Unify automatically surfaces these issues within your Model.
Now that the Enterprise Asset Model is built, comprehensive and assured, you can use its data model to harmonize Hierarchical Views across other systems, such as Honeywell Asset Sentinel, SAP Asset Registry, Aspen IP.21, and OSIsoft PI Asset Framework (AF). More importantly, you can now construct views of the aggregated data ‘at-will’, enabling your use-cases, and other stakeholders that need to see information in a specific way so they can continue to use the BI and analytical tools they already use.
Data Hubs provide long-term value to your organization, supporting not just basic analytical activities, but also machine learning, mixed reality, and other future technologies. With an aggregated data set, and an enterprise data model of your fleet of pumps, you have an abstraction layer that doesn’t require bespoke connections. Additionally, these models are stored in a flexible, graph-based structure allowing you to design and query the asset twin relationships in a multitude of ways. On top of this, Element Unify is built on top of a distributed compute structure, meaning that if you need to elastically scale, the infrastructure grows with you.
Most importantly, Element Unify has cybersecurity at its core. Built to be contained within your firewall, it encrypts data in motion entering and exiting the Hub, as well as at rest inside the Hub. You hold the keys to your own firewall, and your data. So you control who has access and who can interpret the data.
To learn more about Element Unify, see a demo, and hear how customers are using it by building a 360-degree view of their data using Enterprise Asset Models, please view our recent webinar.