A Quick Primer on Element’s AssetHub

Andy provided a great overview about why industrial companies need a 360° view of their assets, why an asset-oriented Data Hub is needed to create that, and what are the requirements that Data Hub must fulfill in order to meet the needs of asset-intensive industries. If you haven’t read it, and are struggling to enable your engineers and operators to become citizen data scientists, I highly recommend you give it a read. You’ll understand why prior analytics strategies just aren’t working.

With that background, Element’s AssetHub is the solution that fulfills this gap. Here’s a quick primer on Element AssetHub:

What is Element’s AssetHub?

AssetHub allows you to build an enterprise data model of your assets, using your existing systems to create a comprehensive view of your instrumentation, equipment and processes. It does this by connecting to real-time, design, and other operationally relevant systems. It then constructs a comprehensive data representation of your assets that we call Asset Twins.

Definition: Asset Twins are digital representations of real-world, physical objects

But rather than focus on solving simulation use cases as Digital Twins tend to, the purpose of Asset Twin is to aggregate operational data into a common enterprise model of your assets to solve operational challenges. With all the information in one place in its raw format, it becomes easy to create new views of the data you need to enable a given use-case, or group of stakeholders. This virtually eliminates the need for bespoke data wrangling or ETL projects that are typically a prerequisite to any analytics project.

The best way to describe this is through a real world example, creating Asset Twins for a fleet of pumps…

Pum Low Rez 2


Pumps have a lot of relevant data tied to their operations: Historian information, Operator Notes and Rounds, Design documents, Operating specs, Lab reports, HazOps, etc. To enable insights across these data sets, you typically have to customize the integration to the systems relevant to your use case, then shape that data set for your analysis (an effort referred to as ETL). The challenge with this approach, is that this is typically where 80% of any analytical effort takes place forcing a lot of reduplication - and re-organization - of data, and consuming valuable engineering time, time that engineers should be spending improving the operation of their assets. This prevents engineers from becoming citizen data scientists.

Instead, the “Big Data” approach has turned this process on its head. Rather than have the use case define the integration and creating a lot of redundant data, Element AssetHub connects to the sources themselves, aggregating the raw data into one place. These direct connections make sure that your existing systems continue to remain the system of record, while also ensuring that everyone is not duplicating data.

Connect Manage Share

How AssetHub builds and maintains Asset Twins to share with data consumers and applications


In order to construct an Asset Twin of your pump fleet, you need to define what a typical pump looks like. This is done through an Asset Template that provides a common structure for how all pumps should be defined. Without this structure, you’ll continue to compare ‘apples-to-oranges’, forcing a considerable amount of one-off customization work to support enterprise-scale analytics.

With a pump template defined and connected to its data, the data now has a reference framework on where to associate the data for any Asset Twin. This associates historian data to the sensor attributes of a template, and design data to attributes, like functional location or manufacturer name. These mappings are what define the Asset Twin of your pump.

With the pump twin now built, it's important to know you can use it. The biggest challenge that operators face is ensuring that the data underlying the twin is trustworthy. We need to know that the data values are of high integrity and whether there are any gaps. With AssetHub you can easily identify these issues within your pump fleet.


Now our pump twin is operating as expected , you can use its data model to harmonize Hierarchical Views across other systems, such as Honeywell Asset Sentinel, SAP Asset Registry, Aspen IP.21, and OSIsoft PI Asset Framework (AF). More importantly, you can now construct views of the aggregated data ‘at-will’, enabling your use-cases, and other stakeholders that need to see information in a specific way so they can continue to use the BI and analytical tools they already use.


Data Hubs provide long-term value to your organization, supporting not just basic analytical activities, but also machine learning, mixed reality, and other future technologies. With an aggregated data set, and an enterprise data model of your fleet of pumps, you have an abstraction layer that doesn’t require bespoke connections. Additionally, these models are stored in a flexible, graph-based structure allowing you to design and query the asset twin relationships in a multitude of ways. On top of this, AssetHub is built on top of a distributed compute structure, meaning that if you need to elastically scale, the infrastructure grows with you.

Most importantly, AssetHub has cybersecurity at its core. Built to be contained within your firewall, it encrypts data in motion entering and exiting the Hub, as well as at rest inside the Hub. You hold the keys to your own firewall, and your data. So you control who has access and who can interpret the data.

To learn more about AssetHub, see a demo, and hear how customers are using it by building a 360-degree view of their data using Asset Twins, please view our recent webinar.