In a previous blog, the Element Unify™ Primer, we describe how Element Unify connects to real-time, design, and other operationally relevant systems to aggregate data and build Asset Twins that solve operational challenges.
This blog will explore how we CONNECT to producing systems, and ingest the volume and variety of data using agents and APIs.
One of the biggest challenges for industrials is both the long tail of the variety of data and of the data itself. It is Big Data coming in all shapes and forms - large databases in the terabytes, many smaller design files as inaccessible PDFs, and medium-sized spreadsheets that contain almost anything - from planning data and work orders to tag configurations. The sheer size of this data challenge is overwhelming enough to prevent its use.
To address this problem, we must first define it so that we can break it down into smaller units for further inquiry. Element Unify defines data on three dimensions:
Data Categories is the broadest dimension of data. It’s a way to refer to data based on how it is generated at the source. Categories of data include: Operational Data, Maintenance and Repair, Operations Determinants, Fleet History, Tests and Inspections, Design Documents, and Notes and Rounds.
Data Sources refer to the source systems that contain the data. Some examples include Historians for Operational Data, EAMs and WAMs for Maintenance and Repair Data, and P&IDs and DataSheets for Design Documents. Some of the top vendors in these spaces include OSIsoft PI and AspenTech InfoPlus.21 for Historians, and SAP PM and IBM Maximo for EAMs.
Data Type is the lowest common denominator of the data that we have in Element. The two main data types are model data and time series data. Model data refers to data that adds to the richness of the data model. This includes tag configuration data/tag metadata, asset registries, and design documents. Time series data refers to data that changes continuously over time. Time series data can have very different densities from vibration data in the hertz (one cycle per second, 5 hertz is equivalent to 5 data points per second), sensor data ranging from every second, every minute to every hour, depending on how the sensor is set up, and transactional data which can happen from once a day to once a week and so on.
Now that we have a better understanding of the data we are working with, we can do a deeper dive on how we go about connecting to different data sources and ingesting all categories of data.
There are three components to Element Unify’s Connect capabilities - Data Agents, Connectivity APIs, and the Data Import Portal. All three components work together to deliver data to Element Unify.
Below is a simplified way of thinking of their inter-relationship:
The Data Import Portal is where an Element user can establish a connection with a data source. They can define how that dataset is named within Element Unify, add metadata for more usability (for better search ability, for example), define what data to pull, and the frequency of data collection.
Our Connectivity APIs have two functions. They interpret the order made by the user and create the appropriate requests so that the agent can execute the right data collection activity. They also define how we are able to ingest data collected by agents and ensure they can be used within the application.
Data Agents are stand-alone agents that are either living within Element Unify and pulling data from source systems or live remotely within the data source’s server.
This separation of responsibilities is advantageous in many ways as it allows us to:
An upcoming blog will detail how Element Unify connects to Historians like OSIsoft PI and Aspentech IP.21.