← Back to resources

Maximize the value of your Operational Data – It’s Too Valuable an Asset Not to!

Download PDF
September 15, 2021

Last week Joe Perino, Principal Analyst at LNS research joined Element CEO, Andy Bane for a discussion on how to maximize the value of operational data. I found it to be quite engaging and while there is no substitute for watching the recording, I did want to write a quick recap of their discussion and some of the questions that came up.

Using operational data is not easy. Maximizing the value of operational data requires facing several challenges head-on. These span all the way from connectivity to providing easy, yet secure access at scale and ensuring that due attention is paid to establishing a sound “DataOps” infrastructure that allows analytics processes to operate effectively at speed.


The New OT ecosystem is complex.
The drive for faster decisions based on more data is driving IT/OT convergence. Formerly air-gapped layers are now increasingly being connected in a bid to facilitate improved data flows all the way from the edge to the cloud. This is a net positive but it does result in a more significant data wrangling undertaking and an increased attack surface that must be protected from a cybersecurity perspective. The IT/OT convergence is manifesting as convergence in the organizations and in related work processes. Leading organizations are embracing integrated teams drawing from IT and OT.


Bringing the data together requires thought. There are choices to consider when bringing the data together for analytics. One needs to take account of the various data sources (e.g., time series data, processed data, …), where it is going to be stored (data lake?) and made available for use, and how to ensure that the data is properly contextualized for meaningful insights. Of course, all of this must be done keeping in mind the use case(s) that one is looking to address. Simply connecting analytics at the historian level may appear expedient in the short-term but can run into difficulties when it comes to scaling beyond the local assets or site. This is where the importance of a scalable architecture with a sound foundation really comes to light.


Disunited metadata is a major hurdle to scaling analytics. The typical industrial data environment is not built for large-scale, modern analytics. Piecemeal, brittle integrations are unable to keep up with the dynamic nature of the data (e.g., plant floor changes, technology upgrades, etc.). There is limited ability to efficiently and reliably blend data from disparate sources for use by the myriad analytical applications and tools. Lack of governance limits project longevity and extensibility to other use cases, requiring repeated configuration to keep data fresh and to address new use cases. Even if a company is willing to spend large amounts of money to refresh data it is impossible to gain back the time spent re-wrangling the data. Simply putting all the data in a data lake is no panacea either without the requisite contextualization and governance. A lot of valuable data (especially time series data) remains fundamentally siloed and under-utilized due to lack of association and management of its context.


Uniting IT/OT metadata is key. Fast and dependable progress along the united data journey requires contextualization and governance of enterprise metadata. Element Unify snaps onto a company’s data architecture to automate contextualization across IT/OT systems and provide governance while leveraging investments in existing systems. A unified metadata system of record enables analytics and application  integration at the metadata layer rather than directly to the tag layer or directly to other source systems. This can then power an ever-growing range of use cases of interest to the various teams (e.g., process, reliability, safety, ESG, corporate, etc.) and really help maximize the value of operational data.


Take a lifecycle view of metadata. If metadata is not managed throughout its lifecycle, it is not managed. Unify takes a complete lifecycle view integrating metadata, including existing asset models, through contextualization and provisioning all the while ensuring governance.


Is your approach driving the right value? When it comes to maximizing the value of your operational data four key dimensions stand out. Speed - being able to quickly get data ready for analytics. Quality - using high-quality, complete data. Governance - ensuring data is up-to-date and maintained. And scale - being able to efficiently extend use beyond early projects to other sites and use cases.


All of the foregoing discussion culminated in the following takeaways to maximize the value of operational data. I will let them stand on their own and move on to the Q&A.


There was certainly more in the discussion than the highlights I have paraphrased above and in the Q&A below - to get the full value you will have to watch the recording.

What are leading companies doing about their DataOps efforts?

Most companies are working to address DataOps exploring technology such as Element Unify. The IT side is looking to pull together approaches by drawing on a mix of vendors such as database vendors and Hyperscalers coupled with DIY (do it yourself) The OT side is typically less interested in DIY and seeks solutions that address their immediate problems. A dominant architecture has not yet emerged but the leading companies have grasped the importance of being able to scale and having IT and OT work together; they are starting to see value from their efforts.

Is it possible to manually enter values for metadata in Element Unify? The PI System does not have an easy way to enter data for static attributes which are not linked to external systems – we currently use Excel sheets for this.

Yes, it is. More importantly, with Unify you can have a template-based common vocabulary to describe assets across plants or across the entire fleet. Not only can you eliminate use of Excel sheets but you can have a uniform, normalized way of dealing with your assets.

How are leading companies bridging the metadata model with transactional and real time data? Are the metadata models being imported into a data lake for merging the metadata with the transactional data?

The metadata model needs to be managed at an independent, foundational layer that is available for broad use and not be embedded within a narrow use system as the historian which fundamentally does not accommodate other transactional data types. Employing custom code at the visualization layer is also not the appropriate approach for the bridging. Having an independent metadata management capability that is flexible, extendible, scalable, and maintainable is the way to go.

Why does Unify use a graph model?

The relationships that Unify seeks to help our customers model are inherently “graphy” and not strictly hierarchical. The relationships span different dimensions – they are not limited to plants and assets but rather include processes, organization, regulations, etc. Employing a graph model makes it easy to model in the first place, and to maintain, update, and scale the model as the modeled environment evolves and analytics needs change. Further, a graph model lends itself to very fast visualization for fast big-picture analysis to guide more detailed analytics.

What is the advantage of using Element Unify instead of a well-designed, templated PI asset framework that has links to external systems?

PI asset frameworks are difficult to build and maintain. Unify lets you not only materialize PI asset data but also other data (e.g. SAP data, P&ID data). Often Unify customers start by incorporating their P&ID data using the Unify P&ID Client to add to their PI data. One Unify customer has uploaded more than 200 PI asset frameworks into Element Unify where they can enrich and manage them without spreadsheets and Python, instead they are able to do it with easy to use (and maintain) drag and drop pipelines. Also, OSIsoft’ s own cloud offering OCS, is not able to ingest PI AFs whereas Unify can.

What makes the leaders stand out when it comes to using operational data?

Joe: Leaders test lots of approaches in an agile mode and then place bets judiciously. They pay attention to building infrastructure properly so that they can scale beyond just one analytic. They have done a good job of integrating (not just “interfacing”) IT and OT teams.

Andy: They also focus on business value first and then consider the people and process and how technology should be applied rather than being seduced by the next hot tech. They are fast learners, willing to iterate in an agile fashion. Willing to operate effectively within an ecosystem and orchestrate solutions that work for them.

Element Unify: Uniting IT/OT metadata – bridging the IT/OT data divide

At Element we are on a mission to unify IT and OT metadata for analytics. To be clear, technology is not the whole answer for digital transformation, it is part of the answer. Our team has deep industrial and data expertise which informs our business and product development. We believe, given the complex industrial environment and multitude of IT and OT data sources, it is important to have a mechanism for integrating the source data, contextualizing it appropriately, provisioning it for consumption while ensuring governance throughout the lifecycle.

Purpose-built for industrials with a knowledge graph foundation and enterprise-grade security, Element Unify does all this. Unify snaps onto your data architecture as a foundational element and does not require “rip and replace” changes, leaving your users with the freedom of choice to use the tools they prefer but knowing that they can rely on the underlying data because their models are governed.

If you are looking to maximize the value of your operational data and believe data quality and governance can help you in your journey, I suggest checking out Element Unify and considering a free trial.