← Back to Resources

Unify Graph - Driving Digital Twin Development

In a companion blog, I published a post about why knowledge graphs in particular, are central to realizing business outcomes using digital twins. Today, I want to discuss how Element Unify brings a knowledge graph-based approach to bear for powering digital twins.

A fundamental appreciation of the relevance of graphs to analytics has underpinned our approach to architecting and developing Element Unify from the beginning. The results are customer benefits across three key stages of their digital twin journey:

  • Building (and maintaining) the graph model
  • Exploring the model locally
  • Provisioning the model for consumption by analytics systems

Building (and maintaining) the graph model

The first step from a user or modeler’s point of view is actually connecting to various data sources and building a graph model to support the specific business outcome or decision-making objective that the digital twin is supposed to enable. Element Unify provides an extensive set of pre-built connectors to facilitate quick and easy connections to myriad data sources (and also to consuming destinations such as digital twin platforms and tools).

Figure 1 - Pre-built connectors for quick and easy connections

In addition to the connectors, Unify also exposes an API for ingesting, loading, and provisioning. Unify also supports ingestion from P&IDs (piping and instrumentation diagrams). Ingested datasets are managed by the Unify Datasets Catalog, a central repository. 

Unify includes a Template Library pre-populated with hundreds of asset templates including those based on industry standard ontologies. Modelers can use these templates as they are, ingest their own templates, modify templates, or create templates from scratch to meet their needs. The templates support inheritance allowing the easy creation of templates that are derived from other templates, and then further derived templates to whatever level of inheritance required. The template construct plays an important role in the building of the knowledge graph, helping ensure that graph elements conform to intended specifications greatly speeding the development of useful graphs.

Element Unify provides a simple method for building the graph model that puts the various data in context. This contextualization representing relationships across systems is achieved quickly and automatically using Unify’s automated, no-code pipelines. Users create the pipelines by operating on the datasets and templates and leveraging hundreds of transformations, including many purpose-built for IT/OT data transformations. This no-code approach frees them from the need to write (and maintain) Python code or dealing with even more cumbersome, and difficult to govern spreadsheets. The approach can be used to represent arbitrary entities and associated relationships in a knowledge graph.

Figure 2 - No-code development of graph model

Governing your data and data models is important not only for meeting compliance obligations but also for reducing the cost, time and friction created by using spreadsheets and python code. Unify is built on an events driven, reactive architecture and persists data in the Unify Graph, serving as a data abstraction layer that can be kept up to date, aligning the digital representation of assets and operations with the ground truth of what’s happening in the physical world.

Exploring the model locally

As the model is being developed it is possible to explore the model (for debugging and for insight generation) using Unify’s visualization capabilities. Simply write Cypher queries to hone in on relationships of interest and explore by interacting graphically with the result.

Figure 3 - Local exploration of model in Element Unify

Provisioning the model for consumption by analytics systems

Local exploration, based on the built-in visualization and Cypher queries, is all well and good but what when it comes to building digital twins, the next step in the journey is taking the model developed in Unify and making it available to a consuming system. Unify has integrations with digital twin enablers such as AWS IoT TwinMaker making the task of building effective digital twins a whole lot more tractable. Users configure the data model required for a target use case using Unify and then deploy it to the consuming system for example, AWS IoT TwinMaker. I recommend you watch this demo to get a flavor for what can be achieved by using Element Unify together with such platforms. Unify is not limited to provisioning models for digital twin initiatives and can be used to feed whatever consuming system relevant to the analytics needs including data lakes, data warehouses, BI tools, or custom applications delivering a true “contextualize once, consume anywhere” capability.

Watch the Unify integration with AWS IoT TwinMaker demo video

The net result of all this is that with Unify Graph Element customers can more easily adopt a “think big, start small, scale fast” approach for their digital twin initiatives by starting small and flexibly and incrementally adding more as their needs call for, building upon existing work.

Ready to learn more about how Element Unify can help with your digital twin initiatives? Please reach out for a discussion with our digital twin experts.

Want to Learn More?

Register to view a live demo of Unify or sign up for a free trial to use the software for 30 days. You can also sign up for a free trial or purchase a Personal License from the AWS Marketplace or Azure Marketplace.

Questions? Please contact us.