Andy Bane, CEO of Element Analytics, recently joined TWIT.tv’s weekly podcast This Week in Tech to discuss bringing order to IT and OT data. The podcast host, Louis Maresca, began the conversation by enthusiastically asking Andy about his origin story.
I had been working in software for decades primarily serving asset intensive industries like power generation and transmission distribution, helping companies use both IT data and operational data to solve problems. Those problems are everything from controlling automation, asset management, field service management all the way to planning and forecasting. Today this would be considered data scientists doing Artificial Intelligence and Machine Learning.
The problems are still much the same – trying to figure out how to optimize industrial operations to be more efficient, and how to keep people safer and out of harm's way. That brought me to Element where we had an opportunity to take it to the next level of the industrial internet of things and help industrial operators take advantage of their IT/OT data to solve some of these hard problems.
Element works across a number of different sectors such as chemical, energy, food and agriculture, or pharmaceutical. We are particularly focused on the operational problems where there is a lot of data coming off of equipment and how to bring that data together in context so it can later be applied to machine learning or even basic analytics. For example, a couple of problems our customers are trying to solve are predicting the failure of equipment and improving uptime.
Element is focused on the problem of data contextualization, which enables companies to perform advanced analytics. One of the challenges in the industrial world is that there is an abundance of time series data coming from sensors installed on instrumentation on equipment. However, that data doesn't have an object model, it's stored in a flat file, and it's very poorly labeled. It only gives you a unit of measure and a timestamp. In order to do something useful with the data, like predict the failure of a piece of equipment, you must bring in data from other sources like your SAP system or engineering design system to create a richer context around that time series data. The way it's typically done is using spreadsheets or Python code, which doesn't scale.These methods don’t allow you to consume as much data as you would like to fulfill modern use cases. We help solve that problem through software today.
We get people off of spreadsheets and into software to use no-code, low-code data transformations that are really built togo after this problem. Then we store the data model in a knowledge graph. It's a very flexible way to build data models. The knowledge graph makes it easier to tackle complex use cases in more of a singular, repeatable fashion.
A good example is Google Maps or Waze for traffic navigation. Under the surface, they have many layers of data: roads, bridges, buildings, etc. All those layers must be snapped together. This is easy for navigation apps, because they use a geospatial key. They have access to the raw data such as the location on earth, the price of gasoline, or the traffic pattern during the time that the user is traveling.
This is not so easy for industrial companies. They don't have a geospatial key that allows them to snap together the data layers. Element becomes the glue that unifies their data:
We unify these layers of data so that you can get more of a Waze or Google Maps experience. We help them create a map of a plant or across a fleet of plants. We have customers who have plants globally, and they want to do machine learning across many use cases, but it takes a lot of data – we help them achieve that outcome.
We deploy on both AWS and Azure. Any company can go to the AWS or the Azure marketplace, and explore Element Unify with a with a free trial. WithUnify, they will be able to connect their time series systems with other systems that bring context to the data. They can quickly bring that data together and create a digital representation (or digital twin) of physical assets and processes so that they can run root cause analysis or predictive maintenance or the whatever the use case may be.
The interview concludes with questions and comments from Oliver Rist, Executive Editor of PCMag. The three continue to have a lively discussion on a variety of topics such as the CEO’s perspective on AI and analytics, what can data engineers do to increase the value of their data, and finally why companies should treat their data as an asset. You can watch the entire podcast here on Twit.tv.