Sensor Trust

For Data Quality, Sensor Trust Is Key

In 2009, I led R&D and Technology Operations at Piramal Sarvajal, whose mission is to provide affordable access to safe drinking water for Indians in underserved areas. The large-scale, centralized water utility model had failed to get clean water to those most in need, so we set out to create a distributed network of water purification units and water ATMs, which today provide clean drinking water to 300,000+ Indians across 12 states.

To deploy those systems in a cost-efficient way, we developed new technology to manage maintenance, ensure water quality, and prevent theft. We began by deploying new control systems: a GSM modem-connected programmatic logic controller (PLC). All PLC data was stored in our cloud-based Data Historian. We were one of the first to centralize operational time-series data, aggregating sensor data from 200+ geographically dispersed assets. It allowed us to remotely monitor and manage every asset from our operations center, streamlining O&M costs by nearly 60%, and even earned us recognition as one of the most innovative companies.

To support our data-driven organization, we had to ensure that our sensor data could be trusted.

We began using our sensor data to predict and prevent failures, but soon encountered a critical problem: there were false alarms and undetected failures. Our field technicians discovered that some machine sensors produced unreliable data – in fact, instrumentation was causing 20-40% of our downtime and maintenance-related issues. To support our data-driven organization, we had to ensure that our sensor data could be trusted.

To address this, our technicians started checking instrumentation readings each time they performed any kind of activity in the field – a manual effort that added 20% more time to each visit, but overall halved the total number of visits by ensuring that the instrumentation was properly reporting data. As a result, we required fewer field technicians as we grew, and existing field technicians were able to focus on high-value activities instead of responding to dozens of false alarms due to bad sensor readings.

My experience at Sarvajal foreshadowed the promise of the Industrial Internet of Things (IIoT) and raised a fundamental question: How do we continuously instill trust in our instrumentation enterprise-wide, ensuring quality data collection for analytics, while maintaining business agility and cost efficiency?

How do we continuously instill trust... while maintaining business agility and cost efficiency?

First, there are 2 categories of data quality issues:

  1. Sensor Data Value Issues:
    Instrumentation calibration may have drifted or have physical or configuration issues, resulting in stale, noisy, or null values. Further investigation leads to repairing or replacing the instrumentation, or fixing connection configurations between Level 0 and Level 3 systems (e.g., OSIsoft PI System).

  2. Data Model Issues:
    A data model represents the workings and relationships of industrial equipment and processes such as a boiler feed or water deaerator. As new tags come online and other tags are decommissioned, the data model naturally evolves. Additionally, there may be data model gaps, mismatched units of measurement, and misaligned data types. When operating geographically dispersed assets with hundreds of thousands of tags, it becomes difficult to maintain and sustain data consistency on a continuous basis across all your assets. Data model issues make it extremely difficult to create and maintain a useable asset data model, which is required for comparing and assessing equipment, processes, and assets within and across different sites.

The Element Platform solves these issues through Tag Mapping and Sensor Auditing.

Tag Mapping

Today, Tag Mapping is performed as a one-time activity to prepare data for analytics. It’s an error-prone and time-consuming manual process conducted one tag at a time using Excel. As data and data models degrade over time, intensive wrangling is required to “realign” certain subsets of data that are needed for analytics.

We’ve seen companies task a site engineer or field technician to monitor and manage up to 75,000 tags. However, most of our customers operate 500,000+ tags, which is unrealistic to manage manually and therefore impossible to scale.

In the Element Platform, Tag Mapping is an automated process of mapping sensors in bulk (e.g., hundreds of thousands at a time) to assets, processes, sites, regions, and any other related metadata. Tags are mapped against a company’s own asset and equipment templates (e.g., pumps, compressors, motors) or to industry template standards. Our customers perform Tag Mapping to produce asset data models that can, for example, be exported as Asset Frameworks (AFs) to be consumed within the OSIsoft PI System.

Most importantly, the Element Platform performs continuous Tag Mapping so that as data naturally degrades, it’s continuously remapped and the resulting data model is always up to date and workable – no manual wrangling required.

Sensor Auditing

The Element Platform also automatically and continuously monitors and audits sensors for both data value and data model issues. It surfaces problems such as calibration drift, stale values, null values, and gaps. Engineers can generate custom reports to view aggregated issues across numerous sites or drill down to a single sensor at one site. Additionally, the platform surfaces issues in the data model, highlighting areas where there are instrumentation coverage issues (e.g., a specific value is measured in one place, but not everywhere), mismatched units of measurement, and other inconsistencies in the data model.

By surfacing data quality issues, the Element Platform helps engineers become proactive, preventing false alarms and undetected issues that would have otherwise have gone unnoticed until a maintenance visit or after something went wrong.

Together, Tag Mapping and Sensor Auditing enable industrial SMEs, Data Historian administrators, field technicians, and engineers to maintain sensor reliability and to collect high-quality time-series data for analytics.

Whatever the use case—bringing clean drinking water to those who need it, to preventing failures in critical equipment—establishing trust in your sensor data is key to success.

To see Tag Mapping and Sensor Auditing in action, please contact for a demo.