Developing normalized operations data and a consistent set of operating key performance indicators (KPIs) is a basic building block of any smart manufacturing or continuous improvement initiative. Normalized KPIs can lead to baseline gains of 3%-5% in productivity as a precursor to even deeper gains that come from intentional process optimization.
Developing normalized KPIs can help you benchmark best practices and drive towards operational excellence. It allows you to:
The objective is to get everyone executing as flawlessly as possible. When every operator and every plant is aligned on the same metrics and knowledge base, they can proactively take action at the first sign of any issues.
Most Smart Manufacturing or Digital Transformation initiatives start with project-based investments that support learning and discovery that create the pathways to operational improvement. For example, project-based investments might look at production efficiencies in individual plants, in the supply chain or around the benefits of remote operations.
However, enterprise productivity gains from these individual projects are impossible to scale until there is a strategy to connect them into a cross-enterprise data architecture and data model that can support normalized KPI and performance benchmarking.
Traditional code-based data models are too cumbersome, cost prohibitive and resource intensive to support an enterprise data model. In a code-based environment, it can take six months just to write and test the code to bring a single plant's operating data into alignment with enterprise data pipelines.
By contrast, a no-code solution like the Element Unify platform allows all IT/OT/ET data sources to be quickly tagged and brought into an Asset Hierarchy. The timeframe for a single plant to bring their operating data into alignment with the enterprise data architecture and data pipelines drops from 6 months to 2 to 4 weeks.
The no-code approach speeds the timeframe to normalizing the data to come up with consistent KPIs or develop “data twins.” Normalizing data means contextualizing in-plant data at an asset level (e.g., pumps, boilers, motors) and process level across plants. Data twins in turn allow for much faster, less resource intensive digital twin model building.
Beyond speeding the time to normalized KPIs, a no-code solution makes it possible for front-line operating teams to use benchmark data to see where they can drive improvement.
Plant level teams and shift teams within plants now can benchmark production and identify opportunities to increase efficiency. They now have the ability to determine things like:
One global food manufacturer found that “making KPIs based on normalized data visible to new operators would typically lead to 3% to 5% gains in productivity at each plant.” This was true across hundreds of plants running a range of different manufacturing processes.
That same normalized KPI data is being used to create advanced analytics to enable more proactive recommendations. These productivity improvements can come from catching equipment failure before it happens, mitigating human error, and more easily scaling best practices from shift to shift and plant to plant.
Up to 23% of unplanned outages across almost any manufacturing operation are caused by human error, and many more outages that are attributed to equipment failure can be traced back to missed opportunities or mistakes. Human error can be minimized and mitigated through normalized metrics, measurement, and scaling best practices.
The further out an operating team can predict or anticipate potential equipment failures or the more visibility they have on an underperforming operating process, the greater their ability to plan for and execute top operational performance.
The foundation for these gains is your operations data – data transparency, measurement, and context, which enables you to optimize your processes to increase efficiency and productivity.
Questions? Please contact us.