← Back to Resources

Scaling Digital Transformation: Challenges and Considerations

Digital transformation offers opportunities for improved productivity, through projects focused on condition-based maintenance or overall equipment effectiveness. Better visibility into asset health and operations, using analytics, promotes safety. These analytics are often deployed for assets at the local plant level for a handful of use cases.

Existing investments in analytic tools, such as statistical modeling applications, BI reporting, and data science notebooks, make it easy to build new analytics to drive insights. Yet, organizations remain stifled, balancing their priorities between redundant data preparation activities for new projects and complex workflows to maintain existing solutions.

The Obstacles that Block Digital Transformation Initiatives

Digital transformation blockers fall into 2 main categories:

  1. No system of record for asset data models

    Plants are designed from a process perspective, while maintenance and reliability initiatives focus on assets. Many existing vendor applications—like OSIsoft Asset Framework and Microsoft PowerBI—support data models for their analytics and visualizations.

    It’s well known in the industry that configuring the analytic is not the hard part. 80% of resource effort will be spent on model building and the accompanying challenges of data wrangling and preparation. However, there is no centralized system of record for asset data models to represent and integrate critical asset information. This stops companies from scaling their digital transformation initiatives because teams have to re-work and wrangle data for each new initiative and platform.

  2. Governance Controls

    Manual processes and validation: Establishing trust in data models is critical to ensuring accurate information is surfaced to the business for safe operations and decision-making. But because most asset data models are configured manually in spreadsheets, there is no digital record for mapping logic. Confounding the challenges, business SMEs who understand the data, have limited time to conduct data discoveries to aid in quality assurance and quality control (QA/QC) efforts, leading to project delays.

    Model configuration and validation often require manual transcription from graphical P&IDs to spreadsheets but there is no digital reference back to the precise location of the diagrams. When scaled across tens of thousands of assets, the asset data model is highly prone to human error. This leads to businesses making critical decisions with incorrect information and increases risk of health, safety, and environmental (HSE) incidents.

    Workflow-driven monitoring and tracking of asset changes: After numerous models are deployed, tracking and monitoring changes from source systems is manual, if it exists at all. Updates are reactive and a disproportionate amount of time is spent on repetitive data discovery.

    For example, a maintenance team may create new asset records in SAP but there is no communication or visibility to create associated process historian tags. This causes existing asset models to degrade and report erroneous information over time.

    User access levels: Users may have different access levels to analytics software and because of a lack of audit trails, unintended changes can be made without a user’s knowledge. Teams today must find a balance between mitigating issues proactively and moving on to new initiatives.

    Reactive responses to data quality: Many points of failure exist across the sensor data collection journey from field devices to the distributed control system (DCS), to multiple process historians. These failures can be due to instrumentation issues or, more commonly, an accidental configuration change that breaks the data collection chain.

    This disruption leads to lost days focused on troubleshooting long sensor lists across multiple systems to pinpoint the exact issue, putting businesses at risk of making decisions with incorrect data. Proactively surfacing these issues with reporting, reduces disruptions and helps the business make accurate decisions.

6 Considerations to Scale Your Transformation

A secure, centralized system of record for asset data models with enterprise governance controls that integrates into existing technology stacks is required to provide:

  • a jumping off point with ready, clean asset context for any analytics project,
  • ease of maintenance to sustain existing projects,
  • proactive reporting and monitoring to detect issues, and
  • tools to mitigate and rollback unwarranted changes.
  1. System of record for asset data models
    A secure repository with access controls provides clean ready-to-go context for any analytics project. It reduces the manual contextualization steps that cannot be circumvented when modeling assets and processes for analytics projects. This repository needs to have controls to easily import and merge existing models from various sources into a common, easily accessible format. It needs access controls to enable reliability, maintenance, and data science consumers to connect to and query the data and reduce time to data discovery.
  2. Automated monitoring and tracking
    Asset data is commonly stored across several data sources at a single plant, and many more systems across an entire enterprise. Having reporting and downstream notifications helps to proactively identify changes across systems over time and eliminate any disruptions to the business.
  3. Software-assisted validation and QA/QC
    Software-assisted validation for QA/QC is needed to reduce the time-burden to manually solicit SME feedback. Automation to extract context and software-assisted QA/QC workflows reduce human error and provide capabilities to review existing models.
  4. Rollbacks and mitigation
    A central hub of asset models with reporting capabilities on differences between systems and across time periods is invaluable. Integrations to systems such as Asset Framework to compare, migrate, and rollback versions reduces maintenance efforts and simplifies workflows to manage models, identify changes, and mitigate potential issues.
  5. Tracking data lineage
    Tracking business logic flow for integration and contextualization is required for ensuring trust and reusability. It provides a layer of auditing that improves troubleshooting and validation that is near impossible to manage with manual, error-prone workflows.
  6. Data quality reporting
    Proactive reporting of data collection issues on troubled sensors and tracking data lineage from field devices to process historians results in faster time to resolution and minimizes disruptions to the business. Making sound business decisions with digital transformation is only as good as the quality of the underlying data.

Element AssetHub™ is a secure enterprise system of record for asset data models, that integrates with existing technology investments and includes data governance capabilities. These capabilities include new IIOT services on Azure or AWS, that securely democratizes access to asset information, allowing companies to scale their transformation initiatives.