I talk to a lot of IT leaders from industrial organizations who are working to get more value from the mountains of industrial data they collect by turning it into advanced analytics. It’s especially hard for them to determine a path forward, to find signals amongst the noisy bombardment from Industrial Internet of Things vendors. Pressure increases when the CEO returns from Davos wanting to know “When will Industry 4.0 be ready?”
Getting started quickly while containing risk is top of mind for these industrial IT leaders. Time and again we find ourselves discussing the same topics, especially why and how they should move analytical workloads to the cloud. The cost and performance advantages of moving analytics to the cloud are well understood, but IT leaders have concerns around how this move can be done in a scalable and non-disruptive way.
Because the cloud can unlock the greatest value from time-series sensor data and enable advanced analytics, we talk to IT leaders about these 4 strategies that we’ve adopted to help them on their way:
1. No Rip and Replace
Your organization has invested handsomely in systems to collect and store real-time, time-series sensor data. Extend and enhance these existing investments wherever possible--no rip and replace--while working to convert data into greater value. Familiar systems, infrastructures, tools and capabilities don’t require a huge investment or steep learning curve. While your cloud vendor may encourage you to start fresh, figuring out ways to extend an already familiar data collection and historian infrastructure from the plant to the enterprise to the cloud is a much easier path than having to do customized retooling.
We also believe that to do advanced industrial analytics, not only should your time-series sensor data be in the cloud, but it needs to be stored in a cloud-enabled historian rather than a data lake or blob. A data historian enables you to easily and quickly work with time-series data, and it allows for more convenient and accurate modeling.
2. Ensure Real-Time Stability
Analytics architecture can’t overload existing operational systems. You can’t have the data science team constantly hosing down operational historians, risking the site engineer’s ability to access the real-time data they need to operate critical assets. Likewise, you can’t have your data science team idle because they’re not allowed to access the on-premise data when the system goes into an “Operators Only” mode following a critical event at the plant. This can be achieved by moving all the historic time-series sensor data to the cloud one time in bulk, and then streaming or micro-batching subsequent time-series sensor data to the cloud on a regular interval, with little impact on real-time systems.
3. Capture Everything in the Cloud
Analytical models perform best when the greatest amount of data is available—historic and current. Having months and years of data available for every pump, compressor, process train—everything that is worthy of instrumenting—and having that data available at a fidelity equal to the sensor data being collected at the source will accelerate time to value for analytics.
Also, on-premise analysis is unique to each site and not readily shared, making it nearly impossible to compare processes and performance or share insights enterprise-wide. By storing sensor data, and other operations data, in the cloud, operators across the globe can benchmark, compare, and contribute to collective knowledge, applying best practices for process optimization.
For example, we deploy the OSIsoft PI System on Microsoft Azure as our cloud-based, time-series data infrastructure -- whether it’s data from the PI System, a third-party data infrastructure, or another source like a Telematics Service Provider. Pulling data from a trusted source like your PI System is critical and facilitates data transfer. The PI System accounts for the many quirky realities of industrial time-series sensor data—interpolation, machine turn-off, statistical regularities are not true at various points in time, etc.—that are critical to know when doing advanced analytics.
4. Your Cloud, Your Control
Industrial companies have fiery internal debates and struggle with the idea of moving sensitive data to a vendor-controlled cloud. Our answer: Don’t struggle - keep control of your data in your cloud. Sensitive data work can be done entirely in a cloud environment that you control without giving up the IT benefits of traditional SaaS solutions where the care and feeding of the platform is left to the vendor.
For example, if you run a PI System today, deploy a PI System on your company’s Azure tenant and stay in control of whom and how you allow access to both your Azure tenant and the PI System data. You already have PI System Admins familiar with deploying and supporting that system—why make them struggle to learn something new that will likely require a lot of customized tools to be developed before it’s useful? Also, take advantage of existing capabilities like the power of PI Asset Frameworks to support your asset data model (managed directly from PI System) and to control data access for internal data consumers and third-party vendors supporting your operations.
There’s a lot on the line for IT leaders because the decisions made today will live on in their enterprises for years to come. These 4 strategies can accelerate the delivery of advanced analytics to the business, increase trust among operators skeptical of data quality and the resulting analytics, and ease the pain for IT leaders endeavoring to deliver on the promise of getting greater value out of their industrial data.