← Back to Resources

IoT, Streaming Data, Analytics, and a Little Design Thinking (Part 2)

A Microsoft and Element Podcast

I recently participated in a discussion with the Microsoft industry experiences team (David Starr and Diego Tamburini) to talk all things IoT, including the cultural changes data-informed decision making has on organizations.

The discussion was recorded as a podcast. This is Part 2. 

Listen to the full-podcast here.

Part 2 covers rethinking analytics as a capability model, and the importance of self service and design thinking as part of digital transformation.

Diego: Sameer, one question specific to customers that want to use the machine learning on IOT data. I’ve heard that the main road block is not enough data science expertise to go around. Plus figuring out how to select the right machine learning algorithm and tweaking it with the right parameters. It’s not trivial at all! So how do your tools help us address that problem, especially for those who don't necessarily have a data scientist in in-house?

Sameer: My recommendation is to think about analytics, not a maturity model, but as a capability model. You want to increase your capabilities to deliver analytics to your organization. So just like with Industry 3.0, where automation and control help add a new capability to industrials, analytics and data and other digital capabilities is what is driving adoption of Industry 4.0. There are three stages of the journey to build that analytics capability model. The first stage is descriptive analytics. This is where people should really stop looking at data sensor-by-sensor, comparing a temperature sensor at one location vs. the temperature sensor at another location. Or looking at one piece of equipment or one process. What you need to do is to look at the fleet level. Try to standardize all your pumps using functional location or make-model, or whatever it might be. But at least get all your pumps or some other equipment and start to standardize that. And so that way you can start to benchmark your pump efficiency across the entire fleet. And you can start to ask questions, “Why is this group worse than the others”? The second stage in the analytics journey is what we call diagnostic analytics. That means labelling your data with key pieces of information about what happened and when, to ask “These pumps are suffering a frequently, recurring failure maybe every month. Can we start to diagnose what's going on right before that”? Because once you have that labelled information in your time series, you can now say, “Ok, let’s look at what maintenance activity and what operator round activity is going on right before that issue”. That way we can diagnose these issues using data, not just a fishbone diagram, to help understand why this is happening. The third, final stage is what I call predictive analytics, and that is where machine learning comes into play. For predictive analytics you have a high sense of data quality. You have standardized, contextualized data. You have labeled data. Now what you can say is, “Ok, I know what's going on in this data set and I have enough statistics, either historical or from like-equipment, to apply machine learning to tackle that”. And yes, many folks might not have a data scientist, but companies like Element, and many other service providers, can provide a data scientist-as-a-service. However, data scientists can’t tackle the problem unless the data is in the right form. There’s a frequently referenced statistic by Harvard Business Review that states about 80% of the time spent when adopting machine learning or AI, is really all about readying the data. And that's only one part of the equation. You also need the right infrastructure in place. And you need to make sure that the data is in a form that people can understand. So, you need to resolve these areas before you can think about bringing in a data scientist to solve your problems.

Q. And we haven't even mentioned yet that you have a product that lives in this space - Element Unify™. Can you tell us about that?

Sameer: Yes. Element Unify is all about connecting to different brownfield, and new IOT, data sources. We bring all that data together and help create a digital model of the physical environment, or what many people call the digital twin. So, you encode all the physics into the model. You encode all the people who were working on that equipment. You start to create a fully standardized, fully contextualized model of the data. And on top of that, you start to ensure data integrity. So, making sure the data model is accurate. Making sure the unit of measures are highly consistent and converting any if needed. Identifying where there's bad signals in the data, what’s known, or mis-calibrated, or noisy. We're constantly checking to make sure that we're able to maintain the data model, so if things get moved around or replaced, also whether the data is up to date – we keep that history as well. And then we push that to the Azure Data Lake so customers can use Azure Databricks, SQL data warehouse, PowerBI, Azure ML, or your fancy data science or BI tools. Or if you want to do SQL queries, you're able to pull that information off the data because you’ve created the schemas of whatever insight you want to go after. What we're seeing when customers explore analytics, it's not just the one use case today but the thousands of use cases they need tomorrow. And so how do you make sure that your organization and your IT capabilities are able to feed all those insights? Be able to let your newly minted engineers ask Python or R queries? Be able to do all the various analyses that they need to do? You need to make sure that your data is ready for that. And Element Unify really helps organizations do this - getting the data into a standardized, contextualized model for generating insights. Element Unify also creates a graph database that lets you easily slice and dice the data, helping you connect to the raw data stored in all those source systems.

Q. So once the data is there, do you then leave it up to the customer to come to the data and put whatever reporting mechanisms on top of it they want?

Sameer: Yes and no. Some customers are more mature in their digital capabilities and want to be more self-serve. For others, this is their first step on the analytics journey. We have a team who are focused on making sure that their deployment goes well. That they can utilize the technology on Azure. We also have people who are focused on analytics. So, if you are looking for data science help or machine learning help to pull data into PowerBI, we want that to be self-serve. We teach people how to fish. We have a model where we’ll be a player for them, and then be a player-coach, and finally a coach to make sure they can be self-running with a new analytics environment. And then we also have people who are more strategic in their focus who say “Ok, maybe we are helping build this discipline of analytics in one part of the organization. But let’s help push this to other parts of the organization, for example, the downstream business, making sure people are much more self-serve, and turning us into a more analytically driven organization going forward.”

Q. And so it truly changes the culture of the company?

A. Oh yeah for sure. When you talk about digital transformation you need to be thinking of change management. When you think about change management there are technology changes, there are organizational changes, and there are process changes. Yes, our software can help you with the technology pieces, but you need to be thinking about this more holistically. We provide delivery services to make sure that we can help them do analytics. Most of our team have done this in the AdTech and commerce industries. And as we all know, they’ve been massively transformed by analytics. Our teams know how to deal with digital transformation there and are now helping to guide industry by applying our design thinking and machine learning experience.

Q. So not to go too far on a tangent, but are you guys using design thinking in your solutions?

A. Yes. I usually help folks go through a design thinking workshop. Workshops are between one to three days depending on the scope of the organization and what they are looking for. What we typically do is we go through a process of identifying what the use cases are, resulting in a lot of post-it notes on the whiteboard! We make sure that we have people from the business side, the IT side and the OT side, and making sure there's alignment. Also, an executive sponsor who is saying, “Yes, we need to pursue this.” Typically, we have 50+ use cases in about a half hour, so we whittle that down to the low hanging fruit of about 20 or so. And then we go through a set of criteria saying, “Alright, is this the right thing that we should be going after right now? Do we have the data available today? Do we have the right people on-board with this”? All the checkpoints that need to happen to make sure there is proper alignment. We then go through a value modeling exercise and crazy eights. Crazy eights is where everyone sketches what they want that solution to look like from a visualization, or process standpoint. And then we say, “Ok, let's pull on these two or three ideas” to start the organizational buy-in. And finally narrow to one idea that we can tackle. We go through this process with new and production customers. For production customers we have a checkpoint every six months to say, “Ok, what are the objectives that we need for the next six months? What are the key results and what are the tactical items that we need to get done in order to achieve that”? It's really helpful for folks to keep pushing along the analytics journey, using design thinking as an approach to percolate the best ideas up to the top, so we can tackle them and make sure that we're pushing forward on this capability.

David: That's fantastic. I'll put links to the Stanford program for design thinking in the show notes.

Sameer: That’s where a lot of the thinking has come from for us as well. We have quite a few folks from the Stanford D.School - https://dschool.stanford.edu/ on the team.

Diego: Could you tell us a little more on the data ingestion side. So, I understand that Element Unify helps customers by doing data joinings and digital twin mappings. On the data ingest side, how close to the device do you get. And what assumptions do you make as to the protocols they speak and the way they tag data. Can you elaborate on that side closer to where the rubber meets the road if you will?

Sameer: Yes. We only connect to systems like an enterprise asset management system or a process historian. We're not connecting to the actual edge devices or the gateways directly. Someone has a new set of Raspberry Pi’s and they want to send that data to their gateway and then to Azure IOT hub. We connect to IOT hub and then pull that information in. Also, from a data historian, or a process historian like the OSIsoft PI System or Honeywell PHD or AspenTech IP21 or Schneider Wonderware. I can go through the whole long list of systems that we connect to directly.

Diego: That answers my question, that you assume that when it comes to IOT telemetry or data records, that you start from the Azure IOT hub. And that’s where we pick up the data and Element focuses on what happens to the data from that point onwards.

Sameer: Exactly. Our take on this is that there's a lot of folks building out sensing, devices, and gateways to be able to push that data to the cloud. And that's great as there's enough use cases around that. And then others are building bespoke analytics for specific use cases around stick slips for oil and gas drilling or whatever it might be. And that's great as well and that can be applied models. But connecting that data in the context of everything that's going on at the edge, not just one sensor, or one piece of equipment, but the whole process, the whole facility. Things are upstream of each other and downstream from each other, working together. And you can’t just look at one manufacturer or one sensor analytical application. You’ve got to look at the whole thing in context. And so, we bring all that data together to make sure that you can go off and tackle the first one or two use cases, but also look at it in a broader scale as well.

David: I'd love to keep going with this but I'm afraid we reached the end of our time. So, I want to thank Sameer and Diego very much for joining the show today. Appreciate it you guys, it's been very informative for me.