Across every industry segment, the Internet of Things (IoT) generates amounts of data that would have been inconceivable to CIOs a decade ago. For utilities, IoT data from smart meters and other connected devices that make up the Smart Grid holds the potential to improve operational efficiencies. In fact, utility CIOs are placing big bets on the promise of this technology. In the 2018 Gartner CIO Survey, utility CIOs mentioned IoT as a key technology enabler more often than their peers in other industries. Utility CIOs also prioritize Business Intelligence and Analytics, but are they missing out on a required capability to extract the most value from all this data and to turn the potential into reality?
With the high volume and velocity of data in an increasingly sensor-rich smart grid, today’s utilities need a different type of data management solution than that required for traditional, stationary transactional data. Two critical differences between management systems for streaming and stationary data include:
• IoT data needs to be managed and analyzed in real time rather than relying exclusively on historical reporting and analysis.
• Streaming IoT data can deliver additional value when analyzed in multiple phases versus once or twice with stationary systems.
Real-time analytics versus after-the-fact
Imagine that you are monitoring the performance of a gas-fired power plant. IoT sensors continuously monitor temperature, vibration, humidity, speed, and other metrics of turbine performance. While still operating in expected bounds, trends in the sensor data indicate that contaminants are building up on compressor surfaces. If you can react before a failure, you can prevent a costly and potentially dangerous situation. If you can’t, the sensor data is meaningless and useless.
Event stream processing systems enable you to act on this information in a timely fashion through real-time data cleansing and analytics.
When event stream processing systems manage data from IoT sensors, they perform processes that turn raw data into useful information in real time. As large amounts of data rapidly stream into the system, event stream processing systems cleanse, normalize and aggregate data immediately in-memory. Simultaneously, real-time analytics models encoded in these data streams perform analysis to determine whether a particular event is relevant and generate instant alerts when urgent action is needed.
When the turbine is in danger of a safety shutdown (or worse), real-time analytics at the edge immediately alert the production operators to take the appropriate action.
Event stream processing systems also filter data in real time. Because the memory in which these systems initially store data is limited, the event stream processing system decides what data to discard and what to keep long term.
For example, if a sensor is tracking temperature and the temperature stays steady, the system doesn’t store ongoing readings. Instead, it might retain only the readings that indicate a change.
In contrast, traditional relational database management (RDBMS) systems store all data and perform cleansing and analysis after the fact. RDBMS systems collect data from predefined sources and store it in a persistent storage system, such as a data mart. Once in storage, data is cleansed, normalized and consolidated into a data warehouse or Hadoop. Only then can users derive meaning from the data through reporting, historical analysis—and even predictive analysis and machine learning.
Another difference between event stream processing and traditional, stationary data analytics is that the former gives you multiple opportunities to extract value from your data. With traditional data management, data is historical and does not change. It may be analyzed once or twice after the fact, not more.
As discussed previously, event stream processing systems first analyze data in real time enabling immediate response to events.
Additionally, in real-time or near real-time, you can bring a subset of the data from multiple sensors back to the cloud or on-premises for cross-sensor analysis.
You might perform analysis across your entire fleet of turbines to determine fault conditions occurring under certain environmental conditions. If the system detects a problem, it could trigger a proactive maintenance procedure.
Finally, the event stream processing system also stores specified data in a data warehouse or Hadoop. There you can perform analytics on the now historical data. In this case, you could use machine learning algorithms for predictive maintenance. Over time, machine learning algorithms can learn patterns that indicate when turbines will soon require maintenance to catch failures before they happen.
In all steps of multi-phase analytics, machine learning can train the system to better predict outcomes. As the model changes, the stream processing solution can update the models at the edge, on-premises or in the cloud when necessary.
Maximizing value from Smart Grid data
Management systems for streaming data allow you to glean insights—when and where you need them—from IoT data throughout the smart grid. Not only can you use event stream processing systems to respond to events in real time, you can also combine streaming data with offline sensor data to build more robust models. Bringing analytics to the data, whether it is in motion or at rest, will be key for utility CIOs who want to turn potential value into realized value.