By Alan McCord, Space-Time Insight
Data volumes are increasing at an exponential pace in practically all areas of science, business and technology. The multiple sources and levels of information sophistication are unprecedented. For the power industry, the age of complexity has officially arrived.
Data, Data Everywhere “
The data intricacy and volume which electric utilities need to understand is challenging even for the most technology-savvy operators. With the deployment of smart meters and advanced metering infrastructure (AMI), providers have to process massive meter event and interval read numbers as well as access and backhaul communication feeds, tariff and demand management information, and service order and asset management systems. New data domains must be combined with existing feeds from energy distribution networks, SCADA systems, weather forecasting and other varied sources. In the face of this data explosion, power providers are concurrently dealing with mandates for renewable energy sources, deregulation, increasing load, demand response, plug-in electric vehicles and charging infrastructure, and aging infrastructure efficiency. (These items also result in the need to understand even more data.) Information management has a big role to play in addressing the complexity inherent in next-generation utility ecosystems.
Utilities can store, compress and move data around faster than ever. These speeds and feeds alone, however, will not address the fundamental issue of complexity. The most critical challenge is not in getting the data; it’s doing something with the data. In a world of lightning-speed decisions, putting information rapidly into a now-what context is key. The software systems, architectures and analytic methods used to understand data have not evolved, however. Instead, they remain stuck in the 30-year-old world of data warehousing and traditional business intelligence.
In one example, a user may want to look at the revenue generated by his organization’s smart meters for a specific ZIP code last month. This analysis could be performed using traditional business intelligence, as the parameters are known early. What happens, however, if there’s a power outage in different areas? What if the user needs to immediately determine where the outage is, what areas are affected, what the possible causes of the outage could be and what revenue is at risk? The user needs the flexibility to define the search criteria on-the-fly because he may not know what he’s looking for, and the search for correlations needs to include both structured and unstructured data. As data domains are increasingly interrelated and dynamically connected, the complexity demands a new data management paradigm—one that’s based on the situation at hand and not on yesterday’s data.
In response to data growth and complexity, the information management landscape is changing. One breakthrough lies in having a more logical (rather than physical) approach to data warehousing. Essentially, this can include a federation of different types of data stores, such as a column-store database management system (DBMS), MapReduce-based systems like Hadoop, and in-memory databases. With these types of solutions, data can be loaded temporarily into part of the system for analysis then into another for different analysis or storage.
Another leap forward is an approach to analysis known as situational intelligence.
Situational intelligence proposes the use of multidimensional spatial maps to combine visualizations of structured and unstructured data from different underlying domains in a way that users can see and understand a situation at a glance.
Humans like to see where things are happening, how they are happening, and in what time frame they are happening—rather than search though static text data or numbers on spreadsheets to connect the dots. In the context of the utility industry, this means being able to bring in multiple streams of data from meters, sensors, weather forecasts, ERP systems, traditional and next-generation databases, complex event processing and more—and getting a single, intuitive view of this information in real time.
Some software, as an example, enables situational intelligence by integrating, normalizing and correlating data from multiple sources, such as spreadsheets, databases, weather services, social media, SCADA devices and more. This information is then made available through a combination of geospatial displays, using 3-D visualizations, color-coding, alerts and other techniques that helps users quickly and easily understand their data.
In addition to integrating and visualizing disparate real-time data streams, situational intelligence is also highly flexible in how data is presented, which gives users the ability to easily turn on or turn off information that is either deemed important or irrelevant. For example, a user investigating the cause of a power outage might first overlay a map of her company’s electric grid with weather information to see if a storm is the culprit. Alternatively, she can overlay the map with the location of maintenance crews over the period of time. This is an intuitive approach similar to how the human brain actually works. Humans don’t often process 10 pieces of information simultaneously; an individual typically look at a few at a time.
Finally, being able to look at information across time is another important aspect of situational intelligence. If for example, an outage occurs and operators want to find out why, they could rewind their data over the past day or hour to see what happened.
Still in its early stages, there are some exciting new directions on the horizon for situational intelligence. As artificial intelligence (AI) and machine learning technologies enter the mainstream, opportunities arise to combine situational intelligence with automated machine learning algorithms that can look for significant patterns in the data. This augmented situational intelligence offers great potential to take yet another leap forward in getting data complexity under control. Together with emerging technologies for databases, data storage and processing, complex event processing and machine learning, situational intelligence will help enable leaders in the power industry meet many of the data challenges of the present, and the future with confidence.
Alan McCord is Space-Time Insight’s director of product management. He has a Ph.D. in computational physics and has worked in power system modeling, chaos theory, analytics and machine learning systems.
Editor’s note: SpaceTime Insight’s situational modeling system helped propel the California ISO to a Projects of the Year award in 2012 to be announced at DistribuTECH 2012’s keynote in San Antonio. More information will be available online at http://distributech.com and in the April issue of POWERGRID International.
View Power Generation Articles on PennEnergy.com