Architecture, Analytics Key to Smart Grid Performance

By David L Haak, Accenture Smart Grid Services

Data are fundamental smart grid currencies, and data will be hitting utility systems in a deluge. For context, think of grid data as equivalent to famous novels. If, every second, a legacy grid produces the data equivalent to one copy of “A Tale of Two Cities” by Charles Dickens, then a smart grid can produce the data equal to some 846 copies of Leo Tolstoy’s “War and Peace,” a novel more than four times longer than Dickens’ tome, according to playful calculations by Accenture staff.

Meanwhile, utilities have high hopes for smart grid performance. They’re hoping a smarter grid will help them tackle a host of problems, including an aging work force, carbon concerns, reliability issues, demand growth, capacity constraints, competition and more.

To tackle these issues, the grid—and the utility’s business as a whole—must become more observable, controllable, automated and integrated. The smart grid also must facilitate improved asset and work management, as well as integration of renewable energy sources, distributed generation and storage as components of the supply mix. And it will, provided utilities plan well.

Dissecting the Data Deluge

The smart grid utilizes sophisticated sensing, embedded processing, digital communications and software designed to generate, manage and respond to network-derived information. But the assets that make the grid smart—namely, data—also make it difficult to manage, which means utilities must implement data management solutions to turn a potentially bewildering flood of data into useful operational information. To do this, utilities and their stakeholders need a holistic view of the data components and characteristics.

This view starts with clear understanding of how smart grid data generate, what that data consist of and the benefits data deliver. Optimally, utilities will design smart grid functions with business objectives in mind, rather than designing a grid first and then seeking potential benefits after the fact.

In general, data management design should extract clean, consistent and well-understood information that drives targeted benefits for the utility business. In addition, utilities must ensure that grid data are governed, readily measurable and observable. Because utilities historically have been unable to observe power distribution grids, developing a true smart grid requires the creation of an explicit grid-observation strategy. Parts of this strategy development already exist in most utilities, but the design will need to close the loop to optimize grid performance on a continual basis. (See Figure 1 on Page 44.)

Creating such a strategy requires a solid understanding of master data, as well as the nature and flow of smart grid data through the organization. As Figure 2 on Page 48 illustrates, data are initially generated by network devices such as meters and sensors before being transported for storage and processing by various applications—the persistence phase. Then they are transformed into actionable operations-oriented information for network and technical analysis, requiring new visualization capabilities. Finally, the resulting analytics applicable for nonreal-time operational consumption are integrated at the enterprise level to drive strategic decision making.

What composes smart grid data? There are five classes of smart grid data, each with its own characteristics.

Operational data. Representing the electrical behavior of the grid, these include voltage and current phasors, real and reactive power flows, demand response capacity, distributed energy capacity and power flows, plus forecasts for any of these data items.

Nonoperational data. Reflecting the condition and behavior of assets, these include master data, data on power quality and reliability, asset stressors, utilization and telemetry from instruments not directly associated with grid power delivery.

Meter data. These data show total power usage and demand values such as average, peak and time of day. They don’t include items such as voltages, power flows, power factor or power quality data, which are sourced at meters but fall into other data classes.

Event message data. Comprised of asynchronous event messages from smart grid devices, these data include meter voltage loss/restoration messages, fault detection event messages and event outputs from various technical analytics.

Metadata. These overarching data are necessary to organize and interpret all the other data classes. Metadata include information on grid connectivity, network addresses, point lists, calibration constants, normalizing factors, element naming, and network parameters and protocols.

Utilities face significant challenges across all five classes in applying smart grid data to their processes. Because raw data from smart grid devices and systems aren’t directly usable or even comprehensible, they need to be transformed into useful information before they can be acted upon. Further complications exist. There’s a need for some information to be used directly by automated systems while other information must be presented to people in forms they can easily understand. Data also must be used on many different time scales, with cycle times ranging from milliseconds to months. And information must be managed in a way that matches the local industry structure and regulatory requirements.

Given these factors, most utilities face four major data management challenges in developing smart grids.

Four Major Data Management Challenges

The first challenge is matching the data acquisition infrastructure to the required outcomes. This includes issues such as the number, kind and placement of data measurement devices, the use of communication networks and data collection engines, and the chosen data persistence architectures.

The second challenge is in learning to apply new tools, standards and architectures to manage grid data at scale. This involves pursuing the development and adoption of new open standards for interoperability, creating and managing distributed data architectures, and applying new analytics tools to make sense of the data flood.

The third challenge is transforming processes throughout the business to take advantage of smart grid technology. Accenture research suggests that smarter grids will impact up to 70 percent of retail/customer and transmission and distribution processes.

The fourth challenge is managing master data to enable the benefits from smart grid capabilities. As utilities improve customer experience through channel management, outage notifications and energy advice, effective master data management is the core nervous system to foster success and growth.

Considering the deluge of smart grid data ahead, what approach should utilities take? First, utilities should ensure that the five data classes previously highlighted are reflected in the data integration architecture. Second, they must use the right analytics to turn the mass of data into usable information and business intelligence.

If designed properly, the data architecture will provide the capabilities utilities will need to deal with future change and evolution in their smart grids and business environment. To do this, the architecture will need to include more than just data stores but also elements such as master data management, services and integration buses to effectively share data and information.

Smart grid data architecture must provide a sound platform on which to apply relevant and sophisticated data analytics. Grid data are too voluminous for people to comprehend directly, and a large amount of data will be used by systems without human intervention. Technical analytics are critical software tools and processes that transform raw data into useful, comprehensible information for operations decision-making.

To date, Accenture has catalogued more than 200 smart grid analytics and several classes of technical analytics such as:

  • Electrical and device states (including traditional, renewables and distributed energy resources),
  • Power quality
  • Reliability and operational effectiveness (system performance),
  • Asset health and stress (for asset management),
  • Asset utilization (e.g., transformer loading), and
  • Customer behavior (especially in demand response).

When incorporating analytics into the data management design, two major considerations are data time scales (latency) and volume scalability. Because of varying application requirements, some analytics must be available at high speed and with low latency (milliseconds), primarily at the level of grid sensors and devices. Others fall into the seconds-to-minutes range, including those for operational processes such as operational efficiency verification, real-time utilization optimization (load balancing) and outage management, and some may play out over hours, days, weeks or months.

To incorporate varying levels of latency accurately into the data management architecture, utilities should construct a data latency hierarchy. This enables the data to be treated and analyzed differently on the basis of its latency and applicability, ranging from the lowest latency data (where real-time technical analytics feed into protection and control system) to the highest latency (where operational analytics can feed into business intelligence management dashboards and reporting).

There are a number of techniques utilities can use to drive the benefits from smart grid. One is complex event processing, a computing platform that involves continually running static queries against multiple dynamic data streams. This enables a utility to manage the bursts of asynchronous event messages generated by smart grid devices and systems when an event (usually a problem) arises on the grid. Complex event processing is not widely used in the utility industry, and it’s a different approach to the standard transaction management approach used universally today.

Another valuable platform for consideration is visualization techniques—a direct extension of analytics for the human eye and brain. By replacing hard-to-understand columns of streaming numbers with well-considered graphic depictions integrated from multiple sources, visualization platforms can provide instant comprehension and avoid swivel-chair integration, or the process in which a human user re-keys information from one computer system to another.

David L Haak is Accenture Smart Grid Services’ lead for North America. Accenture is a global management consulting, technology services and outsourcing company.

Achieving Excellence: Seven Top Tips

Based on experience with smart grid data management, Accenture developed this key best practices list when developing and implementing smart grid solutions.

1. Recognize smart grid data classes and their characteristics to develop comprehensive smart grid data management and governance capabilities.
2. Consider how data sources can support multiple outcomes via analytics and visualization to realize the maximum value from the sensing infrastructure.
3. Consider distributed data, event processing and analytics architectures to help resolve latency, scale and robustness challenges.
4. Consider the whole smart grid challenge when planning data management, analytics and visualization capabilities—not just advanced metering infrastructure—to avoid stranded investments or capability impediment.
5. Design data architectures that leverage quality master data to match data classes and analytics/application characteristics. A giant data warehouse is rarely maintainable.
6. Look to new tools such as complex event processing to handle challenges around processing new data classes. Managing the new smart grid data deluge via historical transaction processing approaches is likely not scalable.
7. Develop business process transformation plans at the same time as—and in alignment with—smart grid designs.


Find more information from Accenture online by going to the website and typing “Accenture” into the search engine. You’ll find:

  • “Building Brains Behind the Smart Grid” by Hormoz Kazemzadeh,
  • “Are GIS and EAM Systems Smart Grid-ready?” by Jeff Hanna and Paul Yarka,
  • Details on the company’s recent corporate changes, joint ventures and contracts, including one with Pepco Holdings,
  • And more.

Visit us online at for all the details.

More PowerGrid International Issue Articles
View Power Generation Articles on
Previous article2012—Unfinished Business, New Challenges, Opportunities
Next articleIllinois Case Study: How to Increase Critical Customer Load Reliability

No posts to display