Non-operational Data: The Untapped Value of Substation Automation

By David Kreiss, Kreiss Johnson Technologies

Utilities have spent millions over the past decade on substation automation, but in many cases they have neither realized the projected return on investment nor taken advantage of the improved system performance these projects could provide. Making more productive use of the non-operational data is one solution to maximizing the return on investment from substation automation projects.

Click here to enlarge image

The automation of substations and related infrastructure is geared toward providing a steady stream of operational information in real time that enables the utility to remotely control and fine-tune system performance. What few utilities realize is that this flow of operational data tells only part of the story. Most substation measurement devices also measure and store what is called non-operational data–details of events such as faults, power grid fluctuations and voltage disturbances. While automated devices routinely identify these events when they happen, non-operational data can reveal why they occur and predict future incidents.

Click here to enlarge image

The blackout of 2003 serves as a cautionary tale of the dangers utilities face when they do not–or cannot–recognize the subtle early warning signs of system weakness. Non-operational data has the potential to reveal operational and infrastructure flaws during normal operating situations, giving utilities a chance to respond to them before they cascade into catastrophic events.

“Utilities should monitor all distribution substations in a simple way that allows them to identify disturbing trends,” said Jim Burke, executive consultant for Synergetic Design in Cary, N.C. “Historical data should be analyzed to flag potential trouble that would not otherwise be obvious.”

Automating the Substation

Substation automation focuses on improving the utility’s bottom line. Automation technology has enabled utilities to more closely monitor the performance of their distribution infrastructure, further enhancing revenues by restoring power more quickly, squeezing extra years out of aging equipment, and operating systems at higher rates. Under the watchful eyes of investors, customers and public utility commissions, power companies have embraced substation automation as a major source of cost reduction and improved operations.

Automation includes a variety of components designed to remotely measure and monitor conditions in the substation and deliver this data back to a central office. The range of devices includes intelligent electronic devices (IEDs), digital fault recorders, substation power monitors, power quality monitors, equipment measurement devices (internal and external), environmental monitors and online services.

Despite the promise of quick return on investment, utilities have been slow in adopting these beneficial technologies. Further, when a piece of substation automation equipment is installed, it often stands alone as a vertical application feeding information to a single department. The generated data is rarely integrated with other information and shared across departmental boundaries. Even worse, the equipment is sometimes forgotten when its primary user within the utility moves on to another department or position.

Benefiting from Non-Operational Data

All of these devices–regardless of which vendor developed them and whether they measure power components or equipment status–have one often-overlooked capability in common. They acquire both operational and non-operational data, according to John McDonald, manager of automation, reliability and asset management at KEMA Inc., an international utility consulting firm headquartered in Fairfax, Va.

“Operational data consists of the instantaneous measurements of volts, currents and breaker status made by the automated devices and transmitted continuously as data points back to the control center,” he said.

Non-operational data, on the other hand, must be retrieved manually and consists of records, or logs, of multiple events such as a series of faults, power fluctuations, disturbances and lightning strikes. Modern IEDs routinely record historical logs of power data and can be programmed to monitor threshold exceptions of volts, amps and flickers. Among the most valuable non-operational data are digitized waveforms that reveal precisely what occurred during a voltage disturbance or fault (see sidebar, “Non-operational Data Sets,” Page 32.)

The logged data also shows event history, such as repeated faults, which a human might otherwise overlook. In a best-case scenario, assessing a series of faults could point to an underlying problem before it gets worse; at the very least, such information can identify the need to service or replace a worn-out or troubled breaker ahead of schedule, thus avoiding a serious malfunction.

The real value in non-operational data comes from integrating data sets and sharing the information with multiple utility departments. For example, transformer technicians who once only looked at dissolved gas data can also examine power waveforms to correlate how power fluctuations are impacting transformer health. This can assist them in planning preventative maintenance.

Integrating data sets can also help determine what caused a fault or a transformer trip. If the trip coincided with a lightning strike-induced fault and the dissolved gas reading indicates normal conditions, a control center technician can conclude the transformer is safe to operate. The transformer can be restored remotely without the need for a time-consuming onsite visit by a field crew, putting customers back on-line in minutes instead of hours.

“The bottom line for non-operational data is that it gives utility personnel the information they need to take action that keeps equipment healthy and prevents a problem from getting worse,” said McDonald.

Overcoming Barriers to Data Use

With so much to gain from this non-operational data, why aren’t utilities taking advantage of it? Put simply, there has been no unified system architecture to overcome the four primary barriers of data access, integration, storage and analysis.

The first major impediment is easy access. Unlike operational data that flows from IEDs using standard protocols such as Modbus or DNP3, non-operational data must be retrieved manually using vendor-proprietary protocols. Although vendors provide software capable of collecting and analyzing data from their own systems, this leaves the utility with silos of disparate non-operational data sets that cannot be integrated.

Effort has been made in developing standard protocols for non-operational data that would allow for easy integration of data. UCA2 MMS is an example, but it is not in widespread use and may not address all of the needs for obtaining non-operational data.

While standards are developed, the only viable solution is to obtain proprietary and standard protocols from the major automation vendors, build a protocol library and incorporate drivers for each into a single software package designed for non-operational data retrieval, integration and analysis.

A key to integrating various data sources successfully is to time synchronize the data collection devices in the substation. This ensures that events occurring simultaneously can be correlated. The most inexpensive way to accomplish this is by installing a GPS receiver in the substation and synchronizing all automated equipment to it.

To integrate data prior to analysis, the architecture must handle and process data sets of many different types and formats. Data display and analysis tools must also be data independent.

Data storage is another obstacle precluding practical use of non-operational data. Large data sets have to be archived and retrieved. A dedicated data warehouse must be established, and it must be capable of handling the wide variety of data types represented by different non-operational measurements, such as temperatures, lightning strikes and waveforms. In addition, the database must store and maintain relationships among measurements and related properties. For instance, a voltage reading cannot be analyzed automatically if the software doesn’t know the location and duration of the recording.

Telecommunication is another major stumbling block. Roughly 75 percent of utilities still use standard telephone connections to transmit data from their substations to the control center. Unfortunately, because non-operational data is typically an archive of data, the files are too large for fast transfer over a telephone connection.

“The data sets coming out of substations are huge, and phone lines are unreliable,” said John Scholz, vice president of sales and marketing at RFL Electronics, a manufacturer of protection communications systems based in Boonton, N.J.

Fortunately, utilities have many options to enhance their pipelines from the substation, such as fiber, frame relay, microwave and other technologies.

Scholz adds that bandwidth issues can wreak havoc inside the substation as well. Utilities must find practical and inexpensive methods of gathering data from multiple automated mechanisms within the substation. “A lot of our clients are installing Ethernet hubs (as part of their substation automation projects),” he said.

Integrating, Analyzing and Using Data

The goal of using non-operational data should be to combine and process the intelligence found in disparate data, such as historical power parameter logs, waveforms, dissolved gases and lightning strikes, to generate information that can help the utility save money. The utility industry has historically lacked a means of analyzing these data sets intelligently. The challenge lies in the size and complexity of non-operational data.

Therefore, a viable analysis solution requires development of knowledge-based systems with embedded expert reasoning to continuously review and correlate this data. This process creates an intelligent filter to recognize and extract data sets of particular interest and apply appropriate analyses to them. The objective is to streamline the process of delivering useful information to the engineer, so he or she does not have to collect and review it manually. Neural nets, fuzzy logic, expert- and rule-based analysis have been used to perform this raw data processing.

The next step is to deliver this information automatically. Routine reports should be delivered to control center personnel via an easy-to-understand user interface or integrated into the interface of an existing monitoring application in the control room. Abnormal events identified by the system should trigger alarms and create immediate messages for delivery by e-mail or pagers.

Click here to enlarge image

In the spirit of enterprise-wide access, the information generated from non-operational data analysis must be available to personnel in departments throughout the utility–such as operations and maintenance, planning and asset management, and marketing and customer support–either through direct login or online. In addition, analysis results should also be transmitted directly to other automated applications, including outage and work order management, so these systems can take the first steps to rectify a problem.

Testing the Solution

Cooperation among utilities and various technology vendors has recently paid off with the development of applications that allow utilities to leverage the full power of non-operational data. Working under a common platform, these applications are capable of locating faults, predicting maintenance needs and clearing faults automatically. Based on utility feedback, the power industry is poised to make non-operational data use a core feature in daily operations.

A follow-up article in an upcoming issue of Utility Automation will examine how engineers at utilities around the country have applied non-operational data analysis to address specific issues that enabled them to take corrective actions. These actions paid off for the utility in terms of improved systems performance, reduced maintenance costs and other bottom line benefits.

David Kreiss is president of Kreiss Johnson Technologies, a developer of substation and power analysis software for the utility industry based in Del Mar, Calif. The company specializes in web-based open architecture power analysis systems that employ artificial intelligence components for data analysis. David has spent most of his career designing hardware and software products including those for relay test, substation and power quality monitoring as vice president with Dranetz Technologies for 17 years and Kreiss Johnson Technologies since 1994. David is a member of many industry standard setting committees including the IEEE1159, 1205, and the Standards Coordinating Committee 22. He is co-author of the Dranetz Power Quality Handbook and a regular author, speaker, and panelist at industry conferences. David can be contacted at (858) 535-2088 or

Previous articleELP Volume 81 Issue 9
Next articleUtility Automation & Engineering T&D magazine will cover the power grid from end to end

No posts to display