By Steven M. Brown, Senior Associate Editor
The sequence of events appears to have begun at approximately noon Eastern Daylight Time on Aug. 14, 2003, with generating unit trips in Ohio and Michigan. What would ultimately become the blackout of 2003 soon escalated into widespread transmission line disconnections across the upper Midwest, Northeast and parts of Canada, and culminated with thousands of New Yorkers spilling into the streets for a long, dark walk home from work while the rest of the nation watched on cable news networks and gave thanks for the wonders of air conditioning and incandescent light bulbs.
That four-hour sequence of events, beginning shortly after noon and ending about 4:13 p.m., could shape regulatory, financial and technological developments in electric power delivery for years to come.
For a week after the blackout, the power grid received the kind of around-the-clock attention generally reserved for war, celebrity murder trials and political scandal. Now that the dust has settled and mass media attention has turned back toward subjects that don’t require a working knowledge of physics to understand, the real power industry experts–and politicians–have turned their attention toward investigating the events of Aug. 14 and preventing their reoccurrence.
Michehl Gent, president of the North American Electric Reliability Council (NERC), already said the final report from the official blackout investigation may not be complete for a year. But while the exact root causes, and the circumstances that allowed the blackout to spread across such a wide area, were still being investigated at the time of this writing, the usual suspects had already taken their places in the lineup of likely contributing factors. Meager investment in the transmission system, overly congested power lines, poor coordination between individual utility companies and grid operators, and faulty technology, or a lack of technology, all stood to absorb some portion of the blame.
A Statistical Inevitability
A blackout like the one that occurred Aug. 14 is never the result of a single, isolated event. While it may be possible to trace a blackout’s genesis to a single point–a transmission line sagging into a tree, for instance–experts like Damir Novosel, senior vice president of KEMA’s T&D consulting business, note that cascading outages are the result of multiple low-probability events occurring in conjunction with one another. It’s not just a matter of a line sagging into a tree; it is, for example, a matter of a line sagging into a tree during a period of heavy grid congestion, coupled with a key generating unit being off-line for routine maintenance, compounded by the malfunctioning of a protective relay or two.
“It sounds like bad luck, but it’s purely statistical,” Novosel said. “It (a cascading outage) may not happen tomorrow. It may not happen next month or next year, but it will happen. As a matter of statistics, these low-probability events will eventually happen in parallel.”
Novosel noted that the likelihood of low-probability events escalating into a cascading outage increases when the grid is already under stress due to certain preconditions, such as congestion, aging infrastructure or lack of investment in the grid.
Not coincidentally, it is widely recognized that all of those preconditions currently exist in the U.S. electric power grid–and not just in the region affected by the ’03 blackout.
A recent analysis of historical transmission and distribution (T&D) capital expenditures by U.S. electric utilities found that current T&D expenditures are at early 1960s levels. The analysis, performed by The C Three Group, also showed that current T&D spending is $10 billion less than it was in 1970, and transmission-specific spending is 65 percent lower than in 1970 (Figure 1, Page 12). At the same time, end-use electricity consumption has almost tripled since 1970.
The analysis also found that much of the existing transmission infrastructure in the United States is well over 30 years old.
Despite the historic lack of investment, and despite growing demands on the grid, a recent report from the Electric Power Research Institute (EPRI) noted that, according to utility company capital expenditure plans, “the underfunding trend is not being reversed.” The EPRI report, titled “Electricity Sector Framework for the Future,” shows that projected transmission system expansion through 2007 is roughly one-quarter of the projected growth in demand over that same time period (Figure 2, Page 12).
The discrepancy between demand growth and grid expansion makes it likely that grid congestion will continue to be a problem across the United States. Excessive congestion is a common precondition for blackouts, according to ICF Consulting, a management, technology and policy consulting firm with expertise in the energy industry.
Likening the congested grid to a busy highway, Philip Mihlmester, senior vice president and managing director of ICF Consulting, said “If you have a thousand cars on an interstate as opposed to a hundred cars, there’s going to be a higher probability for accidents.”
Mihlmester and his company have forecast the number of hours in which congested conditions are likely to be recorded across the interconnected transmission grid in 2004 (Figure 3, Page 14). ICF’s findings indicate that congestion will be most abundant in California and New York in 2004, but that even less densely populated regions may experience congested grid conditions more than 40 percent of the time in 2004.
“Congestion is a problem across the United States,” Mihlmester said. “It’s not just in the Northeast. An important precondition (for blackouts) exists in a lot of different places in the United States.”
ICF has recommended a number of potential actions to bolster the transmission grid and, hopefully, stave off future blackouts. Among ICF’s recommendations are implementation of mandatory, enforceable transmission reliability standards, use of technologies such as flexible AC transmission systems (FACTS) and superconductivity to push more power through existing corridors, and adoption of dynamic programming methodologies–as opposed to using static load flow snapshots–in transmission planning.
“In the past, you knew exactly where the generated power was coming from, and transmission planning was done largely in conjunction with generation expansion,” said Judah Rose, senior vice president and managing director of ICF. “Now, if you don’t use these dynamic, probabilistic modeling frameworks to anticipate where the power flows are going to be … you’re constantly going to be surprised.”
In addition to dynamic transmission planning methodologies, utilities may need to begin to take a more dynamic approach with their protective relay settings. KEMA’s Novosel noted that relays are set for certain system configurations and operating conditions, but that the settings are not always reviewed as conditions change–for instance, when new load comes into an area.
Novosel said protection system failures, malfunctions and outdated settings have been noted in 70 percent of past North American blackouts.
Smart Asset Management
Estimates from organizations such as EPRI, the Edison Electric Institute and various utility industry consulting firms, place the needed transmission system investment anywhere from $5 billion to $10 billion per year over the next decade. It is important to note that most of those estimates were made before the Aug. 14 blackout.
The events of Aug. 14 underscored–in bold, indelible ink–the need for increased investment, but Novosel believes any investment should be preceded by a prudent analysis of which investments are most necessary.
“Investments need to be made. There’s no question about it,” Novosel said. “But we have to be careful and really analyze which are the best investments to make.”
Novosel believes that, if the climate for transmission system investment improves, early investments will likely go toward improved monitoring and diagnostic equipment, control center improvements, operator training and studies on protection coordination. He believes investment in physical infrastructure–lines, cable, new substations, etc.–will come at a later date, along with increased spending on more advanced technologies, such as FACTS and, eventually, wide-area measurement systems (WAMS).
“You can’t solve all the problems with the latest and greatest protection schemes and new technologies,” Novosel said. “You also must have a transmission infrastructure that can sustain the power flow. It’s a balance. You have to have a combination of investment in infrastructure and technology to support that infrastructure.
“But we don’t have to jump right now and make investments left and right without making sure we can get a return on those investments,” Novosel warned. “A really detailed analysis needs to be done.”
(Editor’s note: Read more about Dr. Novosel’s vision of smart asset management in the Perspectives column on Page 48.)
The Role of Technology
The experts interviewed for this article agree that it will take some combination of new physical infrastructure and properly applied technology to bring the power delivery system to a point where it can adequately support current and future demands. They also agree that the siting of new lines and substations will remain a contentious issue–even in a post-blackout environment.
“Despite the great injury that was done on Aug. 14, in the end all politics are local, and the impacts of power line construction remain local,” said John Howe, American Superconductor’s vice president of electric industry affairs. “I don’t think the blackout will significantly mitigate community opposition to hosting overhead power line infrastructure.”
Howe, whose company manufactures superconductor wire and markets FACTS and other power electronics solutions to the power industry, said he believes that by using the right technological solutions, the $50 billion to $100 billion in upgrades to the transmission system could be made much lower.
His company, for instance, is currently analyzing situations where, through the use of a compact, underground very low-impedance cable, large amounts of power could be injected directly into the heart of a congested load pocket.
Howe also noted that, in a year, the typical power line is loaded, on average, to about a third of its annual thermal transfer capacity. He believes there is a lot of room, in some parts of the country, to extract more transfer capacity from existing lines by using voltage stabilization equipment, which can be installed in existing substations without environmental impact.
“The conventional solutions might involve several overhead circuits running through new rights of ways, which can be very difficult to obtain,” Howe said. “With a new way of thinking, it may be possible to tackle these problems at much lower costs and with much less of an environmental impact.”
Researchers at the Department of Energy’s Pacific Northwest National Laboratory (PNNL) are also investigating innovative uses of technology that could mitigate cascading outages, while also lessening environmental impact. One of PNNL’s initiatives aims to take blackout prevention right down to the residential consumer level.
Rob Pratt is program manager for a PNNL initiative called GridWise. As part of GridWise, PNNL engineers are designing smart chips with the ability to detect the frequency of the power grid. These chips would be fitted into household appliances, such as water heaters, air conditioners or clothes dryers. When frequency on the grid deviates away from 60 Hz, the chip would detect the deviation and shut down the appliance for a brief period.
“If you drop the load right away, the stress on the system will be dramatically reduced very quickly–almost instantaneously,” Pratt said. “It gives the grid operator maybe 10 minutes to try to get this bucking horse back under control.”
Pratt said that while the cost, on a per-appliance basis, for this technology is very small, the cumulative value to the grid could be huge. Of course, that would be contingent on getting millions of these smart appliances into use. PNNL is currently working with appliance makers to get the chips into appliances. The next step will be getting the smart devices into consumers’ homes.
“What we’re really talking about is saving energy with bytes instead of steel,” Pratt said. “We’re not trying to say investment in traditional infrastructure will be completely displaced by information technologies. We’re making the case that with some reasonable penetration levels we could save maybe 10 percent of the needed infrastructure over the next 20 years.”
PNNL is expected to begin a demonstration project this fall to test its grid-friendly appliances.
Another innovation from PNNL’s laboratories is the wide-area measurement system (WAMS), which the laboratory developed in conjunction with Bonneville Power Administration (BPA). WAMS can monitor fluctuations in transmission lines, predict failure-level problems and analyze outage causes. It has been reported that the prototype WAMS that was in place at BPA during the Aug. 10, 1996, Western states blackout may have provided up to a six-minute warning of the events that eventually led to that blackout. If operators had been able to act on information from the prototype system, it has been speculated they might have been able to largely mitigate the Western states cascading outage of 1996.
“Over the past couple of years, we’ve been discussing implementation of WAMS on the East Coast with several utilities,” said Jeff Dagle, PNNL chief electrical engineer and grid reliability expert. “The goal is to incorporate information technologies to better predict events, match power need and demands, and consider real-time needs in hopes of improving management of the grid.”
Clark Gellings, EPRI’s vice president of power delivery and markets, is another proponent of implementing some type of wide-area measurement and control to the electric power system. Current state estimation technology lacks the immediacy needed to manage a modern power grid, he said.
“We need to be able to understand the condition of the power system at any moment in time,” Gellings said. “Right now we assess its condition through a limited set of sensors and some communications; we take that data, and we make some computational estimates of the system’s condition.
“It takes nearly a minute, though, for us to estimate the state of a system that’s moving at the speed of light.”
Gellings advocates putting technology in place to develop what EPRI refers to as a “self-healing grid” capable of automatically anticipating and responding to disturbances, while continually optimizing its own performance.
The first step toward making the self-healing grid a reality, Gellings said, is acquiring detailed real-time information about the system, which requires sensors integrated with communications capabilities. Next comes the need for very fast simulation and modeling to estimate the system’s condition every second. To that end, the Electricity Innovation Institute (E2I), an EPRI affiliate, issued a request for proposal on Sept. 2 to solicit assistance in developing a software system designed to provide faster-than-real-time simulation and modeling of grid dynamics at different levels of topological detail over a range of different time domains. (Information about the RFP is available at the E2I website at www.e2i.org.)
Once the fast simulation and modeling are available, Gellings said there is a need to more fully integrate power electronics devices, such as FACTS, into the system.
“If you have the capability to sense system condition, and you have some control hierarchy and the ability to switch with power electronic speed, you could have a self-healing system–or at least one that isolates disturbances very locally or reconfigures the system in real-time to minimize disruption,” Gellings said.
While some of the technology is already available, Gellings estimated it would take another four or five years to have all the technology to make the self-healing grid a reality. It would then take, he estimated, another 10 years to see the technology deployed.
“What’s impeding the use of FACTS and other power electronics solutions right now is cost,” Gellings said. “We have to get aggressive about getting the costs down.”
Gellings said he hopes the lasting legacy of the blackout will be a greater appreciation for what properly implemented technology can do for the grid.
“Support, in general, for science and technology has waned,” Gellings said. “It’s waned both philosophically and practically. It’s both a matter of a loss of idealism as well as not having the resources.
“What I hope comes of this (the blackout) is a realization that technology is important, and perhaps a renewal of will to aggressively study and investigate technology,” he said. “All the rest–the research and development funding, the investments in the grid–will follow if we once again begin to recognize the importance of science and technology.”