By Tim Taylor, Ventyx
While reading the other day, I discovered that voltage drop and real power losses contributed to the invention of the incandescent lightbulb.
In the late 1870s, most research on suitable lightbulb filament revolved around low-resistance materials. Thomas Edison, instead, focused on high-resistance materials for the filament, reasoning that constructing an economic power system would require small-diameter conductors (because they were much less expensive than large-diameter conductors).
Edison knew that if a system was going to be built with small conductors, however, it would require low-load currents so that excessive voltage drop and power losses would not be introduced. To get low currents, he knew that the lightbulb—one of the killer apps of the electric world at that time—would have to have a high-resistance filament. This led to Edison’s eventual discovery of the carbonized cotton-thread filament—and subsequently the carbonized cardboard filament—that evolved into the commercially viable incandescent light that changed the world.
Edison’s approach resulted from a need to counter the still-present problems of voltage drop and real power losses. Why all the attention on drops, losses and system volt/VAR control today when we’ve had to address this problem since Edison’s time?
First, operating objectives have changed. During the past few years, minimizing customer demand and losses has become more important in many locations. Improved volt/VAR control is a cost-effective means to reach this goal, with a high benefit-cost ratio. For a 5,000-MW distribution organization, which has to pay an equivalent capitalized charge of $750 per kilowatt for incremental capacity, a 1 percent reduction in peak demand is worth $37.5 million.
Second, model-based volt/VAR optimization addresses deficiencies in prior methods. Older methods have a number of weaknesses, including keeping up with the continuous changes that are made on a distribution system. This includes both planning and design changes that happen year-to-year and the operating changes that occur day-to-day. Loads and capacitor banks routinely get transferred between feeders, rendering the older volt/VAR control methods less effective.
Older control methods were also based on heuristics, and not formal mathematical optimization, so results were almost always less than optimal. Today’s model-based volt/VAR system considers the as-operated state of the distribution system and can apply true mathematical optimization to achieve maximum reduction of real power losses and customer demand.
Third, there’s a trend with leading distribution organizations to implement a technology platform, which volt/VAR is able to leverage while sharing the expense of the platform with other distribution applications. The technology platform includes geographic information system (GIS), which provides the basis of the operating model for distribution management system (DMS) applications such as model-based volt/VAR optimization, as well as DMS and supervisory control and data acquisition (SCADA) systems and improved two-way communications systems. The technology platform also can be leveraged for other applications, including outage management, feeder monitoring, fault location and restoration switching, work management and equipment condition monitoring. When a utility considers the benefits of implementing these applications, and the shared cost of the infrastructure among all these processes, the business case for model-based volt/VAR economics is particularly strong.
Changing operating objectives, developing model-based volt/VAR optimization, and implementing broad technology platforms for distribution operations have resulted in increased interest in distribution volt/VAR control today.
Tim Taylor is industry solutions executive at Ventyx, an ABB company.
View Power Generation Articles on PennEnergy.com