decision making powered with real-time information

At what price should I purchase coal? From which source can I purchase enough natural gas for the next four months of load? With forecasts predicting a long, cold winter, what will the peak load be? Is it cheaper to purchase power on the local ISO exchange or to spin up a second generator?

These are some of the critical questions facing public utilities on a daily basis, impacting tens of millions of dollars in assets. How well these questions are answered has a direct impact on both a company’s financial performance and the price the public pays for increasingly expensive electricity.

Unfortunately, the data used to make critical decisions is often obsolete and sourced from ill-documented processes. Even in the most automated systems, information can come from complex Excel spreadsheet models populated with data from weekly batch jobs. At the other extreme, traders answer questions solely through personal experience and instinct, as opposed to hard analysis.

Weather forecasts, power wholesale market pricing, oil/natural gas/coal market pricing, and real-time metering are all data elements critical to the models used by most utilities. These elements are all highly active data streams that rapidly update over time. Until now, however, utilities have been unable to keep pace with this ever-changing flow of information. They’ve been guided by static snapshots of dynamic data.

Such static snapshots result in obsolete data, with any decisions necessarily made with old, inaccurate information. In addition, trends in these data streams cannot be captured with a single value; by definition, a set of values over a specified time is required to capture this information.

event stream processing

The scalability and flexibility of utility systems are often crucial. Companies must manage event streams arriving at velocities approaching thousands of events per second. The algorithms used to process the streams are often dynamic themselves and must be easily modifiable as improvements are made.

Utilities want to improve this situation in critical areas such as weather forecasting and wholesale market pricing. With Event Stream Processing (ESP), an emerging class of software that monitors, manages and analyzes real-time data, important trends and thresholds can be identified and acted upon instantly.

Unlike traditional software environments, which respond after events have occurred, ESP allows utilities to continuously assess key performance indicators. These systems can then re-combine this derived information with more static data-perhaps load forecast models stored in Excel-to produce meaningful business decisions.

power trading

ESP is especially relevant to the power industry for two primary reasons. First, the data used to drive business decisions is inherently dynamic and volatile. Second, utilities often make short-term business decisions that impact millions of dollars in assets. Minor improvements in decision-making can translate into significant returns.

Click here to enlarge image

The classic application of ESP in energy is in the wholesale power markets. The speculative trading frenzy driven in the past by Enron and Dynegy has been replaced by more controlled strategies used primarily to adjust power inventories to forecasted load.

Utilities will typically satisfy their load requirements from three sources: internal generation, human-negotiated bi-lateral agreements and electronic markets such as regional Independent System Operators (ISOs) and Inter-Continental Exchange (ICE).

While electronic trading makes up a relatively small percentage of sourcing today, many deregulated utilities are trending toward more automated trading. Power is increasingly traded electronically over several exchanges, most notably ICE and local ISO markets. The ISOs typically offer “real-time” as well as day-ahead markets while other exchanges tend toward slightly longer timeframes.

ESP, common in financial trading, can be applied in energy markets to search for optimal pricing across multiple exchanges or to combine historical information with current trends to identify potential low and high pricing over a given time period. This is especially true as power exchanges continue to grow and large institutional investors (outside of energy) move in to speculate on energy futures.

load forecasting

A critical variable for any utility is the accuracy of its day-ahead, week-ahead and month-ahead load forecasts. Estimate too high and excess power generated or acquired can cost the utility hundreds of thousands of dollars. Estimate too low and blackouts or brownouts are possible.

Accurate load forecasting is especially valuable for utilities operating multiple generation plants. Load forecasts will frequently be used to drive the decision to take generators on- and off-line. Bringing a plant on-line often entails a significant fixed cost on top of the variable generation costs. Inaccurate load estimates that needlessly drive expanded generation capacity can also result in significant costs.

The most accurate modeling will use some static information, such as historical load levels, together with event-oriented data, such as up-to-date weather forecasts and real-time metering that updates as often as once a minute.

Weather is probably the single most important variable used to determine future load. High temperatures drive high air conditioning usage and associated load. Several weather feed formats are typically utilized, ranging from EarthStat to homegrown weather feeds scraped from HTML websites, while real-time metering is increasingly available as utilities update their distribution network. Dips and spikes in current power usage can often be superior predictors of next-day load.

generation

A large percentage of a utility’s controlled assets are in the form of raw materials required to power its generation plants, typically coal, natural gas and oil. Often utilities will keep six to12 months of supply in inventory to hedge against fuel price fluctuations.

Minor savings in the unit cost of these raw materials can translate into millions of dollars. Algorithmic trading, common in financial services and based on ESP, can easily be applied to the commodities markets. Several common exchanges, such as NYMEX and ICE, offer fully electronic platforms to engage in automated trading.

ESP technology can provide information derived from real-time price comparisons of all fuels across several exchanges or implement more complex market arbitrage opportunities for those traders looking to create profits from speculation outside fulfillment of normal load requirements.

deregulation to accelerate ESP adoption

Deregulation has been a powerful transformative force across the power generation, distribution and retail industries. Competition is real. Market economic realities must be considered and the fight for customers is often intense.

As in other more traditionally competitive industries, advances in information processing and the resulting improvements in business decision-making are crucial to any enterprise looking to remain strong. ESP technology offers utilities a way to maintain this competitive edge and grow market share.

Sundeep Goel is an Event Stream Processing technology evangelist for the Real Time Division of Progress Software. During his professional career, he has been responsible for building enterprise software systems, in both leadership and individual roles. This experience has spanned several generations of technology ranging from early Java client / server systems to large scale EAI implementations.

Authors

Previous articlePOWERGRID_INTERNATIONAL Volume 11 Issue 1
Next articleIndustry Veterans acquire EAM Resources; form IT consulting company

No posts to display