In the United States and around the world, the energy landscape is changing at a dizzying pace, as solar farms replace fossil fuel plants, building owners install their own “behind-the-meter” solar arrays, and unexpected disasters create major disruptions in power networks.
The way to ensure a steady flow of electricity — but not waste billions on producing excessive power — is to modernize load forecasting techniques to reflect a reality that is very different today than it was just a few years ago.
“If you do what utilities have always done, the stakes for getting it wrong are pretty high,” said Mark Dyson, a principal with RMI’s electricity practice, in an interview. Relying on outdated forecasting models has led utilities in recent years to waste billions of dollars each year in paying for unnecessary capacity. Against that gloomy background, here is some good news: Sophisticated new load forecasting techniques can save money and allocate power with greater precision — by drawing data from millions of meters and analyzing it with artificial intelligence and machine learning to enable utilities, energy retailers, smart communities and other new players in the emerging energy ecosystem to build a truly intelligent grid.
Renewables go mainstream, bring new challenge
Remember when solar and wind power were considered virtuous but expensive alternatives to oil, coal and nuclear power — something utilities could use to bulk up their renewables portfolio but not rely on as a mainstay?
Those days are gone. Rising worry about climate change, and the way carbon fuels worsen it, have combined with technological advances to make renewable power not only a viable alternative but a mainstream resource.
Take Texas, for instance. The Lone Star State not only generates more electricity than any other American state, it also leads the nation in moving toward renewable power generation, according to a Dallas Morning News column by Fred Beach of the University of Texas Energy Institute.
Far from being the most costly source of electricity, renewables are now meeting — and beating — the price of conventional power supplies, according to Deloitte. In areas with abundant wind, it has become the least expensive source of all, with utility scale solar coming in a strong second. Lower prices and increased availability become even more appealing during times of crisis such as the recent attacks on Saudi oil production facilities and the increasing threat posed by hurricanes and other extreme weather events to the power supplied by fossil fuel plants.
But all of that creates a load management challenge for utilities: how do you manage the mix of conventional and renewable power supplies on the grid, especially given that wind and solar power production peaks during times of the day when demand doesn’t line up?
Part of the answer lies in improved batteries and other ways to store energy for later use. But a key part lies in tapping the rich new sources of user data that allow utilities to more accurately forecast their mix of power sources, and to make sure they’re not buying more than they need.
Old models lead to excessive power costs
In the decades following World War II, electricity demand grew each year at 5 percent or more, often faster than the American economy as a whole. But from the 1980s onward, demand began to lag, driven by factors ranging from the Middle Eastern oil crisis to declining industrialization to the growth in conservation and renewable technologies.
Today, the growth in electricity demand has dropped to less than 1 percent each year, with some regional suppliers anticipating a steady decline.
But a reliance on outdated forecasting models has led utilities to consistently purchase more power than necessary, according to the Rocky Mountain Institute — saddling power companies and their customers with billions in unnecessary costs.
Dyson believes that utilities must give up the notion of planning power needs for five years or more. Rapid changes in the energy economy have made those long-term projections unreliable, and the development of solar, wind and other fast-to-deploy generation technology has made shorter-term forecasting a better bet.
Demand may surge in coming years as more and more electric vehicles take to the roads. But utilities must deploy more accurate load forecasts as that and other trends develop and, ideally, fine-tune them for the needs of particular geographic regions.
Building an intelligent grid from the bottom up
One key to success is leveraging the usage data supplied by millions of smart meters to build a bottom-up source of data — which can then be analyzed in real time with artificial intelligence, machine learning and deep learning to better predict and manage ebbs and flows in supply and demand.
Ken Seiden, a prominent energy consultant with Navigant, has advocated for moving beyond the load forecasting models in place for the past 50 years to ones that better take into account recent shifts in the energy economy. The key, he said, is to adopt a “bottom-up, micro-oriented approach.”
This approach means starting with the way people are actually using electricity now by obtaining and using the best solutions to analyze the detailed data provided by millions of smart meters, which can empower utilities not only to make the grid more cost-effective, but also to make it more secure. By using machine learning and deep learning, power companies can use their newfound data to detect patterns that provide an early warning of power supply disruptions created by bad weather or other problems.
“With these insights into grid vulnerabilities, network operators could take immediate steps to pre-empt equipment failure, and thereby reduce the likelihood that low-impact disruptions could snowball into a widespread grid breakdown,” according to an article published by the Atlantic Council. As they are built with increasingly authoritative data, such algorithms can also help power suppliers more accurately predict the electricity supply and demand needs of an ever-changing power landscape.