Solar forecasting: The next big thing for solar power?

By Jeff Postelwait,
Online Editor

Electric Light & Power, POWERGRID International magazines

People who know about renewable energy are probably also aware of wind forecasting — workers who set up meteorological equipment to measure when, where and how strongly the wind will blow at the site of a proposed wind farm. Knowing this information can make wind turbines and the power they produce more economical and reliable.

The same can be said of knowing when and where the sun will shine, if you are the developer of a solar energy facility. The people you sell your energy to need to know they can rely on the resource you are responsible for — something you might not be too certain of yourself, if you don’t have the kind of information that solar forecasting can put at your fingertips.

Still, solar forecasting as a technology is in its infancy, and one place where it is being developed is at the University of California’s San Diego campus, where the Department of Energy is taking an interest (to the tune of a $1.93 million grant in 2010 with $500,000 cost share from the California Energy Commission) in developing ways to make solar energy more reliable through the use of forecasting techniques.

Over the past few years, UC-San Diego has built a smart microgrid on its campus, explains Byron Washom, director of strategic energy initiatives at the university. A significant part of this grid is the 1.2 MW of photovoltaic solar that campus generates at seven different locations. Another 840 kW is being installed at off-campus facilities, he said.

“We’re not lowering the cost of installed photovoltaics. We’re increasing the value that they will give to the market by increasing the predictability of the output,” Washom said. “Large solar systems in the future will be bidding into an economic system, so if you’re wrong, it costs you money. This is why an annual average approach isn’t good enough.”

Measuring the solar resource is done with equipment that is neither new nor that expensive. The clouds and sky over the entire campus at San Diego can be measured using a single sky imager, which costs about $12,000 and can be installed in about 48 hours.

“The sky imager is a hemispherical mirrored bowl on the highest building on campus. It holds the reflection of every cloud in a 360-degree radius. An arm comes out over this bowl and takes a picture of it,” he said. “It’s a fisheye mirror that is looking at every cloud in the sky at once. By doing this series of photos every few seconds you can determine the speed, direction, type and opacity of the clouds moving toward your solar field.”

What is more cutting edge than the camera itself, however, is the ability to process the massive amount of data generated by climate sensors and cameras, he said.

“The cost of measurement is dropping dramatically,” he said. “Also we’re seeing the costs of wireless communications dropping dramatically. Independent of what we’re doing, the price of computing power and terabytes of storage is dropping and will get lower. So we’re taking advantage of a globally developing market movement in this field.”

Using the data coming into the system from this hardware, a computerized model is created. This model can tell the grid operators how much sun can be expected to make it through the clouds and hit the installed PV panels.

“We know where the panels are because they’re fixed. We know where the sun is down to the nanosecond. So the only thing between our panels and the sun would be a cloud. Other factors do come into play, but cloud cover is the biggest factor.

The lead researcher and individual processing this data is Jan Kleissl, who in addition to working on this project is also a professor of environmental engineering at UC-San Diego.

Kleissl said there is a lot more involved in solar forecasting than just taking pictures of clouds.

“People have been taking pictures of the sky for as long as we’ve had electronics. But what we’ve been working on is computers that can tell us about these clouds. We have to train the computers to do that. Where are the clouds and more importantly where are they moving. Then comes the earth-sun geometry,” Kleissl said.

Knowing how clouds are going to behave is key to forecasting how much energy can be extracted from the sun in a given minute, hour or even day, he said. And how do clouds behave?

“They behave badly,” Kleissl said with a laugh. “In San Diego, the clouds are better behaved than in areas that might have a little more meteorological intensity to them.”

In places like Arizona, where utilities are also investing in solar energy, clouds are rare, but when they exist they will dissipate, thicken, roll over and change shape more unpredictably than they do in the relatively boring weather of sunny San Diego, he said.

To meet the challenge of stubbornly inconsistent weather patterns, a variety of technologies have to work in tandem. Satellite imagery and computerized forecasting models are also part of the system and developed under a $548k grant from the California Public Utilities Commission through the California Solar Initiative, he said.

“There is no one tool that can do everything,” he said. “Think of it as a hand-off.”

For example, the sky imager can be relied on to measure the sunlight that will be available for the next 5-20 minutes. After that, satellite imagery can predict the level of sun from 30 minutes to a few hours into the future. Beyond that, a computerized forecasting model that measures physical weather patterns can be called upon to look into the future by a 24-hour period.

Washom said that when these measurement methods work together on a single system, a clearer picture of how much power can be generated and transmitted emerges.

“We anticipate the ability to forecast intra-hour what your system will produce with up to a 90 percent degree of certainty,” Washom said. “So this will be firm power instead of non-firm power, and firm power is of a higher value because you minimize the penalties of not meeting what you bid into the market that you were going to deliver.”

This is music to the ears of power utilities, who in California are being asked to add an increasingly large amount of renewable energy into their generation portfolios — yet may still not be convinced that they can rely on renewable power to meet demand.

With solar forecasting, Washom said, grid operators can smooth out solar energy by ramping spinning reserves up or ramping it down as the grid demands it.

“If you have too much PV, the ramp rates up and down can wreak havoc on the line voltage and power electronics used to stabilize the voltage. If on the other hand you know in advance about a lot of ramp ups or a lot of ramp downs, you can have some mitigating measures,” he said. “In doing real-time measurements of the actual ramp rates that are being incurred (both up and down), you can then begin to reexamine the standards and rules that limit the amount of PV on a distribution circuit before the host utility requires another engineering study.”

What this also means is that greater amounts of solar energy than previously thought possible can be safely and reliably included on a grid, he said.

Kevin Meagher, chief technology officer at Power Analytics, the company whose software analyzes the incoming data on solar energy availability, agrees that solar forecasting could change the way people view solar energy.

“One of the things that had been poorly understood about PV is how to accurately understand how it’s going to perform. Until this year, the rule of thumb has been more than 15 percent penetration of PV on a distribution circuit triggers expensive engineering studies to determine if additional PV will disrupt your grid,” Meagher said. “What we’re finding is that the level of generation can be even greater than 15 percent on most circuits.”

If grid operators know the energy potential of grid-tied solar assets with granularity down to one minute, the operator knows what to expect the grid impact will be. This makes solar energy a resource people can take more seriously, Meagher said.

“With more accurate data, the long-term potential of this technology is to treat solar energy as a more tangible resource. Like you would with a pile of coal, you’ll know how much energy there will be to draw upon,” he said.

Kleissl said it’s desirable to include even more solar energy onto the grid, but there are still changes that need to be made, and further advances in forecasting technology are still around the corner.

“So far, some solar projects have energy storage and some have forecasting, but integrating the two is what you want,” Kleissl said, adding that with energy storage as part of the system, an operator could store up and later dispatch solar energy during a predicted period of heavy cloud cover.

Over the next few years, researchers and engineers will work to continue to improve this technology — but don’t expect any huge breakthroughs, he said.

“The next step is to better merge and integrate these different models. This is all going to be incremental improvements,” he said. “Each step forward will bring about a lot more work.”

Previous articlePublic Policy Considerations in Transmission Planning
Next articleELP Volume 89 Issue 5

Solar forecasting: The next big thing for solar power?

By Jeff Postelwait,
Online Editor

Electric Light & Power, POWERGRID International magazines

People who know about renewable energy are probably also aware of wind forecasting — workers who set up meteorological equipment to measure when, where and how strongly the wind will blow at the site of a proposed wind farm. Knowing this information can make wind turbines and the power they produce more economical and reliable.

The same can be said of knowing when and where the sun will shine, if you are the developer of a solar energy facility. The people you sell your energy to need to know they can rely on the resource you are responsible for — something you might not be too certain of yourself, if you don’t have the kind of information that solar forecasting can put at your fingertips.

Still, solar forecasting as a technology is in its infancy, and one place where it is being developed is at the University of California’s San Diego campus, where the Department of Energy is taking an interest (to the tune of a $1.93 million grant in 2010 with $500,000 cost share from the California Energy Commission) in developing ways to make solar energy more reliable through the use of forecasting techniques.

Over the past few years, UC-San Diego has built a smart microgrid on its campus, explains Byron Washom, director of strategic energy initiatives at the university. A significant part of this grid is the 1.2 MW of photovoltaic solar that campus generates at seven different locations. Another 840 kW is being installed at off-campus facilities, he said.

“We’re not lowering the cost of installed photovoltaics. We’re increasing the value that they will give to the market by increasing the predictability of the output,” Washom said. “Large solar systems in the future will be bidding into an economic system, so if you’re wrong, it costs you money. This is why an annual average approach isn’t good enough.”

Measuring the solar resource is done with equipment that is neither new nor that expensive. The clouds and sky over the entire campus at San Diego can be measured using a single sky imager, which costs about $12,000 and can be installed in about 48 hours.

“The sky imager is a hemispherical mirrored bowl on the highest building on campus. It holds the reflection of every cloud in a 360-degree radius. An arm comes out over this bowl and takes a picture of it,” he said. “It’s a fisheye mirror that is looking at every cloud in the sky at once. By doing this series of photos every few seconds you can determine the speed, direction, type and opacity of the clouds moving toward your solar field.”

What is more cutting edge than the camera itself, however, is the ability to process the massive amount of data generated by climate sensors and cameras, he said.

“The cost of measurement is dropping dramatically,” he said. “Also we’re seeing the costs of wireless communications dropping dramatically. Independent of what we’re doing, the price of computing power and terabytes of storage is dropping and will get lower. So we’re taking advantage of a globally developing market movement in this field.”

Using the data coming into the system from this hardware, a computerized model is created. This model can tell the grid operators how much sun can be expected to make it through the clouds and hit the installed PV panels.

“We know where the panels are because they’re fixed. We know where the sun is down to the nanosecond. So the only thing between our panels and the sun would be a cloud. Other factors do come into play, but cloud cover is the biggest factor.

The lead researcher and individual processing this data is Jan Kleissl, who in addition to working on this project is also a professor of environmental engineering at UC-San Diego.

Kleissl said there is a lot more involved in solar forecasting than just taking pictures of clouds.

“People have been taking pictures of the sky for as long as we’ve had electronics. But what we’ve been working on is computers that can tell us about these clouds. We have to train the computers to do that. Where are the clouds and more importantly where are they moving. Then comes the earth-sun geometry,” Kleissl said.

Knowing how clouds are going to behave is key to forecasting how much energy can be extracted from the sun in a given minute, hour or even day, he said. And how do clouds behave?

“They behave badly,” Kleissl said with a laugh. “In San Diego, the clouds are better behaved than in areas that might have a little more meteorological intensity to them.”

In places like Arizona, where utilities are also investing in solar energy, clouds are rare, but when they exist they will dissipate, thicken, roll over and change shape more unpredictably than they do in the relatively boring weather of sunny San Diego, he said.

To meet the challenge of stubbornly inconsistent weather patterns, a variety of technologies have to work in tandem. Satellite imagery and computerized forecasting models are also part of the system and developed under a $548k grant from the California Public Utilities Commission through the California Solar Initiative, he said.

“There is no one tool that can do everything,” he said. “Think of it as a hand-off.”

For example, the sky imager can be relied on to measure the sunlight that will be available for the next 5-20 minutes. After that, satellite imagery can predict the level of sun from 30 minutes to a few hours into the future. Beyond that, a computerized forecasting model that measures physical weather patterns can be called upon to look into the future by a 24-hour period.

Washom said that when these measurement methods work together on a single system, a clearer picture of how much power can be generated and transmitted emerges.

“We anticipate the ability to forecast intra-hour what your system will produce with up to a 90 percent degree of certainty,” Washom said. “So this will be firm power instead of non-firm power, and firm power is of a higher value because you minimize the penalties of not meeting what you bid into the market that you were going to deliver.”

This is music to the ears of power utilities, who in California are being asked to add an increasingly large amount of renewable energy into their generation portfolios — yet may still not be convinced that they can rely on renewable power to meet demand.

With solar forecasting, Washom said, grid operators can smooth out solar energy by ramping spinning reserves up or ramping it down as the grid demands it.

“If you have too much PV, the ramp rates up and down can wreak havoc on the line voltage and power electronics used to stabilize the voltage. If on the other hand you know in advance about a lot of ramp ups or a lot of ramp downs, you can have some mitigating measures,” he said. “In doing real-time measurements of the actual ramp rates that are being incurred (both up and down), you can then begin to reexamine the standards and rules that limit the amount of PV on a distribution circuit before the host utility requires another engineering study.”

What this also means is that greater amounts of solar energy than previously thought possible can be safely and reliably included on a grid, he said.

Kevin Meagher, chief technology officer at Power Analytics, the company whose software analyzes the incoming data on solar energy availability, agrees that solar forecasting could change the way people view solar energy.

“One of the things that had been poorly understood about PV is how to accurately understand how it’s going to perform. Until this year, the rule of thumb has been more than 15 percent penetration of PV on a distribution circuit triggers expensive engineering studies to determine if additional PV will disrupt your grid,” Meagher said. “What we’re finding is that the level of generation can be even greater than 15 percent on most circuits.”

If grid operators know the energy potential of grid-tied solar assets with granularity down to one minute, the operator knows what to expect the grid impact will be. This makes solar energy a resource people can take more seriously, Meagher said.

“With more accurate data, the long-term potential of this technology is to treat solar energy as a more tangible resource. Like you would with a pile of coal, you’ll know how much energy there will be to draw upon,” he said.

Kleissl said it’s desirable to include even more solar energy onto the grid, but there are still changes that need to be made, and further advances in forecasting technology are still around the corner.

“So far, some solar projects have energy storage and some have forecasting, but integrating the two is what you want,” Kleissl said, adding that with energy storage as part of the system, an operator could store up and later dispatch solar energy during a predicted period of heavy cloud cover.

Over the next few years, researchers and engineers will work to continue to improve this technology — but don’t expect any huge breakthroughs, he said.

“The next step is to better merge and integrate these different models. This is all going to be incremental improvements,” he said. “Each step forward will bring about a lot more work.”