Customers & Industries: Major Utility in Eastern US

Major Electric Power Holding Company Conducts Stochastic Load Forecasting using The DecisionTools Suite

  • Industry: Utilities
  • Product(s): DecisionTools Suite
  • Application: Load Forecasting

Summary

To tackle new challenges for accurate decision-making around load forecasting, an electric power holding company based in the Eastern United States used the DecisionTools Suite to make better predictions.

DecisionTools Suite

Learn More
Palisade’s DecisionTools Suite provides us with a richer picture of how multiple variables can contribute to future volatility in revenues.
David R., Risk Management Manager

Accurate models for electricity sales are critical to the planning and operations of a regulated utility--especially when multi-billion dollar decisions will have impacts on the economic viability of a region for decades. In the past, load forecasting was fairly straightforward: a continuation of past trends, population growth and housing expansion. Modern trends have made traditional load forecasting methods less reliable, however. In order to tackle these new challenges for accurate decision-making, an electric power holding company based in the Eastern United States used the DecisionTools Suite to make better predictions. “Palisade’s DecisionTools Suite provides us with a richer picture of how multiple variables can contribute to future volatility in revenues,” says David R., a risk management manager for the company.

Background

For one of the largest electric power holding companies in the United States, making decisions on where to expand their facilities and reach can prove challenging. Clearly, the range of potential outcomes in sales volumes can greatly impact planning and financial modeling. However, predicting what those sales volumes will be for any given market or area is a complex task. Previously, traditional load forecasting was done by reviewing historical trends and developing a deterministic model based on multivariate regression that incorporated forecasted time series purchased from a vendor. However, new technology (such as variable speed compressors, direct current motors, and LED lighting), evolving demographic preferences for ‘green’ energy options, and highly unpredictable weather patterns have obscured the clarity of the old modeling techniques.

Approach

To tackle this new complexity, the firm’s risk management team developed models for various service areas across the nation and different customer classes (i.e. retail, industrial), plus the number of new customer additions, using a variety of end-use variables. Some of these variables include:

  • heating and cooling degree days
  • lighting hours
  • end-use efficiencies
  • population growth and net migration
  • industrial output for key sectors
  • industrial energy intensity

The data for these variables was garnered from a number of sources, including industry journals, research papers, and subject matter experts. “When accepting inputs from subject matter experts, the PERT distribution [in @RISK] is my go-to distribution,” says David R. “It’s accessible and intuitive to non-statisticians.” The firm uses manufacturing GDP numbers over the past 5+ years as data for modeling future economic uncertainty, as well as samples, and looks at distributions of historical weather data to better inform their modeling of weather variability.

As the graph below illustrates, introducing weather variability into the model based on observed weather patterns allows the firm to gain perspective on how much their forecast would be expected to vary in any given period:

Additionally, end-use efficiency projections are modeled based upon subject -matter expertise:

Combining the uncertainty in the independent variables resulted in a distribution of outcomes in the sales forecast:

Using StatTools, the firm’s team ran multivariate regressions. They then fit the residuals to a normal distribution to replicate the error term using @RISK. (StatTools and @RISK are two of the software components of the DecisionTools Suite.) The resulting model depicts the variability of the dependent variable in a much more realistic way than merely ignoring the residual (as is typically done in a deterministic multivariate regression).

“Another extremely useful feature of @RISK is the ability to determine which distribution ‘fits’ a dataset through easy use of the Akaike Information Criterion (AIC),” says David R. “This distribution-fitting feature recommended the normal distribution in the majority of the equations after assessing the residuals. This also gave me greater confidence that the equations weren’t skewed towards over or under prediction.”

The resulting outputs of these models’ graphs display several important pieces of information: how much each of the variables contribute to sales; potential growth rates in each of the input and output variables; and how much up- or down-side can be reasonably expected based on weather and efficiency trends.

For David R., there are many techniques the DecisionTools Suite provides that make for improved risk assessment. “I employed multivariate regression to develop a relationship between the variables, distribution fitting to specify an expectation for weather variability based on observed volatility, visual basic to accelerate the computations of the resulting equations, and time-series modeling to create a plausible path for economic expectations,” he says.

On top of the sophisticated modeling and number-crunching the DecisionTools Suite can do, David R. notes that a huge benefit of the software is its presentation and sharing capabilities. “I find the swap-out @RISK function to be immensely useful to share the model with people that don’t have @RISK on their machine,” he says. “Another key benefit is the ease of switching between various components of the DecisionTools Suite. The resulting models look and feel like Excel, which helps when explaining the modeling process to management. Finally, the ability to save an accompanying .RSK5 file allows me to analyze and present the model results from my machine live, as running the model may consume too much time for an audience to observe.”

Have Questions? Talk with us.

Contact Us