load forcasting ashok kumar from bk birla institute of engineering technology pilani

Upload: jadavashish

Post on 17-Oct-2015

12 views

Category:

Documents


0 download

DESCRIPTION

project

TRANSCRIPT

  • 0

    Load Forecasting

    Load forecasting Introduction factors affecting load forecasting types of load & types of load forecasting difficulties in load forecasting Advantages & disadvantages of load forecasting.

    2012

    Hitesh Goyal B.K.Birla Institute of Engineering & Technology,Pilani

    8/22/2012

  • 1

    INTRODUCTION

    Accurate models for electric power load forecasting are essential to the operation and planning of

    a utility company. Load forecasting helps an electric utility to make important decisions

    including decisions on purchasing and generating electric power, load switching, and

    infrastructure development. Load forecasts are extremely important for energy suppliers, ISOs,

    nancial institutions, and other participants in electric energy generation, transmission,

    distribution, and markets. Load forecasts can be divided into three categories: short-term

    forecasts which are usually from one hour to one week, medium forecasts which are usually from

    a week to a year, and long-term forecasts which are longer than a year. The forecasts for dierent

    time horizons are important for dierent operations within a utility company. The natures of

    these forecasts are dierent as well. For example, for a particular region, it is possible to predict

    the next day load with an accuracy of approximately 1-3%. However, it is impossible to predict

    the next year peak load with the similar accuracy since accurate long-term weather forecasts are

    not available. For the next year peak forecast, it is possible to provide the probability distribution

    of the load based on historical weather observations. It is also possible, according to the industry

    practice, to predict the so-called weather normalized load, which would take place for average

    annual peak weather conditions or worse than average peak weather conditions for a given area.

    Weather normalized load is the load calculated for the so-called normal weather conditions

    which are the average of the weather characteristics for the peak historical loads over a certain

    period of time. The duration of this period varies from one utility to another. Most companies

    take the last 25-30 years of data. Load forecasting has always been important for planning and

    operational decision conducted by utility companies. However, with the deregulation of the

    energy industries, load forecasting is even more important. With supply and demand punctuating

    and the changes of weather conditions and energy prices increasing by a factor of ten or more

    during peak situations, load forecasting is vitally important for utilities. Short-term load

    forecasting can help to estimate load ows and to make decisions that can prevent overloading.

    Timely implementations of such decisions lead to the improvement of network reliability and to

    the reduced occurrences of equipment failures and blackouts. Load forecasting is also important

    for contract evaluations and evaluations of various sophisticated nancial products on energy

    pricing oered by the market. In the deregulated economy, decisions on capital expenditures

    based on long-term forecasting are also more important than in a non-deregulated economy when

    rate increases could be justice by capital expenditure projects.

  • 2

    FACTORS OF LOAD FORCASTING

    For short-term load forecasting several factors should be considered, such as time factors,

    weather data, and possible customers classes. The medium- and long-term forecasts take into

    account the historical load and weather data, the number of customers in dierent categories, the

    appliances in the area and their characteristics including age, the economic and demographic data

    and their forecasts, the appliance sales data, and other factors. The time factors include the time

    of the year, the day of the week, and the hour of the day. There are important dierences in load

    between weekdays and weekends. The load on dierent weekdays also can behave dierently.

    For example, Mondays and Fridays being adjacent to weekends, may have structurally dierent

    loads than Tuesday through Thursday. This is particularly true during the summer time. Holidays

    are more dicult to forecast than non-holidays because of their relative infrequent occurrence.

    Weather conditions inuence the load. In fact, forecasted weather parameters are the most

    important factors in short-term load forecasts. Various weather variables could be considered for

    load forecasting. Temperature and humidity are the most commonly used load predictors. An

    electric load prediction survey published in [17] indicated that of the 22 research reports

    considered, 13 made use of temperature only, 3 made use of temperature and humidity, 3 utilized

    additional weather parameters, and 3 used only load parameters. Among the weather variables

    listed above, two composite weather variable functions, the THI (temperature-humidity index)

    and WCI (wind chill index), are broadly used by utility companies.

    Weather influence:-Electric load has an obvious correlation To weather. The most important

    variables responsible in load changes are:

    Dry and wet bulb temperature

    Humidity

    Wind Speed / Wind Direction

    Sky Cover

    Sunshine

    Time factors:- In the forecasting model, we should also consider time factors such as:

    The day of the week

    The hour of the day & holiday

    Customer classes:-Electric utilities usually serve different types of customers such as

    residential, commercial, and industrial. The following graphs show the load behavior in the

    above classes by showing the amount of peak load per customer, and the total energy.

  • 3

    TYPES OF LOAD

    Residential Forecast: -

    Residential Forecast Comparison: -The 2010 Forecast is higher than the 2009 Forecast over the

    entire forecast period. The differences in the residential sales forecast compared to the previous

    forecast are: up 204 GWh (1.1%) in F2011, up 340 GWh (1.7%) in F2015, up 409 GWh (1.9%)

    in F2021 and up 153 GWh (0.6%) in F2030. See Figure 6.1 below for a comparison between the

    2010 and 2009 annual residential sales forecasts. The difference in the sales forecasts is

    attributed to a higher number of accounts forecast. See Figure 6.2 for a comparison between the

    2010 and 2009 residential accounts forecasts. The ending number of accounts for F2010 was

    1,633,558. This was 9,844 accounts or 0.61% above the 2009 forecast of 1,623,714. A higher

    starting point in accounts also contributed towards the 2010 residential sales forecast being

    higher than the 2010 Forecast. The 2010 accounts forecast is based on a housing starts forecast

    which is growing faster relative to the 2009 Forecast. In the 2009 forecast, 5, 11, and 1 year

    growth rates for number of accounts were 1.3%, 1.3%, and 1.2% respectively. In the current

    forecast, the 5, 11, and 21 year growth rates for number of accounts are 2.0%, 1.9%, and 1.7%

    respectively.

    Fig: - Comparison of Forecasts for Residential Sales before DSM and Rate Impacts (Excluding

    EVs and Adjustments for Codes and Standards) (GWh)

  • 4

    Fig: - Comparison of Forecasts of Number of Residential Accounts

    Risks and Uncertainties: -

    Uncertainty in the residential sales forecast is due to uncertainty in three factors: forecast for

    number of accounts, forecast for use per account, and weather.

    Number of Accounts: -In the short-term, an error in the forecast for account growth

    would not result in a significant error in the forecast for total number of accounts. This is

    because account growth is on average 1.7% per year, so in the first year, an error of 1%

    in the forecast for account growth would contribute an error of about 0.017% to the

    forecast for total number of accounts. However, in the long-term, there is a risk from the

    cumulative effect of errors in the forecast for account growth.

    Use per Account: Most of the risk in the residential forecast is due to the forecast for use

    per account. This is due to two reasons:

    Unlike the forecast for account growth, an error of 1% in the forecast for use per account

    in any year would contribute to an error of 1% to the forecast for residential sales for that

    year.

    The forecast for use per account is the net result of many conflicting forces. Some of the

    forces working to increase use rate are:

    Increases in home sizes If natural gas prices increase faster than electricity prices

    Increases in electric space heating share

    Increases in real disposable income

    Increases in saturation levels of appliances

  • 5

    Commercial Forecast: -

    Commercial Forecast Comparison: - provides a summary of historical and forecast sales by the

    four regions before DSM and rate impacts. The forecast sales in Table 7.1 below and the

    comparison to the 2009 Forecast excludes adjustments for (EVs) and adjustments for load

    forecast/DSM integration for codes and standards. The impact of EV and codes and standard on

    the commercial forecast is shown in Appendix 4 and 5. When compared with the 2009 Forecast,

    the 2010 commercial sales forecast before DSM and before rate impacts is 93 GWh higher

    (0.6%) for F2011, 1,028 GWh higher (5.8%) for F2015, 1,159 GWh higher (5.9%) for F2021

    and 560 GWh higher (2.4%) for F2030. Commercial distribution sales, in the short and medium

    term, are very close to last years forecast. Over the long term, the current commercial

    distribution forecast is lower than last years forecast mainly due to revised assumptions of

    average stock efficiency.

    Fig: -Comparison of Forecasts for Commercial Billed Sales before DSM and Rate Impacts

    (GWh)

    Risk and Uncertainties: - Commercial sales models are dependent on the outcome of the

    regional economic drivers and efficiency projections. In the SAE model, a rolling 10 year

    average of heating degree days and cooling degree days are used to calculate the normalized

    heating and cooling variables. Total commercial sales are not as sensitive to weather as

    compared to residential sales. As such, variation in temperatures from normal weather is less of a

    risk in the commercial forecast. There is some uncertainty around any individual large project

    completing all requirements before it begins to take electrical service. As such delays or

    cancellation in major projects will have an impact.

  • 6

    Factors Leading to Lower than Forecast Commercial Sales: -

    The pine beetle infestation will cause forestry employment to decline in the Long-term.

    This might impact regional commercial sales.

    Improved equipment efficiency.

    The aging provincial population will decrease future employment growth.

    A high value of the Canadian dollar over a sustained period will tend slow US-based

    tourism and exports.

    Factors Leading to Higher than Forecast Commercial Sales: -

    A robust economic recovery and tourism activity that would create additional demands

    for commercial services.

    Lowering of interest rates may encourage consumer expenditures.

    Substantially warmer summers (increasing air conditioning loads) or colder winters

    (increasing heating loads) relative to historical patterns.

    Industrial Forecast

    Industrial Forecast Comparison: - Fig shows the 2009 Forecast (before SM and rate impacts)

    compared to the 2010 Forecast. In the first three years, the 2010 Forecast is lower because of the

    forestry sector. Forestry sales were reduced in the 2010 Forecast mainly as a result permanent

    closure of a pulp and paper mill announced in 2010.

    Fig: - Comparison of Forecasts for Industrial Billed Sales before DSM And Rate Impacts

  • 7

    From F2014 to F2023, the 2010 sales forecast is above the 2009 Forecast. The increase is due to

    expected growth in the mining and oil and gas sectors; a key forecast driver of the expected

    growth is the recent strong Asian economic recovery, which has improved expectations for

    mining in B.C.s Northwest and oil and gas in B.C.s northeast. After F2023, total sales are

    roughly the same as the F2009 forecast because changes between the two forecasts for mining

    and oil and gas are offset by a reduction in the forestry sector sales. The following sections

    describe the outlook, drivers, projections for the major industrial sub-sectors and a comparison to

    the 2009 Forecast. Unless otherwise stated in the following sections, the comparison between the

    2010 and the 2009 Forecast is on a transmission sales basis only.

    TYPES OF LOAD FORECASTING

    SHORT TERM LOAD FORECASTING:-

    STLF (Short-tem load forecasting) plays an important role in the operation of electric power

    system, especially in a deregulated power market, where many systems are being pushed in a

    stressed situation close to their security margin. So the system security is more concerned in

    electricity industry. Its urgent to take the operational planning seriously based on accurate and

    effective short-tem load forecasting, enforcing the system security. The aim of short-tem load

    forecasting is to predict future electricity demands, based usually on historical data and other

    factors, such as weather conditions. In general the techniques for STLF have two approaches;

    they are regression analysis and time series analysis, which only take historical data. In this

    paper, we consider a 48-point-per-day electricity demand data series. In view of Sthese, we will

    employ the prediction techniques related to time series.

  • 8

    Characteristics of the Power System Load

    The system load is the sum of all the consumers load at the same time. The objective of system

    STLF is to forecast the future system load. A good understanding of the system characteristics

    helps to design reasonable forecasting models and select appropriate models operating in

    different situations. Various factors that influence the system load behavior, can be classified

    into the following major categories

    Weather

    Time

    Economy

    Random disturbance

    The effects of all these factors are introduced in the remaining part of this section to provide a

    basic understanding of the load characteristics.

    Weather

    Weather factors include temperature, humidity, precipitation, wind speed, cloud cover, light

    intensity and etc. The change of the weather causes the change of consumers comfort Feeling

    and in turn the usage of some appliances such as space heater, water heater and air conditioner.

    Weather-sensitive load also includes appliance of agricultural irrigation due to the need of the

    irrigation for cultivated plants. In the areas where summer and winter have great meteorological

    difference, the load patterns differ greatly. Normally the intraday temperatures are the most

    important weather variables in terms of their effects on the load; hence they are often selected as

    the independent variables in STLF.

  • 9

    Temperatures of the previous days also affect the load profile. For example, continuous high

    temperature days might lead to heat buildup and in turn a new demand peak. Humidity is also

    an important factor, because it affects the human beings comfort feeling greatly. People feel

    hotter in the environment of 35 oC with 70% relative humidity than in the environment of 37 oC

    with 50% relative humidity. Thats why temperature-humidity index (THI) is sometimes

    employed as an affecting factor of load forecasting. Furthermore, wind chill index (WCI) is

    another factor that measures the cold feeling. It is a meaningful topic to select the appropriate

    weather variables as the inputs of STLF.

    Time

    Time factors influencing the load at time point of the day, holiday, weekday/weekend property

    and season property. From the observation of the load curves it can be seen that there are certain

    rules of the load variation with the time point of the day. For example, the typical load curve of

    the normal winter weekdays (from Monday to Saturday) of the Orissa Grid is shown in Fig. 2.2,

    with the sample interval of 1 hour, i.e. there are altogether 24 sample points in one day.

    Fig. 2.1. 24 Hour Load Profile of Orissa Grid for a week of December-2006

    Typically load is low and stable from 0:00 to 6:00; it rises from around 6:00 to 9:00 and then

    becomes flat again until around 12:00; then it descends gradually until 17:00; thereafter it rises

    again until 19:00; it descends again until the end of the day. Actually this load variation with

    time reflects the peoples daily style: working time, leisure time and sleeping time.

  • 10

    There are also some other rules of load variation with time. The weekend or holiday load curve is

    lower than the weekday curve, due to the decrease of working load. Shifts to and from daylight

    savings time and start of the school year also contribute to the significant change of the previous

    load profiles.

    Periodicity is another property of the load curve. There is very strong daily, weekly, seasonal and

    yearly periodicity in the load data. Taking good use of this property can benefit the load

    forecasting result.

    Economy

    Electricity is a kind of commodity. The economic situation also influences the utilization of this

    commodity. Economic factors, such as the degree of industrialization, price of electricity and

    load management policy have significant impacts on the system load growth/decline trend. With

    the development of modern electricity markets, the relationship between electricity price and

    load profile is even stronger. Although time-of-use pricing and demand side management had

    arrived before deregulation, the volatility of spot markets and incentives for consumers to adjust

    loads are potentially of a much greater magnitude. At low prices, elasticity is still negligible, but

    at times of extreme conditions, price-induced rationing is a much more likely scenario in a

    deregulated market compared to that under central planning.

    Random Disturbance

    The modern power system is composed of numerous electricity users. Although it is not possible

    to predict how each individual user consumes the energy, the amount of the total loads of all the

  • 11

    small users shows good statistical rules and in turn, leads to smooth load curves. This is the

    groundwork of the load forecasting work. But the startup and shutdown of the large loads, such

    as steel mill, synchrotrons and wind tunnels, always lead to an obvious impulse to the load curve.

    This is a random disturbance, since for the dispatchers, the startup and shutdown time of these

    users is quite random, i.e. there is no obvious rule of when and how they get power from the

    grid. When the data from such a load curve are used in load forecasting training, the impulse

    component of the load adds to the difficulty of load forecasting. Special events, which are known

    in advance but whose effect on load is not quite certain, are another source of random

    disturbance. A typical special event is, for example, a world cup football match, which the

    dispatchers know for sure will cause increasing usage of television, but cannot best decide the

    amount of the usage. Other typical events include strikes and the governments compulsory

    demand-side management due to forecasted electricity shortage.

    Classification of Developed STLF Methods

    In terms of lead time, load forecasting is divided into four categories:

    Long-term forecasting with the lead time of more than one year

    Mid-term forecasting with the lead time of one week to one year

    Short-term load forecasting with the lead time of 1 to 168 hours

    Very short-term load forecasting with the lead time shorter than one day

    Different categories of forecasting serve for different purposes. In this thesis short-term load

    forecasting which serves the next day(s) unit commitment and reliability analysis is focused on.

    The research approaches of short-term load forecasting can be mainly divided into two

    categories: statistical methods and artificial intelligence methods [1]. In statistical methods,

    equations can be obtained showing the relationship between load and its relative factors after

    raining the historical data, while artificial intelligence methods try to imitate human beings way

    of thinking and reasoning to get knowledge from the past experience and forecast the future load.

    The statistical category includes multiple linear regression [2], stochastic time series [3], general

    exponential smoothing [4], state space [5], etc. Recently support vector regression (SVR) [6, 7],

    which is a very promising statistical learning method, has also been applied to short-term load

    forecasting and has shown good results. Usually statistical methods can predict the load curve of

    ordinary days very well, but they lack the ability to analyze the load property of holidays and

    other anomalous days, due to the inflexibility of their structure. Expert system [8], artificial

    neural network (ANN) [9], fuzzy inference [10], and evolutionary algorithm belong to the

    computational intelligence category. Expert systems try to get the knowledge of experienced

    operators and express it in an ifthen rule, but the difficulty is sometimes the experts

    knowledge is intuitive and could not easily be expressed. Artificial neural network doesnt need

    the expression of the human experience and aims to establish a network between the input data

  • 12

    set and the observed outputs. It is good at dealing with the nonlinear relationship between the

    load and its relative factors, but the shortcoming lies in over fitting and long training time. Fuzzy

    inference is an extension of expert systems. It constructs an optimal structure of the simplified

    fuzzy inference that minimizes model.

    Errors and the number of the membership functions to grasp nonlinear behavior of short Term

    loads, yet it still needs the experts experience to generate the fuzzy rules. Evolutionary

    lgorithms have been proved to be very useful in multiobjective function optimization, this aspect

    is used to train the neural network to obtain better results. Generally computational intelligence

    methods are flexible in finding the relationship between load and its relative factors, especially

    for the anomalous load forecasting. Some main STLF methods are introduced as follows

    Regression Methods

    Regression is one of most widely used statistical techniques. For load forecasting regression

    methods are usually employed to model the relationship of load consumption and other factors

    such as weather, day type and customer class. Engle et al. [11] presented several regression

    models for the next day load forecasting. Their models incorporate deterministic influences such

    as holidays, stochastic influences such as average loads, and exogenous influences such as

    weather. [12 - 15] describe other applications of regression models applied to load forecasting.

    Time Series

    Time series methods are based on the assumption that the data have an internal structure, such as

    autocorrelation, trend or seasonal variation. The methods detect and explore such a structure.

    Time series have been used for decades in such fields as economics, digital signal processing, as

    well as electric load forecasting. In particular, ARMA (autoregressive moving average), ARIMA

    (autoregressive integrated moving average) and ARIMAX (autoregressive integrated moving

    average with exogenous variables) are the most often used classical time series methods. ARMA

    models are usually used for stationary processes while ARIMA is an extension of ARMA to non

    stationary processes. ARMA and ARIMA use the time and load as the only input parameters.

    Since load generally depends on the weather and time of the day, ARIMAX is the most natural

    tool for load forecasting among the classical time series models.

    Fan and McDonald [16] and Cho et al. [17] described implementations of ARIMAX models for

    load forecasting. Yang et al. [18] used an evolutionary programming (EP) approach to identify

    the ARMAX model parameters for one day to one week ahead hourly-load-demand forecasting.

    The evolutionary programming is a method for simulating evolution and constitutes a stochastic

    optimization algorithm. Yang and Huang [19] proposed a fuzzy autoregressive moving average

    with exogenous input variables (FARMAX) for one day ahead hourly load forecasting.

  • 13

    Neural Networks

    The use of artificial neural networks (ANN or simply NN) has been a widely studied load

    forecasting technique since 1990. Neural networks are essentially non-linear circuits that Have

    the demonstrated capability to do non-linear curve fitting. The outputs of an artificial neural

    network are some linear or non-linear mathematical function of its inputs. The inputs may be the

    outputs of other network elements as well as actual network inputs. In practice network elements

    are arranged in a relatively small number of connected layers of elements between network

    inputs and outputs. Feedback paths are sometimes used. In applying a neural network to load

    forecasting, one must select one of a number of architectures (e.g. Hopfield, back propagation,

    Boltzmann machine), the number and connectivity of layers and elements, use of bi-directional

    or uni-directional links and the number format (e.g. binary or continuous) to be used by inputs

    and outputs . The most popular artificial neural network architecture for load forecasting is back

    propagation. This network uses continuously valued functions and supervised learning. That is,

    under supervised learning, the actual numerical weights assigned to element inputs are

    determined by matching g historical data (such as time and weather) to desired outputs (such

    as historical loads) in a pre-operational training session. Artificial neural networks with

    unsupervised learning do not require pre-operational training.

    Developed an ANN based short-term load forecasting model for the Energy Control Center of

    the Greek Public Power Corporation. In the development they used a fully connected three-layer

    feed forward ANN and a back propagation algorithm was used for training. Input variables

    include historical hourly load data, temperature, and the day of week. The model can forecast

    load profiles from one to seven days. Also Papalexopoulos et al. developed and implemented a

    multi-layered feed forward ANN for short-term system load forecasting. In the model three types

    of variables are used as inputs to the neural networks: seasonal related inputs, weather related

    inputs, and historical loads. Khotanzad et al [ described a load forecasting system known as

    ANNSTLF. It is based on multiple ANN strategy that captures various trends in the data. In the

    development they used a multilayer perception trained with an error back propagation algorithm.

    ANNSTLF can consider the effect of temperature and relative humidity on the load. It also

    contains forecasters that can generate the hourly temperature and relative humidity forecasts

    needed by the system. An improvement of the above system was described in . In the new

    generation, ANNSTLF includes two ANN forecasters: one predicts the base load and the other

    forecasts the change in load. The final forecast is computed by adaptive combination of these

    forecasts. The effect of humidity and wind speed are considered through a linear transformation

    of temperature. At the time it was reported in , ANNSTLF was being used by 35 utilities across

    the USA and Canada. Also developed a three layer fully connected feed forward neural network

    and a back propagation algorithm was used as the training method. Their ANN though considers

    electricity price as one of the main characteristics of the system load. Many published studies use

    artificial neural networks in conjunction with other forecasting techniques such as time series

    and fuzzy logic. A recently developed and published recurrent neural network by Sava ran is a

    god bet for the purpose too. The same was applied on STLF and interesting results were found.

  • 14

    Similar Day Approach

    This approach [28] is based on searching historical data for days within one, two or three years

    with similar characteristics to the forecast day. Similar characteristics include weather,day of the

    week and the date. The load of a similar day is considered as a forecast. Instead of a single

    similar day load, the forecast can be a linear combination or regression procedure that can

    include several similar days. The trend coefficients can be used for similar days in the previous

    years.

    Expert Systems

    Rule-based forecasting makes use of rules, which are often heuristic in nature, to do accurate ore

    casting. Expert systems incorporate rules and procedures used by human experts in the field of

    interest into software that is then able to automatically make forecasts without human assistance.

    Ho et al. Proposed a knowledge-based expert system for the short-term load forecasting of he

    Taiwan power system. Operators knowledge and the hourly observation of system load over the

    past five years are employed to establish eleven day-types. Weather parameters were also

    onsidered. Rahman and Hazim [30] developed a site-independent technique for short-term load

    forecasting. Knowledge about the load and the factors affecting it is extracted and represented in

    parameterized rule base. This rule-based system is complemented by a parameter database that

    varies from site to site. The technique is tested in different sites in the United States with low

    forecasting errors. The load model, the rules and the parameters presented in the paper have been

    designed using no specific knowledge about any particular site. Results improve if operators at a

    particular site are consulted.

    Fuzzy Logic

    Fuzzy logic is a generalization of the usual Boolean logic used for digital circuit design. An input

    under Boolean logic takes on a value of True or False. Under fuzzy logic an input is

    associated with certain qualitative ranges. For instance the temperature of a day may be low,

    medium or high. Fuzzy logic allows one to logically deduce outputs from fuzzy inputs. In

    this sense fuzzy logic is one of a number of techniques for mapping inputs to outputs.

    Among the advantages of the use of fuzzy logic are the absences of a need for a mathematical

    odel mapping inputs to outputs and the absence of a need for precise inputs. With such generic

    conditioning rules, properly designed fuzzy logic systems can be very robust when used or

    forecasting. Of course in many situations an exact output is needed. After the logical processing

    of fuzzy inputs, a defuzzification can be used to produce such precise output.

    Data mining

    Data mining is the process that explores information data in a large database to discover rules,

    knowledge, etc . Hiroyuki Mori et al. proposed a data mining method for discovering STLF rules

  • 15

    in. The method is based on a hybrid technique of optimal regression tree and an artificial neural

    network. It classifies the load range into several classes, and decides which class the forecasted

    load belongs to according to the classification rules. Then multi layer preceptor (MLP) is used to

    train the sample in every class. The paper puts an emphasis on clarifying the nonlinear

    relationship between input and output variables in a prediction model.

    Wavelets

    A STLF model of wavelet-based networks is proposed to model the highly nonlinear, dynamic

    behavior of the system loads and to improve the performance of traditional ANNs. The three-

    layer networks of the wavelet, the weighting, and the summing nodes are built by an

    evolutionary computing algorithm. Basically, the first layer of wavelet nodes decomposes the

    input signals into diverse scales of signals, to which different weighting values are given by the

    second layer of weighting nodes. Finally the third layer of summing nodes combines the

    weighted scales of signals into the output. In the evolutionary computing constructive algorithm,

    the parameters to be tuned in the networks are compiled into a population of vectors. The

    populations are evolved according to the stochastic procedure of the offspring creation, the

    competition of the individuals, and the mutation. To investigate the performance of the proposed

    evolving wavelet-based networks on load forecasting, the practical load and weather data for the

    Taiwan power systems were employed. Used as a reference for determining the input variables

    of the networks, a statistical analysis of correlation functions between the historical load and

    weather variables was conducted a priori. For comparison, the existing ANNs approach for the

    STLF, using a back propagation training algorithm, was adopted. The comparison shows

    wavelet-based ANN forecasting has a more accurate forecasting result and faster speed.

    Evolutionary Algorithms

    Evolutionary algorithms like genetic algorithm (GA) [38 - 43], particle swarm optimization

    (PSO) [44 - 46], and artificial immune system (AIS) [47], ant colony optimization (ACO) [48]

    have been used for training neural networks in short term load forecasting applications. These

    algorithms are better than back-propagation in convergence and search space capability.

    Requirements of the STLF Process

    In nearly all the energy management systems of the modern control centers, there is a shortterm

    load forecasting module. A good STLF system should fulfill the requirement of accuracy, fast

    speed, automatic bad data detection, friendly interface, automatic data access and automatic

    forecasting result generation.

    Accuracy

    The most important requirement of STLF process is its prediction accuracy. As discussed earlier,

    good accuracy is the basis of economic dispatch, system reliability and electricity markets. The

  • 16

    main goal of most STLF literatures and also of this thesis is to make the forecasting result as

    accurate as possible.

    Fast Speed

    Employment of the latest historical data and weather forecast data helps to increase the accuracy.

    When the deadline of the forecasted result is fixed, the longer the runtime of the STLF program

    is, the earlier historical data and weather forecast data can be employed by the program.

    Therefore the speed of the forecasting is a basic requirement of the forecasting program.

    Programs with too long training time should be abandoned and new techniques shortening the

    training time should be employed. Normally the basic requirement of 24 hour forecasting should

    be less than 20 minutes.

    Automatic Bad Data Detection

    In the modern power systems, the measurement devices are located over the system and the

    measured data are transferred to the control centre by communication lines. Due to the sporadic

    failure of measurement or communication, sometimes the load data that arrive in the dispatch

    entre are wrong, but they are still recorded in the historical database. In the early days, the STLF

    systems relied on the power system operators to identify and get rid of the bad ata. The new

    trend is to let the system itself do this instead of the operators, to decrease their work burden and

    to increase the detection rate.

    Friendly Interface

    The interface of the load forecasting should be easy, convenient and practical. The users can

    easily define what they want to forecast, whether through graphics or tables. The output should

    also be with the graphical and numerical format, in order that the users can access it easily.

    Automatic Data Access

    The historical load, weather and other load-relevant data are stored in the database. The STLF

    system should be able to access it automatically and get the needed data. It should also be able to

    get the forecasted weather automatically on line, through Internet or through specific

    communication lines. This helps to decrease the burden of the dispatchers.

    Automatic Forecasting Result Generation

    To reduce the risk of individual imprecise forecasting, several models are often included in one

    STLF system. In the past such a system always needs the operators interference. In other words,

    the operators have to decide a weight for every model to get the combinative outcome. To be

    more convenient, the system should generate the final forecasting result according to the

    forecasting behavior of the historical days.

  • 17

    Portability

    Different power systems have different properties of load profiles. Therefore a normal STLF

    software application is only suitable for the area for which it has been developed. If a general

    STLF software application, which is portable from one grid to another, can be developed, the

    effort of developing different software for different areas can be greatly saved. This is a very

    high-level requirement for the load forecasting, which has not been well realized up until today.

    Difficulties in the STLF

    Several difficulties exist in short-term load forecasting. This section introduces them separately.

    Precise Hypothesis of the Input-output Relationship

    Most of the STLF methods hypothesize a regression function (or a network structure, similar to

    ANN) to represent the relationship between the input and output variables. How to hypothesize

    the regression form or the network structure is a major difficulty because it needs detailed a prior

    knowledge of the problem. If the regression form or the network structure were improperly

    selected, the prediction result would be unsatisfactory. For example, when a problem itself is a

    quadratic, the prediction result will be very poor if a linear input-output relationship is supposed.

    Another similar problem is parameter selection: not only the form of the regression function (or

    the network structure), but also the parameters of it should be well selected to get a good

    prediction. Moreover, it is always difficult to select the input variables. Too many or too few

    input variables would decrease the accuracy of prediction. It should be decided which variables

    are influential and which are trivial for a certain situation. Trivial ones that do not affect the load

    behavior should be abandoned.

    Because it is hard to represent the input-output relationship in one function, the mode recognition

    tool, clustering, has been introduced to STLF [49]. It divides the sample data into several

    clusters. Each cluster has a unique function or network structure to represent the input and

    output relationship. This method tends to have better forecasting results because it reveals the

    system property more precisely. But a prior knowledge is still required to do the clustering and

    determine the regression form (or network structure) for every cluster.

    Generalization of Experts Experience

    Many experienced working staff in power grids are good at manual load forecasting. They are

    even always better than the computer forecasting. So it is very natural to use expert systems and

    fuzzy inference for load forecasting. But transforming the experts experience to a rule database

    is a difficult task, since the experts forecasting is often intuitive.

    The Forecasting of Anomalous Days

    Loads of anomalous days are also not easy to be predicted precisely, due to the dissimilar load

    behavior compared with those of ordinary days during the year, as well as the lack of sufficient

    samples. These days include public holidays, consecutive holidays, days preceding and following

  • 18

    the holidays, days with extreme weather or sudden weather change and special event days.

    Although the sample number can be greatly enhanced by including the days that are ar way

    from the target day, e.g. the past 5 years historical data can be employed rather than only one or

    two years, the load growth through the years might lead to dissimilarity of two sample days.

    From the experimental results it is found that days with sudden weather change are extremely

    hard to forecast. This sort of day has two kinds of properties: the property of the previous

    neighboring days and the property of the previous similar days. How to combine these two

    properties is a challenging task.

    Inaccurate or Incomplete Forecasted Weather Data

    As weather is a key factor that influences the forecasting result, it is employed in many models.

    Although the technique of weather forecasting, like the load forecasting, has been improved in

    the past several decades, sometimes it is still not accurate enough. The inaccurate weather report

    data employed in the STLF would cause large error. Another problem is, sometimes the detailed

    forecasted weather data cannot be provided. The normal one day ahead weather report

    information includes highest temperature, lowest temperature, average humidity, precipitation

    probability, maximum wind speed of the day, weather condition of three period of the day

    (morning, afternoon and evening). Usually the number of the load forecasting points in a day is

    96. If the forecasted weather data of these points can be known in advance, it would greatly

    increase the precision. However, normal weather reports do not provide such detailed

    information, especially when the lead time is long. This is a bottleneck of load forecasting.

    Less Generalization Ability Caused By Over fitting

    Over fitting is a technical problem that needs to be solved for load forecasting. Load forecasting

    is basically a training and predicting problem, which is related to two datasets: training data

    and testing data. Historical training data are trained in the proposed model and a basic

    representation can be obtained and in turn used to predict the testing data. For the out coming

    training module, if the training error for the training data is low but the error for the testing data

    is high, over fitting is said to have occurred. A significant disadvantage of neural networks is

    over fitting; it shows perfect performance for training data prediction but much poorer

    performance for the future data prediction. Since the goal of STFL is to predict the future

    unknown data, technical solutions should be applied to avoid over fitting.

  • 19

    MID TERM LOAD FORECASTING

    FOR electric utilities it is important to have accurate load forecasting for different time periods.

    With the deregulation of the energy industries, load forecasting is even more important

    especially for dispatcher who can make better decisions and comply with them. Thus, electric

    utilities reduce occurrences of equipment failures and blackouts. Forecasting depending on a

    time period can be generally divided into three types: long term, medium term and short term.

    Each type has important role on Manuscript received on August 28, 2010.

    Medium-term forecasting must take into account seasonal patterns (such as July average load

    larger than February)

  • 20

    VARIOUS METHODS OF MID TERM LOAD FORECASTING:-

    Time series prediction:

    -Predicting time series is of great importance in many domains of science and engineering, such

    as nancial, electricity, environment and ecology. A time series is a sequence of observations

    made through time, in the form of vector or scalar [5].Time series prediction can be considered

    as a problem of a model formation that establishes a mapping between the input and output

    values. After such model is formed, it can be used to predict the future values based on the

    previous and Mid-Term Load Forecasting Using Recursive Time Series Prediction Strategy

    ...289 current values. The previous and the current values of the time series are used as inputs for

    the prediction model:

    Where h represents the number of ahead predictions, F is prediction model and miss size of

    repressor. Time series prediction can be divided into two categories depending on prediction

    time period: short term and long term [6]. Short term prediction is related to one step ahead

    prediction. The goal of long term prediction is to predict values for several steps ahead. In long

    term prediction propagation of errors and the deciency of information occur, which makes the

    prediction more difficult. For long-term prediction there are two different approaches that can be

    used: direct and recursive. In following section, an approach for recursive prediction is

    presented.

    Recursive prediction strategy: -

    Recursive prediction strategy uses the predicted values as known data to predict the next ones

    [7]. The model can be constructed by making one-step ahead prediction:

    y (t +1) = F(y (t), y (t 1) . . . y (t m+1)). (2)

    The repressor of the model is dened as the vector of inputs:

    y (t), y (t1) . . . y (tm+1),

    where m is the size of repressor. To predict the next value, the same model is used:

    y(t +2) = F(y(t +1),y(t), . . . ,y(t m+2)), (3)

    and for prediction:

    y (t +h) = F(y(t +h1),y(t +h2), . . . ,y(t m +h)). (4)

    It is important to notice in 3, that the predicted value of y(t +1) is used instead of the true value,

    which is unknown. Then, h steps ahead predictions, from y(t +2) toy(t +h) are predicted

    iteratively. Thus when the repressor length m is larger than h, there are mh real data in the

    repressor to predict the step. But when h becomes larger then m, all the inputs are the

  • 21

    predicted values. Usage of the predicted values as inputs affects on the accuracy of the prediction

    in cases when h signicantly exceeds m.

    SVM:

    SVMs are developed based on statistical learning theory given by Vapid [8] in1995 to resolve

    the issue of data classication. Two years later, the version of SVM290 M. Stojanovic, M.

    Boazs, and M. Stankovic: is proposed that can be successfully applied to the data regression

    problem. This method is called Support Vector Regression (SVR) and it is the most common

    form of SVMs that is applied in practice [9]. SVMs are based on the principle of structural risk

    minimization (SRM), which is proved to be more efficient than the empirical risk minimization

    (ERM), which is used in neural networks. SRM minimizes an upper bound of expected risk as

    opposed to ERM that minimizes the error on the training data [10]. SVMs implement a learning

    algorithm that performs learning from examples in order to predict the values on previously

    unseen data. The goal of SVR is to generate a model which will perform prediction of unknown

    output values based on the known input parameters. In the learning phase, the formation of the

    model is performed based on the known training data where xi are

    input vectors, and outputs associated to them. Each input vector consists of numeric features.

    In the phase of application, the trained model on the basis of new inputs makes

    prediction of output values .. SVR is an optimization problem [9], in which is needed

    to determine the parameters and b to minimize:

    where xi is mapped in the multi-dimensional vector space with non linear mapping . i is the

    upper limit of training error and i the lower. The idea of SVR is based on the computation of a

    linear regression function in a high dimensional feature space where the input data are mapped

    via a nonlinear function. The parameters that control the quality of the regression are kernel

    function, C and . C is parameter which determines the cost of error, i.e. determines the

    tradeoffs between the model complexity and the degree to which deviations larger than are

    tolerated [11]. A larger values for C reduces the error on training data but yields a more complex

    forecasting function that is more likely to overt on the training data [12]. Parameter controls the

    width of insensitive zone, and hence the number of support vectors (vectors that lying on a

    margin of the tube) and errors that lying outside zone [13]. The goal of SVR is to place as

  • 22

    many input data inside the tube |y + b)| , which is shown in Fig. 1. If xi is not inside

    the tube, an error occurs or Mid-Term Load Forecasting . Loss function assigns errors only

    to those xi for which or [12], and it is dened with:

    The problem can be solved using Lagrange multipliers [9], and the solution is dened with:

    Where represents kernel function, dened as dot product between (xi)T and (x). More

    about SVR can be found in [9], [12].For the experiments we used a publicly available library

    Lissom - A Library for Support Vector Machines [14] which we integrated in our software for

    recursive time series prediction.

    LONG TERM LOAD FORECASTING

    Load forecasting is of the most difficult problems in distribution system planning and analysis.

    However, not only historical load data of a distribution system play a very important role on

    peak load forecasting, but also the impacts of meteorological and demographic factors must be

    taken into consideration [1, 2]. For planning, management and effective operation of electric

    power systems, load forecasting should be accomplished over a broad spectrum of time intervals

    [3]. Load forecasting methods are distinguished on the basis of forecasting periods. In general,

    the required load forecasting can be categorized into short, medium and long term forecasts.

    Short term forecasting (half hour to one week ahead) represents a great saving potential for

    economic and secure operation of power systems. Medium term forecasting (one day to several

  • 23

    months) deals with the scheduling of fuel supplies and maintenance operations, and long term

    forecasting (more than a year ahead) is useful for planning operations [1-3]. Majority of the

    electric load forecasting methods are dedicated to short term forecasting and relatively less work

    has been done on long or medium term load forecasting.

    VARIOUS METHODS OF LONG TERM LOAD FORECASTING:-

    Least Squares Regression Methods: -

    In regression based models, the prediction error is minimised to zero by using the least squares

    approach as given in Equation (1).

    In the equation; n is the number of data, i= real value ; y is the existing recorded data,

    is the type of the function used and S is the sum of the squared prediction errors. In

    this method, the variable S is equalised to zero after differentiated to each coefficient. By this

    way the normal equations are attained [13, 14]. In the least squares method, regression models

    which are explained in detail below and used as , in Equation (1).

    The simple linear regression: -The simple linear regression model is based on the linear

    relationship between the dependent variable y and independent variable x as shown in Equation

    (2)

    y =b x+ a (2)

    This is the equation of a straight line which intercepts y axes at a with a slope of b [14].

    In order to attain zero error, by using the least squares errors method, Equations (3) and (4) are

    obtained.

    In the equations the variables y, x and n represent the peak load, the years and the number of

    years which the forecasting is based on respectively. Theand b coefficients are calculated from

    the Equations (3) and (4) and replaced in Equation (2) for the load forecasting [14].

    Multiple linear regression: - This approach shows a plane in the space with three dimensions

    which can be expressed as given in Equation (5) .

  • 24

    In the equation, a, b, and c are regression parameters relating the mean value of y to 1 x

    and 2 x . When the least squares method is applied (to attain zero error), Equation (6) is obtained

    [15].

    By solving the Equation (6); a, b and c parameters are calculated; where y is the peak load, 1i x

    is the temperature, 2i x is the population data and n is the number of years the forecasting

    algorithm based on. By replacing the regression parameters in Equation (5) the peak load

    forecasting is performed.

    The quadratic regression: - In this approach the parabolic function which is given in Equation (7)

    The, b and c coefficients of the parabolic function can be obtained from Equation (8) which is

    written in matrix form.

    The load forecasting is performed by replacing the calculated coefficients in Equation (7).

    The exponential regression: -In this approach, the trend equation is formed by using an

    exponential function as given in Equation (9).

    By writing the Equation (9) in logarithmic form and then applying the least squares approach,

    Equations (10), (11) and (12) are formed.

  • 25

    Since the Equation (10) is linear, by applying linear trend analysis a and b coefficients are

    found as shown in Equations (13) and (14).

    By replacing and b coefficients in Equation (9) the peak loads are predicted.

    ANN: - Figure 1 depicts the architecture of typical feed-forward multilayered neural network

    consist of an input layer, (one or more) hidden layers and an output layer. The number of hidden

    layers and neurons in layers is subject to problem studied and decided upon trial-error.

    The input layer receives the signal from outer environment and distributes it to the neurons in

    the hidden layers. The hidden layers have computational neurons and the number of layers

    depends on the functions to be used. The network computes the actual inputs outputs Input

    layer Hidden layer Output layer International Journal on Technical and Physical Problems of

    Engineering (IJTPE), Is. 7, Vol. 3, No. 2, Jun. 2011 outputs of the neurons in the hidden and

    output layer by using the activation function. The error gradient for the neurons in the output

    layer is calculated and the weights in the back-propagation network propagating backward the

    errors associated with output neurons are adjusted. The total error at the output layer is then

    reduced by redistributing the error backwards through the hidden layers until the hidden layer is

    reached. The process updating the weights until the desired output is reached defined as training.

    This process is called as generalised delta rule and repeated until the error criterion for all

  • 26

    datasets is reached. In general each ANN is trained on a different 80% of the training data and

    then validated on the remaining 20%. Since each additional layer exponentially increases the

    computing load, in practice mostly 3-layer ANNs are preferred [3, 12, and 15]. In this work, in

    the implementation stage of the ANN mat lab 6.5 software is used. In the program; three layers

    ANN model including one hidden layer with feed forward and back-propagation algorithm has

    been trained by using Liebenberg Marquardt (LM) algorithm. The network used in this study has

    12 neurons in the hidden layer with the logarithmic sigmoid activation function which is non-

    linear continuous function between 0 and 1 as expressed in Equation (15) where, is the slope

    constant and in general assumed equal to 1.

    For inputs, along the peak load dataset, monthly temperature and population growth are taken

    into account. In this study, the average monthly temperature values are obtained from the

    regional meteorological record office. The monthly population growth is calculated from the

    1997 and 2000 national population statistics using Equation (16) which gives the population

    growth on monthly bases [16].

    In this equation; 0 P is the first and n P is the second of the two consecutive population statistics,

    n is the time interval between the statistics and r shows population growth rate. For Cudahy, by

    taking the 1997 and 2000 population statistics which are obtained from respectively; the value of

    r is calculated as . By using these values the population is calculated on monthly

    bases. All the data which are used for the training and testing of the ANN have scaled to an i

  • 27

  • 28

    Difficulties in the LF:

    Several difficulties exist in S forecasting. This section introduces them separately

    Precise Hypothesis of the Input-output Relationship: -

    Most of the STLF methods hypothesize a regression function (or a network structure, e.g. in

    ANN) to represent the relationship between the input and output variables. How to hypothesize

    the regression form or the network structure is a major difficulty because it needs detailed a prior

    knowledge of the problem. If the regression form or the network structure were improperly cited,

    the prediction result would be unsatisfactory. For example, when a problem itself is a 16 2 Basic

    Concepts of Short-term Load Forecasting quadratic, the prediction result will be very poor if a

    linear input-output relationship is supposed. Another similar problem is parameter selection: not

    only the form of the regression function (or the network structure), but also the parameters of it

    should be well selected to get a good prediction. Moreover, it is always difficult to select the

    input variables. Too many or too few input variables would decrease the accuracy of prediction.

    It should be decided which variables are influential and which are trivial for a certain situation.

    Trivial ones that do not affect the load behaviour should be abandoned. Because it is hard to

    represent the input-output relationship in one function, the mode recognition tool, clustering, has

    been introduced to STLF [54]. It divides the sample data into several clusters. Each cluster has a

    ssunique function or network structure to represent the input and output relationship. This

    method tends to have better forecasting results because it reveals the system property more

    precisely. But a prior knowledge is still required to do the clustering and determine the

    regression form (or network structure) for every cluster.

    Generalization of Experts Experience: -

    Many experienced working staff in power grids is good at manual load forecasting. They are

    even always better than the computer forecasting. So it is very natural to use expert systems and

    fuzzy inference for load forecasting. But transforming the experts experience to a rule database

    is a difficult task, since the experts forecasting is often intuitive.

    Forecasting of Anomalous Days:

    Loads of anomalous days are also not easy to be predicted precisely, due to the dissimilar load

    behaviour compared with those of ordinary days during the year, as well as the lack of sufficient

    maples. These days include public holidays, consecutive holidays, days preceding and following

    the holidays, days with extreme weather or sudden weather change and special event days.

    Although the sample number can be greatly enhanced by including the days that are far away

    from the target day, e.g. the past 5 years historical data can be employed rather than only one or

    two years, the load growth through the years might lead to dissimilarity of two sample days.

    From the experimental results it is found that days with sudden weather change are extremely

    hard to forecast. This sort of day has two kinds of properties: the property of the previous

  • 29

    neighbouring days and the property of the previous similar days. How to combine these two

    properties is a challenging task.

    Inaccurate or Incomplete Forecasted Weather Data:

    As weather is a key factor that influences the forecasting result, it is employed in many models.

    Although the technique of weather forecasting, like the load forecasting, has been improved in

    the past several decades, sometimes it is still not accurate enough. The inaccurate weather report

    data employed in the STLF would cause large error. Another problem is, sometimes the detailed

    forecasted weather data cannot be provided. The normal one day ahead weather report

    information includes highest temperature, lowest temperature, average humidity, precipitation

    probability, maximum wind speed of the day, weather condition of three period of the day

    (morning, afternoon and evening). Usually the number of the load forecasting points in a day is

    96. If the forecasted weather data of these points can be known in advance, it would greatly

    increase the precision. However, normal weather reports do not provide such detailed

    information, especially when the lead time is long. This is a bottleneck of load forecasting.

    Less over fitting:

    it is a technical problem that needs to be solved for load forecasting. Load forecasting is

    basically a training and predicting problem, which is related to two datasets: training data and

    testing data. Historical training data are trained in the proposed model and a basic representation

    can be obtained and in turn used to predict the testing data. For the out coming training module,

    if the training error for the training data is low but the error for the testing data is high, over

    fitting is said to have occurred. Fig. shows the regression curve of the 1- dimensional input to

    illustrate the effect of over fitting. The round dots represent the testing data and the triangle dots

    represent the training data. In (a) both the training error and the testing error are low. In (b)

    where over fitting exists, although the training error is almost zero, the testing error is quite high.

    A significant disadvantage of neural networks is over fitting; it shows perfect performance for

    training data prediction but much poorer performance for the future data prediction. Since the

    goal of STFL is to predict the future unknown data, technicals

  • 30

    The Destroy of Load Curve Nature by Compulsory Demand-side Management:-

    With the development of economical development and relative lag in power investment, energy

    shortage has appeared in many countries. To avoid reliability problem and assure the power

    supply of very important users, compulsory demand-side management is often executed. This

    compulsory command destroys the natural property of load curve. When this kind of load curve

    is included in training, it serves as noise and deteriorates the final results.

    Motivation:-

    Load forecasting is an important component for power system energy management system.

    Precise load forecasting helps the electric utility to make unit commitment decisions, reduce

    spinning reserve capacity and schedule device maintenance plan properly. Besides playing a key

    role in reducing the generation cost, it is also essential to the reliability of power systems. The

    system operators use the load forecasting result as a basis of off-line network analysis to

    determine if the system might be vulnerable. If so, corrective actions should be prepared, such

    as load shedding, power purchases and bringing peaking units on line. Since in power systems

    the next days power generation must be scheduled every day, day ahead short-term load

    forecasting (STLF) is a necessary daily task for power dispatch. Its accuracy affects the

    economic operation and reliability of the system greatly. Under prediction of STLF leads to

    insufficient reserve capacity preparation and in turn, increases the operating cost by using

    expensive peaking units. On the other hand, over prediction of STLF leads to the unnecessarily

    large reserve capacity, which is also related to high operating cost. In spite of the numerous

    literatures on STLF published since 1960s, the research work in this area is still a challenge to

    the electrical engineering scholars because of its high complexity. How to estimate the future

    load with the historical data has remained a difficulty up to now, especially for the load

    forecasting of holidays, days with extreme weather and other anomalous days. With the

    recent development of new mathematical, data mining and artificial intelligence tools, it is

    potentially possible to improve the forecasting result. With the recent trend of deregulation of

    electricity markets, STLF has gained more importance and greater challenges. In the market

    environment, precise forecasting is the basis of electrical energy trade and spot price

    establishment for the system to gain the minimum electricity purchasing cost. In the real-time

    dispatch operation, forecasting error causes more purchasing electricity cost or breaking-contract

    penalty cost to keep the electricity supply and consumption balance. There are also some

    modifications of STLF models due to the implementation of the electricity market. For

    example, the demand-side management and volatility of spot markets causes the consumers

    active response to the electricity price. This should be considered in the forecasting model in the

    market environment

  • 31

    Advantages and Disadvantages of Forecasting Methods of Production and

    Operations Management:-

    Organizations use forecasting methods of production and operations management to implement

    production strategies. Forecasting involves using several different methods of estimating to

    determine possible future outcomes for the business. Planning for these possible outcomes is the

    job of operations management. Additionally, operations management involves the managing of

    the processes required to manufacture and distribute products. Important aspects of operations

    management include creating, developing, producing and distributing products for the

    organization

    Advantages of Forecasting: -

    An organization uses a variety of forecasting methods to assess possible outcomes for the

    company. The methods used by an individual organization will depend on the data available and

    the industry in which the organization operates. The primary advantage of forecasting is that it

    provides the business with valuable information that the business can use to make decisions

    about the future of the organization. In many cases forecasting uses qualitative data that depends

    on the judgment of experts.

    Disadvantages of Forecasting:-

    It is not possible to accurately forecast the future. Because of the qualitative nature of

    forecasting, a business can come up with different scenarios depending upon the interpretation of

    the data. For this reason, organizations should never rely 100 percent on any forecasting method.

    However, an organization can effectively use forecasting with other tools of analysis to give the

    organization the best possible information about the future. Making a decision on a bad forecast

    can result in financial ruin for the organization, so an organization should never base decisions

    solely on a forecast.

    Advantages of Operations Management: -

    Operations management can help an organization implement strategic objectives, strategies,

    processes, planning and controlling. One of the primary focuses of operations management is to

    effectively manage the resources of an organization so that the organization can maximize the

    potential of the products or services produced or offered by the company. Depending on the

    organization, operations management can include managing human resources, materials,

    information, production, inventory, transportation, logistics, purchasing and procurement.

  • 32

    Disadvantages of Operations Management:-

    Operations management depends on many different components within the organization working

    together to achieve success. Even if operations management implements an effective plan, if

    operations management does not carry out the plan properly, the plan will most likely fail.

    Within an organization, mistakes often occur during the chain of events from manufacturing to

    sale. Therefore, operations management requires the coordination of operation functions,

    marketing, finance, accounting, engineering, information systems and human resources to have

    success within the organization. This poses the primary disadvantage of operations management

    because if an organization's individual components do not work well together, operations

    management will have limited success within the organization.