+ All Categories
Home > Documents > final ppt

final ppt

Date post: 28-Oct-2014
Category:
Upload: pavan-kumar-thikkisetti
View: 344 times
Download: 0 times
Share this document with a friend
Popular Tags:
52
P.Uday Kiran M.Ajay Kumar M.Suresh By 4/4 B.Tech(EEE) IMPLEMENTATION OF SHORT-TERM LOAD FORECASTING USING NEURAL NETWORKS AND ANFIS
Transcript
Page 1: final ppt

P.Uday Kiran M.Ajay Kumar M.Suresh

By

4/4 B.Tech(EEE)

IMPLEMENTATION OF SHORT-TERM LOAD

FORECASTING USING NEURAL NETWORKS AND ANFIS

Page 2: final ppt

Predicting the load

It is important for maintaining the Power Plant

Forecaster ascertains the estimated load for required hour

What is Load Forecasting?

Page 3: final ppt

Importance of Load Forecasting

• Load Forecasting has always been the essential part of an efficient Power System planning and operation

• Several Electric Power companies are now forecasting load power based on conventional methods

• However, since the relationship between load power and factors influencing the load power is non-linear, it is difficult to identify its non-linearity by using conventional methods

Page 4: final ppt

Short-Term Load Forecasting Load prediction period may be a week or

shorter period than a week

Medium –Term Load ForecastingLoad prediction period may be few months

Long-Term Load Forecasting Load prediction period may be more than a

year

Types of Load Forecasting

Page 5: final ppt

Factors affecting the load Forecasting

• Seasonal changes

• Daily changes

• Temperature

• Humidity

• Clouds

• Random event

• Any economical or environmental change

Page 6: final ppt

Need for Forecasting the load

• Planning of power generation

• Scheduling of fuel supplies and maintenance

• For minimizing the operation costs

• Important for supplier: With the forecasted load number of generations in operation can be controlled

Page 7: final ppt

Artificial Neural Network

• A Neural network is a massively parallel-distributed processor made up of simple processing units, know as neurons.

• It resembles brain in two aspects: 1.Knowledge is acquired by the network from its environment

through learning process. 2.Inter-neuron connection strengths, known as synaptic

weights, are used to store the acquired knowledge.

• The procedure used to set the connection strengths is called learning

Page 8: final ppt

Basic Elements In Neural Network Structure

• ANN performs fundamentally like a human brain. The cell body in the human neuron receives incoming impulses via dendrites.

• Neurons of ANN consists of 3 main components; weights connecting the nodes, the summation function within the node, transfer function.

Page 9: final ppt

COMPONENTS OFNEURON

THE NEURON MODEL

Page 10: final ppt

• After the training process, ANN generalizes it. Generalization refers to the neural network producing reasonable outputs for inputs not encountered during training (learning).

• These two information-processing capabilities make it possible for neural networks to solve complex problems.

Page 11: final ppt

Model of Artificial neuron

Page 12: final ppt

Properties of neural networks

Non-linearity

Input-Output mapping

Adaptivity

Fault tolerance

These are the properties that are most desirable for Solving the problems at hand.

Page 13: final ppt

Taxonomy

Neural networks

Feed-forward networks

Feed back networks

Page 14: final ppt

Feed-Forward Neural Network

Page 15: final ppt

Back Propagation Network

• Back Propagation is a systematic method for training multi-layer artificial neural networks using back propagation of errors rule.

• The aim of this network is to train the net to achieve a balance between the ability to respond correctly to the input patterns that are used for training and ability to provide good responses that are similar

Page 16: final ppt

Back Propagation neural network

• The BP network consists of one input layer, one or more hidden layers, one output layer.

• The learning process includes two courses, one is the input information transmitting in the forward direction and another is the error transmitting in the backward direction.

Page 17: final ppt

Training Algorithm

The training algorithm of back propagation involves 4 stages:

Initialization of weights

Feed-forward

Back propagation of errors

Updation of weights and biases

Page 18: final ppt

APPLICATION OF ANN’S

• The application of ANN’ s to short-term load forecasting has gained a lot of attention recently.

• The availability of historical data is most important for ANN’ s to apply to this field.

Page 19: final ppt

MATLAB

An interactive system whose basic element is an array that does not require dimensioning

It allows

• Graphics

• Computation

• External interface

It has in built tool boxes, which can also be extended by user programming (SIMULINK)

Page 20: final ppt

NEURAL NETWORK TOOL BOX

STEPS INVOLVED

• Assemble the training data

• Create the network object

• Train the network

• Simulate the network response to new inputs

Page 21: final ppt

Preprocessing and post processing

• It’s useful to scale inputs & outputs such that they fall in specified range

• So, the NN output needs de-scaling to generate forecasted loads

Page 22: final ppt

LOAD FORECASTING USING ANN

• In this work NPDCL Warangal distribution system load data is considered for forecasting.

• Temperature data was taken from the NITW weather station.

• Feed Forward Back propagation 2-layered network structure with non-linear sigmoid function as transfer function is chosen.

Page 23: final ppt

Network Properties

• The two layers include hidden layer and output layer.

• The connection weights can be real numbers or integers. They are adjustable during the training, but some can be fixed deliberately. When training is completed, all of them should be fixed.

Page 24: final ppt

Load Forecasting with only loads as inputs

• Input Variables• Different sets of lagged loads have been proposed as

input features for the load prediction in the electricity market. Bearing in mind the daily and weekly periodicity and trend of the load signal, the set of

• {L(h-1), L(h-2), L(h-3), L(h-4), L(h-5), L(h-6), L(h-24), L(h-25), L(h-26), L(h-48), L(h-49), L(h-50), L(h-72), L(h-73), L(h-74), L(h-96), L(h-97), L(h-98), L(h-120), L(h-121), L(h-122), L(h-144), L(h-145), L(h-146), L(h-168), L(h-169), L(h-170), L(h-192), L(h-193), L(h-194), } total of 30 inputs are used at input layer .

Page 25: final ppt

Topology Of ANN

(D-8) (D-7) … (D-2) (D-1) (D)

L(h-192) L(h-193) L(h-194)

L(h-168)

L(h-169)

L(h-170)

……

L(h-48) L(h-49) L(h-50)

L(h-24) L(h-25) L(h-26)

L(h-1) L(h-2) L(h-3) L(h-4) L(h-5) L(h-6)

L(h)

Output at

Interval

‘ h’

Page 26: final ppt

Training:

1) The back propagation NN algorithm is used here for learning the neural network.

2) The implementation of Back Propagation involves a forward pass through the network to estimate the error, and then a backward pass modifying the synapses (weights) to decrease the error.

Simulation:

Using the trained neural network, the forecasting output is simulated using the test input patterns.

Page 27: final ppt

Simulation Results

• Number of input nodes =30;

• Number of hidden nodes=50;

• Number of output nodes=1;

• Number of training samples=10;

• Number of testing samples=5;

Page 28: final ppt

ACUTUAL LOAD

FORECASTED

LOAD

% ERROR

1256 1417.7 12.8742

1307 1390 6.3504

1153 1270.68 10.2064

992 910.73 8.1925

1028 920.87 10.42

Page 29: final ppt
Page 30: final ppt

Load Forecasting By Considering Temperature Effect

INPUT VARIABLES:

Hourly load data for the month January 2007 was collected from NPDCL (Northern Power Distribution Corporation Limited). Hourly temperature values were from the NITW weather station for the month January 2007.

We used this data to train the network and test its performance.

Page 31: final ppt

Topology Of ANN

(D-3) (D-2) (D-1) (D) Output L(h-72)

T(h-72)

L(h-73)

T(h-73)

L(h-74)

T(h-74)

L(h-48)

T(h-48)

L(h-49)

T(h-49)

L(h-50)

T(h-50)

L(h-24)

T(h-24)

L(h-25)

T(h-25)

L(h-26)

T(h-26)

h, T(h)

L(h-1),T(h-1)

L(h-2), T(h-2) L(h-3), T(h-3)

L(h-4), T(h-4) L(h-5), T(h-5) L(h-6), T(h-6)

L(h)

Load to be

forecasted at hour

‘h’.

Page 32: final ppt

Simulation Results:

• Number of input nodes=32;

• Number of hidden nodes=50;

• Number of output nodes=1;

• Number of training samples=10;

• Number of testing samples=5;

Page 33: final ppt

ACUTUAL LOAD

FORECASTED

LOAD

% ERROR

1028 1059.5 3.06

1200 1164.4 2.697

1185 1192.5 0.632

1159 1208 4.227

1174 1128.9 3.841

Page 34: final ppt
Page 35: final ppt

Adaptive neuro-fuzzy inference system

• In ANFIS MF parameters are chosen so as tailor them to a set of i/p-o/p data.

• First order sugeno fuzzy model with hybrid learning algorithm is used.

• It constructs a set of fuzzy if-then rules with appropriate membership functions to generate the stipulated input-output pairs.

• The parameters of MF’s and rules change through the learning process.

Page 36: final ppt

Fuzzy Inference System

1.Fuzzification interface: transforms input crisp values into fuzzy values

Page 37: final ppt

2.Knowledge base : A combination of rule base and data base.

(i) Rule base containing a number of fuzzy if- then rules

(ii) Data base defines the membership functions of fuzzy sets used in the fuzzy rules.

3.Decision-making logic: performs inference for fuzzy control actions.

4.Defuzzification interface: transforms fuzzy value to a crisp value.

Page 38: final ppt

Membership functions for seven linguistic variables

NB NM NS PS PM PB

Xmin

Xmax

Page 39: final ppt

ANFIS Architecture

• 1. If x is A1 and y is B1, then f1=p1x+q1y+r1• 2. If x is A2 and y is B2, then f2=p2x+q2y+r2

Page 40: final ppt

Layer 1: Every node i in this layer is an adaptive node with a node function.

O 1,i = µ A i (x) , for i=1,2 or

O 1,i = µ B i-2 (y) , for i=3,4,

Where x (or y) is the input to node i and Ai

(or Bi-2) is a linguistic label (“small” or “large”) associated with the node .

Here Gaussian membership function can be used.

Page 41: final ppt

Layer 2: Every node in this layer is a fixed node labeled Π, whose output is the product of all the incoming signals.

Layer 3: Here, the ith node calculates the ratio of the ith rule’s firing strength to the sum of all rule’s firing strength.

Page 42: final ppt

Layer 4: Every node i in this layer is an adaptive node with a node function.

Where wi is a normalized firing strength from layer 3 and {pi, qi, ri} is the parameter set of the node. These parameters are referred to as consequent parameters.

Layer 5: The single node in this layer is a fixed node labeled Σ, which computes the overall output as the summation of all incoming signals:

Page 43: final ppt

Load Forecasting Using ANFIS:

• INPUT VARIABLES: Hourly load data for the month January 2007 was collected from NPDCL (Northern Power Distribution Corporation Limited). Hourly temperature values were from the NITW weather station for the month January 2007.

• We used this data to train the ANFIS and test its performance. Our focus is on a normal weekday.

Page 44: final ppt

Topology Of ANFIS

Output

L(h-2)

T(h-2)

Load and temperature

at (h-2)

L(h-1)

T(h-1)

Load and temperature

at (h-1)

‘h’, T(h),

‘h’ is the hour of predicted

load

L(h)

Load to be forecasted at

hour ‘h’

Page 45: final ppt

ANFIS editor: anfisedit

Page 46: final ppt

Simulation Results:

• Number of input nodes=6;

• Number of Memebership functions for each input=3;

• Number of output nodes=1;

• Number of Membership function for output=1;

• Number of training samples=38;

• Number of testing samples=5;

Page 47: final ppt

Simulation Block Diagram

Page 48: final ppt

ACUTUAL LOAD

FORECASTED

LOAD

% ERROR

1028 1035.5 0.727

1200 1192.4 0.633

1185 1179.5 0.464

1159 1167 0.69

1174 1169.9 0.349

Page 49: final ppt

Conclusions

From the simulation results obtained we observed that the error in the load forecasting was decreased to a great extent when temperature effect was considered.

The error was further decreased when simulated by ANFIS.

Page 50: final ppt

References

• [1] G.Gross, F.D.Galiana, "Short-Term Load Forecasting," PrQceedings of IEEE, vo1.75, no.12, Dec. 1987, pp 1558-1 573.

• [2] S.Rahman, R.Bhatnagar, "An Expert System Based Algorithm for Short Term Load Forecast," IEEE Trans. on Power Systems, vo1.3, no.2, May 1988, pp. 392-399

• [3] D.C.Park, M.A.El-Sharkawi, R.J.Mark 11, "Electric Load Forecasting Using An Artificial Neural Network," IEEE Trans. on Power Systems, vo1.6, no.2, May1991, pp.442- 449

Page 51: final ppt

[4] K.Y.Lee, Y.T.Cha, J.H.Park, "Short - Term Load Forecasting Using An Artificial Neural Network," IEEE Trans. on Power Systems, vo1.7, no. 1, Feb. 1992, pp. 124-132[5] S.T.Chen, D.C.Yu, A.R.Moghaddamjo, "Weather Sensitive Short-Term Load Forecasting Using Nonfully Connected Artificial Neural Network," IEEE Trans. onPower Systems, vo1.7, no.3, Aug. 1992, pp.1098-1105[6] Y.T.Park, J.K.Park, "An Expert System for Short Term Load Forecasting by Fuzzy Decision," Proc. of 2nd Symp. on Expert System Application to Power Systems, July1989, Washington, USA, pp.244-250

Page 52: final ppt

THANK YOU


Recommended