Discussion of Various Techniques for Solving Economic Load Dispatch

Similar documents
International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Solving Economic Load Dispatch Problems in Power Systems using Genetic Algorithm and Particle Swarm Optimization

ABC Analysis For Economic operation of power system

Dynamic Economic Dispatch for Power Generation Using Hybrid optimization Algorithm

International Journal of Current Research and Modern Education (IJCRME) ISSN (Online): & Impact Factor: Special Issue, NCFTCCPS -

Dr. Ramesh Kumar, Nayan Kumar (Department of Electrical Engineering,NIT Patna, India, (Department of Electrical Engineering,NIT Uttarakhand, India,

1 Lab + Hwk 5: Particle Swarm Optimization

2009 International Conference of Soft Computing and Pattern Recognition

A Novel Approach to Solve Unit Commitment and Economic Load Dispatch Problem using IDE-OBL

1 Lab 5: Particle Swarm Optimization

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

Optimal Power Flow Using Particle Swarm Optimization

MOST of power system optimization problems including

Using CODEQ to Train Feed-forward Neural Networks

A Modified PSO Technique for the Coordination Problem in Presence of DG

Available online at ScienceDirect. Procedia Technology 21 (2015 ) SMART GRID Technologies, August 6-8, 2015

Differential Evolution Biogeography Based Optimization for Linear Phase Fir Low Pass Filter Design

Distributed Load Flow using Partitioning and Equivalencing of Power Networks

Optimized Watermarking Using Swarm-Based Bacterial Foraging

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

1 Lab + Hwk 5: Particle Swarm Optimization

APPLICATION OF PATTERN SEARCH METHOD TO POWER SYSTEM ECONOMIC LOAD DISPATCH

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

B.Tech Thesis on ECONOMIC LOAD DISPATCH FOR IEEE 30-BUS SYSTEM USING PSO

Binary Differential Evolution Strategies

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

An Introduction to Evolutionary Algorithms

Journal of Asian Scientific Research, 1 (5), pp

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

PARTICLE SWARM OPTIMISATION APPLIED TO ECONOMIC LOAD DISPATCH PROBLEM

A Novel Approach to Planar Mechanism Synthesis Using HEEDS

Simple Bacteria Cooperative Optimization with Rank-based Perturbation

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Modified Particle Swarm Optimization

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Evolutionary Computation Algorithms for Cryptanalysis: A Study

DERIVATIVE-FREE OPTIMIZATION

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic

Bat Algorithm for Resource Scheduling In Cloud Computing

PARTICLE SWARM OPTIMIZATION (PSO)

International Conference on Modeling and SimulationCoimbatore, August 2007

POWER FLOW OPTIMIZATION USING SEEKER OPTIMIZATION ALGORITHM AND PSO

A NEW APPROACH TO THE SOLUTION OF ECONOMIC DISPATCH USING PARTICLE SWARM OPTIMIZATION WITH SIMULATED ANNEALING

Algorithm Design (4) Metaheuristics

NOVEL CONSTRAINED SEARCH-TACTIC FOR OPTIMAL DYNAMIC ECONOMIC DISPATCH USING MODERN META-HEURISTIC OPTIMIZATION ALGORITHMS

An Energy Efficient Network Life Time Enhancement Proposed Clustering Algorithm for Wireless Sensor Networks

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

Optimization of Makespan and Mean Flow Time for Job Shop Scheduling Problem FT06 Using ACO

Learning Fuzzy Cognitive Maps by a Hybrid Method Using Nonlinear Hebbian Learning and Extended Great Deluge Algorithm

Research on time optimal trajectory planning of 7-DOF manipulator based on genetic algorithm

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms

AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

Escaping Local Optima: Genetic Algorithm

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & MANAGEMENT

Sci.Int.(Lahore),28(1), ,2016 ISSN ; CODEN: SINTE 8 201

Heuristic Optimisation

Optimal Power Flow of Distribution Network

Particle Swarm Optimization

FDR PSO-Based Optimization for Non-smooth Functions

II. ARTIFICIAL NEURAL NETWORK

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Optimal Analysis of Economic Load Dispatch using Artificial Intelligence Techniques

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

A Hybrid Intelligent System for Fault Detection in Power Systems

ISSN: ISO 9001:2008 Certified International Journal of Engineering and Innovative Technology (IJEIT) Volume 7, Issue 1, July 2017

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Learning Adaptive Parameters with Restricted Genetic Optimization Method

Automatic differentiation based for particle swarm optimization steepest descent direction

Introduction to Design Optimization: Search Methods

Solution of Optimal Power Flow Subject to Security Constraints by a New Improved Bacterial Foraging Method

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization

Genetic Algorithm for Job Shop Scheduling

Image Compression: An Artificial Neural Network Approach

The Genetic Algorithm for finding the maxima of single-variable functions

Particle Swarm Optimization For N-Queens Problem

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

Optimization Technique for Maximization Problem in Evolutionary Programming of Genetic Algorithm in Data Mining

Simplicial Global Optimization

A Genetic Algorithm Approach to the Group Technology Problem

Transcription:

International Journal of Enhanced Research in Science, Technology & Engineering ISSN: 2319-7463, Vol. 4 Issue 7, July-2015 Discussion of Various Techniques for Solving Economic Load Dispatch Veerpal Kaur Sran 1, Harpreet Kaur 2 1 BBSBEC, Fatehgarh Sahib 2 GNDEC, Ludhiana Abstract: Economic Load Dispatch (ELD) is an important topic in the operation of power plants which can help to build up effecting generating management plans. The ELD problem has non smooth cost function with equality and inequality constraints which make it difficult to be effectively solved. This paper present various techniques for solving economic load dispatch. A large number of iterations and oscillation are those of major concern using various methods. Each method has its own merits and demerits for solving ELD. Keywords: Economic Load Dispatch; lambda Iteration method; Newton Method. Introduction The main objective of Economic Load Dispatch (ELD) is to minimize the fuel cost while satisfying the load demand. Traditionally, the cost function of each generator has been approximately represented by a single quadratic cost function. Practically, operating conditions of many generating units require that the generation cost function be segmented as piecewise quadratic functions. Therefore, it is more realistic to represent the generation cost function as a piecewise quadratic cost function. There are several methods for solving Economic Load Dispatch such as lambda iteration, Newton, gradient, linear programming, base point and participation factor etc. Complex constrained ELD is addressed by intelligent methods such as evolutionary programming (EP), simulated annealing (SA), tabu search (TS), dynamic programming (DP), Hopfield neural network (HNN), genetic algorithm (GA), differential evolution (DE), and particle swarm optimization (PSO), Bacterial Foraging Algorithm (BFA), quantum-inspired particle swarm optimization (QPSO), biogeography-based optimization (BBO). Problem Formulation The ELD problem is considered as a general minimization problem with constraints, and can be written in the following form: Minimize f(x) Subjected to: g(x) = 0 h(x) 0 f(x) is the objective function, g(x) and h(x) are respectively the set of equality and inequality constraints. x is the vector of control and state variables. The control variables are generator active and reactive power outputs, bus voltages, shunt capacitors/reactors and transformers tap-setting. The state variables are voltage and angle of load buses. Objective function The economic load dispatch problem can be described as an optimization (minimization) process with the following objective function Min (1) Page 61

Where FC j (P j ) is the total cost function of the j th unit and P j is the power generated by the j th unit Subject to power balance equation D= (2) Where D is the system demand and P L is the transmission loss, and generating capacity constraints: P j min P j P j for j = 1, 2, 3, n (3) Where P j min and P j max are the minimum and maximum power output of j th unit. The fuel cost function without valve-point loading of the generating unit by: 2 f (P j ) = a j + b j P j + c j P j Where ng is the number of thermal units, P gi is the active power generation at unit i and a i, coefficients of the i th generator. Now methods for solving this ELD problem are discussed below: b i and c i are the cost Lambda Iteration In Lambda iteration method lambda is the variable introduced in solving constraint optimization problem and is called Lagrange multiplier [1]. It is important to note that lambda can be solved at hand by solving systems of equation. Since all the inequality constraints to be satisfied in each trial the equations are solved by the iterative method. This method has used equal increment cost criterion for systems without transmission losses and penalty factors B matrix for considering the losses. Gradient Search Method This method works on the principle that the minimum of a function, f(x), can be found by a series of steps that always take us in a downward direction. In this method the fuel cost function is chosen to be of quadratic form. However, the fuel cost function becomes more nonlinear when valve point loading effects are included. Newton Method Newton s method goes a step beyond the simple gradient method and tries to solve the economic dispatch by observing that the aim is to always drive the gradient of function to zero. Generally, Newton s method will solve for the correction that is much closer to the minimum generation cost in one cost in one step than would the gradient method. Linear Programming Linear programming (LP) is a technique for optimization of a linear objective function subject to linear equality and linear in-equality constraints. Informally, linear programming determines the way to achieve the best outcome (such as maximum profit or lowest cost) in a given mathematical model and given some list of requirements represented by linear equations. A linear programming method will find a point in the optimization surface where this function has the smallest (or largest) value. Such points may not exist, but if they do, searching through the optimization surface vertices is guaranteed to find at least one of them. Base Point and Participation Factor This method assumes that the economic dispatch problem has to be solved repeatedly by moving the generators from one economically optimum schedule to another as the load changes by a reasonably small amount. It is started from a given schedule called the base point. Next assumes a load change and investigates how much each generating unit needs to be moved in order that the new load served at the most economic operating point. Intelligent Methods Evolutionary Programming (EP), Simulated Annealing (SA), Tabu Search (TS) Although the heuristic methods do not always guarantee discovering globally optimal solutions in finite time, they often provide a fast and reasonable solution. EP can be a quite powerful evolutionary approach; however, it is rather slow converging to a near optimum for some problems. Both SA and TS can be quite useful in solving complex reliability Page 62

optimization problems; however, SA is very time consuming, and cannot be utilized easily to tune the control parameters of the annealing schedule. TS is difficult in defining effective memory structures and strategies which are problem dependent. Dynamic Programming (DP) When cost functions are no-convex equal incremental cost methodology can not be applied. Under such circumstances, there is a way to find an optimum dispatch which use dynamic programming method. In dynamic Programming is an optimization technique that transforms a maximization (or minimization) problem involving n decision variables into n problems having only one decision variable each. This is done by defining a sequence of Value functions V1, V2..Vn, with an argument y representing the state of the system. The definition of Vi(y) is the maximum obtainable if decisions 1, 2...I are available and the state of the system is y. The function V1 is easy to find. For I=2,...n, Vi at any state y is calculated from Vi -1 by maximizing, over the I-th decision a simple function (usually the sum) of the gain of decision i and the function Vi -1 at the new state of the system if this decision is made. Since Vi -1 has already been calculated, for the needed states, the above operation yields Vi for all the needed states. Finally, Vn at the initial state of the system is the value of the optimal solution. The optimal values of the decision variables can be recovered, one by one, by tracking back the calculations already performed. Hopfield Neural Network (HNN) Hopfield introduced in 1982[4] and 1984[5], the Hopfield neural networks have been used in many different applications. The important property of the Hopfield neural network is the decrease in energy by finite amount whenever there is any change in inputs. Thus, the Hopfield neural network can be used for optimization. Tank and Hopfield [13] described how several optimization problem can be rapidly solved by highly interconnected networks of a simple analog processor, which is an implementation of the Hopfield neural network. Park and others [6] presented the economic load dispatch for piecewise quadratic cost functions using the Hopfield neural network. The results of this method were compared very well with those of the numerical method in a hierarchical approach. King and Others [12] applied the Hopfield neural network in the economic and environmental dispatching of electric power systems. These applications, however, involved a large number of iterations and often shown oscillations during transients. This suggests a need for improvement in convergence through an adaptive approach, such as the adaptive learning rate method developed by Ku and Lee [2] for a diagonal recurrent neural network. Genetic Algorithm (GA), Differential Evolution (DE) GA ensures colony evolves and solutions change continually; however, sometimes it lacks a strong capacity of producing better offspring and causes slow convergence near global optimum, sometimes may be trapped into local optimum. Due to the premature convergence of GA, its performance degrades and its search capability reduces. Price and Storn [8] invented differential evolution (DE). It involves three basic operations, e.g., mutation, crossover, and selection, in order to reach an optimal solution. DE has been found to yield better and faster solution, satisfying all the constraints, both for uni-modal and multi-modal system, using its different crossover strategies. But when system complexity and size increases, DE method is unable to map its entire unknown variables together in a better way. In DE all variables are changed together during the crossover operation. The individual variable is not tuned separately. So in starting stage, the solutions moves very fast towards the optimal point but at later stage when fine tuning operation is required, DE fails to give better performance. Particle Swarm Optimization (PSO) In the mid 1990s, Kennedy and Eberhart invented PSO [10]. In PSO there are only a few parameters to be adjusted, which make PSO more attractive. Simple concept, easy implementation, robustness and computational efficiency are the main advantages of the PSO algorithm. A closer examination on the operation of the algorithm indicates that once inside the optimum region, the algorithm progresses slowly due to its inability to adjust the velocity step size to continue the search at a finer grain. So for multi-modal function, particles sometimes fail to reach global optimal point. When compared with other methods, the PSO is computationally inexpensive in terms of memory and speed. The most attractive features of PSO could be summarized as: simple concept, easy implementation, fast computation, and robust search ability. Artificial Immune System (AIS) Artificial Immune System (AIS) [11] is another population based or network-based soft computing technique in the field of optimization that has been successfully implemented in various power system optimization problems. In each iteration of Page 63

AIS, many operations like affinity calculation, cloning, hyper-mutation, and selection are performed. During cloning, operation size of population also increases. Due to increase in number of operations, and larger size of population, convergence speed of AIS is much slower than DE or PSO. Bacterial Foraging Algorithm (BFA) Inspired from the mechanism of the survival of bacteria, e.g., E. coli, an optimization algorithm, called Bacterial Foraging Algorithm (BFA) [7], has been developed. Chemotaxis, reproduction and dispersion are the three processes with the help of which global searching capability of this algorithm has been achieved. These properties have helped BFA to be applied successfully in several kinds of power system optimization problems. But constraints satisfaction creates little trouble in BFA. Quantum-inspired Evolutionary Algorithm (QEAs) The quantum-inspired evolutionary algorithms (QEAs) [9], is then proposed, are based on the concepts and principles of quantum computing, which can strike the right balance between exploration and exploitation more easily when compared with conventional EAs. Meanwhile, the QEAs can explore the search space with a smaller number of individuals and exploit global solution within a short span of time. In the research of the QEAs and PSO, quantum-inspired particle swarm optimization (QPSO) is proposed. Two main definitions used in the QEAs are introduced: quantum bit and quantum rotation gate. Quantum bit is used as probabilistic representation of particles, defined as the smallest information unit. A string of quantum bits consists of a quantum bit individual. Also, quantum rotation gate is defined as an implementation to drive individuals toward better solutions and eventually find global optimum. Furthermore, three definitions in immunology are introduced: individual affinity, individual concentration, and selection possibility. They are used in the implementations of self-adaptive probability selection and chaotic sequences mutation, which can improve the algorithm search performance by increasing population diversity and preventing premature convergence. Here it should be noted that the proposed QPSO is a novel PSO, not a quantum algorithm. Biogeography-Based Optimization (BBO) Very recently, a new optimization concept, based on biogeography, has been proposed by Simon. Biogeography is the nature s way of distributing species. Let us consider an optimization problem with some trial solutions of it. In BBO [3], a good solution is analogous to an island with a high Habitat Suitability Index (HSI), and a poor solution represents an island with a low HSI. High HSI solutions resist change more than low HSI solutions. Low HSI solutions tend to copy good features from high HSI solutions. The shared features remain in the high HSI solutions, while at the same time appearing as new features in the low HSI solutions. This is as if some representatives of a species migrating to a habitat, while other representatives remain in their original habitat. Poor solutions accept a lot of new features from good solutions. This addition of new features to low HSI solutions may raise the quality of those solutions. This new approach to solve a problem is known as biogeography-based optimization (BBO). BBO works based on the two mechanisms: migration and mutation. BBO, as in other biology-based algorithms like GA and PSO, has the property of sharing information between solutions. Besides, the algorithm has certain unique features which overcome several demerits of the conventional methods as mentioned below: 1) In BBO and PSO, the solutions survive forever although their characteristics change as the optimization process progresses. But solutions of evolutionary-based algorithms like GA, EP, DE, etc. die at the end of each generation. Due to presence of crossover operation in evolutionary-based algorithms, many solutions whose fitness are initially good sometimes lose their quality in later stage of the process. In BBO there is no crossover like operation; solutions get fine tuned gradually as the process goes on through migration operation. Elitism operation has made the algorithm more efficient in this respect. This gives an edge to BBO over techniques mentioned above. 2) In PSO, solutions are more likely to clump together in similar groups, while in BBO, solutions do not have the tendency to cluster due to its new type of mutation operation. This is an added advantage of BBO in comparison to PSO. 3) BBO involves fewer computational steps per iteration compared to AIS. This results in faster convergence. Page 64

4) In BBO poor solutions accept a lot of new features from good ones which may improve the quality of those solutions. This is a unique feature of BBO algorithm compared to other techniques. At the same time this makes constraint satisfaction much easier, compared to that in BFA. References [1]. A. A. El-Keib, H. Ma, and J. L. Hart, Environmentally constrained economic dispatch using the Lagrange relaxation method, IEEE Trans. Power System., vol. 9, no. 4, pp. 1723 1729, Nov. 1994. [2]. C. C. Ku and K. Y. Lee, Diagonal recurrent neural networks for dynamic systems control, IEEE, Transactions on Neural Networks, Vol. 6, No. 1, pp. 144-156, January 1995. [3]. D. Simon, Biogeography-based optimization, IEEE Trans. Evol. Comput., vol. 12, no. 6, pp. 702 713, Dec. 2008. [4]. J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of National Academy of Science, USA., Vol. 79, pp. 2554-2558, April 1982. [5]. J. J. Hopfield, Neurons with graded response have collective computational properties like those of two state neurons, Proceedings of National Academy of Science, USA., Vol. 81, pp. 3088-3092, May 1984. [6]. J. H. Park, Y. S. Kim, I. K. Eom, and K. Y. Lee, Economic load dispatch for piecewise quadratic cost function using Hopfield neural networks, IEEE Transactions on Power system Apparatus and Systems, Vol. 8, NO. 3, pp. 1030-1038, August 1993. [7]. K. M. Passino, Biomimicry of bacterial foraging for distributed optimization and control, IEEE Trans. Control Syst., vol. 22, no. 3, pp. 52 67, Jun. 2002. [8]. L. N. De Castro and J. Timmis, Artificial Immune Systems: A New Computational Intelligence approach. London, U.K.: Springer-Verlag, 1996. [9]. M. Moore and A. Narayanan, Quantum-Inspired Computing Dept. Computer Science University Exeter. Exeter, U.K., 1995. [10]. R. C. Eberhart and Y. Shi, Comparison between genetic algorithms and particle swarm optimization, in Proc. IEEE Int. Conf. Evolutionary Computation, May 1998, pp. 611 616. [11]. R. Storn and K. Price, Differential Evolution A Simple and Efficient Adaptive Scheme for Global Optimization Over Continuous Spaces, International Computer Science Institute,, Berkeley, CA, 1995, Tech. Rep. TR-95 012. [12]. T. D. King, M. E. El-Hawary, and F. El-Hawary, Optimal environmental dispatching of electric power system via an improved Hopfield neural network model, IEEE Transactions on Power Systems, Vol. 1, No. 3, pp. 1559-1565, August 1995. [13]. W. D. Tank and J. J. Hopfield, Simple neural optimization networks: an A/D converter, signal decision circuit, and a linear programming circuit, IEEE Transactions on Systems, Mans, and Cybernetics, Vol. CAS-33, No. 5, pp. 553-541, January/February. Page 65