Generating Uniformly Distributed Pareto Optimal Points for Constrained and Unconstrained Multicriteria Optimization

Similar documents
Exploration of Pareto Frontier Using a Fuzzy Controlled Hybrid Line Search

Multiobjective Optimization Using Adaptive Pareto Archived Evolution Strategy

On a Class of Global Optimization Test Functions

PARETO FRONT APPROXIMATION WITH ADAPTIVE WEIGHTED SUM METHOD IN MULTIOBJECTIVE SIMULATION OPTIMIZATION. Jong-hyun Ryu Sujin Kim Hong Wan

A gradient-based multiobjective optimization technique using an adaptive weighting method

COMPENDIOUS LEXICOGRAPHIC METHOD FOR MULTI-OBJECTIVE OPTIMIZATION. Ivan P. Stanimirović. 1. Introduction

Multi-objective Optimization

Bi-objective Network Flow Optimization Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances

Smart Pareto Filter: Obtaining a Minimal Representation of Multiobjective Design Space

The Normalized Normal Constraint Method for Generating the Pareto Frontier

Optimization Methods: Advanced Topics in Optimization - Multi-objective Optimization 1. Module 8 Lecture Notes 2. Multi-objective Optimization

MOEA/D with NBI-style Tchebycheff approach for Portfolio Management

Compromise Based Evolutionary Multiobjective Optimization Algorithm for Multidisciplinary Optimization

Non-convex Multi-objective Optimization

Multi-Objective Memetic Algorithm using Pattern Search Filter Methods

THE NEW HYBRID COAW METHOD FOR SOLVING MULTI-OBJECTIVE PROBLEMS

The Smart Normal Constraint Method for Directly Generating a Smart Pareto Set

Adaptive weighted-sum method for bi-objective optimization: Pareto front generation

The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.

Evolutionary Algorithms and the Cardinality Constrained Portfolio Optimization Problem

DETC2001/DAC EFFECTIVE GENERATION OF PARETO SETS USING GENETIC PROGRAMMING

A penalty based filters method in direct search optimization

The Smart Normal Constraint Method for Directly Generating a Smart Pareto Set

A Distance Metric for Evolutionary Many-Objective Optimization Algorithms Using User-Preferences

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

Finding Sets of Non-Dominated Solutions with High Spread and Well-Balanced Distribution using Generalized Strength Pareto Evolutionary Algorithm

Particle Swarm Optimization to Solve Optimization Problems

A Procedure for Using Bounded Objective Function Method to Solve Time-Cost Trade-off Problems

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization

Multi-objective Optimization Algorithm based on Magnetotactic Bacterium

APPLICATION OF PATTERN SEARCH METHOD TO POWER SYSTEM ECONOMIC LOAD DISPATCH

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems

Multi-objective branch-and-bound. Application to the bi-objective spanning tree problem.

Constraints in Particle Swarm Optimization of Hidden Markov Models

Multi-objective Peer-to-Peer Neighbor-Selection Strategy Using Genetic Algorithm

Multi-Objective Optimization Techniques for VLSI Circuits

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

Efficient Algorithms for Robust Procurements

A Comparative Study on Optimization Techniques for Solving Multi-objective Geometric Programming Problems

Preprint Stephan Dempe, Alina Ruziyeva The Karush-Kuhn-Tucker optimality conditions in fuzzy optimization ISSN

Evolutionary Algorithm for Embedded System Topology Optimization. Supervisor: Prof. Dr. Martin Radetzki Author: Haowei Wang

Finding Knees in Multi-objective Optimization

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO., MONTH YEAR 1

Evolutionary multi-objective algorithm design issues

Double Archive Pareto Local Search

Multiobjective Optimization (II)

Using an outward selective pressure for improving the search quality of the MOEA/D algorithm

TIES598 Nonlinear Multiobjective Optimization Methods to handle computationally expensive problems in multiobjective optimization

Clustering-Based Distributed Precomputation for Quality-of-Service Routing*

α Coverage to Extend Network Lifetime on Wireless Sensor Networks

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

Lamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

A Multi-objective Binary Cuckoo Search for Bicriteria

NEW DECISION MAKER MODEL FOR MULTIOBJECTIVE OPTIMIZATION INTERACTIVE METHODS

DEMO: Differential Evolution for Multiobjective Optimization

Approximation-Guided Evolutionary Multi-Objective Optimization

Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts

Methods and Applications of Multiobjective Optimization. Introduction. Terminology (1) Terminology(2) Multiobjective or Multicriteria optimization

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines

Numerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

CHAPTER 2 MULTI-OBJECTIVE REACTIVE POWER OPTIMIZATION

An Adaptive Normalization based Constrained Handling Methodology with Hybrid Bi-Objective and Penalty Function Approach

Transportation Policy Formulation as a Multi-objective Bilevel Optimization Problem

Multiobjective Optimisation. Why? Panorama. General Formulation. Decision Space and Objective Space. 1 of 7 02/03/15 09:49.

A Method for Selecting Pareto Optimal Solutions in Multiobjective Optimization

Fixed-Parameter Evolutionary Algorithms and the Vertex Cover Problem

Finding a preferred diverse set of Pareto-optimal solutions for a limited number of function calls

Evolutionary Approaches for Resilient Surveillance Management. Ruidan Li and Errin W. Fulp. U N I V E R S I T Y Department of Computer Science

SwarmOps for Java. Numeric & Heuristic Optimization Source-Code Library for Java The Manual Revision 1.0

OPTIMIZATION, OPTIMAL DESIGN AND DE NOVO PROGRAMMING: DISCUSSION NOTES

Bilinear Programming

A Search Method with User s Preference Direction using Reference Lines

Computational Intelligence

AVERAGING RANDOM PROJECTION: A FAST ONLINE SOLUTION FOR LARGE-SCALE CONSTRAINED STOCHASTIC OPTIMIZATION. Jialin Liu, Yuantao Gu, and Mengdi Wang

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

Modeling with Uncertainty Interval Computations Using Fuzzy Sets

Parallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources

Stochastic branch & bound applying. target oriented branch & bound method to. optimal scenario tree reduction

Minimum Cost Edge Disjoint Paths

Optimization of Axle NVH Performance Using the Cross Entropy Method

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

An Angle based Constrained Many-objective Evolutionary Algorithm

THE REFERENCE POINT METHOD APPLIED TO DECISION SELECTION IN THE PROCESS OF BILATERAL NEGOTIATIONS

Martin Luther Universität Halle Wittenberg Institut für Mathematik

OPTIMIZATION METHODS. For more information visit: or send an to:

Optimization. Industrial AI Lab.

A Super Non-dominated Point for Multi-objective Transportation Problem

Assessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization

α-pareto optimal solutions for fuzzy multiple objective optimization problems using MATLAB

Telecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA

Transcription:

Generating Uniformly Distributed Pareto Optimal Points for Constrained and Unconstrained Multicriteria Optimization Crina Grosan Department of Computer Science Babes-Bolyai University Cluj-Napoca, Romania cgrosan@cs.ubbcluj.ro Ajith Abraham Norwegian Center of Excellence, Q2S, Norwegian University of Science and Technology Trondheim, Norway ajith.abraham@ieee.org Abstract This paper proposes a novel approach to generate a uniform distribution of the optimal solutions along the Pareto frontier. We make use of a standard mathematical technique for optimization namely line search and adapt it so that it will be able to generate a set of solutions uniform distributed along the Pareto front. To validate the method, numerical bi-criteria examples are considered. The method deals with both unconstrained as well as constrained multicriteria optimization problems. 1 Introduction Multiobjective Optimization Problems (MOP s) are ones of the hardest optimization problems, which arise in many real-world applications. Due to the increasing interest in multiobjective optimization, several algorithms for dealing with such problems have been proposed recently [1] [2][3][6][7][8]. Most of these algorithms use an iterative method to generate multiple points approximating the Pareto set. A Pareto optimal set is regarded as the mathematical solution to a multicriteria problem. In continuous problems, the number of Pareto optimal solutions is usually infinite. Only in relatively simple cases the entire Pareto optimal set can be determined analytically. In a typical problem, we must be satisfied by obtaining enough Pareto optima to cover the minimal set in the criteria space properly. This computed subset of Pareto optima could be called as a representative Pareto optimal set and its quality can be judged for example by its ability to cover the whole minimal set evenly. In some problems, however, the cost of generating just one Pareto optimum may become so high that the designer can afford only a few Pareto optimal solutions. Before performing numerical optimization a suitable generation strategy is to be selected, which guarantees that only Pareto optima are obtained. The paper deals with the generation of uniformly distributed Pareto points. A scalarization of the objectives is used in order to transform the multiobjective optimization problem into a single objective optimization problem. A line search based technique is used to obtain an efficient solution. Starting with this solution, a set of efficient points are further generated, which are widely distributed along the Pareto frontier using again a line search based method but involving Pareto dominance relationship. Rest of the paper is organized as follows. The multiobjective algorithm used is presented in Section 2. The numerical examples used to emphasize the efficiency of the used approach are presented in Section 3 followed by Conclusions in Section 4. 2 The Multicriteria Approach The line search [3] is a well established optimization technique. The standard line search technique is modified so that it is able to generate the set of nondominated solutions for a multiobjective optimization problem (MOP) [4]. The approach comprises of two phases: first, the problem is transformed into a SOP and a solution is found using a line search based approach. This is called as convergence phase. Second, a set of Pareto solutions are generated starting with the solution obtained at the end of convergence phase. This is called as spreading phase. Consider the MOP formulated as follows: Let R m and R n be Euclidean vector spaces referred to as the decision space and the objective space. Let X R m be a feasible set and let f be a vector-valued INFOS2008, March 27-29, 2008 Cairo-Egypt 2008 Faculty of Computers & Information-Cairo University MM-73

objective function f: R m R n composed of n realvalued objective functions f=(f 1, f 2,..., f n ), where f k : R m R, for k=1,2,..., n. A MOP is given by: min (f 1 (x), f 2 (x),..., f n (x)), subject to x X. The MOP is further transformed into a single objective optimization problem (SOP) by considering: min F = n fi 3(x) i=1 subject to x X. Direction and step setting The direction is set as being a random number between 0 and 1. The step is set as follows: α k =2+ 3 2 2k +1 (1) where k refers to the iteration number. Convergence phase Initially, a set of points are randomly generated within the given boundaries. The line search with direction and step set as mentioned above is applied for a given number of iterations for each of these points. Afterwards, the boundaries of the definition domain are modified taking into account the partial derivatives as follows: Let x be the best result obtained in the previous set of iterations: For each dimension iof the point x, the first partial derivative with respect to this dimension is calculated. This means the gradient of the objective function is calculated which is denoted by g. Taking this into account, the bounds of the definition domain for each dimension are re-calculated as follows: if g i = F x i < 0then lower bound =x i if g i = F x i > 0then upper bound =x i; The search process is re-started by re-initializing a new arbitrary point between the newly obtained boundaries. Spreading phase At the end of the convergence phase, a solution is obtained. This solution is considered as an efficient (or Pareto) solution. During this phase and taking into account of the existing solution, more efficient solutions are to be generated so as to have a thorough distribution of all several good solutions along the Pareto frontier. In this respect, the line search technique is made use of to generate one solution at the end of each set of iterations. This procedure is applied several times in order to obtain a larger set of non-dominated solutions. The following steps are repeated in order to obtain one non-dominated solution: Step 1. A set of nondominated solutions found so far is archived. Let us denote it by NonS. Initially, this set will have the size one and will only contain the solution obtained at the end of convergence phase. Step2. We apply line search for one solution and one dimension of this solution at one time. For this: Step 2.1. A random number i between one and NonS (. denotes the cardinal) is generated. Denote the corresponding solution by nons i. Step 2.2. A random number j between one and the number of dimensions (the number of decision variables) is generated. Denote this by nons ij. Step 3. Line search is applied for nons ij. Step 3.1. Set p=random. Step 3.2. Set α (which depends on the problem, on the number of total nondominated solutions which are to be generated, etc.). Step 3.3. The new obtained solution new sol is identical to nons i in all dimensions except dimension j which is: new sol j = nons ij +α p Step 3.4. if (new sol j > upper bound) or (new sol j < lower bound) then new sol j = lower bound + random (upper bound lower bound). Step 4. if F (new sol) > F (nons 1 ) then discard new sol else if new sol is nondominated with respect to the set NonS then add new sol to NonS and increase the size on NonS by 1. Go to step 2. Step 5. Stop MM-74

3 Experiment Results In order to emphasize the performance of the above technique to generate uniformly distributed Pareto points we make use of two multiobjective optimization test problems: and unconstrained MOP and a constrained MOP. Figure 1. Pareto front obtained with 100 solutions for Problem 1. 3.1 Unconstrained example (Problem 1) Consider the following problem [9]: Min f={f 1 (x), f 2 (x)} f 1 (x)=cosh(x) f 2 (x) = x 2-12x+35 The multiobjective approach used the following parameters: number of re-starts: 10; number of iteration per each re-start: 10; α for the spreading phase (its value depends on the number of Pareto solutions which are to be generated). The Pareto frontier generated using 100 solutions is depicted in Figure 1 and using 1000 solutions in Figure 2. The value of α is 4.9, while the Pareto front consists of 100 points and 9.5 when the Pareto front has 1000 points. 3.2 Constrained example (Problem 2) Figure 2. Pareto front obtained with 1000 solutions for Problem 1. Considering the following test function with equalities and inequalities constraints [9]: Min f={f 1 (x), f 2 (x)} f 1 (x)=(x 1-2) 2 +(x 2-1) 2 f 2 (x) = x 2 1 + (x 2 6) 2 s.t. g 1 (x) = x 2 1 x 2 0 g 2 (x) = 5x 2 1 + x 2 10 g 3 (x) = x 2 5 g 4 (x) = x 1 0 The Pareto frontier generated using 100 solutions is illustrated in Figure 3 and using 1000 solutions in Figure 4. The value of α is 5.4 and the Pareto front consists of 100 points and 7.2 when the Pareto front has 1000 points. MM-75

Figure 5. Distributions of solutions along the Pareto frontier for Problem 2 by considering: (a) α=1.5 and (b) α=9.7. Figure 3. Pareto front obtained with 100 solutions for Problem 2. Figure 4. Pareto front generated using 1000 points for Problem 2. MM-76

One of the challenges of this approach is to find an adequate value for α. The values obtained in our examples were set based on several experiments. For example, Figure 5 depicts two other distributions of the solutions along the Pareto front which are not uniform obtained for α=1.5 (Figure 5 (a)) and α=9.7 (Figure 5 (b)). 4 Conclusions The paper proposed a novel method for finding a set of uniformly distributed Pareto solutions along the Pareto frontier. A line search based technique is applied in order to obtain one solution. Starting from this solution, a simplified version of the initial line search is used in order to generate solutions with a well distribution on the Pareto frontier. Numerical experiments performed on unconstrained as well as constrained multiobjective optimization test problems show that the approach presented here is able to converge very fast and provide a very good distribution of solution along the Pareto frontier. The approach uses a parameter whose setting gives a better or a worse distribution. The value of this parameter is set based on some experimental tests but one of our future research plans is to find a better way to deal with setting the right value for this parameter. [5] Grosan, C. and Abraham, A., Hybrid line search for multiobjective optimization, 2007 International Conference on High Performance Computing and Communications (HPCC-07), Houston, USA, Springer Verlag, Germany, R. Perrott et al. (Eds.): HPCC 2007, LNCS 4782, pp. 62-73, 2007. [6] Grosan, C., Abraham, A., Nicoara, M. Search Optimization Using Hybrid Particle Sub-Swarms and Evolutionary Algorithms, International Journal of Simulation Systems, Science & Technology, Volume 6, Nos. 10 and 11, pp. 60-79, 2005. [7] Messac, A. and Mattson, C. A., Generating Well- Distributed Sets of Pareto Points for Engineering Design using Physical Programming, Optimization and Engineering, 3, pp. 431 450, 2002. [8] Messac, A., Ismail-Yahaya, A., and Mattson, C. A., The Normalized Normal Constraint Method for Generating the Pareto Frontier, Structural and Multidisciplinary Optimization, 25(2), pp. 86 98, 2003. [9] Zhang, W, H., Gao, T., A min max method with adaptive weightings for uniformly spaced Pareto optimum points, Computers and Structures 84, pp. 1760 1769, 2006 References [1] Das, I. and Dennis, J., A Closer Look at Drawbacks of Minimizing Weighted Sums of Objective for Pareto Set Generation in Multicriteria Optimization Problems, Structural Optimization, 14, pp. 63 69, 1997. [2] Dumitrescu, D., Grosan, C., and Oltean, M. Evolving Continuous Pareto Regions, Evolutionary Multiobjective Optimization: Theoretical Advances and Applications, Abraham A., Jain L. and Goldberg R. (Eds.), Springer Verlag, London, ISBN 1852337877, 167-199, 2005 [3] Ehrgott, M., and Wiecek, MM., Mutiobjective Programming, In Multiple Criteria Decision Analisys: State of the art surveys, J. Figueira et al. (Eds.), International Series in Operations Research & Management Science, Volume 78, Springer New York, 2005. [4] Gould, N., An introduction to algorithms for continuous optimization, Oxford University Computing Laboratory Notes, 2006. MM-77