Multicriterial Optimization Using Genetic Algorithm
|
|
- Joel Dickerson
- 6 years ago
- Views:
Transcription
1 Multicriterial Optimization Using Genetic Algorithm Fitness Best Fitness Mean Fitness Page Generations
2 Contents Optimization, Local and Global Optimization Multicriterial Optimization Constraints Methods of Solution Examples Task of the Desicion Maker Page 2
3 Global optimization Global optimization is the process of finding the global extreme value (minimum or maximum) within some search space S. The single objective global optimization problem can be formally defined as follows: Page 3
4 Global optimization * Then x is the global solution(s), f is the objective function, and the set Ω is the feasible region ( Ω S ). The problem to finding the minimum solution(s) is called the global optimization problem. The maximalization can explain from minimalization with the next fromula: max{ f ( x )} = min{ f ( x )} Page 4
5 Optimization Local optimums and the Global optimum Local optimums and the others Global optimum Page 5
6 Multicriterial optimalization Altough single-objective optimalization problem may have an unique optimal solution (global optimum). Multiobjective Optimalization Problem (MOPs) as a rule present a possibility of uncountable set of solutions, which when evaluated, produce vectors whose components represent trade-offs of objective space. A decision maker (DM) then implicitly chooses an acceptable solution (or solutions) by selecting one or more of these vectors. Page 6
7 Multicriterial optimalization f2 objective functions F = [ f1, f2 ] f1 = Best solutions Page 7 + = Normal solutions
8 Multicriterial optimalization The Multiobjective Optimalization Problem also called multicriteria optimisation or vector optimisation problem can then be determined (in words) as a problem of finding a vector of decision variables which satisfies constraints and optimises a vector function whose elements represent the objective functions. This functions form a mathematical description of performance criteria which are usually in conflict with each other. Hence the term optimise means finding such a solution which would give the values of all the objective functions acceptable to the decision maker. Page 8
9 Decision variables The decision variables are the numerical quantities for which values are to be chosen in an optimalization problem. Page 9
10 Constraints In most optimalization problem there are always restrictions imposed by the particular characteristics of the environment or resources available (e. g. physical limitations, time restrictions, e.t.c. ). These restrictions must be satisfied in order to consider that certain solution is acceptable. All these restrictions in general are called constrains and they describe dependences among decision variables and contants (or parameters) involved in the problem. Page 10
11 Constraints These constrains are expressed in form of mathematical inequalities: where p < n and n is the size of decision vector Page 11
12 Constraints The number p of equality constrains, must be less than n, the number of decision variables, because if p >= n the problem is said to be overconstrained, since there are no degrees of freedom left for optimizing (more unknowns than equations). The number of degrees of freedom is given by (n p). Also constrains can be explicit (i.e. given in algebraic form) or implicit in which case the algorithm to compute gi (x) for any given vector x must be known. Page 12
13 Objective Functions In order to know how good a certain solution is, it is nessesary to have some criteria to evaluate it. (For example the profit, the number of employee, etc.) These criteria are expressed as computable functions of the decision variables, that are called objective functions. In real word problems, some of them in conflict with others, and some have to be minimized while the others are maximized. These objective functions may be commensurable (measured in the same unit) or non-commensurable (measured in different units). Page 13
14 Types of Multicriterial Optimization Problem In multiobjective optimization problems, there are three possible situations: Minimize all objective functions Maximize all objective functions Minimize some and maximize others Page 14
15 Objective Functions The multiple objectives being optimized almost always conflict, placing a partial, rather than total, ordering on the search space. In fact finding the global optimum of a general MOP is NP-Complete (Bäck 1996). Page 15
16 Attributes, Critereia, Objectives and Goals Attributes: are often thought of as differentiating aspects, properties or characteristics or alternatives or consequences. Criteria: generally denote evaluative measures, dimensions or scales against which alternatives may be gauged in a value or worth sence. Objectives: are sometimes viewed in the same way, but also denote specific desired levels of attainment or vague ideals. Goals: usually indicate either of the latter notations. A distiction commonly made in Operation Research is to use the term goal to designate potentially attainable levels, and objectives to designate unattainable ideas. Page 16
17 Attributes, Critereia, Objectives and Goals The convention adopted in this presentation is the same assummed by several researcher { Horn (1997), Fishburn (1978) } of using the terms objective, criteria, and attribute interchangeably to represent an MOP s goal or objectives (i.e. distinct mathematical functions) to be achived. The terms objective space or objective function space are also used to denote the coordinate space within which vectors resulting from evaluating an MOP are plotted. Page 17
18 Objectives Functions Page 18
19 Euclidean space The set of all n-tuples of real numbers denoted by Rn is called Euclidean n-space. Two Euclidean spaces are considered: The n-dimensional space of decision variables in which each coordinate axis corresponds to a component of vector x. The k-dimensional space of objective functions in which each coordinate axis corresponds to a component of vector f(x). Page 19
20 Euclidean space Every point in the first space (decision variables ) represents a solution and gives a certain point in the second space (objective functions ), which determines a quality of solution in term of the values of the objective functions. Page 20
21 Euclidean space Page 21
22 General Multicriterial Optimization Problem Page 22
23 General Multicriterial Optimization Problem Page 23
24 Convert of Multicriterial Optimization Problem For simplicity reasons, normally all functions are converted to a maximization or minimization form. For example, the following identity may be used to convert all functions which are to be maximized into a form which allows their minimalization: min{ f ( x )} = max{ f ( x )} Page 24
25 Convert of Multicriterial Optimization Problem Similarity, of the inequality constrains of the form gi ( x ) 0 i = 1, 2,..., m can be converted to (1.8) form by multiplying by 1 and changing the sign of the inequality. Thus, the previous equation is equivalent to gi ( x ) 0 Page 25 i = 1, 2,..., m
26 Multicriterial Optimization Problem Ideal Solution Page 26
27 Multicriterial Optimization Problem Ideal Solution Page 27
28 Multicriterial Optimization Problem Ideal Solution f2( x ) f2( x* ) Single optimal solution Figure 1.1 Page 28 f1( x ) f1( x * ) Singe x* solution vector
29 Multicriterial Optimization Problem Ideal Vector Page 29
30 Multicriterial Optimization Problem Convexity Page 30
31 Multicriterial Optimization Problem Convex Sets Page 31
32 Multicriterial Optimization Problem Non-convex Sets Page 32
33 Multicriterial Optimization Problem Pareto Optimality Page 33
34 Multicriterial Optimization Problem Pareto Optimality In words this definition says that is Pareto optimal if there is exists no feasible vector which would decrese some criterion without causing a simultaneous increase in the last one other criterion. The phrase Pareto optimal is considered to mean which respect to the entire decision variable space unless otherwise specified. Page 34
35 Multicriterial Optimization Problem Pareto Optimality f2 objective functions F = [ f1, f2 ] f1 = Pareto Optimal Set Page 35 + = Normal solutions
36 Multicriterial Optimization Problem Pareto Optimality Page 36
37 Multicriterial Optimization Problem Pareto Front The minima in the Pareto sence are going to the boundary of the design region, or in the locus of the tangent points of the objective functions. In the Figure 1.6 a bold curve is used to mark the boundary for a bi-objective problem. The region of the points defined by this bold curve is called the Pareto front. Page 37
38 Multicriterial Optimization Problem Pareto Front f2 objective functions F = [ f1, f2 ] F Pareto Front f1 Page 38
39 Multicriterial Optimization Problem Global Optimization Defining an MOP global optimum is not a trivial task as the best compromise solution is really dependent on the specific preferences (or biases) of the (human) decision maker. Solutions may also have some temporal dependences (e.g. acceptable resource expeditures may vary from month to month). Thus, there is no universally accepted definition for the MOP global optimization problem. (But there are implemented more and more individual solutions...) Page 39
40 General Optimization Algorithms Overview Genaral search and optimization techniques are classified into three categories: enumarative, detereministic and stochastic (random). (Figure 1.11 on next page) As many real-world problems are computationally intensive, some means of limiting the search space must be implemented to find acceptable solutions in acceptable time (Mihalewicz and Fogel 2000) Deterministic algorithms attempt this by incorporating problem domain knowledge. Many of graph/tree search algorithms are known and applied. Page 40
41 General Optimization Algorithms Overview Page 41
42 General Optimization Algorithms Genetic Algorithm Page 42
43 General Optimization Algorithms Genetic Algorithm Page 43
44 General Optimization Algorithms Genetic Algorithm Page 44
45 General Optimization Algorithms Genetic Algorithm Chromosome- (Floating point Coding) Selection Individual Page Fitness Values Generáció Objective Values
46 General Optimization Algorithms Genetic Algorithm Crossover Parents Mutation CROSSOVER Children Page
47 General Optimization Algorithms Genetic Algorithm Mutation Decision Variable Space Variable_2 Parents Crossover Children Variable_1 Page 47
48 MOGA Optimization Algorithms Genetic Algorithm Objective Function Space Objective 2 Local Pareto Front Rank = 3 Rank = 2 Rank = 1 Pareto Front Page 48 Objective 1
49 MOGA Optimization Algorithms Genetic Algorithm (Dummy) Fitness Nonlinear Fitness Assignment Rank Page 49 RankMAX
50 MOGA Optimization Algorithms Genetic Algorithm Objective_2 MOGA operation (theoretically) Pareto Front Page 50 Objective_1
51 MOGA Optimization Algorithms Genetic Algorithm Objective_2 Genetic Drift (real operation of MOGA) Pareto Front Page 51 Objective_1
52 MOGA Optimization Algorithms Genetic Algorithm Genetic Drift Break with Fitness Correction 1 Normalization Objective_2 [0,1]x[0,1] 2 Objective_1 Page 52
53 MOGA Optimization Algorithms Genetic Algorithm 1 1 Niche count distanc e σshare Σ(Niche Count) = 1/Wi 0 1 Page 53
54 MOGA Optimization Algorithms Genetic Algorithm Calculation of Fitness Correction Factors Page 54
55 MOGA Optimization Algorithms Genetic Algorithm Fitness Genetic Drift Break with Fitness Correction Rank Values Page 55
56 MOGA Optimization Examples Examples: MOP-1 MOP-1 normal MOP-1 with Drift Break 5 5 where 10 x 10 2 f 1 ( x ) = ( x 2 ) f1( x ) = x 2 MOP-2 f1( x, y ) = x x 2 x sin( 8πx ) f 2 ( x, y ) = ( y ) y y 0 x,y 1 Page 56 MOP-2 normal where MOP-2 with Drift Break
57 MOGA Optimization Examples MOP 3 ( ( ) f 1 ( x, y ) = 1 + ( A1 B1 )2 + ( A2 B2 )2 where 2 2 f2 ( x, y ) = 1 + ( x + 3 ) + ( y + 1 ) ) A1 = 0.5 sin( 1 ) + 2 cos( 1 ) + sin( 2 ) 1.5 cos( 2 ) A2 = 1.5 sin( 1 ) cos( 1 ) + 2 sin( 2 ) 0.5 cos( 2 ) B1 = 0.5 sin( x ) + 2 cos( x ) + sin( y ) 1.5 cos( y ) conditions B2 = 1.5 sin( x ) cos( x ) + 2 sin( y ) 0.5 cos( y ) π x, y π MOP-3 normal Page 57 MOP-3 with Drift Break
58 MOGA Optimization Examples MOP 4 2 n 1 f 1 ( xi ) = 1 exp xi i =1 n where i = 1, 2 and 2 n 1 f 2 ( xi ) = 1 exp xi + i =1 n 4 xi 4 Page 58 MOP-4 normal MOP-4 with Drift Break
59 Decision Maker Mathematically, every Pareto optimal point is an equally acceptable solution of the multiobjective optimalization problem. However, it is generally desirable to obtain one point as a solution. Selecting one of the set of Pareto optimal solutions call for information that is not contained in the objective function. That is why compared to single objective optimalization a new element is added in multiobjective optimalization. Page 59
60 Decision Maker We need a decision maker to make the selection. The decision maker is a person (or a group of persons) who is supposed to have better insight into the problem and who can express preference repations between different solutions. Usually, the decision maker is responsible for the final solution. Page 60
61 Decision Maker Solving a multiobjective optimalization problem calls for the co-operation of the decision maker and an analyst. By an analyst we have mean a person or a computer program responsibile for mathematical side of the solution process. The analyst generates information for the decisition maker to consider and the solution is selected according to the preferences of the decision maker. Page 61
62 Decision Maker It is assummedin the following that we have a single decision maker or an unanimous group of decision makers. Generally, group decision making is a world of its own. It calls for negotiations and specific methods when searching for compromises between different interest groups. Page 62
63 Thank you for your attention Questions? Page 63
Module 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationCHAPTER 2 MULTI-OBJECTIVE REACTIVE POWER OPTIMIZATION
19 CHAPTER 2 MULTI-OBJECTIE REACTIE POWER OPTIMIZATION 2.1 INTRODUCTION In this chapter, a fundamental knowledge of the Multi-Objective Optimization (MOO) problem and the methods to solve are presented.
More informationREAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN
REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty
More informationMulti-objective Optimization
Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001 Multi-objective Optimization Implementation of Constrained GA Based on NSGA-II Optimization
More informationMulti-objective Optimization
Jugal K. Kalita Single vs. Single vs. Single Objective Optimization: When an optimization problem involves only one objective function, the task of finding the optimal solution is called single-objective
More informationCHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM
20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:
More informationClassification of Optimization Problems and the Place of Calculus of Variations in it
Lecture 1 Classification of Optimization Problems and the Place of Calculus of Variations in it ME256 Indian Institute of Science G. K. Ananthasuresh Professor, Mechanical Engineering, Indian Institute
More informationProgramming, numerics and optimization
Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June
More informationA Short SVM (Support Vector Machine) Tutorial
A Short SVM (Support Vector Machine) Tutorial j.p.lewis CGIT Lab / IMSC U. Southern California version 0.zz dec 004 This tutorial assumes you are familiar with linear algebra and equality-constrained optimization/lagrange
More informationOptimization Methods: Advanced Topics in Optimization - Multi-objective Optimization 1. Module 8 Lecture Notes 2. Multi-objective Optimization
Optimization Methods: Advanced Topics in Optimization - Multi-objective Optimization 1 Module 8 Lecture Notes 2 Multi-objective Optimization Introduction In a real world problem it is very unlikely that
More informationLecture 2 Optimization with equality constraints
Lecture 2 Optimization with equality constraints Constrained optimization The idea of constrained optimisation is that the choice of one variable often affects the amount of another variable that can be
More informationInverse and Implicit functions
CHAPTER 3 Inverse and Implicit functions. Inverse Functions and Coordinate Changes Let U R d be a domain. Theorem. (Inverse function theorem). If ϕ : U R d is differentiable at a and Dϕ a is invertible,
More informationChapter 3 Numerical Methods
Chapter 3 Numerical Methods Part 1 3.1 Linearization and Optimization of Functions of Vectors 1 Problem Notation 2 Outline 3.1.1 Linearization 3.1.2 Optimization of Objective Functions 3.1.3 Constrained
More informationII. Linear Programming
II. Linear Programming A Quick Example Suppose we own and manage a small manufacturing facility that produced television sets. - What would be our organization s immediate goal? - On what would our relative
More informationEvolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.
Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple
More informationA computational intelligence approach to systemof-systems. optimization
SoS Optimization A computational intelligence approach to systemof-systems architecting incorporating multiobjective optimization MOTIVATION > To better understand the role of multi-objective optimization
More informationImproving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.
Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Gómez-Skarmeta, A.F. University of Murcia skarmeta@dif.um.es Jiménez, F. University of Murcia fernan@dif.um.es
More informationCurve and Surface Basics
Curve and Surface Basics Implicit and parametric forms Power basis form Bezier curves Rational Bezier Curves Tensor Product Surfaces ME525x NURBS Curve and Surface Modeling Page 1 Implicit and Parametric
More informationAn Improved Progressively Interactive Evolutionary Multi-objective Optimization Algorithm with a Fixed Budget of Decision Maker Calls
An Improved Progressively Interactive Evolutionary Multi-objective Optimization Algorithm with a Fixed Budget of Decision Maker Calls Ankur Sinha, Pekka Korhonen, Jyrki Wallenius Firstname.Secondname@aalto.fi,
More informationReal life Problem. Review
Linear Programming The Modelling Cycle in Decision Maths Accept solution Real life Problem Yes No Review Make simplifying assumptions Compare the solution with reality is it realistic? Interpret the solution
More informationCOMPENDIOUS LEXICOGRAPHIC METHOD FOR MULTI-OBJECTIVE OPTIMIZATION. Ivan P. Stanimirović. 1. Introduction
FACTA UNIVERSITATIS (NIŠ) Ser. Math. Inform. Vol. 27, No 1 (2012), 55 66 COMPENDIOUS LEXICOGRAPHIC METHOD FOR MULTI-OBJECTIVE OPTIMIZATION Ivan P. Stanimirović Abstract. A modification of the standard
More informationLesson 17. Geometry and Algebra of Corner Points
SA305 Linear Programming Spring 2016 Asst. Prof. Nelson Uhan 0 Warm up Lesson 17. Geometry and Algebra of Corner Points Example 1. Consider the system of equations 3 + 7x 3 = 17 + 5 = 1 2 + 11x 3 = 24
More informationGraphical Methods in Linear Programming
Appendix 2 Graphical Methods in Linear Programming We can use graphical methods to solve linear optimization problems involving two variables. When there are two variables in the problem, we can refer
More informationDevelopment of Evolutionary Multi-Objective Optimization
A. Mießen Page 1 of 13 Development of Evolutionary Multi-Objective Optimization Andreas Mießen RWTH Aachen University AVT - Aachener Verfahrenstechnik Process Systems Engineering Turmstrasse 46 D - 52056
More informationFinite Element Analysis Prof. Dr. B. N. Rao Department of Civil Engineering Indian Institute of Technology, Madras. Lecture - 24
Finite Element Analysis Prof. Dr. B. N. Rao Department of Civil Engineering Indian Institute of Technology, Madras Lecture - 24 So in today s class, we will look at quadrilateral elements; and we will
More informationFinding Sets of Non-Dominated Solutions with High Spread and Well-Balanced Distribution using Generalized Strength Pareto Evolutionary Algorithm
16th World Congress of the International Fuzzy Systems Association (IFSA) 9th Conference of the European Society for Fuzzy Logic and Technology (EUSFLAT) Finding Sets of Non-Dominated Solutions with High
More information. Tutorial Class V 3-10/10/2012 First Order Partial Derivatives;...
Tutorial Class V 3-10/10/2012 1 First Order Partial Derivatives; Tutorial Class V 3-10/10/2012 1 First Order Partial Derivatives; 2 Application of Gradient; Tutorial Class V 3-10/10/2012 1 First Order
More informationBI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines
International Journal of Mathematics and Computer Applications Research (IJMCAR) ISSN 2249-6955 Vol. 2 Issue 4 Dec - 2012 25-32 TJPRC Pvt. Ltd., BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP
More informationAn Evolutionary Algorithm for the Multi-objective Shortest Path Problem
An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China
More informationThe Genetic Algorithm for finding the maxima of single-variable functions
Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding
More informationIn other words, we want to find the domain points that yield the maximum or minimum values (extrema) of the function.
1 The Lagrange multipliers is a mathematical method for performing constrained optimization of differentiable functions. Recall unconstrained optimization of differentiable functions, in which we want
More informationNOTATION AND TERMINOLOGY
15.053x, Optimization Methods in Business Analytics Fall, 2016 October 4, 2016 A glossary of notation and terms used in 15.053x Weeks 1, 2, 3, 4 and 5. (The most recent week's terms are in blue). NOTATION
More informationToday. Gradient descent for minimization of functions of real variables. Multi-dimensional scaling. Self-organizing maps
Today Gradient descent for minimization of functions of real variables. Multi-dimensional scaling Self-organizing maps Gradient Descent Derivatives Consider function f(x) : R R. The derivative w.r.t. x
More informationFunctions of Several Variables
Jim Lambers MAT 280 Spring Semester 2009-10 Lecture 2 Notes These notes correspond to Section 11.1 in Stewart and Section 2.1 in Marsden and Tromba. Functions of Several Variables Multi-variable calculus
More informationSimplicial Global Optimization
Simplicial Global Optimization Julius Žilinskas Vilnius University, Lithuania September, 7 http://web.vu.lt/mii/j.zilinskas Global optimization Find f = min x A f (x) and x A, f (x ) = f, where A R n.
More informationMulti-Objective Optimization Using Genetic Algorithms
Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves
More informationNCGA : Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems
: Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems Shinya Watanabe Graduate School of Engineering, Doshisha University 1-3 Tatara Miyakodani,Kyo-tanabe, Kyoto, 10-031,
More informationGenetic Algorithms Variations and Implementation Issues
Genetic Algorithms Variations and Implementation Issues CS 431 Advanced Topics in AI Classic Genetic Algorithms GAs as proposed by Holland had the following properties: Randomly generated population Binary
More informationEvolutionary Computation
Evolutionary Computation Lecture 9 Mul+- Objec+ve Evolu+onary Algorithms 1 Multi-objective optimization problem: minimize F(X) = ( f 1 (x),..., f m (x)) The objective functions may be conflicting or incommensurable.
More informationOptimization with Multiple Objectives
Optimization with Multiple Objectives Eva K. Lee, Ph.D. eva.lee@isye.gatech.edu Industrial & Systems Engineering, Georgia Institute of Technology Computational Research & Informatics, Radiation Oncology,
More informationSection 18-1: Graphical Representation of Linear Equations and Functions
Section 18-1: Graphical Representation of Linear Equations and Functions Prepare a table of solutions and locate the solutions on a coordinate system: f(x) = 2x 5 Learning Outcome 2 Write x + 3 = 5 as
More informationUnsupervised Feature Selection Using Multi-Objective Genetic Algorithms for Handwritten Word Recognition
Unsupervised Feature Selection Using Multi-Objective Genetic Algorithms for Handwritten Word Recognition M. Morita,2, R. Sabourin 3, F. Bortolozzi 3 and C. Y. Suen 2 École de Technologie Supérieure, Montreal,
More informationf xx (x, y) = 6 + 6x f xy (x, y) = 0 f yy (x, y) = y In general, the quantity that we re interested in is
1. Let f(x, y) = 5 + 3x 2 + 3y 2 + 2y 3 + x 3. (a) Final all critical points of f. (b) Use the second derivatives test to classify the critical points you found in (a) as a local maximum, local minimum,
More information1. Show that the rectangle of maximum area that has a given perimeter p is a square.
Constrained Optimization - Examples - 1 Unit #23 : Goals: Lagrange Multipliers To study constrained optimization; that is, the maximizing or minimizing of a function subject to a constraint (or side condition).
More informationUsing Linear Programming for Management Decisions
Using Linear Programming for Management Decisions By Tim Wright Linear programming creates mathematical models from real-world business problems to maximize profits, reduce costs and allocate resources.
More informationA Theoretical and Algorithmic Characterization of Bulge Knees
A Theoretical and Algorithmic Characterization of Bulge Knees Pradyumn Kumar Shukla, Marlon Alexander Braun, and Hartmut Schmeck Institute AIFB, Karlsruhe Institute of Technology Karlsruhe, D-76128, Germany
More informationMAT203 OVERVIEW OF CONTENTS AND SAMPLE PROBLEMS
MAT203 OVERVIEW OF CONTENTS AND SAMPLE PROBLEMS MAT203 covers essentially the same material as MAT201, but is more in depth and theoretical. Exam problems are often more sophisticated in scope and difficulty
More informationChapter 15 Introduction to Linear Programming
Chapter 15 Introduction to Linear Programming An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Brief History of Linear Programming The goal of linear programming is to determine the values of
More informationLinear Programming. L.W. Dasanayake Department of Economics University of Kelaniya
Linear Programming L.W. Dasanayake Department of Economics University of Kelaniya Linear programming (LP) LP is one of Management Science techniques that can be used to solve resource allocation problem
More informationBi-Objective Optimization for Scheduling in Heterogeneous Computing Systems
Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Tony Maciejewski, Kyle Tarplee, Ryan Friese, and Howard Jay Siegel Department of Electrical and Computer Engineering Colorado
More informationMATH3016: OPTIMIZATION
MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is
More informationUsing an outward selective pressure for improving the search quality of the MOEA/D algorithm
Comput Optim Appl (25) 6:57 67 DOI.7/s589-5-9733-9 Using an outward selective pressure for improving the search quality of the MOEA/D algorithm Krzysztof Michalak Received: 2 January 24 / Published online:
More informationLocal and Global Minimum
Local and Global Minimum Stationary Point. From elementary calculus, a single variable function has a stationary point at if the derivative vanishes at, i.e., 0. Graphically, the slope of the function
More informationCHAPTER 3 MAINTENANCE STRATEGY SELECTION USING AHP AND FAHP
31 CHAPTER 3 MAINTENANCE STRATEGY SELECTION USING AHP AND FAHP 3.1 INTRODUCTION Evaluation of maintenance strategies is a complex task. The typical factors that influence the selection of maintenance strategy
More informationMathematics for chemical engineers
Mathematics for chemical engineers Drahoslava Janovská Department of mathematics Winter semester 2013-2014 Numerical solution of ordinary differential equations Initial value problem Outline 1 Introduction
More informationMath 32, August 20: Review & Parametric Equations
Math 3, August 0: Review & Parametric Equations Section 1: Review This course will continue the development of the Calculus tools started in Math 30 and Math 31. The primary difference between this course
More informationMA30SA Applied Math Unit D - Linear Programming Revd:
1 Introduction to Linear Programming MA30SA Applied Math Unit D - Linear Programming Revd: 120051212 1. Linear programming is a very important skill. It is a brilliant method for establishing optimum solutions
More informationApproximation Algorithms
Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A 4 credit unit course Part of Theoretical Computer Science courses at the Laboratory of Mathematics There will be 4 hours
More informationEARLY INTERIOR-POINT METHODS
C H A P T E R 3 EARLY INTERIOR-POINT METHODS An interior-point algorithm is one that improves a feasible interior solution point of the linear program by steps through the interior, rather than one that
More information13.1. Functions of Several Variables. Introduction to Functions of Several Variables. Functions of Several Variables. Objectives. Example 1 Solution
13 Functions of Several Variables 13.1 Introduction to Functions of Several Variables Copyright Cengage Learning. All rights reserved. Copyright Cengage Learning. All rights reserved. Objectives Understand
More informationUsing Genetic Algorithms to Solve the Box Stacking Problem
Using Genetic Algorithms to Solve the Box Stacking Problem Jenniffer Estrada, Kris Lee, Ryan Edgar October 7th, 2010 Abstract The box stacking or strip stacking problem is exceedingly difficult to solve
More informationNEW DECISION MAKER MODEL FOR MULTIOBJECTIVE OPTIMIZATION INTERACTIVE METHODS
NEW DECISION MAKER MODEL FOR MULTIOBJECTIVE OPTIMIZATION INTERACTIVE METHODS Andrejs Zujevs 1, Janis Eiduks 2 1 Latvia University of Agriculture, Department of Computer Systems, Liela street 2, Jelgava,
More informationFramework for Design of Dynamic Programming Algorithms
CSE 441T/541T Advanced Algorithms September 22, 2010 Framework for Design of Dynamic Programming Algorithms Dynamic programming algorithms for combinatorial optimization generalize the strategy we studied
More informationGraphing Linear Inequalities in Two Variables.
Many applications of mathematics involve systems of inequalities rather than systems of equations. We will discuss solving (graphing) a single linear inequality in two variables and a system of linear
More informationExperimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization
Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical
More informationIntroduction to PDEs: Notation, Terminology and Key Concepts
Chapter 1 Introduction to PDEs: Notation, Terminology and Key Concepts 1.1 Review 1.1.1 Goal The purpose of this section is to briefly review notation as well as basic concepts from calculus. We will also
More informationCHAPTER 6 REAL-VALUED GENETIC ALGORITHMS
CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms
More informationME 575: Two-bar Truss
Name ME 575: Two-bar Truss Consider the design of a simple tubular symmetric truss shown in Fig. 1.1 below (problem originally from Fox 1 ). A design of the truss is specified by a unique set of values
More informationMarch 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms
Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014
More informationAn Interactive Evolutionary Multi-Objective Optimization Method Based on Progressively Approximated Value Functions
An Interactive Evolutionary Multi-Objective Optimization Method Based on Progressively Approximated Value Functions Kalyanmoy Deb, Ankur Sinha, Pekka Korhonen, and Jyrki Wallenius KanGAL Report Number
More informationTopological Machining Fixture Layout Synthesis Using Genetic Algorithms
Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,
More informationNon-convex Multi-objective Optimization
Non-convex Multi-objective Optimization Multi-objective Optimization Real-world optimization problems usually involve more than one criteria multi-objective optimization. Such a kind of optimization problems
More informationBackground for Surface Integration
Background for urface Integration 1 urface Integrals We have seen in previous work how to define and compute line integrals in R 2. You should remember the basic surface integrals that we will need to
More informationPerformance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances
Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances Minzhong Liu, Xiufen Zou, Yu Chen, Zhijian Wu Abstract In this paper, the DMOEA-DD, which is an improvement of DMOEA[1,
More informationDOWNLOAD PDF BIG IDEAS MATH VERTICAL SHRINK OF A PARABOLA
Chapter 1 : BioMath: Transformation of Graphs Use the results in part (a) to identify the vertex of the parabola. c. Find a vertical line on your graph paper so that when you fold the paper, the left portion
More informationEfficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization
Efficient Non-domination Level Update Approach for Steady-State Evolutionary Multiobjective Optimization Ke Li 1, Kalyanmoy Deb 1, Qingfu Zhang 2, and Sam Kwong 2 1 Department of Electrical and Computer
More informationEC5555 Economics Masters Refresher Course in Mathematics September Lecture 6 Optimization with equality constraints Francesco Feri
EC5555 Economics Masters Refresher Course in Mathematics September 2013 Lecture 6 Optimization with equality constraints Francesco Feri Constrained optimization The idea of constrained optimisation is
More informationComparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems
Australian Journal of Basic and Applied Sciences, 4(8): 3366-3382, 21 ISSN 1991-8178 Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Akbar H. Borzabadi,
More informationLagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers. Lagrange Multipliers
In this section we present Lagrange s method for maximizing or minimizing a general function f(x, y, z) subject to a constraint (or side condition) of the form g(x, y, z) = k. Figure 1 shows this curve
More informationExploration vs. Exploitation in Differential Evolution
Exploration vs. Exploitation in Differential Evolution Ângela A. R. Sá 1, Adriano O. Andrade 1, Alcimar B. Soares 1 and Slawomir J. Nasuto 2 Abstract. Differential Evolution (DE) is a tool for efficient
More informationSimulation. Lecture O1 Optimization: Linear Programming. Saeed Bastani April 2016
Simulation Lecture O Optimization: Linear Programming Saeed Bastani April 06 Outline of the course Linear Programming ( lecture) Integer Programming ( lecture) Heuristics and Metaheursitics (3 lectures)
More information1 Linear programming relaxation
Cornell University, Fall 2010 CS 6820: Algorithms Lecture notes: Primal-dual min-cost bipartite matching August 27 30 1 Linear programming relaxation Recall that in the bipartite minimum-cost perfect matching
More informationMachine Learning Classifiers and Boosting
Machine Learning Classifiers and Boosting Reading Ch 18.6-18.12, 20.1-20.3.2 Outline Different types of learning problems Different types of learning algorithms Supervised learning Decision trees Naïve
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationPreferences in Evolutionary Multi-Objective Optimisation with Noisy Fitness Functions: Hardware in the Loop Study
Proceedings of the International Multiconference on ISSN 1896-7094 Computer Science and Information Technology, pp. 337 346 2007 PIPS Preferences in Evolutionary Multi-Objective Optimisation with Noisy
More informationPreprint Stephan Dempe, Alina Ruziyeva The Karush-Kuhn-Tucker optimality conditions in fuzzy optimization ISSN
Fakultät für Mathematik und Informatik Preprint 2010-06 Stephan Dempe, Alina Ruziyeva The Karush-Kuhn-Tucker optimality conditions in fuzzy optimization ISSN 1433-9307 Stephan Dempe, Alina Ruziyeva The
More information14.5 Directional Derivatives and the Gradient Vector
14.5 Directional Derivatives and the Gradient Vector 1. Directional Derivatives. Recall z = f (x, y) and the partial derivatives f x and f y are defined as f (x 0 + h, y 0 ) f (x 0, y 0 ) f x (x 0, y 0
More informationMathematical Optimization in Radiotherapy Treatment Planning
1 / 35 Mathematical Optimization in Radiotherapy Treatment Planning Ehsan Salari Department of Radiation Oncology Massachusetts General Hospital and Harvard Medical School HST S14 May 13, 2013 2 / 35 Outline
More informationMulti-Objective Memetic Algorithm using Pattern Search Filter Methods
Multi-Objective Memetic Algorithm using Pattern Search Filter Methods F. Mendes V. Sousa M.F.P. Costa A. Gaspar-Cunha IPC/I3N - Institute of Polymers and Composites, University of Minho Guimarães, Portugal
More informationData Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University
Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with
More informationOptimizations and Lagrange Multiplier Method
Introduction Applications Goal and Objectives Reflection Questions Once an objective of any real world application is well specified as a function of its control variables, which may subject to a certain
More informationMathematics. Linear Programming
Mathematics Linear Programming Table of Content 1. Linear inequations. 2. Terms of Linear Programming. 3. Mathematical formulation of a linear programming problem. 4. Graphical solution of two variable
More informationGENETIC ALGORITHM with Hands-On exercise
GENETIC ALGORITHM with Hands-On exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic
More informationLecture
Lecture.. 7 Constrained problems & optimization Brief introduction differential evolution Brief eample of hybridization of EAs Multiobjective problems & optimization Pareto optimization This slides mainly
More informationAdaptive Crossover in Genetic Algorithms Using Statistics Mechanism
in Artificial Life VIII, Standish, Abbass, Bedau (eds)(mit Press) 2002. pp 182 185 1 Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism Shengxiang Yang Department of Mathematics and Computer
More informationStandard Optimization Techniques
12 Standard Optimization Techniques Peter Marwedel TU Dortmund, Informatik 12 Germany Springer, 2010 2012 年 12 月 19 日 These slides use Microsoft clip arts. Microsoft copyright restrictions apply. Structure
More informationDM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini
DM545 Linear and Integer Programming Lecture 2 The Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. 3. 4. Standard Form Basic Feasible Solutions
More informationLinear Programming Problems: Geometric Solutions
Linear Programming Problems: Geometric s Terminology Linear programming problems: problems where we must find the optimum (minimum or maximum) value of a function, subject to certain restrictions. Objective
More informationINTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM
Advanced OR and AI Methods in Transportation INTERACTIVE MULTI-OBJECTIVE GENETIC ALGORITHMS FOR THE BUS DRIVER SCHEDULING PROBLEM Jorge PINHO DE SOUSA 1, Teresa GALVÃO DIAS 1, João FALCÃO E CUNHA 1 Abstract.
More informationSome Advanced Topics in Linear Programming
Some Advanced Topics in Linear Programming Matthew J. Saltzman July 2, 995 Connections with Algebra and Geometry In this section, we will explore how some of the ideas in linear programming, duality theory,
More information