Multi-objective Optimization Using Adaptive Explicit Non-Dominated Region Sampling

Similar documents
Smoothing Spline ANOVA for variable screening

Support Vector Machines

An Adaptive Surrogate-Assisted Strategy for Multi-Objective Optimization

NGPM -- A NSGA-II Program in Matlab

Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems

Classification / Regression Support Vector Machines

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

Support Vector Machines

Classifier Selection Based on Data Complexity Measures *

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

Machine Learning 9. week

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Edge Detection in Noisy Images Using the Support Vector Machines

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

Case. Anoop. A. Mullur. Sirisha. Pseudo Response Surfaces

Meta-heuristics for Multidimensional Knapsack Problems

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

Feature Reduction and Selection

The Research of Support Vector Machine in Agricultural Data Classification

Wishing you all a Total Quality New Year!

Announcements. Supervised Learning

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

Support Vector Machines. CS534 - Machine Learning

Sequential Projection Maximin Distance Sampling Method

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

A New Token Allocation Algorithm for TCP Traffic in Diffserv Network

Multicriteria Decision Making

Active Contours/Snakes

Biostatistics 615/815

X- Chart Using ANOM Approach

LS-TaSC Version 2.1. Willem Roux Livermore Software Technology Corporation, Livermore, CA, USA. Abstract

An Optimal Algorithm for Prufer Codes *

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

Unsupervised Learning

Topology Design using LS-TaSC Version 2 and LS-DYNA

Fitting: Deformable contours April 26 th, 2018

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

Study on Multi-objective Flexible Job-shop Scheduling Problem considering Energy Consumption

CS 534: Computer Vision Model Fitting

Review of approximation techniques

Multi-objective Optimization Using Self-adaptive Differential Evolution Algorithm

A MULTI-OBJECTIVE GENETIC ALGORITHM FOR EXTEND

GSLM Operations Research II Fall 13/14

Collaboratively Regularized Nearest Points for Set Based Recognition

Parallel matrix-vector multiplication

The Codesign Challenge

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Classifying Acoustic Transient Signals Using Artificial Intelligence

An Efficient Pareto Set Identification Approach for Multi-objective Optimization on Black-box Functions

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

Multiobjective Optimization

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:


Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Multi-Objective Design Exploration for Aerodynamic Configurations

Data Mining For Multi-Criteria Energy Predictions

A Statistical Model Selection Strategy Applied to Neural Networks

Evolutionary Wavelet Neural Network for Large Scale Function Estimation in Optimization

Complex System Reliability Evaluation using Support Vector Machine for Incomplete Data-set

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Elliptical Rule Extraction from a Trained Radial Basis Function Neural Network

Modeling, Manipulating, and Visualizing Continuous Volumetric Data: A Novel Spline-based Approach

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Classifier Swarms for Human Detection in Infrared Imagery

Shape Optimization of Shear-type Hysteretic Steel Damper for Building Frames using FEM-Analysis and Heuristic Approach

Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems

Machine Learning. Topic 6: Clustering

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

MULTISTAGE OPTIMIZATION OF AUTOMOTIVE CONTROL ARM THROUGH TOPOLOGY AND SHAPE OPTIMIZATION. 1 Duane Detwiler, 2 Emily Nutwell*, 2 Deepak Lokesha

Hermite Splines in Lie Groups as Products of Geodesics

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Adaptive Weighted Sum Method for Bi-objective Optimization

SVM-based Learning for Multiple Model Estimation

General Vector Machine. Hong Zhao Department of Physics, Xiamen University

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments

A Binarization Algorithm specialized on Document Images and Photos

Wavefront Reconstructor

TESTING AND IMPROVING LOCAL ADAPTIVE IMPORTANCE SAMPLING IN LJF LOCAL-JT IN MULTIPLY SECTIONED BAYESIAN NETWORKS

Multiobjective fuzzy optimization method

An Entropy-Based Approach to Integrated Information Needs Assessment

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Data Mining: Model Evaluation

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

Concurrent Apriori Data Mining Algorithms

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

ISSN: International Journal of Engineering and Innovative Technology (IJEIT) Volume 1, Issue 4, April 2012

PARETO BAYESIAN OPTIMIZATION ALGORITHM FOR THE MULTIOBJECTIVE 0/1 KNAPSACK PROBLEM

Transcription:

11 th World Congress on Structural and Multdscplnary Optmsaton 07 th -12 th, June 2015, Sydney Australa Mult-objectve Optmzaton Usng Adaptve Explct Non-Domnated Regon Samplng Anrban Basudhar Lvermore Software Technology Corporaton, Lvermore, CA, USA, abasudhar@lstc.com 1. Abstract A new method to effcently perform mult-objectve optmzaton (MOO), referred to as Adaptve Explct Mult-objectve Optmzaton (AEMOO), s presented. Unlke exstng methods, t uses bnary classfcaton to explctly defne the decson boundary between domnated and non-domnated (ND) regons n the desgn space. An adaptvely refned support vector machne (SVM) s used to defne the boundary. AEMOO has several advantages that stem from the avalablty of the estmated explct boundary boundng the ND desgn space, whch represents Pareto-optmal (PO) desgns at convergence. It allows for an effectve adaptve samplng strategy that samples "mportant" regons n the desgn space. Addtonally, explct knowledge of the PO desgn space facltates effcent real tme Pareto-optmalty decsons. AEMOO uses a hybrd approach that consders the dstrbuton of samples n both desgn and objectve spaces. Two varants of AEMOO are presented - one based purely on classfcaton and the other based on both classfcaton and metamodel approxmaton. The results are compared to the wdely used NSGAII method and Pareto Doman Reducton (PDR) usng test problems up to 30 varables. AEMOO shows sgnfcantly better effcency and robustness compared to these exstng methods. 2. Keywords: support vector machne, hybrd adaptve samplng, real tme optmalty decson, bnary response. 3. Introducton Mult-objectve optmzaton (MOO) often nvolves locatng a set of Pareto optmal ponts that are ND over the entre desgn space (x-space). The multple objectves typcally represent quanttes wth dfferent unts that are not comparable. Addtonally, fndng a soluton set that can represent the complete Pareto front s more challengng than fndng a sngle soluton. Therefore, effcent and accurate soluton of MOO problems s overall much more challengng compared to sngle-objectve optmzaton (SOO), and s stll an evolvng research area. Classcal MOO approaches nvolve scalarzaton of the objectves to convert MOO nto several SOO problems [1], but these approaches are not well suted to fnd a set of ponts that can represent the complete PO front. Evolutonary algorthms, e.g. SPEA and NSGAII, are extensvely used to solve MOO problems n a true mult-objectve sense [2,3]. These methods are appled n a drect optmzaton framework or n conjuncton wth metamodel approxmatons to allevate the cost of potentally expensve evaluatons (e.g. crashworthness) [4]. The metamodel-based method s further classfed based on samplng schemes. In some of these, samplng s based on drect optmzaton (e.g. genetc operators) [5]. The metamodel's accuracy estmate s used to determne whch samples need to be evaluated usng the expensve evaluator, and rest of the samples are evaluated usng a metamodel. In other methods, samplng s drected to obtan accurate metamodels and a method lke NSGAII s then appled on the metamodels to fnd PO ponts. A basc approach s to use sequental space-fllng samples for metamodelng [4]. Whle ths approach s global, fllng the space s prohbtvely expensve n hgher dmensons. Another metamodel-based MOO algorthm s ParEGO [6], whch s an extenson of ts SOO counterpart. ParEGO nvolves scalarzaton of objectves. Addtonally, t was tested only up to eght varables n [6]. One major lmtaton of commonly used MOO methods s the lack of explct defnton of PO regons of desgn space. To determne whether a gven sample s ND, one needs to evaluate the objectve functons and constrants at not only that pont but over the entre desgn space. ND ponts are determned usng parwse comparson n objectve (f) space before mappng them back to the desgn space. For large sample szes, these calculatons can add sgnfcant overhead. Addtonally, n the absence of an explct defnton of ND regons of the desgn space, t s dffcult to adaptvely add samples n those regons of partcular nterest. A sequental metamodel-based method known as Pareto Doman Reducton (PDR) was recently developed that focuses on samplng based on desgn space sparsty n the vcnty of predcted front [7]. Hyperboxes constructed around prevous ND ponts were sampled. However, performance of PDR depended on the number and sze of such hyperboxes. Ths paper presents a novel classfcaton-based adaptve explct MOO approach (AEMOO) that mtgates aforementoned lmtatons of exstng methods by provdng an effcent samplng scheme. Ths s made possble by constructng an explct defnton of the boundary separatng domnated and ND samples n the desgn space usng a support vector machne (SVM) [8] and usng a hybrd samplng scheme that accounts for sparsty n both spaces (x and f space). Treatment of MOO as a bnary classfcaton problem nvolves a paradgm shft n ts soluton approach. The goal n MOO s to determne whether a desgn s PO or not, and to fnd a set of such desgns. Therefore, t s naturally suted for treatment as bnary classfcaton. Usng the proposed approach, a 1

sngle evaluaton of the traned classfer can determne f a desgn s ND. Ths makes t especally attractve for real tme Pareto-optmalty decsons. Explct defnton of ND regons also facltates mplementaton of a proposed adaptve samplng scheme that selects samples n these regons. Restrctng the samplng regon usng the decson boundary mproves the PO front search effcency. AEMOO can be mplemented ether as a classfer asssted drect method or n conjuncton wth metamodel approxmatons. It should be noted that n the latter approach, constrants can be handled usng metamodel approxmatons or by approxmatng the zero level usng a classfer. Classfer-based handlng allows the handlng of dscontnuous and bnary constrant functons [9]. 4. Adaptve Explct Mult-objectve Optmzaton (AEMOO) Two varants of the proposed AEMOO method are presented n ths secton. In Secton 4.1 the basc dea of classfcaton-based MOO s presented. In Secton 4.2 a classfer-asssted drect optmzaton method s presented. A second method that utlzes classfcaton as well metamodel approxmaton s presented n Secton 4.3. 4.1. Classfcaton Approach for MOO - Dynamc Classfer Typcally, soluton of MOO s a set of PO ponts. Part of the desgn space s PO whle other regons are domnated. Such a problem s deal for applyng classfcaton methods, the two classes beng domnated (-1 class) and ND (+1 class) (Fg. 1). Once a classfer separatng domnated and ND samples s traned, Pareto-optmalty of any desgn s determned usng a sngle classfer evaluaton, n contrast wth exstng methods. SVM s used to construct the decson boundary n ths work (Eq.(1)). It constructs the optmal boundary that maxmzes the margn between two sample classes (+1) n a feature space. A kernel K maps the desgn and feature spaces. In ths work, a Gaussan kernel s used to construct SVM boundares (s( = 0). Spread of the kernel s assgned the largest value wthout any tranng msclassfcaton. The SVM s defned as: N s( b y K( x, x ) 1 (1) Here, y = 1 s class label, α s the Lagrange multpler for th sample and b s bas. Each sample's class needs to be assgned pror to the constructon of SVM boundary. An ssue n usng classfcaton for MOO s that although t s known that domnated samples (class -1) cannot be PO, the opposte s not true; beng ND among current samples sn't suffcent to be PO. However, +1 samples represent PO desgns f the data s suffcent. A decson boundary obtaned by assgnng +1 class to the current ND samples represents an estmate of the PO front, and s refned adaptvely. As ponts are added, samples may swtch from +1 to -1 as ND samples may be domnated by newer samples untl convergence. As class defnton of exstng samples can change durng the course of AEMOO, the classfer s referred to as dynamc. The classfcaton based AEMOO method has several advantages. Explct defnton of the ND regon facltates mplementaton of an effcent samplng scheme It facltates effcent real tme Pareto optmalty decsons It uses nformaton n both desgn and objectve spaces to enhance ts effcency and robustness The classfcaton approach allows the handlng of bnary and dscontnuous constrant functons As ND regon s explctly defned n the desgn space, AEMOO s unaffected by number of objectves. Fgure 1: Boundary defnng ND regons n desgn space (left). Desgn-Objectve space Mappng (rght). 4.2. Drect AEMOO Method Drect AEMOO s based on SVM classfcaton only and does not approxmate functon values. It evaluates more samples n the explctly defned SVM-based ND regons to avod waste of samples and ncrease effcency. The sgn of SVM value s( determnes whether a sample s ND or not, and s straghtforward to determne. A fracton of samples per teraton are selected wthn the s( > 0 ND regon of desgn space. Detals are shown n 2

Fg. 2. Samplng mportant regons of the desgn space allows a faster ncrease n the SVM accuracy, as samplng based only on the objectve space can lead to clusterng of samples n desgn space, where the SVM s constructed. Maxmzng the value of SVM, one of the samplng crtera, s equvalent to maxmzng the probablty of locatng a ND sample [8,10]. Addtonally, one generaton of NSGAII s also used to select samples, frst wthn the s( > 0 regons and then usng an unconstraned formulaton. Usng NSGAII-based samples, the algorthm ensures that the effects of sparsty n the objectve functon space and genetc operator-based evoluton are also accounted for. In order to ensure a global search, a small fracton of samples s added based on maxmum mnmum dstance n the entre unconstraned desgn space. Such samples are not expected to provde an effcent samplng scheme, and are therefore optonal, but guarantee the locaton of the complete Pareto front when run for a suffcent tme. Fgure 2: Summary of drect AEMOO method (left) and samplng scheme (rght) 4.3. Metamodel-asssted AEMOO Method In ths approach, metamodel approxmaton and SVM are used together. Basc dea s same as drect AEMOO - to consder ND rankng along wth sample sparsty n both x and f spaces. However, the sngle generaton of drect NSGA-II samples s replaced wth converged predcted PO samples obtaned usng metamodel-based NSGAII. Metamodel approxmaton and the SVM-based classfcaton serve as complementary approaches that help n enhancng accuracy by accountng for dstrbuton of samples n both spaces. The methodology s shown n Fg. 3. Fgure 3: Summary of metamodel-asssted AEMOO method (left) and samplng scheme (rght) 5. Results Several examples are presented to valdate the effcacy and effcency of AEMOO. A 10 varable analytcal example usng drect AEMOO s presented, followed by three 30 varable examples usng metamodel-asssted AEMOO. Effcency s compared to exstng methods PDR and NSGAII. Fnally, AEMOO s used for tolerance based MOO of a Chevrolet truck [11]. The sample type fractons are η 1 = η 2 =0.2 and η 3 =0.1 (Fg. 2 and 3). Unless mentoned, floo1.5*(m+1))+1 samples are used per teraton, m beng the number of varables. AEMOO showed comparatvely hgher effcency also when larger sample szes were used, but those results haven't been shown. Radal bass functon metamodels have been used for functon approxmatons, but others can also be used. For examples 1-4, one of the objectves s f 1 = x 1. The second objectve f 2 s provded wth the ndvdual examples. 5.1. Example 1. Analytcal example wth ten varables and two objectves - ZDT3 (Drect AEMOO) Ths example has 10 varables (m = 10) and 2 objectves. The second objectve f 2 s: 3

m 9 f2 h( ), where 1 x, h( 1 sn(10 ) m 1 2 (2) The Pareto fronts at successve teratons are plotted n Fg. 4. The front at teraton 125 (2250 ponts) s qute close to the actual one, and shows that AEMOO can locate dsjont PO fronts even when only classfcaton s used. NSGAII found four of the dsjont regons on the front accurately, but completely mssed one regon. Ths can be attrbuted to samplng based on the f-space only wthout consderng desgn space sparsty, unlke n AEMOO. Fgure 4: Results for Example 1. Drect AEMOO (left) and drect NSGAII (rght). 5.2. Example 2. Analytcal example wth 30 varables and 2 objectves - ZDT1 (Metamodel-asssted AEMOO) Ths problem conssts of two objectves f 1 and f 2. The second objectve s: m 9 (3) f2 h( ), where 1 x, h( 1 m 1 2 Optmzaton results are shown n Fg. 5 usng trade off plots at dfferent teratons at ntervals of 10. The results shown are the evaluated Pareto optmal ponts. The proposed AEMOO method s able to locate the entre spread of Pareto front at the 10 th teraton tself (470 samples), after whch t adds dversty. The samples on the front are very well dversfed by the 20 th teraton. In comparson, performance of drect NSGAII s much slower and t takes 40-50 generatons (1920-2400 samples) to obtan a dversfed front. Even at 50 th generaton, the Pareto front usng NSGAII s not as accurate as the 10 th teraton of AEMOO. PDR performs more effcently than drect NSGAII, but stll takes 20-30 teratons (940-1410 samples) to obtan a well dversfed accurate front. Fgure 5: Example 2. Metamodel-asssted AEMOO (left), PDR (center) and drect NSGAII (rght). 5.3. Example 3. Analytcal example wth 30 varables and 2 objectves - ZDT2 (Metamodel-asssted AEMOO) Ths problem conssts of 30 varables (m = 30) and two objectve functons. The second objectve s: f r 2 2 m ( 9 ) ( 1( ) ( )), where 1 1 1 2 ( ) x h f x r x x, h( m r x Optmzaton results are shown usng computed trade off plots n Fg. 6. AEMOO s able to fnd a very well dversfed and accurate front before the 10 th teraton tself (470 samples). On the contrary, wth a comparable populaton sze of 48, drect NSGAII faled to obtan the Pareto front even after 50 generatons. The populaton sze for NSGAII had to be ncreased to fnd the actual front. PDR was able to locate the Pareto front wth a sample (4) 4

sze of 47 per teraton, but was slower than AEMOO, as t took 30 teratons (1410 samples) to obtan a front of comparable (but not qute as good) accuracy and dversty. Fgure 6: Example 3. Metamodel-asssted AEMOO (left), PDR (center) and drect NSGAII (rght). 5.4. Example 4. Analytcal example wth 30 varables and 2 objectves - ZDT3 (Metamodel-asssted AEMOO) Ths optmzaton problem s smlar to Example 1 (Eq.(2)), except that t has 30 varables nstead of 10. PO fronts usng the three methods are shown n Fg. 7, along wth the actual front. AEMOO located all fve dsjont regons on the front wthn frst 10 teratons, followng whch t further added samples on the front to ncrease dversty. Both NSGAII and PDR were sgnfcantly slower and had lower accuracy. Usng populaton sze of 48, drect NSGAII completely mssed 2 out of 5 regons. PDR was able to sample 4 of the regons satsfactorly at 40-50 teratons (1880-2350 samples). It found one ND sample close to the ffth regon, but not on t. Fgure 7: Example 4. Metamodel-asssted AEMOO (left), PDR (center) and drect NSGAII (rght). 5.5. Example 5. 7 varable tolerance optmzaton of Chevrolet C2500 truck (Metamodel-asssted AEMOO) AEMOO s used for mult-objectve tolerance optmzaton of a Chevrolet truck. LS-OPT s used to buld global metamodels, based on the truck's responses at 1500 samples (usng LS-DYNA). MOO s run on the metamodels.. Mass s mnmzed whle tolerance s maxmzed (Eq. (5)). Relatve tolerance and 6 thcknesses x are the varables. mn tolerance, scaled _ massx tolerance, x s. t. scaled _ massx 0.9) Ptarget (5) scaled _ stage1_ pulsex 1) Ptarget scaled _ stage2 _ pulsex 1) Ptarget scaled _ dspx 1) Ptarget where Ptarget 0 n ths work In fgure 3, the vehcle parts to be optmzed are shown along wth the optmzaton results. The Pareto front obtaned usng AEMOO and NSGAII are shown. 100 samples per teraton are used to solve ths example to ensure at least one sample of each class n the ntal samplng. Other approaches to avod ths restrcton are possble, but are outsde the scope of ths paper. AEMOO results are provded for 30 teratons that were completed at the tme of wrtng ths paper and compared to NSGAII s run up to 50 generatons. The PO front usng AEMOO has a better spread, and dversty compared to the NSGAII front even at the 50 th generaton, whch shows ts superor performance. At the same stage (30 th teraton), the PO front usng AEMOO s clearly better. The PO front conssts of a knee at 6% tolerance suggestng t to be a good desgn, as there s rapd mass ncrease beyond t. 5

Fgure 8: Truck to be optmzed (top), AEMOO (left), PDR (center), NSGAII (center) and overlad fronts (rght). 5. Concludng Remarks A new classfcaton-based adaptve samplng approach to solve MOO problems s presented. The proposed AEMOO method has several advantages compared to exstng methods due to ths radcally dfferent approach. Two varatons of the method are presented - drect and metamodel asssted. The method's effcacy s valdated usng standard examples of up to 30 varables. It has been shown to clearly outperform exstng methods lke NSGAII and PDR n terms of effcency. Addtonally, ablty to locate dsjont Pareto fronts has been shown. Ablty to solve constraned MOO has also been shown usng a tolerance-based crashworthness optmzaton example. As AEMOO explctly defnes the ND regon boundares n the desgn space, t also naturally facltates real-tme Pareto optmalty decsons. Ths work s expected to open new avenues for research n the feld of MOO. As the samplng scheme s partly based on desgn space classfcaton, whch dscards sgnfcant regons of the space as domnated, t s expected to be affected by objectve space dmensonalty to a lesser extent. Future work s needed to valdate ths hypothess. Addtonally there s scope for further mprovement n constrant handlng. 6. Acknowledgements The author s thankful to Dr. Nelen Stander and Mr. Imtaz Gandkota of LSTC for ther valuable feedback and for allowng the use of ther machnes for runnng some of the comparatve studes. 7. References [1] Deb, K. (2001). Mult-objectve optmzaton usng evolutonary algorthms (Vol. 16). John Wley & Sons. [2] Laumanns, M. (2001). SPEA2: Improvng the strength Pareto evolutonary algorthm. [3] Deb, K., Pratap, A., Agarwal, S., & Meyarvan, T. A. M. T. (2002). A fast and eltst multobjectve genetc algorthm: NSGA-II. Evolutonary Computaton, IEEE Transactons on, 6(2), 182-197. [4] Stander, N., Roux, W.J., Basudhar, A., Eggleston, T., Goel, T., Crag, K.J. LS-OPT User s Manual Verson 5.0, Aprl 2013. [5] L, M., L, G., & Azarm, S. (2008). A krgng metamodel asssted mult-objectve genetc algorthm for desgn optmzaton. Journal of Mechancal Desgn, 130(3), 031401. [6] Knowles, J. ParEGO: A hybrd algorthm wth on-lne landscape approxmaton for expensve multobjectve optmzaton problems. Techncal report TR-COMPSYSBIO-2004-01, September 2004. [7] Stander, N. An Adaptve Surrogate-Asssted Strategy for Mult-Objectve Optmzaton. 10th World Congress on Structural and Multdscplnary Optmzaton, Orlando, Florda, USA, 2013 [8] Vapnk, V.N., and Vapnk, V.. Statstcal learnng theory. Vol. 1. New York: Wley, 1998. [9] Basudhar, A., and Mssoum, S. Adaptve explct decson functons for probablstc desgn and optmzaton usng support vector machnes. Computers & Structures 86.19 (2008): 1904-1917. [10] Basudhar, A., Drbusch, C., Lacaze, S. and Mssoum, S. Constraned effcent global optmzaton wth support vector machnes. Structural and Multdscplnary Optmzaton 46, no. 2 (2012): 201-221. [11] Natonal Crash Analyss Center. Fnte element model archve, www.ncac.gwu.edu/vml/models.html. 2008. 6