Optimization of Axle NVH Performance Using the Cross Entropy Method
|
|
- Lenard Blair
- 5 years ago
- Views:
Transcription
1 Optimization of Axle NVH Performance Using the Cross Entropy Method Abstract Glenn Meinhardt Department of Industrial and Systems Engineering Oakland University Rochester, Michigan Sankar Sengupta Department of Industrial and Systems Engineering Oakland University Rochester, Michigan An approach to optimization of automobile axles for noise, vibration and harshness (NVH) performance based on -of-line testing is presented. The method used, the cross-entropy method, iteratively solves an objective function based on statistical distributions of the indepent variables of the objective function. A Matlab program written by the authors is presented and discussed. The algorithm used within the method is presented along with solutions under different convergence criteria. Introduction Noise, Vibration and Harshness (NVH) performance is a critical quality characteristic for automobile manufacturers (original equipment manufacturers, or OEMs) and driveline component manufacturers alike. A major component of the driveline is the axle. The axle transfers torque from the engine and driveshaft to the wheels. For axle manufacturers, one of the primary NVH metrics is gear whine [1]. To ensure satisfactory gear whine performance when the automobile leaves the factory, many OEMs now require axle assemblies to be tested for gear whine performance at the of the assembly line using an -of-line NVH test (EOLT) prior to shipment to their assembly plants. It is in the best interest of both the OEMs and axle manufacturers to ensure that the vibration levels of axles not only meet the requirement at the EOLT, but that the levels are as low as possible [2]. One way to control the levels at the EOLT is to understand the correlation of the upstream performance variables to the EOLT result. A previous work by the authors examined one such correlation [3, 4] involving the assembly parameters of the axle and the resulting coast-side vibration. This work illustrates the use of the cross entropy method to minimize the EOLT result with the regression equation presented in [4] used as the objective function. The solution of the same problem is presented in other works by the authors using Particle Swarm Optimization [5] and a Genetic Algorithm [6]. 1
2 The Optimization Problem The desire is to minimize the coast-side vibration given by the regression equation from [4]: Therefore, the optimization problem is written Y(X) = a a c6 (1) c d d6 * = min Y(X) = Y(X*) (2) where * is the optimum value of Y, X = [ a1 a5 c6 c7 d3 d6 ], and X* are the values of X associated with *. The regression equation was derived from 21 samples of data collected from the assembly line. Clearly the regression equation is only valid for the range of data from which it was derived. Therefore, the boundary conditions (constraints) for the optimization problem are the range of each variable from which the objective function was derived. The constraints are taken from range of data in [4] and summarized in Table 1. Equations (1) and (2) along with Table 1 completely define the optimization problem. Now, this problem can be solved very easily deterministically and that solution is given in Table 2. The purpose of this work is to illustrate how the cross entropy method can be used to solve optimization problems. The simple problem presented above and the deterministic solution can be used as a basis for such an illustration, the results of the optimization compared to the deterministic solution. Table 1 The Constraints for the Parameters, X, of the Regression Equation (Various Units, db) The Cross Entropy Method Variable Lower Upper Bound Bound a a c c d d The Cross-Entropy method (CE) was first introduced by Rubinstein [7] and is a well-known evolutionary algorithm involving variance minimization. The name cross-entropy is derived 2
3 Table 2 The Solution to the Deterministic Form of the Optimization Problem Optimization Method a1 a5 c6 c7 d3 d6 Optimum Solution Y(X*) Deterministic from its use of the Kullback-Leibler divergence which is a measure of the loss of information when one probability distribution is used to approximate another. In CE, the relationship between the fitness value to be optimized, Y, and the controlling variables, X, is not evaluated deterministically, but with probability density functions, f ( ;v) representing X (the associated stochastic problem). The Kullback-Leibler divergence is employed to iteratively update the parameters of the probability density functions minimizing the loss of information as the solution converges to minimum variance and the optimum value. The basic algorithm for CE is a simple two-step iterative process [8, 9]: 1. Define the associated stochastic problem by generating appropriate random samples representing each of the variables of the objective function. 2. Update the parameters of the sampling distributions for the next iteration to move the solution closer to the optimum value. In Step 2 CE utilizes rare event estimation and importance sampling to converge each of the parameters of the probability density functions to their optimum value. A detailed derivation of CE can be found in Rubinstein and Kroese [10] with an excellent example presented by Kothari and Kroese [11]. To solve the problem with CE, it is first necessary to define the associated stochastic problem by replacing the static variables with their stochastic counterparts. It is appropriate to define the stochastic counterparts based on data collected from the assembly line. In previous work [3] it was shown that each variable can be represented by a normal distribution, except d3 due to the bi-modal nature of its distribution. It was explained in [3] that the bi-modal nature of d3 was likely due to the process producing d3 from two sources. For the purposes of this work the optimization method will assume d3 can be represented by one normal distribution. Thus Equation (1) becomes Y(v) = N( a1, a1) N( a5, a5) N( c6, c6) (3) N( c7, c7) N( d3, d3) N( d6, d6) 3
4 To solve Equation (2) given Equation (3), the CE method employs rare event estimation such that l( ) = Pu(Y(X) ) = Eu I {Y(X) } (4) where E is the Expected Value operator and X is a random vector with probability distribution functions, f ( ; u), for u ϵ v where v = [ a1, a1, a5, a5, c6, c6, c7, c7, d3, d3, d6, d6 ] (5) Now l( ) 0 as *. This is the rare event that is estimated under the importance sampling of X X*. CE then adaptively updates and v until the solution converges to the tuple ( *,v*). From Kothari and Kroese [11], for each iteration, i, and known values of vi-1 and with i assigned to be a known quantile of Y(X) under vi-1, a value of is selected such that and PVi-1(Y(X) i) (6) PVi-1 (Y(X) i) 1 (7) The parameter defines the elite samples from the current population that will be used to estimate *. For this work, is chosen to be the Again from Kothari and Kroese [11], v is updated in each iteration, i, by deriving v i from the cross-entropy program given by where max v D (v) = max v 1 N N I {Y(X k ) γ i} k=1 ln f(x k ; v) (8) I {Y(Xk ) γ i} = { 1, Y(X k) γ i 0, Y(X k ) > γ i (9) To avoid a sub-optimal solution the convergence is slowed with a slow factor,, such that the updated value of v is given as v i = α v i + (1 α) v i 1 (10) with usually defined to be 0.7 < < 1.0. Here will be assigned a value of From Kothari and Kroese [11] for normally distributed values of X, the solution of Equation (8) at each iteration i yields 4
5 v i = [μ i, σ i] (11) with μ ij = N k=1 X kj NElite j = 1, 2,, p (12) σ ij = N (X kj μ ij ) 2 k=1 N Elite j = 1, 2,, p (13) where N is the index for the number of feasible solutions. The procedure continues until a stopping criteria is met. CE is easily adapted to a spreadsheet, but is more practical within a mathematical programming package such as Matlab. The next section will review the numerical solution of the optimization problem using CE. Numerical Solution Using CE The solution to the optimization problem by CE is conducted within Matlab. The algorithm used to write the program is given in the Appix. The data used to initialize the program are the data used in [4] to derive the regression equation. These data are shown in Table 3. The program is initialized by establishing the parameters for the CE method. These include the number of samples to generate with each generation, N, the percentile defining the elite sample,, the slowing factor,, and the stopping criteria. These values are summarized in Table 4, and are the input to the program. The program provides output of the average values and standard deviations for each parameter at each iteration, the setup parameters, the CPU Time required and the number of iterations required to converge. The solutions from each of five runs of the CE program using the parameters in Table 4 are shown in Table 5. In addition, Figures 1, 2 and 3 show for each run the optimum value, the iterations required to converge to the optimum solution, and the solver time required. From Figure 1, it is clear that the Cross-Entropy method successfully identifies the optimum solution for this problem. Figures 2 and 3 suggest that it may not be necessary to use such strict convergence criteria since there is an impact on solver time and the number of iterations required to converge. The penalty is insignificant compared to the increased precision in the result, if increased precision is desired. In this work, it is desired to achieve precision to four decimal places. CE demonstrates the ability to do this with the more strict convergence criteria. This suggests that if high precision is desired, a good approach to optimization may be to start with more strict convergence criteria. 5
6 Table 3 The Initial Data From the Assembly Line, Various Units, db Sample a1 a5 c6 c7 d3 d6 NVH Avg St Dev Max Min
7 Table 4 Initialization Parameters for the CE Method Description Parameter Value Number of Samples in Each Generation N 1000 Elite percentile of the population 0.01 Slowing Factor 0.75 Stopping Criteria (convergence) - Trial 1 Maximum Sample Standard Deviation Stopping Criteria (convergence) - Trial 2 Maximum Sample Standard Deviation Table 5 The Average Value of the Parameters and Solution of the Optimization Problem for Each Iteration Run D Iterations Required to Converge Solver Time (sec) a1 a5 c6 c7 d3 d6 Optimum Solution Y(X*)
8 Iteratiosn Required to Converge Optimum Value, db Run D Figure 1 The Optimum Value Identified by the Cross-Entropy Method by the Convergence Criteria and Run Number (The dashed line is the optimum value found deterministically) Run D Figure 2 The Number of Iterations Required to Identify the Optimum Value Using the Cross- Entropy Method by the Convergence Criteria and Run Number 8
9 35 30 Solver Time, Seconds Run D Figure 3 The Solver Time Required to Identify the Optimum Value Using the Cross-Entropy Method by the Convergence Criteria and Run Number Summary The Cross Entropy (CE) method is a relatively new optimization method. This paper illustrates the application of the cross entropy method to a very simple linear regression model. The deterministic solution is used as a means of comparison of the CE results to a known solution. Table 6 shows the comparison. Table 6 shows nearly exact agreement between the classical solution and the one found through CE. As discussed above, restricting the convergence criteria further will improve the precision. It remains, for future work, to confirm that axles built to the optimum conditions indeed produce improved vibration performance. Other papers by the authors illustrate solving the same optimization problem using a Genetic Algorithm [6] and Particle Swarm Optimization [5]. Table 6 A Comparison of the Deterministic Solution of the Optimization Problem to the Best Performance of CE Optimization Method N Iterations to Solve Solver Time (sec) a1 a5 c6 c7 d3 d6 Optimum Solution Y(X*) Deterministic 0 0 < Cross-Entropy
10 Appix A detailed algorithm / pseudocode for the cross entropy method 1. Initialize the program: a. Define the number of random samples, N, for X. b. Define the percent of the solutions of Y(X),, that will comprise the elite sample. c. Define the slow factor,. d. Define the stopping criteria. This is chosen to be when the maximum standard deviation across all X and Y(X) is db or after 1,000 iterations. 2. Initialize 0j and 0j for j = 1, 2,, p. a. Import the raw data from the assembly line for each parameter. b. Calculate 0j and 0j from the data. 3. Generate N samples Xi from (i-1)j and (i-1)j. a. Evaluate Xi against the constraints and discard infeasible solutions. 4. Calculate Yi(Xi) using Equation (1). a. Calculate ij and ij and compare to the stopping criteria ( ij < 0.001?) b. If True, * = Yi(Xi) and X* = Xi 5. Sort Yi(Xi) and select the percentile elite solutions. a. The number of elite solutions = N Elite 6. Calculate ij and ij from Equations (8) and (9) using the elite samples. 7. Calculate v i from Equation (10) 8. Increment the iteration number and repeat from Step 3 until stopping criteria is met. The Matlab code used for the cross entropy method % Open the data files O = load('nvhdata.mat', '-ASCII'); constraint = load('const.mat', '-ASCII'); % Establish the number of variables (c) and the number of samples (r) [r,c] = size(o); Sample(r,c)=0; SampleStDev(1,c) = 100; SolutionAverage(1,1) = 0; SolutionStDev(1,1) = 0; % Initialize the counters err = 0; x = 0; % Calculate the averages while j < c + 1 while i < r + 1 x = x + O(i,j); 10
11 Average(j) = x / r; x = 0; Average; %Re-iniitialize counters % Calculate the Standard Deviations while j < c + 1 while i < r + 1 err = err + (O(i,j) - Average(j))^2; StDev(j) = sqrt(err / (r-1)); err = 0; StDev; % Perform the Optimization n = input('how many random samples shall we generate? '); pelite = input('what percentile of feasible solutions shall we use as the Elite Sample? '); stop = input('what is the maximum Standard Deviation desired to achieve the optimum solution? This will apply to all variables. '); slow = input('what weight shall we apply to the new parameters (slow factor)? '); plots = input('shall we create plots at the? ', 's'); % Begin the iterations numiterations = 0; while max(samplestdev) > stop numiterations = numiterations + 1; infeasible = 0; % Generate the new population while j < 7 while i < n + 1 Sample(i,j) = random('norm', Average(j), StDev(j)); Sample(i,7)= *Sample(i,1)-2.789*Sample(i,2) *Sample(i,3) *Sample(i,4) *Sample(i,5) *Sample(i,6) ; % Check feasibility of the solutions while i < n + 1 while j < 7 if Sample(i,j) > constraint(2,j) && Sample(i,j) < constraint(1,j) Sample(i,8) = 2; 11
12 else Sample(i,8) = 1; j = 7; infeasible = infeasible + 1; % Create the array of feasible solutions % Sort the sample first by Column 8 (the feasible solutions, descing order) and then by % Column 7 (the fitness value, ascing order) SortedSample = sortrows(sample,[-8,7]); NumFeasible = n - infeasible - 1; while i < NumFeasible + 1 while j < 8 Feasible(i,j) = SortedSample(i,j); % Calculate the averages of the feasible solutions while j < 8 FeasibleAverage = 0; while i < NumFeasible + 1 FeasibleAverage = FeasibleAverage + Feasible(i,j); SampleAverage(j) = FeasibleAverage / NumFeasible ; SolutionAverage(numIterations,j) = SampleAverage(j); % Calculate the Standard Deviations of the feasible solutions while j < 8 FeasibleStdError = 0; while i < NumFeasible + 1 FeasibleStdError = FeasibleStdError + (Feasible(i,j)- SampleAverage(j))^2; SampleStDev(j) = sqrt(feasiblestderror / (NumFeasible - 1)); SolutionStDev(numIterations,j) = SampleStDev(j); 12
13 % Calculate the Elite Averages nelite = round(numfeasible * pelite); while j < 8 neliteaverage = 0; while i < nelite + 1 neliteaverage = neliteaverage + Feasible(i,j); EliteAverage(j) = neliteaverage / nelite ; % Calculate the Elite Standard Deviations while j < 8 nelitestderror = 0; while i < nelite + 1 nelitestderror = nelitestderror + (Feasible(i,j)-EliteAverage(j))^2; EliteStDev(j) = sqrt(nelitestderror / (nelite-1)); % Update the Average and Standard Deviation for the next population NumFeasible = 0; Average = slow * EliteAverage + (1 - slow) * SampleAverage; StDev = slow * EliteStDev + (1 - slow) * SampleStDev; % After the stopping criteria has been reached, display the results from each iteration along with the optimum SolutionAverage SolutionStDev numiterations SampleAverage SampleStDev % Write the results to an Excel File xlswrite('solution.xls',solutionaverage,'averages'); xlswrite('solution.xls',solutionstdev,'stdevs'); if plots == 'y' figure plot(solutionaverage) figure plot(solutionstdev) % End of test. Ask to clear the memory. reply = input ('Do you want to clear everything? (y/n)[n]', 's'); if reply == 'y' clear clc 13
14 elseif isempty(reply) reply = 'n'; Bibliography 1. Sun, Z., et. al., NVH Robustness Design of Axle Systems, SAE Transactions, v. 112, pp , Steyer, G., et. al., The Future of NVH Testing An End-User s Perspective, SAE Technical Paper , Meinhardt, G. and Sengupta, S., Correlation of Axle Build Parameters to End-of-Line NVH Test Performance Part I-Preparing the Multivariate data for Regression Analysis, SAE Technical Paper , Meinhardt, G. and Sengupta, S., Correlation of Axle Build Parameters to End-of-Line NVH Test Performance Part II-Multivariate Regression Analysis, SAE Technical Paper , Meinhardt G. and Sengupta, S., Optimization of Axle NVH Performance Using Particle Swarm Optimization, Proceedings of the ICAM 2014 May 28-30, 2014, Meinhardt G. and Sengupta, S., Optimization of Axle NVH Performance Using A Genetic Algorithm, Proceedings of the ICAM 2014 May 28-30, 2014, Rubinstein, R., Optimization of Computer Simulation Models with Rare Events, European Journal of Operations Research, v. 99, pp , Kroese, D., et. al., The Cross-Entropy Method for Continuous Multi-Extremal Optimization, Methodology in Computing and Applied Probability, v. 8, pp , De Boer, P., et. al., A Tutorial on the Cross-Entropy Method, Annals of Operations Research, v. 134, pp , Rubinstein, R. and Kroese, D., The Cross-Entropy Method, Springer-Verlag, Kothari, R., and Kroese, D., Optimal Generation Expansion Planning Via the Cross-Entropy Method, Proceedings of the 2009 Winter Conference - IEEE, pp ,
Optimization of Axle NVH Performance Using Particle Swarm Optimization
Optimization of Axle NVH Performance Using Particle Swarm Optimization Abstract Glenn Meinhardt Department of Industrial and Systems Engineering Oakland University Rochester, Michigan 48309 Email: gameinha@oakland.edu
More informationThe Cross-Entropy Method for Mathematical Programming
The Cross-Entropy Method for Mathematical Programming Dirk P. Kroese Reuven Y. Rubinstein Department of Mathematics, The University of Queensland, Australia Faculty of Industrial Engineering and Management,
More informationThe Cross-Entropy Method
The Cross-Entropy Method Guy Weichenberg 7 September 2003 Introduction This report is a summary of the theory underlying the Cross-Entropy (CE) method, as discussed in the tutorial by de Boer, Kroese,
More informationNumerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm
Numerical Experiments with a Population Shrinking Strategy within a Electromagnetism-like Algorithm Ana Maria A. C. Rocha and Edite M. G. P. Fernandes Abstract This paper extends our previous work done
More informationCHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES
CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving
More informationarxiv: v1 [cs.ne] 22 Mar 2016
Adaptive Parameter Selection in Evolutionary Algorithms by Reinforcement Learning with Dynamic Discretization of Parameter Range arxiv:1603.06788v1 [cs.ne] 22 Mar 2016 ABSTRACT Arkady Rost ITMO University
More informationExperimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization
Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical
More informationCHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM
20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationMeta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization
2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic
More informationPARALLEL CROSS-ENTROPY OPTIMIZATION. Dirk P. Kroese. Department of Mathematics University of Queensland Brisbane, QLD 4072, AUSTRALIA
Proceedings of the 27 Winter Simulation Conference S. G. Henderson, B. Biller, M.-H. Hsieh, J. Shortle, J. D. Tew, and R. R. Barton, eds. PARALLEL CROSS-ENTROPY OPTIMIZATION Gareth E. Evans Department
More informationGenerating Uniformly Distributed Pareto Optimal Points for Constrained and Unconstrained Multicriteria Optimization
Generating Uniformly Distributed Pareto Optimal Points for Constrained and Unconstrained Multicriteria Optimization Crina Grosan Department of Computer Science Babes-Bolyai University Cluj-Napoca, Romania
More informationLECTURE NOTES Non-Linear Programming
CEE 6110 David Rosenberg p. 1 Learning Objectives LECTURE NOTES Non-Linear Programming 1. Write out the non-linear model formulation 2. Describe the difficulties of solving a non-linear programming model
More informationMetaheuristic Optimization with Evolver, Genocop and OptQuest
Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:
More informationComparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems
Australian Journal of Basic and Applied Sciences, 4(8): 3366-3382, 21 ISSN 1991-8178 Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Akbar H. Borzabadi,
More informationA Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling Repairman Problem
Proceedings of the 2012 International Conference on Industrial Engineering and Operations Management Istanbul, Turkey, July 3 6, 2012 A Development of Hybrid Cross Entropy-Tabu Search Algorithm for Travelling
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationA Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2
Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications
More informationHybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems
Proceedings of the International Conference on Computational and Mathematical Methods in Science and Engineering, CMMSE 2008 13 17 June 2008. Hybrid Optimization Coupling Electromagnetism and Descent Search
More informationCHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION
131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This
More informationOptimization of Tapered Cantilever Beam Using Genetic Algorithm: Interfacing MATLAB and ANSYS
Optimization of Tapered Cantilever Beam Using Genetic Algorithm: Interfacing MATLAB and ANSYS K R Indu 1, Airin M G 2 P.G. Student, Department of Civil Engineering, SCMS School of, Kerala, India 1 Assistant
More informationEvolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.
Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple
More informationSolving the Capacitated Single Allocation Hub Location Problem Using Genetic Algorithm
Solving the Capacitated Single Allocation Hub Location Problem Using Genetic Algorithm Faculty of Mathematics University of Belgrade Studentski trg 16/IV 11 000, Belgrade, Serbia (e-mail: zoricast@matf.bg.ac.yu)
More informationStatistical Pattern Recognition
Statistical Pattern Recognition Features and Feature Selection Hamid R. Rabiee Jafar Muhammadi Spring 2012 http://ce.sharif.edu/courses/90-91/2/ce725-1/ Agenda Features and Patterns The Curse of Size and
More informationDERIVATIVE-FREE OPTIMIZATION
DERIVATIVE-FREE OPTIMIZATION Main bibliography J.-S. Jang, C.-T. Sun and E. Mizutani. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence. Prentice Hall, New Jersey,
More informationSensing Error Minimization for Cognitive Radio in Dynamic Environment using Death Penalty Differential Evolution based Threshold Adaptation
Sensing Error Minimization for Cognitive Radio in Dynamic Environment using Death Penalty Differential Evolution based Threshold Adaptation Soumyadip Das 1, Sumitra Mukhopadhyay 2 1,2 Institute of Radio
More informationResearch on time optimal trajectory planning of 7-DOF manipulator based on genetic algorithm
Acta Technica 61, No. 4A/2016, 189 200 c 2017 Institute of Thermomechanics CAS, v.v.i. Research on time optimal trajectory planning of 7-DOF manipulator based on genetic algorithm Jianrong Bu 1, Junyan
More informationThe Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination
INFOCOMP 20 : The First International Conference on Advanced Communications and Computation The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination Delmar Broglio Carvalho,
More informationHandling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization
Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,
More informationComparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles
INTERNATIONAL JOURNAL OF MATHEMATICS MODELS AND METHODS IN APPLIED SCIENCES Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles M. Fernanda P.
More informationOptimal designs for comparing curves
Optimal designs for comparing curves Holger Dette, Ruhr-Universität Bochum Maria Konstantinou, Ruhr-Universität Bochum Kirsten Schorning, Ruhr-Universität Bochum FP7 HEALTH 2013-602552 Outline 1 Motivation
More informationMINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS
Proceedings of Student/Faculty Research Day, CSIS, Pace University, May 5 th, 2006 MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS Richard
More informationThe Genetic Algorithm for finding the maxima of single-variable functions
Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding
More informationSimultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Network and Fuzzy Simulation
.--- Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Networ and Fuzzy Simulation Abstract - - - - Keywords: Many optimization problems contain fuzzy information. Possibility
More informationACO and other (meta)heuristics for CO
ACO and other (meta)heuristics for CO 32 33 Outline Notes on combinatorial optimization and algorithmic complexity Construction and modification metaheuristics: two complementary ways of searching a solution
More informationCHAPTER 5 OPTIMAL TOLERANCE DESIGN WITH ALTERNATIVE MANUFACTURING PROCESS SELECTION
9 CHAPTER 5 OPTIMAL TOLERANCE DESIGN WITH ALTERNATIVE MANUFACTURING PROCESS SELECTION 5. STAGE 4 COMBINED OBJECTIVE ( OBJECTIVES) CONCURRENT OPTIMAL TOLERANCE DESIGN WITH ALTERNATIVE MANUFACTURING PROCESS
More informationGA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les
Chaotic Crossover Operator on Genetic Algorithm Hüseyin Demirci Computer Engineering, Sakarya University, Sakarya, 54187, Turkey Ahmet Turan Özcerit Computer Engineering, Sakarya University, Sakarya, 54187,
More informationHEURISTIC OPTIMIZATION USING COMPUTER SIMULATION: A STUDY OF STAFFING LEVELS IN A PHARMACEUTICAL MANUFACTURING LABORATORY
Proceedings of the 1998 Winter Simulation Conference D.J. Medeiros, E.F. Watson, J.S. Carson and M.S. Manivannan, eds. HEURISTIC OPTIMIZATION USING COMPUTER SIMULATION: A STUDY OF STAFFING LEVELS IN A
More informationCOUPLING TRNSYS AND MATLAB FOR GENETIC ALGORITHM OPTIMIZATION IN SUSTAINABLE BUILDING DESIGN
COUPLING TRNSYS AND MATLAB FOR GENETIC ALGORITHM OPTIMIZATION IN SUSTAINABLE BUILDING DESIGN Marcus Jones Vienna University of Technology, Vienna, Austria ABSTRACT Incorporating energy efficient features
More informationCHAPTER 5 STRUCTURAL OPTIMIZATION OF SWITCHED RELUCTANCE MACHINE
89 CHAPTER 5 STRUCTURAL OPTIMIZATION OF SWITCHED RELUCTANCE MACHINE 5.1 INTRODUCTION Nowadays a great attention has been devoted in the literature towards the main components of electric and hybrid electric
More informationAIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS
AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS Emre Alpman Graduate Research Assistant Aerospace Engineering Department Pennstate University University Park, PA, 6802 Abstract A new methodology
More informationA penalty based filters method in direct search optimization
A penalty based filters method in direct search optimization ALDINA CORREIA CIICESI/ESTG P.PORTO Felgueiras PORTUGAL aic@estg.ipp.pt JOÃO MATIAS CM-UTAD Vila Real PORTUGAL j matias@utad.pt PEDRO MESTRE
More informationPerformance Evaluation of an Interior Point Filter Line Search Method for Constrained Optimization
6th WSEAS International Conference on SYSTEM SCIENCE and SIMULATION in ENGINEERING, Venice, Italy, November 21-23, 2007 18 Performance Evaluation of an Interior Point Filter Line Search Method for Constrained
More informationFeature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Feature Selection CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Dimensionality reduction Feature selection vs. feature extraction Filter univariate
More informationREAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN
REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty
More informationStandard Error Dynamic Resampling for Preference-based Evolutionary Multi-objective Optimization
Standard Error Dynamic Resampling for Preference-based Evolutionary Multi-objective Optimization Florian Siegmund a, Amos H. C. Ng a, and Kalyanmoy Deb b a School of Engineering, University of Skövde,
More informationContents. Tutorials Section 1. About SAS Enterprise Guide ix About This Book xi Acknowledgments xiii
Contents About SAS Enterprise Guide ix About This Book xi Acknowledgments xiii Tutorials Section 1 Tutorial A Getting Started with SAS Enterprise Guide 3 Starting SAS Enterprise Guide 3 SAS Enterprise
More informationOutline. CS 6776 Evolutionary Computation. Numerical Optimization. Fitness Function. ,x 2. ) = x 2 1. , x , 5.0 x 1.
Outline CS 6776 Evolutionary Computation January 21, 2014 Problem modeling includes representation design and Fitness Function definition. Fitness function: Unconstrained optimization/modeling Constrained
More informationAn Evolutionary Algorithm for the Multi-objective Shortest Path Problem
An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China
More informationLS-OPT Current development: A perspective on multi-level optimization, MOO and classification methods
LS-OPT Current development: A perspective on multi-level optimization, MOO and classification methods Nielen Stander, Anirban Basudhar, Imtiaz Gandikota LSTC, Livermore, CA LS-DYNA Developers Forum, Gothenburg,
More informationLouis Fourrier Fabien Gaie Thomas Rolf
CS 229 Stay Alert! The Ford Challenge Louis Fourrier Fabien Gaie Thomas Rolf Louis Fourrier Fabien Gaie Thomas Rolf 1. Problem description a. Goal Our final project is a recent Kaggle competition submitted
More informationParallel Hierarchical Cross Entropy Optimization for On-Chip Decap Budgeting
Parallel Hierarchical Cross Entropy Optimization for On-Chip Decap Budgeting Xueqian Zhao, Yonghe Guo, Zhuo Feng and Shiyan Hu Department of Electrical and Computer Engineering Michigan Technological University,
More informationOptimal Design of a Parallel Beam System with Elastic Supports to Minimize Flexural Response to Harmonic Loading
11 th World Congress on Structural and Multidisciplinary Optimisation 07 th -12 th, June 2015, Sydney Australia Optimal Design of a Parallel Beam System with Elastic Supports to Minimize Flexural Response
More informationStochastic branch & bound applying. target oriented branch & bound method to. optimal scenario tree reduction
Stochastic branch & bound applying target oriented branch & bound method to optimal scenario tree reduction Volker Stix Vienna University of Economics Department of Information Business Augasse 2 6 A-1090
More informationInternational Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)
Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem
More informationTraffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization
Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization J.Venkatesh 1, B.Chiranjeevulu 2 1 PG Student, Dept. of ECE, Viswanadha Institute of Technology And Management,
More informationDr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH
Evolutionary and Genetic Algorithms in OptiSLang Dr.-Ing. Johannes Will CAD-FEM GmbH/DYNARDO GmbH dynamic software & engineering GmbH www.dynardo.de Genetic Algorithms (GA) versus Evolutionary Algorithms
More informationA Simplex Based Parametric Programming Method for the Large Linear Programming Problem
A Simplex Based Parametric Programming Method for the Large Linear Programming Problem Huang, Rujun, Lou, Xinyuan Abstract We present a methodology of parametric objective function coefficient programming
More informationGLOBAL LIKELIHOOD OPTIMIZATION VIA THE CROSS-ENTROPY METHOD WITH AN APPLICATION TO MIXTURE MODELS. Zdravko Botev Dirk P. Kroese
Proceedings of the 2004 Winter Simulation Conference R. G. Ingalls, M. D. Rossetti, J. S. Smith, and B. A. Peters, eds. GLOBAL LIKELIHOOD OPTIMIZATION VIA THE CROSS-ENTROPY METHOD WITH AN APPLICATION TO
More informationApproximate Evolution Strategy using Stochastic Ranking
Approximate Evolution Strategy using Stochastic Ranking Thomas Philip Runarsson, Member, IEEE Abstract The paper describes the approximation of an evolution strategy using stochastic ranking for nonlinear
More informationThe optimum design of a moving PM-type linear motor for resonance operating refrigerant compressor
International Journal of Applied Electromagnetics and Mechanics 33 (2010) 673 680 673 DOI 10.3233/JAE-2010-1172 IOS Press The optimum design of a moving PM-type linear motor for resonance operating refrigerant
More informationPrepared By. Handaru Jati, Ph.D. Universitas Negeri Yogyakarta.
Prepared By Handaru Jati, Ph.D Universitas Negeri Yogyakarta handaru@uny.ac.id Chapter 8 Using The Excel Solver To Solve Mathematical Programs Chapter Overview 8.1 Introduction 8.2 Formulating Mathematical
More informationLuo, W., and Li, Y. (2016) Benchmarking Heuristic Search and Optimisation Algorithms in Matlab. In: 22nd International Conference on Automation and Computing (ICAC), 2016, University of Essex, Colchester,
More informationMulti-Objective Memetic Algorithm using Pattern Search Filter Methods
Multi-Objective Memetic Algorithm using Pattern Search Filter Methods F. Mendes V. Sousa M.F.P. Costa A. Gaspar-Cunha IPC/I3N - Institute of Polymers and Composites, University of Minho Guimarães, Portugal
More informationBayesian Estimation for Skew Normal Distributions Using Data Augmentation
The Korean Communications in Statistics Vol. 12 No. 2, 2005 pp. 323-333 Bayesian Estimation for Skew Normal Distributions Using Data Augmentation Hea-Jung Kim 1) Abstract In this paper, we develop a MCMC
More informationARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.
Copyright 2002 IFAC 5th Triennial World Congress, Barcelona, Spain ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA Mark S. Voss a b and Xin Feng a Department of Civil and Environmental
More informationSımultaneous estımatıon of Aquifer Parameters and Parameter Zonations using Genetic Algorithm
Sımultaneous estımatıon of Aquifer Parameters and Parameter Zonations using Genetic Algorithm M.Tamer AYVAZ Visiting Graduate Student Nov 20/2006 MULTIMEDIA ENVIRONMENTAL SIMULATIONS LABORATORY (MESL)
More informationTime-Domain Dynamic Analysis of Helical Gears with Reduced Housing Model
2013-01-1898 Published 05/13/2013 Copyright 2013 SAE International doi:10.4271/2013-01-1898 saeaero.saejournals.org Time-Domain Dynamic Analysis of Helical Gears with Reduced Housing Model Vijaya Kumar
More information3 Interior Point Method
3 Interior Point Method Linear programming (LP) is one of the most useful mathematical techniques. Recent advances in computer technology and algorithms have improved computational speed by several orders
More informationAutomatically Balancing Intersection Volumes in A Highway Network
Automatically Balancing Intersection Volumes in A Highway Network Jin Ren and Aziz Rahman HDR Engineering, Inc. 500 108 th Avenue NE, Suite 1200 Bellevue, WA 98004-5549 Jin.ren@hdrinc.com and 425-468-1548
More informationOptimization of Noisy Fitness Functions by means of Genetic Algorithms using History of Search with Test of Estimation
Optimization of Noisy Fitness Functions by means of Genetic Algorithms using History of Search with Test of Estimation Yasuhito Sano and Hajime Kita 2 Interdisciplinary Graduate School of Science and Engineering,
More informationAnalysis of Directional Beam Patterns from Firefly Optimization
Analysis of Directional Beam Patterns from Firefly Optimization Nicholas Misiunas, Charles Thompson and Kavitha Chandra Center for Advanced Computation and Telecommunications Department of Electrical and
More informationSynthesis of Planar Mechanisms, Part XI: Al-Jazari Quick Return-Motion Mechanism Galal Ali Hassaan Emeritus Professor, Mechanical Design & Production
Synthesis of Planar Mechanisms, Part XI: Al-Jazari Quick Return-Motion Mechanism Galal Ali Hassaan Emeritus Professor, Mechanical Design & Production Department. Faculty of Engineering, Cairo University,
More informationGRASP. Greedy Randomized Adaptive. Search Procedure
GRASP Greedy Randomized Adaptive Search Procedure Type of problems Combinatorial optimization problem: Finite ensemble E = {1,2,... n } Subset of feasible solutions F 2 Objective function f : 2 Minimisation
More informationArgha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.
Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial
More informationEE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over
EE Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over Emre Uğur February, 00 Abstract In this work, Particle Swarm Optimization (PSO) method is implemented and applied to various
More informationDealing with Categorical Data Types in a Designed Experiment
Dealing with Categorical Data Types in a Designed Experiment Part II: Sizing a Designed Experiment When Using a Binary Response Best Practice Authored by: Francisco Ortiz, PhD STAT T&E COE The goal of
More informationImmune Optimization Design of Diesel Engine Valve Spring Based on the Artificial Fish Swarm
IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-661, p- ISSN: 2278-8727Volume 16, Issue 4, Ver. II (Jul-Aug. 214), PP 54-59 Immune Optimization Design of Diesel Engine Valve Spring Based on
More informationRecent advances in Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will
Lectures Recent advances in Metamodel of Optimal Prognosis Thomas Most & Johannes Will presented at the Weimar Optimization and Stochastic Days 2010 Source: www.dynardo.de/en/library Recent advances in
More informationStatistical Pattern Recognition
Statistical Pattern Recognition Features and Feature Selection Hamid R. Rabiee Jafar Muhammadi Spring 2013 http://ce.sharif.edu/courses/91-92/2/ce725-1/ Agenda Features and Patterns The Curse of Size and
More informationEfficient Resources Allocation in Technological Processes Using an Approximate Algorithm Based on Random Walk
Efficient Resources Allocation in Technological Processes Using an Approximate Algorithm Based on Random Walk M.M. Bayas 1,2, V.M. Dubovoy 1 1 Department Computer Control Systems, Institute for Automatics,
More informationA heuristic approach of the estimation of process capability indices for non-normal process data using the Burr XII distribution
Noname manuscript No. (will be inserted by the editor) A heuristic approach of the estimation of process capability indices for non-normal process data using the Burr XII distribution Andrea Molina-Alonso
More informationJednociljna i višeciljna optimizacija korištenjem HUMANT algoritma
Seminar doktoranada i poslijedoktoranada 2015. Dani FESB-a 2015., Split, 25. - 31. svibnja 2015. Jednociljna i višeciljna optimizacija korištenjem HUMANT algoritma (Single-Objective and Multi-Objective
More informationImproving Convergence in Cartesian Genetic Programming Using Adaptive Crossover, Mutation and Selection
2015 IEEE Symposium Series on Computational Intelligence Improving Convergence in Cartesian Genetic Programming Using Adaptive Crossover, Mutation and Selection Roman Kalkreuth TU Dortmund University Department
More informationABC Optimization: A Co-Operative Learning Approach to Complex Routing Problems
Progress in Nonlinear Dynamics and Chaos Vol. 1, 2013, 39-46 ISSN: 2321 9238 (online) Published on 3 June 2013 www.researchmathsci.org Progress in ABC Optimization: A Co-Operative Learning Approach to
More informationMultidisciplinary Analysis and Optimization
OptiY Multidisciplinary Analysis and Optimization Process Integration OptiY is an open and multidisciplinary design environment, which provides direct and generic interfaces to many CAD/CAE-systems and
More informationOPTIMIZATION FOR SURFACE ROUGHNESS, MRR, POWER CONSUMPTION IN TURNING OF EN24 ALLOY STEEL USING GENETIC ALGORITHM
Int. J. Mech. Eng. & Rob. Res. 2014 M Adinarayana et al., 2014 Research Paper ISSN 2278 0149 www.ijmerr.com Vol. 3, No. 1, January 2014 2014 IJMERR. All Rights Reserved OPTIMIZATION FOR SURFACE ROUGHNESS,
More informationArtificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems
Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer
More informationExcel Scientific and Engineering Cookbook
Excel Scientific and Engineering Cookbook David M. Bourg O'REILLY* Beijing Cambridge Farnham Koln Paris Sebastopol Taipei Tokyo Preface xi 1. Using Excel 1 1.1 Navigating the Interface 1 1.2 Entering Data
More informationA NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER NONLINEAR PROGRAMMING PROBLEMS
EVOLUTIONARY METHODS FOR DESIGN, OPTIMIZATION AND CONTROL P. Neittaanmäki, J. Périaux and T. Tuovinen (Eds.) c CIMNE, Barcelona, Spain 2007 A NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER
More informationAn efficient algorithm for sparse PCA
An efficient algorithm for sparse PCA Yunlong He Georgia Institute of Technology School of Mathematics heyunlong@gatech.edu Renato D.C. Monteiro Georgia Institute of Technology School of Industrial & System
More informationTwo-phase strategies for the bi-objective minimum spanning tree problem
Two-phase strategies for the bi-objective minimum spanning tree problem Lavinia Amorosi a,, Justo Puerto b a Department of Statistical Sciences, Sapienza University of Rome, Italy b Department of Statistical
More informationQUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION
International Journal of Computer Engineering and Applications, Volume VIII, Issue I, Part I, October 14 QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION Shradha Chawla 1, Vivek Panwar 2 1 Department
More informationStochastic global optimization using random forests
22nd International Congress on Modelling and Simulation, Hobart, Tasmania, Australia, 3 to 8 December 27 mssanz.org.au/modsim27 Stochastic global optimization using random forests B. L. Robertson a, C.
More informationDevelopment of a tool for the easy determination of control factor interaction in the Design of Experiments and the Taguchi Methods
Development of a tool for the easy determination of control factor interaction in the Design of Experiments and the Taguchi Methods IKUO TANABE Department of Mechanical Engineering, Nagaoka University
More informationComputational study of the step size parameter of the subgradient optimization method
1 Computational study of the step size parameter of the subgradient optimization method Mengjie Han 1 Abstract The subgradient optimization method is a simple and flexible linear programming iterative
More informationThe picasso Package for High Dimensional Regularized Sparse Learning in R
The picasso Package for High Dimensional Regularized Sparse Learning in R X. Li, J. Ge, T. Zhang, M. Wang, H. Liu, and T. Zhao Abstract We introduce an R package named picasso, which implements a unified
More informationData Preprocessing. Why Data Preprocessing? MIT-652 Data Mining Applications. Chapter 3: Data Preprocessing. Multi-Dimensional Measure of Data Quality
Why Data Preprocessing? Data in the real world is dirty incomplete: lacking attribute values, lacking certain attributes of interest, or containing only aggregate data e.g., occupation = noisy: containing
More informationA Taguchi Approach to Parameter Setting in a Genetic Algorithm for General Job Shop Scheduling Problem
IEMS Vol. 6, No., pp. 9-4, December 007. A Taguchi Approach to Parameter Setting in a Genetic Algorithm for General Job Shop Scheduling Problem Ji Ung Sun School of Industrial & Managment Engineering Hankuk
More informationA penalty based filters method in direct search optimization
A penalty based filters method in direct search optimization Aldina Correia CIICESI / ESTG P.PORTO Felgueiras, Portugal aic@estg.ipp.pt João Matias CM-UTAD UTAD Vila Real, Portugal j matias@utad.pt Pedro
More information