1 Lab + Hwk 5: Particle Swarm Optimization


 Octavia Shelton
 11 months ago
 Views:
Transcription
1 1 Lab + Hwk 5: Particle Swarm Optimization This laboratory requires the following equipment: C programming tools (gcc, make). Webots simulation software. Webots User Guide Webots Reference Manual. The laboratory duration is about 3 hours. Homework will be due on the 7 th day after your lab session, at 12 noon. Please format your solution as a PDF file with the name [name]_lab[#].pdf, where [name] is your account user name and [#] is the number of the lab+homework assignment, and upload it as the Report submission onto the student section of the course webpage. All additional material (movies, code, and any other relevant files) should be archived as a zip file with the name [name]_lab[#].tgz (hint: tar cvzf [name]_lab[#].tgz [directory]) and uploaded as the Additional Material submission onto the student section of the course webpage. Every misnamed file will result in a 1 point penalty to your assignment grade. Keep in mind that no late solutions will be accepted unless supported by health reasons (and a note from the doctor). If there are extreme circumstances which will prevent you from attending a particular lab session, it may be possible to make arrangements with the TAs before the lab in question takes place. The more advance notice you can give in these situations, the more likely it is that we will be able to work something out. 1.1 Grading In the following text you will find several exercises and questions. The notation S x means that the question can be solved using only additional simulation; you are not required to write or produce anything for these questions. The notation Q x means that the question can be answered theoretically, without any simulation; your answers to these questions must be submitted in your Report. The length of answers should be approximately two sentences unless otherwise noted. The notation I x means that the problem has to be solved by implementing a piece of code and performing a simulation; the code written for these questions must be submitted in your Additional Material. Please use this notation in your answer file! The number of points for each exercise is given between parentheses. The combined total number of points for the laboratory and homework exercises is Optimization In many instances, we want to find either the minimum or maximum value of some function. If the function in question is complex or even unknown, it is impossible to JP; TL, Lab+Hwk 5: Particle Swarm Optimization 1
2 accomplish this task analytically. Furthermore, if the function parameter space is very large, a systematic search of this space is often too computationally expensive. Therefore, a number of techniques have been developed to find nearoptimal solutions (i.e. local minima and maxima), such as genetic algorithms and particle swarm optimization. 1.3 The Particle Swarm To find minima/maxima, repeated evaluations of the function in question must be done at different points. A good optimization algorithm will use the information obtained from these evaluations to choose the locations of future evaluations, and eventually to decide where the minimum/maximum is. In Particle Swarm Optimization, a set of particles is initialized with a random position for each particle. Each particle is also assigned a random velocity in the search space. Particles are then flown through the search space, moving at their respective velocities. Optimization is achieved by each particle remembering at what position they achieved the best evaluation of the function, or the particle s personal best. Particles also remember the best achieving position of their neighborhood. The neighborhood for some particle A is a group of particles to which A belongs. This group can be topological (whatever particles are closest) or fixed throughout the algorithm. It can consist of either a subset of the particle swarm (local neighborhood), or the entire swarm (global neighborhood). The neighborhood best of a neighborhood is the position that yielded the best evaluation by any particle in the neighborhood. At each generation of the algorithm, the velocity of each particle is updated, using a randomized attraction to both the particle best and the neighborhood best. This allows particles to move towards areas of the search space that have yielded good results, and in doing so, discover nearby better results. In this way, the particles will eventually converge on possibly global optima. The idea for this type of behavior took inspiration from the ways birds act while flying in a flock. 2 Lab: Using PSO 2.1 PSO on the Sphere Function The Sphere function is defined as follows: f ( x) = n 2 x i i= 1 We want to find the set of x i that give the value closest to 0 (given by x i = 0 for all x i ). The values for x i are initially randomly distributed between and This function is often used as a simple test for optimization functions; although it may seem trivial to a human observer, it can be difficult for an optimization algorithm to figure out that the best solution is all zeroes. We will use PSO with a swarm size of 30 and the neighborhood of a particle given by the three nearest particles on each side (e.g. particle 8 has neighborhood {5, 6, 7, 8, 9, 10, 11}) with particle 29 being next to particle 0. We will optimize the Sphere Comment [AM1]: I.e. mention neighborhood index = 7 JP; TL, Lab+Hwk 5: Particle Swarm Optimization 2
3 function with n = 10, and, therefore, ten components in the vector representing each particle. In other words, a particle is a set of 10 numbers, and the closer to 0 all the numbers become, the better the performance. Copy the pso directory from the student section of the course webpage into your home directory. Compile the PSO program using make. You can execute the program with./main <number of iterations>. S: Run the program with 10 iterations, 100 iterations, 1000 iterations, and iterations. At the end of each trial, both the achieved value and the found solution for 10 runs will be printed along with the average value. Lower values mean better the solutions. Q 1 (5): What do you observe? S: Try the simulation with iterations. Q 2 (5): How many iterations do you think will be necessary to get the exact answer? Why? S: The definition VMAX in main.c controls the maximum velocity that particles can achieve in the simulation. Using 100 iterations, try varying VMAX from 0.5 to 4.0. Q 3 (5): What do you observe? S: Try varying VMAX from to Q 4 (5): What happens now? Why? It can be very inconvenient to have to tune VMAX to give the optimal performance for a simulation. A feature that was developed for PSO soon after its invention was an inertia coefficient. This was a coefficient typically less than 1 which is multiplied with the velocity at each iteration, in order to naturally slow particles down without imposing a hard velocity threshold. S: Uncomment line 77 of pso.c (it will be marked in the code), set VMAX to 4.0, and run the program again. Try increasing the number of iterations. Q 5 (5): What do you observe now? 2.2 PSO for Unsupervised Learning We ve established that PSO can do a good job of optimizing simple mathematical functions. The next step is to test it in a more demanding environment, in particular in the presence of noisy functions. In Evolution of Homing Navigation in a Real Mobile Robot (Floreano and Mondada, 1996), a genetic algorithm was used to train a robot to move through a maze and avoid obstacles. We want to use Webots to evolve robotic controllers for the mobile robots, comparing the performance of GA and PSO. For the experiments in this lab, we use the epuck robot instead of the Khepera that was used in the paper. The behavior we wish to achieve on the robots is obstacle avoidance, the same as in the Floreano and Mondada paper. Copy the ga_obs_sup, pso_obs_sup and obs_con controllers and the ga_obs and pso_obs worlds from the student section of the course webpage. Compile all controllers. These worlds are simple walled arenas with no obstacles and twenty epuck robots. We use a two neuron, singlelayer, neural networks to control each robot, with proximity sensors and the JP; TL, Lab+Hwk 5: Particle Swarm Optimization 3
4 relative position center of mass as inputs and the motor speed as outputs. There are recursive and lateral connections from the outputs of both neurons back into each other. The controller we evolve is the set of weights for these neurons (8 proximity sensors + 2 recursive/lateral connections + 1 threshold = 11 weights for each neuron => 22 weights total). The fitness of a controller is measured with the equation: F = V ( 1 v)(1 i), V, v, i [0,1] where V is a measure of the average rotation speed of the two wheels over the trial, v is a measure of the average difference in wheel speed of the two wheels over the trial, and i is the average activation value of the proximity sensor with the highest activity. Each of these terms in the fitness function encourage the robot to go fast, go straight, and not stay close to walls, respectively. The default time for a single evaluation of fitness is ~30 s. The genetic algorithm we re using is similar to that used in 1996 by Floreano and Mondada. It selects the best performing half of the population as potential parents. Out of these, parents are selected using a Roulette Wheel scheme. Crossover occurs with probability 0.2 and mutation occurs with probability 0.15 to create children to replenish the missing half of the population Fast Heterogeneous Learning Using PSO or GA on a single robot, it can take a long time to evolve highperforming controllers. This is because the robot will have to test every member of the population/swarm one after another for each iteration of the algorithm. For example, doing 100 iterations of PSO with 20 particles with each evaluation taking 1 minute means we need 2000 minutes or over 36 hours to complete! One way to speed up this process is to use multiple robots doing heterogeneous learning. In heterogeneous learning, each robot is using a different controller than the others. By using multiple robots, we can distribute the particles among the robots and evaluate their performance in parallel, thus saving a lot of time. Going back to our previous example, if we have 20 robots each evaluating a different particle, it will only take 1 minute to evaluate 20 particles, and the total evolution time goes down to about 1.7 hours. We will now test evolution of obstacle avoidance behavior using heterogeneous learning with GA and PSO. In the follow questions, you will be asked to run simulations that may take several minutes to complete. In order to save time, you may want to look ahead to the next question to see if it is something which you can work on while you wait. S: Load the ga_obs world. Run the world to simulate the evolution. You will need to use fast mode to simulate quickly enough, but you may want to observe how the robots are behaving occasionally (pay attention to whether they are near the beginning or end of an evolutionary run). The simulation will do 10 separate evolutionary runs and print the final fitness for each, as well as the average fitness at the end. When the evolution is finished, observe the performance of the robots (they will be running the best found controller). JP; TL, Lab+Hwk 5: Particle Swarm Optimization 4
5 Q 6 (5): What kind of fitness values do you get? S: Now load the pso_obs world, and run the simulations again. Q 7 (5): How does the performance compare? We now switch to using the modified version of our genetic algorithm specifically designed for noisy fitness function evaluation (see Zhang, 2003). At each iteration, the performance of the potential parent set is reevaluated, and the new fitness value is combined with previous values to get a combined estimate. Q 8 (5): Why might this modification help to improve performance? S: In the ga.c file, set the NOISY definition from 0 to 1. This enables the modification. Run the simulations again. Q 9 (5): Record the final fitness values. How do the new values compare? The modification for GA for handling noisy fitness function evaluations was adapted for PSO in Particle Swarm Optimization for Unsupervised Robotic Learning (Pugh, 2005). In this, at each iteration, the previous best positions of the particles are reevaluated, and the new fitness value combined with previous ones. S: In the pso.c file, set the NOISY definition from 0 to 1. This enables the modification. Run the simulations again. Q 10 (5): Record the final fitness values. How do the new fitness values compare? Q 11 (5): In both modifications, the number of iterations run by the algorithm is divided by two. Why does this make sense? JP; TL, Lab+Hwk 5: Particle Swarm Optimization 5
6 3 Hwk: Analyzing PSO Depending on the function being optimized, different parameter settings will give better results in PSO. We will now explore the influence of parameters on a more complex numerical function, the Rastrigin function, given by: n 2 [ x i 10cos( 2 x i ) + 10] f ( x) = π i= 1 This function contains many local optima which could cause premature convergence of particles. S: Go back to the pso directory. Rerun optimization of the Sphere function (with the inertia coefficient) setting the number of particles to 3, 5, 10, 15, 30, and 60. For each number of particles n, set the number of iterations to 3000/n (i.e. 1000, 600, 300, 200, 100, 50; this will keep the computational time approximately constant). You can specify the number of particles from the command line with./main <number of iterations> <number of particles>. Q 12 (5): What performances did you get for each number of particles? Which number is best? S: In pso.c, change the definition of FUNCTION from 1 to 2. This selects the Rastrigin function. Try running the simulation again while varying the number of particles. Q 13 (5): What performances did you get now for each number of particles? Which number is best? Q 14 (10): How do the results for the Sphere and Rastrigin functions compare? Why is this happening? You may need up to five lines to fully explain this. Next, we will explore when it s useful to use a global neighborhood, where all the particles are attracted to the single best found position, and when it s useful to use a local neighborhood, when different groups of particles are attracted to different positions. S: On the Sphere function, rerun the optimization setting the definition of NB (number of neighbors on each side) to 1 (local neighborhood) and 15 (global neighborhood) with 15 particles and 100 iterations. Q 15 (5): What performances did you get? Which neighborhood type is best? S: Now set the number of iterations to 1000, and run the same experiments on the Rastrigin function. Q 16 (5): What performances did you get now? Which neighborhood type is best? Q 17 (5): For evolving aggregation in Webots, robots can cluster well either by moving forwards towards the group or by moving backwards towards the group. However, robots are better able to navigate to a more tight cluster going forwards, as they have more proximity sensors to see what they are approaching, and will therefore give a slightly better fitness overall. Would a global or local neighborhood be preferable here? Q 18 (10): What if you want to optimize the Sphere function, but with Gaussian noise added. Would this affect your neighborhood choice? Why? JP; TL, Lab+Hwk 5: Particle Swarm Optimization 6
7 4 References Floreano D. and Mondada F., "Evolution of Homing Navigation in a Real Mobile Robot", IEEE Trans. on System, Man, and Cybernetics: Part B, 26(3): , Zhang Y., Martinoli A., and Antonsson E. K. Evolutionary Design of a Collective Sensory System, In H. Lipson, E. K. Antonsson, and J. R. Koza, editors, Proc. of the 2003 AAAI Spring Symposium on Computational Synthesis, March 2003, Stanford, CA, pp AAAI Press Pugh, J., Zhang, Y., and Martinoli, A. Particle Swarm Optimization for Unsupervised Robotic Learning, In Proc. of the IEEE Swarm Intelligence Symposium 2005, Pasadena, CA. JP; TL, Lab+Hwk 5: Particle Swarm Optimization 7
GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les
Chaotic Crossover Operator on Genetic Algorithm Hüseyin Demirci Computer Engineering, Sakarya University, Sakarya, 54187, Turkey Ahmet Turan Özcerit Computer Engineering, Sakarya University, Sakarya, 54187,
More informationPARTICLE SWARM OPTIMIZATION (PSO)
PARTICLE SWARM OPTIMIZATION (PSO) J. Kennedy and R. Eberhart, Particle Swarm Optimization. Proceedings of the Fourth IEEE Int. Conference on Neural Networks, 1995. A population based optimization technique
More informationGENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN NQUEEN PROBLEM
Journal of AlNahrain University Vol.10(2), December, 2007, pp.172177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN NQUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon AlNahrain
More informationEvolutionary Algorithms. CS Evolutionary Algorithms 1
Evolutionary Algorithms CS 478  Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance
More informationParticle Swarm Optimization
Particle Swarm Optimization Gonçalo Pereira INESCID and Instituto Superior Técnico Porto Salvo, Portugal gpereira@gaips.inescid.pt April 15, 2011 1 What is it? Particle Swarm Optimization is an algorithm
More informationMobile Robot Path Planning in Static Environment
Mobile Robot Path Planning in Static Environment A Thesis Submitted in Partial Fulfilment of the Requirements for the Degree of Bachelor of Technology in Computer Science & Engineering Submitted by: Raman
More informationParticle Swarm Optimization
Dario Schor, M.Sc., EIT schor@ieee.org Space Systems Department Magellan Aerospace Winnipeg Winnipeg, Manitoba 1 of 34 Optimization Techniques Motivation Optimization: Where, min x F(x), subject to g(x)
More informationGeneration of Ultra Side lobe levels in Circular Array Antennas using Evolutionary Algorithms
Generation of Ultra Side lobe levels in Circular Array Antennas using Evolutionary Algorithms D. Prabhakar Associate Professor, Dept of ECE DVR & Dr. HS MIC College of Technology Kanchikacherla, AP, India.
More informationMachine Evolution. Machine Evolution. Let s look at. Machine Evolution. Machine Evolution. Machine Evolution. Machine Evolution
Let s look at As you will see later in this course, neural networks can learn, that is, adapt to given constraints. For example, NNs can approximate a given function. In biology, such learning corresponds
More informationEVOLVING ENGINEERING DESIGN TRADEOFFS. Yizhen Zhang Collective Robotics Group
Proceedings of DETC 3 ASME 23 Design Engineering Technical Conferences and Computers and Information in Engineering Conference Chicago, Illinois USA, September 26, 23 DETC23/DTM48676 EVOLVING ENGINEERING
More informationA NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION
A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION Manjeet Singh 1, Divesh Thareja 2 1 Department of Electrical and Electronics Engineering, Assistant Professor, HCTM Technical
More informationApproach Using Genetic Algorithm for Intrusion Detection System
Approach Using Genetic Algorithm for Intrusion Detection System 544 Abhijeet Karve Government College of Engineering, Aurangabad, Dr. Babasaheb Ambedkar Marathwada University, Aurangabad, Maharashtra
More informationGenetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. Sexpressions
Genetic programming Lecture Genetic Programming CIS 412 Artificial Intelligence Umass, Dartmouth One of the central problems in computer science is how to make computers solve problems without being explicitly
More informationGenetic.io. Genetic Algorithms in all their shapes and forms! Genetic.io Make something of your big data
Genetic Algorithms in all their shapes and forms! Julien Sebrien Selftaught, passion for development. Java, Cassandra, Spark, JPPF. @jsebrien, julien.sebrien@genetic.io Distribution of IT solutions (SaaS,
More informationModel Parameter Estimation
Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 0623836: Computational Modelling with MATLAB Outline Outline of Topics Concepts about model parameter
More informationConstrained Functions of N Variables: NonGradient Based Methods
onstrained Functions of N Variables: NonGradient Based Methods Gerhard Venter Stellenbosch University Outline Outline onstrained Optimization Nongradient based methods Genetic Algorithms (GA) Particle
More informationNeural Network Weight Selection Using Genetic Algorithms
Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks
More informationAutomatic differentiation based for particle swarm optimization steepest descent direction
International Journal of Advances in Intelligent Informatics ISSN: 24426571 Vol 1, No 2, July 2015, pp. 9097 90 Automatic differentiation based for particle swarm optimization steepest descent direction
More informationOffspring Generation Method using Delaunay Triangulation for RealCoded Genetic Algorithms
Offspring Generation Method using Delaunay Triangulation for RealCoded Genetic Algorithms Hisashi Shimosaka 1, Tomoyuki Hiroyasu 2, and Mitsunori Miki 2 1 Graduate School of Engineering, Doshisha University,
More informationDr. Mustafa Jarrar. Chapter 4 Informed Searching. Sina Institute, University of Birzeit
Lecture Notes, Advanced Artificial Intelligence (SCOM7341) Sina Institute, University of Birzeit 2 nd Semester, 2012 Advanced Artificial Intelligence (SCOM7341) Chapter 4 Informed Searching Dr. Mustafa
More informationIntroduction to Genetic Algorithms
Advanced Topics in Image Analysis and Machine Learning Introduction to Genetic Algorithms Week 3 Faculty of Information Science and Engineering Ritsumeikan University Today s class outline Genetic Algorithms
More informationCT79 SOFT COMPUTING ALCCSFEB 2014
Q.1 a. Define Union, Intersection and complement operations of Fuzzy sets. For fuzzy sets A and B Figure Fuzzy sets A & B The union of two fuzzy sets A and B is a fuzzy set C, written as C=AUB or C=A OR
More informationComparing Classification Performances between Neural Networks and Particle Swarm Optimization for Traffic Sign Recognition
Comparing Classification Performances between Neural Networks and Particle Swarm Optimization for Traffic Sign Recognition THONGCHAI SURINWARANGKOON, SUPOT NITSUWAT, ELVIN J. MOORE Department of Information
More informationParticle swarm optimization of artificial neural networks for autonomous robots. Mahmood Rahmani
Thesis for the degree of Master of Science in Complex Adaptive Systems Particle swarm optimization of artificial neural networks for autonomous robots Mahmood Rahmani Department of Applied Physics Chalmers
More informationWitold Pedrycz. University of Alberta Edmonton, Alberta, Canada
2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC) Banff Center, Banff, Canada, October 58, 2017 Analysis of Optimization Algorithms in Automated Test Pattern Generation for Sequential
More informationGenetic Algorithms: Setting Parmeters and Incorporating Constraints OUTLINE OF TOPICS: 1. Setting GA parameters. 2. Constraint Handling (two methods)
Genetic Algorithms: Setting Parmeters and Incorporating Constraints OUTLINE OF TOPICS: 1. Setting GA parameters general guidelines for binary coded GA (some can be extended to real valued GA) estimating
More informationDiscussion of Various Techniques for Solving Economic Load Dispatch
International Journal of Enhanced Research in Science, Technology & Engineering ISSN: 23197463, Vol. 4 Issue 7, July2015 Discussion of Various Techniques for Solving Economic Load Dispatch Veerpal Kaur
More informationEvolutionary Algorithms For Neural Networks Binary And Real Data Classification
Evolutionary Algorithms For Neural Networks Binary And Real Data Classification Dr. Hanan A.R. Akkar, Firas R. Mahdi Abstract: Artificial neural networks are complex networks emulating the way human rational
More informationHomework /681: Artificial Intelligence (Fall 2017) Out: October 18, 2017 Due: October 29, 2017 at 11:59PM
Homework 2 15381/681: Artificial Intelligence (Fall 2017) Out: October 18, 2017 Due: October 29, 2017 at 11:59PM Homework Policies Homework is due on Autolab by the posted deadline. Assignments submitted
More informationA COMPARATIVE STUDY OF EVOLUTIONARY ALGORITHMS FOR SCHOOL SCHEDULING PROBLEM
A COMPARATIVE STUDY OF EVOLUTIONARY ALGORITHMS FOR SCHOOL SCHEDULING PROBLEM 1 DANIEL NUGRAHA, 2 RAYMOND KOSALA 1 School of Computer Science, Bina Nusantara University, Jakarta, Indonesia 2 School of Computer
More informationGenetic Algorithms. Kang Zheng Karl Schober
Genetic Algorithms Kang Zheng Karl Schober Genetic algorithm What is Genetic algorithm? A genetic algorithm (or GA) is a search technique used in computing to find true or approximate solutions to optimization
More informationA Genetic AlgorithmBased Approach for Energy Efficient Clustering of Wireless Sensor Networks
A Genetic AlgorithmBased Approach for Energy Efficient Clustering of Wireless Sensor Networks A. Zahmatkesh and M. H. Yaghmaee Abstract In this paper, we propose a Genetic Algorithm (GA) to optimize
More informationAdaptive Multiobjective Particle Swarm Optimization Algorithm
Adaptive Multiobjective Particle Swarm Optimization Algorithm P. K. Tripathi, Sanghamitra Bandyopadhyay, Senior Member, IEEE and S. K. Pal, Fellow, IEEE Abstract In this article we describe a novel Particle
More informationStep Size Optimization of LMS Algorithm Using Particle Swarm Optimization Algorithm in System Identification
IJCSNS International Journal of Computer Science and Network Security, VOL.13 No.6, June 2013 125 Step Size Optimization of LMS Algorithm Using Particle Swarm Optimization Algorithm in System Identification
More informationInternational Conference on Modeling and SimulationCoimbatore, August 2007
International Conference on Modeling and SimulationCoimbatore, 2729 August 2007 OPTIMIZATION OF FLOWSHOP SCHEDULING WITH FUZZY DUE DATES USING A HYBRID EVOLUTIONARY ALGORITHM M.S.N.Kiran Kumara, B.B.Biswalb,
More informationMachine Learning for Software Engineering
Machine Learning for Software Engineering SingleState MetaHeuristics Prof. Dr.Ing. Norbert Siegmund Intelligent Software Systems 1 2 Recap: Goal is to Find the Optimum Challenges of general optimization
More informationUsing a genetic algorithm for editing knearest neighbor classifiers
Using a genetic algorithm for editing knearest neighbor classifiers R. GilPita 1 and X. Yao 23 1 Teoría de la Señal y Comunicaciones, Universidad de Alcalá, Madrid (SPAIN) 2 Computer Sciences Department,
More informationThreeDimensional OffLine Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization
ThreeDimensional OffLine Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization Lana Dalawr Jalal Abstract This paper addresses the problem of offline path planning for
More informationPARAMETER OPTIMIZATION FOR AUTOMATED SIGNAL ANALYSIS FOR CONDITION MONITORING OF AIRCRAFT SYSTEMS. Mike Gerdes 1, Dieter Scholz 1
AST 2011 Workshop on Aviation System Technology PARAMETER OPTIMIZATION FOR AUTOMATED SIGNAL ANALYSIS FOR CONDITION MONITORING OF AIRCRAFT SYSTEMS Mike Gerdes 1, Dieter Scholz 1 1 Aero  Aircraft Design
More informationInitializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method
Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method K.E. PARSOPOULOS, M.N. VRAHATIS Department of Mathematics University of Patras University of Patras Artificial Intelligence
More informationBinary Differential Evolution Strategies
Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, populationbased optimization approach. The
More informationRESPONSE SURFACE METHODOLOGIES  METAMODELS
RESPONSE SURFACE METHODOLOGIES  METAMODELS Metamodels Metamodels (or surrogate models, response surface models  RSM), are analytic models that approximate the multivariate input/output behavior of complex
More informationRapid Simultaneous Learning of Multiple Behaviours with a Mobile Robot
Rapid Simultaneous Learning of Multiple Behaviours with a Mobile Robot Koren Ward School of Information Technology and Computer Science University of Wollongong koren@uow.edu.au www.uow.edu.au/~koren Abstract
More informationOptimized Algorithm for Particle Swarm Optimization
Optimized Algorithm for Particle Swarm Optimization Fuzhang Zhao Abstract Particle swarm optimization (PSO) is becoming one of the most important swarm intelligent paradigms for solving global optimization
More informationDesigning Radial Basis Neural Networks using a Distributed Architecture
Designing Radial Basis Neural Networks using a Distributed Architecture J.M. Valls, A. Leal, I.M. Galván and J.M. Molina Computer Science Department Carlos III University of Madrid Avenida de la Universidad,
More informationMixture Models and EM
Mixture Models and EM Goal: Introduction to probabilistic mixture models and the expectationmaximization (EM) algorithm. Motivation: simultaneous fitting of multiple model instances unsupervised clustering
More informationPrinciples of Algorithm Design
Principles of Algorithm Design When you are trying to design an algorithm or a data structure, it s often hard to see how to accomplish the task. The following techniques can often be useful: 1. Experiment
More informationGridBased Genetic Algorithm Approach to Colour Image Segmentation
GridBased Genetic Algorithm Approach to Colour Image Segmentation Marco Gallotta Keri Woods Supervised by Audrey Mbogho Image Segmentation Identifying and extracting distinct, homogeneous regions from
More informationPreprocessing of Stream Data using Attribute Selection based on Survival of the Fittest
Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Bhakti V. Gavali 1, Prof. Vivekanand Reddy 2 1 Department of Computer Science and Engineering, Visvesvaraya Technological
More informationObjectives. Part 1: forward kinematics. Physical Dimension
ME 446 Laboratory #1 Kinematic Transformations Report is due at the beginning of your lab time the week of February 20 th. One report per group. Lab sessions will be held the weeks of January 23 rd, January
More informationCOMP 3500 Introduction to Operating Systems Project 5 Virtual Memory Manager
COMP 3500 Introduction to Operating Systems Project 5 Virtual Memory Manager Points Possible: 100 Submission via Canvas No collaboration among groups. Students in one group should NOT share any project
More informationEvolved Multiresolution Transforms for Optimized Image Compression and Reconstruction under Quantization
Evolved Multiresolution Transforms for Optimized Image Compression and Reconstruction under Quantization FRANK W. MOORE Mathematical Sciences Department University of Alaska Anchorage CAS 154, 3211 Providence
More informationLDPC Simulation With CUDA GPU
LDPC Simulation With CUDA GPU EE179 Final Project Kangping Hu June 3 rd 2014 1 1. Introduction This project is about simulating the performance of binary LowDensityParityCheckMatrix (LDPC) Code with
More informationConvolutional Code Optimization for Various Constraint Lengths using PSO
International Journal of Electronics and Communication Engineering. ISSN 09742166 Volume 5, Number 2 (2012), pp. 151157 International Research Publication House http://www.irphouse.com Convolutional
More informationAn Introduction to Evolutionary Algorithms
An Introduction to Evolutionary Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi http://users.jyu.fi/~kasindhy/
More informationEvolving SQL Queries for Data Mining
Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper
More informationA Novel Approach to Planar Mechanism Synthesis Using HEEDS
AB2033 Rev. 04.10 A Novel Approach to Planar Mechanism Synthesis Using HEEDS John Oliva and Erik Goodman Michigan State University Introduction The problem of mechanism synthesis (or design) is deceptively
More informationTHE preceding chapters were all devoted to the analysis of images and signals which
Chapter 5 Segmentation of Color, Texture, and Orientation Images THE preceding chapters were all devoted to the analysis of images and signals which take values in IR. It is often necessary, however, to
More informationGrid Scheduling using PSO with Naive Crossover
Grid Scheduling using PSO with Naive Crossover Vikas Singh ABV Indian Institute of Information Technology and Management, GwaliorMorena Link Road, Gwalior, India Deepak Singh Raipur Institute of Technology
More informationPart A: Monitoring the Rotational Sensors of the Motor
LEGO MINDSTORMS NXT Lab 1 This lab session is an introduction to the use of motors and rotational sensors for the Lego Mindstorm NXT. The first few parts of this exercise will introduce the use of the
More informationImage Compression: An Artificial Neural Network Approach
Image Compression: An Artificial Neural Network Approach Anjana B 1, Mrs Shreeja R 2 1 Department of Computer Science and Engineering, Calicut University, Kuttippuram 2 Department of Computer Science and
More informationLesson 2: Analyzing a Data Set
Student Outcomes Students recognize linear, quadratic, and exponential functions when presented as a data set or sequence, and formulate a model based on the data. Lesson Notes This lesson asks students
More informationlab A.3: introduction to RoboLab vocabulary materials cc30.03 Brooklyn College, CUNY c 2006 Name: RoboLab communication tower canvas icon
cc30.03 Brooklyn College, CUNY c 2006 lab A.3: introduction to RoboLab Name: vocabulary RoboLab communication tower canvas icon draganddrop function palette tools palette program algorithm syntax error
More informationNew Method for Accurate Parameter Estimation of Induction Motors Based on Artificial Bee Colony Algorithm
New Method for Accurate Parameter Estimation of Induction Motors Based on Artificial Bee Colony Algorithm Mohammad Jamadi Zanjan, Iran Email: jamadi.mohammad@yahoo.com Farshad MerrikhBayat University
More informationCS5401 FS2015 Exam 1 Key
CS5401 FS2015 Exam 1 Key This is a closedbook, closednotes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015
More informationModeling and Simulating Social Systems with MATLAB
Modeling and Simulating Social Systems with MATLAB Lecture 7 Game Theory / AgentBased Modeling Stefano Balietti, Olivia Woolley, Lloyd Sanders, Dirk Helbing Computational Social Science ETH Zürich 02112015
More informationSegmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms
Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms B. D. Phulpagar Computer Engg. Dept. P. E. S. M. C. O. E., Pune, India. R. S. Bichkar Prof. ( Dept.
More informationEvolutionary Computation Algorithms for Cryptanalysis: A Study
Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis
More informationAn Improved Genetic Algorithm based Fault tolerance Method for distributed wireless sensor networks.
An Improved Genetic Algorithm based Fault tolerance Method for distributed wireless sensor networks. Anagha Nanoti, Prof. R. K. Krishna M.Tech student in Department of Computer Science 1, Department of
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationIntroduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell
Introduction to Genetic Algorithms Based on Chapter 10 of Marsland Chapter 9 of Mitchell Genetic Algorithms  History Pioneered by John Holland in the 1970s Became popular in the late 1980s Based on ideas
More informationJob Scheduling on Computational Grids Using Fuzzy Particle Swarm Algorithm
Job Scheduling on Computational Grids Using Fuzzy Particle Swarm Algorithm Ajith Abraham 1,3, Hongbo Liu 2, and Weishi Zhang 3 1 School of Computer Science and Engineering, ChungAng University, Seoul,
More informationBenchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization
Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization K. Tang 1, X. Yao 1, 2, P. N. Suganthan 3, C. MacNish 4, Y. P. Chen 5, C. M. Chen 5, Z. Yang 1 1
More informationGeneral Instructions. Questions
CS246: Mining Massive Data Sets Winter 2018 Problem Set 2 Due 11:59pm February 8, 2018 Only one late period is allowed for this homework (11:59pm 2/13). General Instructions Submission instructions: These
More informationApplication of Genetic Algorithms to CFD. Cameron McCartney
Application of Genetic Algorithms to CFD Cameron McCartney Introduction define and describe genetic algorithms (GAs) and genetic programming (GP) propose possible applications of GA/GP to CFD Application
More informationOverlapping Swarm Intelligence for Training Artificial Neural Networks
Overlapping Swarm Intelligence for Training Artificial Neural Networks Karthik Ganesan Pillai Department of Computer Science Montana State University EPS 357, PO Box 173880 Bozeman, MT 597173880 k.ganesanpillai@cs.montana.edu
More informationPin type Clamping Attachment for Remote Setup of Machining Process
Pin type Clamping Attachment for Remote Setup of Machining Process Afzeri, R. Muhida, Darmawan, and A. N. Berahim Abstract Sharing the manufacturing facility through remote operation and monitoring of
More informationGENETIC ALGORITHM with HandsOn exercise
GENETIC ALGORITHM with HandsOn exercise Adopted From Lecture by Michael Negnevitsky, Electrical Engineering & Computer Science University of Tasmania 1 Objective To understand the processes ie. GAs Basic
More informationWhat Makes A Successful Society?
What Makes A Successful Society? Experiments With Population Topologies in Particle Swarms Rui Mendes and José Neves Departamento de Informática Universidade do Minho Portugal Abstract. Previous studies
More informationClustering Color/Intensity. Group together pixels of similar color/intensity.
Clustering Color/Intensity Group together pixels of similar color/intensity. Agglomerative Clustering Cluster = connected pixels with similar color. Optimal decomposition may be hard. For example, find
More informationModified Particle Swarm Optimization with Novel Modulated Inertia for Velocity Update
Modified Particle Swarm Optimization with Novel Modulated Inertia for Velocity Update Abdul Hadi Hamdan #1, Fazida Hanim Hashim #2, Abdullah Zawawi Mohamed *3, W. M. Diyana W. Zaki #4, Aini Hussain #5
More informationAssessing Particle Swarm Optimizers Using Network Science Metrics
Assessing Particle Swarm Optimizers Using Network Science Metrics Marcos A. C. OliveiraJúnior, Carmelo J. A. BastosFilho and Ronaldo Menezes Abstract Particle Swarm Optimizers (PSOs) have been widely
More informationOptimization of Benchmark Functions Using Genetic Algorithm
Optimization of Benchmark s Using Genetic Algorithm Vinod Goyal GJUS&T, Hisar Sakshi Dhingra GJUS&T, Hisar Jyoti Goyat GJUS&T, Hisar Dr Sanjay Singla IET Bhaddal Technical Campus, Ropar, Punjab Abstrat
More informationCSE 114, Computer Science 1 Course Information. Spring 2017 Stony Brook University Instructor: Dr. Paul Fodor
CSE 114, Computer Science 1 Course Information Spring 2017 Stony Brook University Instructor: Dr. Paul Fodor http://www.cs.stonybrook.edu/~cse114 Course Description Procedural and objectoriented programming
More informationEE 589 INTRODUCTION TO ARTIFICIAL NETWORK REPORT OF THE TERM PROJECT REAL TIME ODOR RECOGNATION SYSTEM FATMA ÖZYURT SANCAR
EE 589 INTRODUCTION TO ARTIFICIAL NETWORK REPORT OF THE TERM PROJECT REAL TIME ODOR RECOGNATION SYSTEM FATMA ÖZYURT SANCAR 1.Introductıon. 2.Multi Layer Perception.. 3.Fuzzy CMeans Clustering.. 4.Real
More informationGenetic Algorithm, Extremal Optimization, and Particle Swarm Optimization Applied to the Discrete Network Configuration Problem
Genetic Algorithm, Extremal Optimization, and Particle Swarm Optimization Applied to the Discrete Network Configuration Problem M. Martin, E. Drucker, and W.D. Potter Artificial Intelligence Center, University
More informationParticle Swarm Optimization For NQueens Problem
Journal of Advanced Computer Science and Technology, 1 (2) (2012) 5763 Science Publishing Corporation www.sciencepubco.com/index.php/jacst Particle Swarm Optimization For NQueens Problem Aftab Ahmed,
More informationGenetic algorithm based on number of children and height task for multiprocessor task Scheduling
Genetic algorithm based on number of children and height task for multiprocessor task Scheduling Marjan Abdeyazdan 1,Vahid Arjmand 2,Amir masoud Rahmani 3, Hamid Raeis ghanavati 4 1 Department of Computer
More informationLab 1 Implementing a Simon Says Game
ECE2049 Embedded Computing in Engineering Design Lab 1 Implementing a Simon Says Game In the late 1970s and early 1980s, one of the first and most popular electronic games was Simon by Milton Bradley.
More informationAUTONOMOUS CONTROL OF AN OMNIDIRECTIONAL MOBILE ROBOT
Projects, Vol. 11, 2004 ISSN 11728426 Printed in New Zealand. All rights reserved. 2004 College of Sciences, Massey University AUTONOMOUS CONTROL OF AN OMNIDIRECTIONAL MOBILE ROBOT C. J. Duncan Abstract:
More informationA Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2
Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications
More informationClustering. Lecture 6, 1/24/03 ECS289A
Clustering Lecture 6, 1/24/03 What is Clustering? Given n objects, assign them to groups (clusters) based on their similarity Unsupervised Machine Learning Class Discovery Difficult, and maybe illposed
More informationYou may refer to the lesson on data structures (Introduction to Data Structures) as necessary.
The Science of Computing I Living with Cyber Raspberry Pi Activity 4: Arraynging Things In this activity, you will implement the insertion sort. You will need the following items: Raspberry Pi B v2 with
More informationPath Planning Optimization Using Genetic Algorithm A Literature Review
International Journal of Computational Engineering Research Vol, 03 Issue, 4 Path Planning Optimization Using Genetic Algorithm A Literature Review 1, Er. Waghoo Parvez, 2, Er. Sonal Dhar 1, (Department
More informationDefining a Standard for Particle Swarm Optimization
Defining a Standard for Particle Swarm Optimization Daniel Bratton Department of Computing Goldsmiths College University of London London, UK Email: dbratton@gmail.com James Kennedy US Bureau of Labor
More informationSUPERVISED classification techniques classify the input
JOURNAL OF L A TEX CLASS FILES, VOL 6, NO 1, JANUARY 2007 1 Feature Selection Based on Hybridization of Genetic Algorithm and Particle Swarm Optimization Pedram Ghamisi, Student Member, IEEE and Jon Atli
More informationCOMBINED METHOD TO VISUALISE AND REDUCE DIMENSIONALITY OF THE FINANCIAL DATA SETS
COMBINED METHOD TO VISUALISE AND REDUCE DIMENSIONALITY OF THE FINANCIAL DATA SETS Toomas Kirt Supervisor: Leo Võhandu Tallinn Technical University Toomas.Kirt@mail.ee Abstract: Key words: For the visualisation
More informationGenetic Algorithm Performance with Different Selection Methods in Solving MultiObjective Network Design Problem
etic Algorithm Performance with Different Selection Methods in Solving MultiObjective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA
More informationIntelligent Reduction of Tire Noise
Intelligent Reduction of Tire Noise Matthias Becker and Helena Szczerbicka University Hannover Welfengarten 3067 Hannover, Germany xmb@sim.unihannover.de Abstract. In this paper we report about deployment
More informationArtificial Intelligence
Artificial Intelligence Information Systems and Machine Learning Lab (ISMLL) Tomáš Horváth 10 rd November, 2010 Informed Search and Exploration Example (again) Informed strategy we use a problemspecific
More information