Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010

Similar documents
EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

A Clustering Algorithm Solution to the Collaborative Filtering

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

An Improved Particle Swarm Optimization for Feature Selection

Classifier Swarms for Human Detection in Infrared Imagery

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

Cost-efficient deployment of distributed software services

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Data Mining For Multi-Criteria Energy Predictions

Meta-heuristics for Multidimensional Knapsack Problems

Network Intrusion Detection Based on PSO-SVM

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Training ANFIS Structure with Modified PSO Algorithm

Optimizing SVR using Local Best PSO for Software Effort Estimation

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

A Novel Deluge Swarm Algorithm for Optimization Problems

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

Combining Cellular Automata and Particle Swarm Optimization for Edge Detection

Complexity Analysis of Problem-Dimension Using PSO

A hybrid sequential approach for data clustering using K-Means and particle swarm optimization algorithm

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

THE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT

Estimation of Image Corruption Inverse Function and Image Restoration Using a PSObased

Smoothing Spline ANOVA for variable screening

MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE SWARM OPTIMIZATION

Stochastic optimization algorithm with probability vector in mathematical function minimization and travelling salesman problem

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index

Active Contours/Snakes

Stochastic optimization algorithm with probability vector

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

Straight Line Detection Based on Particle Swarm Optimization

An Optimal Algorithm for Prufer Codes *

Analysis of Particle Swarm Optimization and Genetic Algorithm based on Task Scheduling in Cloud Computing Environment

A Load-balancing and Energy-aware Clustering Algorithm in Wireless Ad-hoc Networks

Research Article Decision of Multimodal Transportation Scheme Based on Swarm Intelligence

OSPP Face Recognition Using Meta-Heuristic Algorithm

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling

Predator-Prey Pigeon-Inspired Optimization for UAV Three-Dimensional Path Planning

Machine Learning. Topic 6: Clustering

Fitting: Deformable contours April 26 th, 2018

Hermite Splines in Lie Groups as Products of Geodesics

A Time-driven Data Placement Strategy for a Scientific Workflow Combining Edge Computing and Cloud Computing

Bin Bloom Filter Using Heuristic Optimization Techniques for Spam Detection

CHAPTER 4 OPTIMIZATION TECHNIQUES

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

ARTICLE IN PRESS. Applied Soft Computing xxx (2012) xxx xxx. Contents lists available at SciVerse ScienceDirect. Applied Soft Computing

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Rule Discovery with Particle Swarm Optimization

SHAPE OPTIMIZATION OF STRUCTURES BY MODIFIED HARMONY SEARCH

Using Particle Swarm Optimization for Enhancing the Hierarchical Cell Relay Routing Protocol

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

An Influence of the Noise on the Imaging Algorithm in the Electrical Impedance Tomography *

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Overview. Basic Setup [9] Motivation and Tasks. Modularization 2008/2/20 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

Classifier Ensemble Design using Artificial Bee Colony based Feature Selection

A Two-Stage Algorithm for Data Clustering

Fast Computation of Shortest Path for Visiting Segments in the Plane

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks

Research of Dynamic Access to Cloud Database Based on Improved Pheromone Algorithm

Unsupervised Learning

A parallel implementation of particle swarm optimization using digital pheromones

Particle Swarm Optimization for HW/SW Partitioning

Swarm Intelligence and its Application in Abnormal Data Detection

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

Module Management Tool in Software Development Organizations

USING MODIFIED FUZZY PARTICLE SWARM OPTIMIZATION ALGORITHM FOR PARAMETER ESTIMATION OF SURGE ARRESTERS MODELS

A fast algorithm for color image segmentation

Concurrent Apriori Data Mining Algorithms

The Shortest Path of Touring Lines given in the Plane

On Supporting Identification in a Hand-Based Biometric Framework

Cluster Analysis of Electrical Behavior

Torusity Tolerance Verification using Swarm Intelligence

Problem Set 3 Solutions

Ant Colony Algorithm for Clustering through of Cliques

Parallel Artificial Bee Colony Algorithm for the Traveling Salesman Problem

Duality Search: A novel simple and efficient meta-heuristic algorithm. *Mohsen Shahrouzi 1)

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

High-Boost Mesh Filtering for 3-D Shape Enhancement

Available online at ScienceDirect. Procedia CIRP 17 (2014 )

The Research of Support Vector Machine in Agricultural Data Classification

Improving Classifier Fusion Using Particle Swarm Optimization

Optimization Design of Computer Network Reliability Based on Genetic Algorithms

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Structural optimization using artificial bee colony algorithm

Control strategies for network efficiency and resilience with route choice

Review of approximation techniques

A Saturation Binary Neural Network for Crossbar Switching Problem

A New Hybrid Method Based on Improved Particle Swarm Optimization, Ant Colony Algorithm and HMM for Web Information Extraction

Image Alignment CSC 767

Optimization of integrated circuits by means of simulated annealing. Jernej Olenšek, Janez Puhan, Árpád Bűrmen, Sašo Tomažič, Tadej Tuma

Collaboratively Regularized Nearest Points for Set Based Recognition

S1 Note. Basis functions.

Transcription:

Natural Computng Lecture 13: Partcle swarm optmsaton Mchael Herrmann mherrman@nf.ed.ac.uk phone: 0131 6 517177 Informatcs Forum 1.42 INFR09038 5/11/2010

Swarm ntellgence Collectve ntellgence: A super-organsm emerges from the nteracton of ndvduals The super-organsm has abltes that are not present n the ndvduals ( s more ntellgent ) The whole s more than the sum of ts parts Mechansms: Cooperaton and competton self-organsaton, and communcaton Examples: Socal anmals (ncl. ants), smart mobs, mmune system, neural networks, nternet, swarm robotcs Ben, G., Wang, J.: Swarm Intellgence n Cellular Robotc Systems, Proc. NATO Adv. Workshop on Robots and Bologcal Systems, Tuscany, Italy, 26 30/6 (1989)

Swarm ntellgence: Applcaton areas Bologcal and socal modelng Move effects Dynamc optmzaton routng optmzaton structure optmzaton data mnng, data clusterng Organc computng Swarm robotcs

Swarms n robotcs and bology Robotcs/AI Man nterest n pattern synthess Self-organzaton Self-reproducton Self-healng Self-confguraton Constructon Bology/Socology Man nterest n pattern analyss Recognzng best pattern Optmzng path Mnmal condtons not what, but why Modelng Dumb parts, properly connected nto a swarm, yeld smart results. Kevn Kelly

Complex behavour from smple rules Rule 1: Separaton Avod Collson wth neghborng agents Rule 2: Algnment Match the velocty of neghborng agents Rule 3: Coheson Stay near neghborng agents

Towards a computatonal prncple Evaluate your present poston Compare t to your prevous best and neghborhood best Imtate self and others Hypothess: There are two major sources of cognton, namely, own experence and communcaton from others. Leon Festnger, 1954/1999, Socal Communcaton and Cognton

Partcle Swarm Optmzaton (PSO) Methods for fndng an optmal soluton to an objectve functon Drect search,.e. gradent free Smple and quas-dentcal unts Asynchronous; decentralzed control Intermedate number of unts: ~ 101-10 <<23 Redundancy leads to relablty and adaptaton PSO s one of the computatonal algorthms n the feld of swarm ntellgence (another one s ACO) J. Kennedy, and R. Eberhart, Partcle swarm optmzaton, n Proc. IEEE. Int. Conf. on Neural Networks, Pscataway, NJ, pp. 1942 1948, 1995.

PSO algorthm: Intalzaton Ftness functon Number of partcles n = 20,, 200 Partcle postons Partcle veloctes Current best of each partcle ( smple nostalga ) Global best ( group norm ) Intalze constants f : R m R m x R, = 1,, n v R m, = 1, n, xˆ ĝ ω, α 1/ 2

The canoncal PSO algorthm For each partcle For all members of the swarm,.e. create random vectors r 1, r 2 wthcomponents drawn fromu update veloctes v update postons x x + ω v update local bests update global best ( x ) f ( gˆ ) gˆ x f f < v [0,1] 1 n ( x x ) + r ( gˆ x ) ˆ 2 2 + α r α 1 1 xˆ x f f < ( x ) f ( xˆ ) componentwse multplcaton mnmzaton problem!

Comparson of GA and PSO Generally smlar: 1. Random generaton of an ntal populaton 2. Calculaton of a ftness value for each ndvdual. 3. Reproducton of the populaton based on ftness values. 4. If requrements are met, then stop. Otherwse go back to 2. Modfcaton of ndvduals In GA: by genetc operators In PSO: Partcles update themselves wth the nternal velocty. They also have memory. Sharng of nformaton Mutual n GA. Whole populaton moves as a group towards optmal area. One-way n PSO: Source of nformaton s only gbest (or lbest). All partcles tend to converge to the best soluton quckly. Representaton GA: dscrete PS: contnuous www.swarmntellgence.org/tutorals.php

PSO as MBS As n GA the model s actually a populaton (whch can be represented by a probablstc model) Generate new samples from the ndvdual partcles of the prevous teraton by random modfcatons Use memory of global, neghborhood or personal best for learnng

Intalzaton # Intalze the partcle postons and ther veloctes X = lower_lmt + (upper_lmt - lower_lmt) * rand(n_partcles, m_dmensons) assert X.shape == (n_partcles, m_dmensons) V = zeros(x.shape) # Intalze the global and local ftness to the worst possble ftness_gbest = nf ftness_lbest = ftness_gbest * ones(n_partcles) w=0.1 # omega range 0.01 0.7 a1=a2=2 # alpha range 0 4, both equal n=25 # range 20 200 max velocty # no larger than: range of x per step or 10-20% of ths range Man loop (next page)

for k = 1.. T_teratons: # loop untl convergence ftness_x = evaluate_ftness(x) # evaluate ftness of each partcle for I = 1.. n_partcles: # update local bests f ftness_x[i] < ftness_lbest[i]: ftness_lbest[i] = ftness_x[i] for J = 1.. m_dmensons: X_lbest[I][J] = X[I][J]; end J; end l; mn_ftness_ndex = argmn(ftness_x) # update global best mn_ftness = ftness_x[mn_ftness_ndex] f mn_ftness < ftness_gbest: ftness_gbest = mn_ftness; X_gbest = X[mn_ftness_ndex,:] for I = 1.. n_partcles: # update veloctes and postons for J = 0.. m_dmensons: R1 = unform_random_number() R2 = unform_random_number() V[I][J] = (w*v[i][j] + a1*r1*(x_lbest[i][j] - X[I][J]) + a2*r2*(x_gbest[j] - X[I][J])) X[I][J] = X[I][J] + V[I][J] end I, end J, end k;

Illustratve example Marco A. Montes de Oca PSO Introducton

Exploratory behavour: Search a broad regon of space Explotatve behavour: Locally orented search to approach a (possbly local) optmum Parameters to be chosen to properly balance between exploraton and explotaton,.e. to avod premature convergence to a local optmum yet stll ensure a good rate of convergence to the optmum. Convergence How does t work? Exploraton: Swarm collapses (or rather dverges, oscllates, or s crtcal) Explotaton: Global best approaches global optmum (or rather, for a collapse of the swarm, a local optmum) Mathematcal attempts (typcally oversmplfed): Convergence to global optmum for a 1-partcle swarm after nfnte tme (F. v. d. Bergh, 2001) see PSO at en.wkpeda.org

Repulsve PSO algorthm For each partcle 1 n create random vectors r 1, r 2, r 3 wth components drawn from U[0,1] update veloctes ŷ best of random neghbors, α 2 <0 z random velocty componentwse multplcaton update postons etc. Propertes: sometmes slower, more robust and effcent

Constrcton factor n canoncal PSO Introduced by Clerc (1999) Smplest form: May replace ntera ω Meant to mprove convergence by an enforced decay (more about ths later)

Topology: Restrcted competton/coordnaton Topology determnes wth whom to compare and thus how solutons spread through the populaton Tradtonal ones: gbest, lbest Global verson s faster but mght converge to local optmum for some problems. Local verson s a somewhat slower but not easy to be trapped nto local optmum. Combnaton: Use global verson to get rough estmate. Then use local verson to refne the search. For some topologes analogous to slands n GA

Innovatve topologes Specfed by: Mean degree, clusterng, heterogenety etc.

Comparson of GA and PSO Generally smlar: 1. Random generaton of an ntal populaton 2. Caclulate of a ftness value for each ndvdual. 3. Reproducton of the populaton based on ftness values. 4. If requrements are met, then stop. Otherwse go back to 2. Modfcaton of ndvduals In GA: by genetc operators In PSO: Partcles update themselves wth the nternal velocty. They also have memory. Sharng of nformaton Mutual In GA. Whole populaton moves as a group towards optmal area. One-way n PSO: Source of nformaton s only gbest (or lbest). All partcles tend to converge to the best soluton quckly. Representaton GA: dscrete PS: contnuous www.swarmntellgence.org/tutorals.php

Lterature on swarms Erc Bonabeau, Marco Dorgo, Guy Theraulaz: Swarm Intellgence: From Natural to Artfcal Systems (Santa Fe Insttute Studes on the Scences of Complexty) OUP USA (1999) J. Kennedy, and R. Eberhart, Partcle swarm optmzaton, n Proc. of the IEEE Int. Conf. on Neural Networks, Pscataway, NJ, pp. 1942 1948, 1995. Y Sh, RC Eberhart (1999) Parameter selecton n partcle swarm optmzaton. Sprnger. Eberhart Y. Sh (2001) PSO: Developments, applcatons ressources. IEEE. www.engr.upu.edu/~eberhart/web/psobook.html Tutorals: www.partcleswarm.nfo/ Bblography: cdweb.cc.purdue.edu/~hux/pso.shtml