A multi-level thresholding approach using a hybrid optimal estimation algorithm

Similar documents
A Binarization Algorithm specialized on Document Images and Photos

CS 534: Computer Vision Model Fitting

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Unsupervised Learning

A Robust Method for Estimating the Fundamental Matrix

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Support Vector Machines

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Biostatistics 615/815

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

Feature Reduction and Selection

Support Vector Machines

An Entropy-Based Approach to Integrated Information Needs Assessment

Meta-heuristics for Multidimensional Knapsack Problems

S1 Note. Basis functions.

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

GSLM Operations Research II Fall 13/14

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

Mathematics 256 a course in differential equations for engineering students

Problem Set 3 Solutions

Programming in Fortran 90 : 2017/2018

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

X- Chart Using ANOM Approach

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance

Smoothing Spline ANOVA for variable screening

Active Contours/Snakes

An Optimal Algorithm for Prufer Codes *

The Codesign Challenge

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

EECS 730 Introduction to Bioinformatics Sequence Alignment. Luke Huan Electrical Engineering and Computer Science

Exercises (Part 4) Introduction to R UCLA/CCPR. John Fox, February 2005

y and the total sum of

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Unsupervised Learning and Clustering

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Three supervised learning methods on pen digits character recognition dataset

Hermite Splines in Lie Groups as Products of Geodesics

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

5 The Primal-Dual Method

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Optimal Scheduling of Capture Times in a Multiple Capture Imaging System

Vectorization of Image Outlines Using Rational Spline and Genetic Algorithm

Estimation of Image Corruption Inverse Function and Image Restoration Using a PSObased

Wishing you all a Total Quality New Year!

Cluster Analysis of Electrical Behavior

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

A Two-Stage Algorithm for Data Clustering

Performance Evaluation of Information Retrieval Systems

Fitting: Deformable contours April 26 th, 2018

Multi-view 3D Position Estimation of Sports Players

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

An Improved Image Segmentation Algorithm Based on the Otsu Method

Help for Time-Resolved Analysis TRI2 version 2.4 P Barber,

Complexity Analysis of Problem-Dimension Using PSO

Classification / Regression Support Vector Machines

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches

High-Boost Mesh Filtering for 3-D Shape Enhancement

Edge Detection in Noisy Images Using the Support Vector Machines

Review of approximation techniques

Hierarchical clustering for gene expression data analysis

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

Solving two-person zero-sum game by Matlab

Using Particle Swarm Optimization for Enhancing the Hierarchical Cell Relay Routing Protocol

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated.

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

Classifier Swarms for Human Detection in Infrared Imagery

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

EXTENDED BIC CRITERION FOR MODEL SELECTION

Optimizing Document Scoring for Query Retrieval

Unsupervised Learning and Clustering

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

The Research of Support Vector Machine in Agricultural Data Classification

Straight Line Detection Based on Particle Swarm Optimization

Lecture 5: Multilayer Perceptrons

A New Token Allocation Algorithm for TCP Traffic in Diffserv Network

Design for Reliability: Case Studies in Manufacturing Process Synthesis

Design of Structure Optimization with APDL

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index

3. CR parameters and Multi-Objective Fitness Function

NGPM -- A NSGA-II Program in Matlab

Parameter estimation for incomplete bivariate longitudinal data in clinical trials

Module Management Tool in Software Development Organizations

Transcription:

Pattern Recognton Letters 28 (2007) 662 669 www.elsever.com/locate/patrec A mult-level thresholdng approach usng a hybrd optmal estmaton algorthm Shu-Ka S. Fan *, Yen Ln Department of Industral Engneerng and Management, Yuan Ze Unversty, No. 135, Yuandong Road, Jhongl Cty, Taoyuan County 320, Tawan, ROC Receved 13 January 2006; receved n revsed form 25 August 2006 Avalable onlne 22 December 2006 Communcated by L. Younes Abstract Ths paper presented a hybrd optmal estmaton algorthm for solvng mult-level thresholdng problems n mage segmentaton. The dstrbuton of mage ntensty s modeled as a random varable, whch s approxmated by a mxture Gaussan model. The Gaussan s parameter estmates are teratvely computed by usng the proposed PSO + EM algorthm, whch conssts of two man components: () global search by usng partcle swarm optmzaton (PSO); () the best partcle s updated through expectaton maxmzaton (EM) whch leads the remanng partcles to seek optmal soluton n search space. In the PSO + EM algorthm, the parameter estmates fed nto EM procedure are obtaned from global search performed by PSO, expectng to provde a sutable startng pont for EM whle fttng the mxture Gaussans model. The prelmnary expermental results show that the hybrd PSO + EM algorthm could solve the mult-level thresholdng problem qute swftly, and also provde qualty thresholdng outputs for complex mages. Ó 2006 Elsever B.V. All rghts reserved. Keywords: Mult-level thresholdng; Mxture Gaussan curve fttng; Expectaton maxmzaton (EM); Partcle swarm optmzaton (PSO) 1. Introducton Image thresholdng s very useful n separatng objects from background mage, or dscrmnatng objects from objects that have dstnct gray-levels. Sezgn and Sankur (2004) have presented a thorough survey of a varety of thresholdng technques, among whch global hstogrambased algorthms, such as mnmum error thresholdng (Kttler and Illngworth, 1986), are employed to determne the threshold. In parametrc approaches (Weszka and Rosenfeld, 1979; Synder et al., 1990), the gray-level dstrbuton of each class has a probablty densty functon (PDF) that s assumed to obey a (mxture) Gaussan dstrbuton. An attempt to fnd an estmate of the parameters of the dstrbuton that wll best ft the gven hstogram data s * Correspondng author. Tel.: +886 3 4638800x2510; fax: +886 3 4638907. E-mal address: smonfan@saturn.yzu.edu.tw (S.-K.S. Fan). made by usng the least-squares estmaton (LSE) method. Typcally, t leads to a nonlnear optmzaton problem, of whch the soluton s computatonally expensve and tmeconsumng. Besdes, Snyder et al. presented an alternatve method for fttng curves based on a heurstc method called tree annealng; Yn (1999) proposed a fast scheme for optmal thresholdng usng genetc algorthms, and Yen et al. (1995) proposed a new crteron for mult-level thresholdng, termed automatc thresholdng crteron (ATC) to deal wth automatc selecton of a robust, optmum threshold for mage segmentaton. More recently, Zahara et al. (2005) proposed a hybrd Nelder-Mead Partcle Swarm Optmzaton (NM-PSO) method to solve the objectve functon of Gaussan curve fttng for mult-level thresholdng. The NM-PSO method s effcent n solvng the mult-level thresholdng problem and could provde better effectveness than the other tradtonal methods n the context of vsualzaton, object sze and mage contrast. However, curve fttng s usually tme-consumng 0167-8655/$ - see front matter Ó 2006 Elsever B.V. All rghts reserved. do:10.1016/j.patrec.2006.11.005

S.-K.S. Fan, Y. Ln / Pattern Recognton Letters 28 (2007) 662 669 663 for mult-level thresholdng. To further enhance effcency whle mantanng qualty effectveness, an mproved method s warranted. For further detals of the NM-PSO method, see Zahara et al. (2005) and Fan and Zahara (2006). In ths paper, an mprovement upon Gaussan curve fttng s reported that the effcency of mage thresholdng methods could be enhanced n the case of mult-level thresholdng. We present a hybrd expectaton maxmzaton (EM) and partcle swarm optmzaton (PSO + EM) algorthm to solve the objectve functon of Gaussan parameter estmaton. The PSO + EM algorthm s appled to mage thresholdng wth mult-modal hstograms, and the performances of PSO + EM on Gaussan curve fttng are compared to the NM-PSO method. Secton 2 ntroduces the parametrc objectve functons that need to be solved n mage thresholdng. In Secton 3, the proposed hybrd PSO + EM algorthm s detaled. Secton 4 gves the expermental results and compares performances between two dfferent methods. Lastly, Secton 5 concludes ths research. 2. Mnmum error thresholdng by Gaussan model One of the most advanced technques n mage thresholdng s mnmum error thresholdng proposed by Kttler and Illngworth (1986), whch was remarked n the survey paper presented by Sezgn and Sankur (2004). In the begnnng, we consder an mage whose pxels assume gray level values, x, from the nterval [0, n]. It s convenent to summarze the dstrbuton of gray levels n the form of a hstogram h(x) whch gves the frequency of occurrence of each gray level n the mage (Kttler and Illngworth, 1986). Then, the dstrbuton of mage ntensty s modeled as a random varable, obeyng a mxture Gaussan model. A properly normalzed mult-modal hstogram p(x) of an mage can be ftted wth the sum of d probablty densty functons (PDFs) for fndng the optmal thresholds used n mage segmentaton (Synder et al., 1990). Wth Gaussan PDFs, the model follows the followng form: " # pðxþ ¼ Xd P pffffffffff exp ðx l Þ 2 2p r r 2 ; ð1þ where P s the a pror probablty and P d P ¼ 1, d s the levels of thresholdng, l s the mean, and r 2 s the varance of mode. A PDF model must be ftted to the hstogram data, typcally by usng the maxmum lkelhood or mean-squared error approach, n order to locate the optmal threshold. Gven the hstogram data h(x) (observed frequency of gray level, x), t can be defned below: hðxþ ¼ JðxÞ X L 1 ¼0 JðxÞ ; where J(x) denotes the occurrence of gray-level x over a gven mage range [0,L 1] where L s the total number of gray levels. It s expected to fnd a set of parameters, denoted by H, whch mnmze the fttng error: ð2þ Mnmze H ¼ X ½hðxÞ pðx; HÞŠ 2 : ð3þ x Here, H s the objectve functon (.e., Hammng Dstance) to be mnmzed wth respect to H, a set of parameters defnng the mxture Gaussan PDFs and the probabltes, as gven by H ={P,l,r ; =1,2,...,d}. Gven p(xj) and P, there exsts a gray level s for whch gray level x satsfes (for the case of two adjacent Gaussan PDFs) P 1 pðxj1þ > P 2 pðxj2þ; when x 6 s ð4þ P 1 pðxj1þ < P 2 pðxj2þ; when x > s: s s the Bayes mnmum error threshold at whch the mage should be bnarzed. Now usng the model p(xj,t), =1,2, the condtonal probablty e(x, T) of gray level x beng replaced n the mage by a correct bnary value s gven by eðx; T Þ¼pðxj; T ÞP ðt Þ=pðxÞ ¼ 1 x 6 T ð5þ 2 x > T : Takng the logarthm of the numerator n Eq. (5) and multplyng the result by 2, the followng quantty can be obtaned: eðx;t Þ¼ x l 2 ðt Þ þ 2logr ðt Þ 2logP ðt Þ ¼ 1 x 6 T r ðt Þ 2 x > T : The average performance profle for the whole mage can be characterzed by the crteron functon: JðT Þ¼ X pðxþeðx; T Þ: ð7þ x Thus, the problem of mnmum error threshold selecton can be formulated as one of mnmzng J(T),.e., JðsÞ ¼mn JðT Þ: ð8þ T After fttng the mult-modal hstogram, the optmal threshold could be determned by mnmzng Eq. (7) gven by the mxture Gaussan model, t can be found JðT Þ¼1þ2½P 1 log r 1 ðt ÞþP 2 log r 2 ðt ÞŠ 2½P 1 log P 1 ðt ÞþP 2 log P 2 ðt ÞŠ: ð9þ Eq. (9) can be computed easly and fnd ts mnmum. Lkewse, t can be extended to a mult-level thresholdng case: JðT ;...;T d 1 Þ¼1 þ 2 Xd ð6þ fp ðt Þ½logr ðt Þ logp ðt ÞŠg; ð10þ where the mage ntensty contans d modes,.e., t s a mxture of d Gaussan denstes. Here, the number of canddate thresholds to be evaluated s consderably greater 2. The number of ponts for whch the crteron functon must be computed s (L + 1)!/[(L +2 d)!(d 1)!] where L s the largest gray level startng from 0. For nstance, when d = 3 the number of ponts for functon evaluaton s 32,640. Kttler and Illngworth (1986) provded a fast

664 S.-K.S. Fan, Y. Ln / Pattern Recognton Letters 28 (2007) 662 669 procedure for the value of threshold; the procedure wll be adopted n ths paper after the estmates of mxture model are obtaned. 3. Hybrd PSO + EM method In ths paper, the development of the hybrd algorthm ntends to mprove the effcency of the optmal thresholdng technques currently appled n practce. The goal of ntegratng expectaton maxmzaton (EM) algorthm and partcle swarm optmzaton (PSO) s to combne ther advantages and compensate for each other s dsadvantages. For nstance, the EM algorthm s effcent but excessvely senstve to the startng pont (.e., ntal estmates). A poor startng pont mght make the EM algorthm termnate prematurely or get stuck due to computatonal dffcultes. On the other hand, PSO belongs to the class of global, populaton-based search procedure but requres much computatonal effort than classcal search procedures. Smlar hybrdzaton deas have ever been dscussed n hybrd methods usng genetc algorthms and drect search technques, and they emphaszed the trade-off between soluton qualty, relablty and computaton tme n global optmzaton (Renders and Flasse, 1996; Yen et al., 1998). Ths secton starts by ntroducng the procedure of the EM algorthm and PSO, followed by a descrpton of the proposed hybrd method. 3.1. The EM algorthm For the PDF of mxture Gaussan functon (see Eq. (1)), the lkelhood functon s defned as follows: KðX; HÞ ¼ YN X d P px n ; l ; r 2 : ð11þ Thus, the logarthm of the lkelhood functon K(X; H) s gven by kðx; HÞ ¼ XN log Xd P px n ; l ; r 2 : In general, the parameter estmaton problem can be defned as bh arg max kðx; HÞ: ð12þ H Note that any maxmzaton algorthm could be used to fnd H, b the maxmum lkelhood estmate of parameter vector H (P,l, and r )(Tomas, 2005). Here, the EM algorthm assumes that approxmate (ntal) estmates P ðkþ ; l ðkþ,andr ðkþ are avalable for the parameters of the lkelhood functon K(X;H (k) ) or ts logarthm k(x;h (k) ). Then, better estmates P ðkþ1þ ; l ðkþ1þ, and can be computed by frst usng the prevous estmates r ðkþ1þ to construct a lower bound b(h (k) ) for the lkelhood functon, and then maxmzng the bound wth respect to H. Constructon of the bound b(h (k) ) s called the E step n that the bound s the expectaton of a logarthm. The maxmzaton of b(h (k) ) that yelds the new estmates P ðkþ1þ ; l ðkþ1þ,andr ðkþ1þ s called the M step. The EM algorthm terates the above two computatons untl convergence to a local maxmum of the lkelhood functon and obtans an estmated optmal soluton. In detal, for the case of mxture Gaussan parameters estmaton, the estmate P (k) (jn) s computed for the membershp probabltes: P ðkþ P ðkþ p x n ; l ðkþ ; r ðkþ ðjnþ ¼X K ; ð13þ p x n ; l ðkþ ; r ðkþ P ðkþ m¼1 where the prevous parameter estmates P ðkþ ; l ðkþ, and r ðkþ are requred. Ths s the actual computaton performed n the E step. The rest of the constructon of the bound b(h (k) ) uses Jensen s nequalty to bound the logarthm k(x;h (k) ) of the lkelhood functon: kðx; HÞ ¼ XN P XN log Xd X d where qðjnþ ¼P pðx n ; l ; r Þ: qð; nþ P ðkþ qð; nþ ð; nþ log P ðkþ ð; nþ ¼ b HðkÞ ; ð14þ The bound b(h (k) ) obtaned can be rewrtten as follows: X d b H ðkþ ¼ P ðkþ ðjnþlog qð; nþ XN X d P ðkþ ðjnþlog P ðkþ ðjnþ: Snce the old membershp probabltes P (k) (jn) are known, maxmzng b(h (k) ) s the same as maxmzng the frst of the two summatons: X d bh ðkþ ¼ P ðkþ ðjnþlog qð; nþ: ð15þ b(h (k) ) contans a lnear combnaton of d logarthms, and ths breaks the couplng of the equatons obtaned by settng the dervatves of b(h (k) ), wth respect to the parameters, equal to zero. The dervatve of b(h (k) ) wth respect to l s easly found to be ob k ol ¼ XN P ðkþ ðjnþ l x n : ð16þ r 2 Upon settng ths expresson to zero, the varance r can be cancelled, and the remanng equaton contans only l as the unknown as follows: l P ðkþ ðjnþ ¼ XN P ðkþ ðjnþx n : ð17þ

S.-K.S. Fan, Y. Ln / Pattern Recognton Letters 28 (2007) 662 669 665 Ths equaton can be solved mmedately to yeld the new estmate of the mean: l ðkþ1þ ¼ P ðkþ ðjnþx n P ðkþ ðjnþ ; ð18þ whch s only a functon of old values (wth superscrpt k). The resultng value l ðkþ1þ s plugged nto the expresson for b(h (k) ), whch can now be dfferentated wth respect to r through a very smlar manpulaton to yeld vffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff r ðkþ1þ u1 ¼ d P ðkþ ðjnþkx n l ðkþ1þ k 2 t : ð19þ P ðkþ ðjnþ The dervatve of b(h (k) ) wth respect to P, subject to the constrant that the P 0 s add up to one, can be handled agan through the soft-max functon. Ths yelds the new estmate for the mxng probabltes as a functon of the old membershp probabltes: P ðkþ1þ ¼ 1 N P ðkþ ðjnþ: ð20þ Eventually, the new estmates (wth superscrpt k + 1) are all solved and would be fed nto next (EM) run untl the lkelhood converges. In ths paper, the termnaton condton s satsfed when the dfference between the current and pervous lkelhood value s smaller than the tolerance of 10 6. 3.2. The PSO algorthm Partcle swarm optmzaton (PSO), developed by Kennedy and Eberhart (1995), s one of the latest evolutonary optmzaton algorthms. PSO smulates a commonly observed socal behavor where members of a group tend to follow the leader of the group. Smlar to genetc algorthms (GA), PSO s also populaton-based and evolutonary n nature, wth one major dfference from GA that t does not mplement selecton; namely, all partcles n the populaton survve through the entre search process. The descrpton of PSO here s necessarly bref snce the complete procedure has been detaled n (Zahara et al., 2005). The partcle s velocty and poston are teratvely updated by v NEW d x NEW d ¼ w v OLD d þ C 1 r 1 x pd x OLD d þ C 2 r 2 x gd x OLD d ; ð21þ ¼ x OLD d þ v NEW d ; ð22þ where the acceleraton coeffcents C 1 and C 2 are two postve constants; w s an nerta weght, and r 1, r 2 are unformly generated random numbers from the range [0, 1] whch s ndependently generated for every teraton. For further detals regardng PSO, see Eberhart and Sh (2001) and Hu and Eberhart (2001). 3.3. Hybrd PSO + EM algorthm Intalzaton. Generate a populaton of sze 3N + 1. Repeat (1).Evaluaton & Rankng. Evaluate and Rank the ftness of all partcles. From the populaton select the global best partcle and record ts locaton (parameters). (2).EM Update. Apply EM approach to the global best partcle and replace ts locaton wth the updated parameters (Note that the global best partcle of each teraton s updated only once). Excepton: If EM approach fals n update, generate a new soluton (partcle) randomly and then go back to step (1). (3).PSO Update. Apply velocty update to the remanng 3N partcles accordng to equatons (21) and (22). Untl a termnaton condton (EM algorthm converges) s met. Fg. 1. The hybrd PSO + EM algorthm procedure. Unlke the real synergy of hybrdzaton n NM-PSO, the proposed PSO + EM algorthm s more lke a twostep search procedure by two dfferent methods. The populaton sze of ths hybrd PSO + EM approach s set at 3N + 1 when solvng an N-dmensonal problem n order to make a far comparson wth the NM-PSO method. Note that, n Gaussan curve fttng we need to estmate N =3d 1 parameters. For example, n b-level thresholdng, there are fve parameters, H ={P 1,l 1,r 1,l 2,r 2 }and P 2 =1 P 1. The ntal populaton s created by usng a randomzed startng pont, that s, all partcles are randomly generated n search space. A total of 3N + 1 partcles are sorted by the ftness, and the top one partcle (.e., elte) s then fed nto the EM algorthm to update ts locaton (soluton) once n parameter space. The remanng 3N partcles are adjusted by PSO takng nto account the poston of ths elte partcle f updated successfully. Ths procedure for adjustng the 3N partcles nvolves selecton of the global best partcle and fnally velocty updates. The global best partcle of the populaton s determned accordng to the sorted ftness values. Note that f the global best partcle could not be updated through the EM algorthm, the PSO + EM algorthm wll generate a new one to replace the old one, and reselect the global best partcle n the new swarm. So, selecton (or called flterng) n PSO + EM s possble. By Eqs. (21) and (22), a velocty update for each of 3N partcles s then carred out. The 3N + 1 partcles are sorted n preparaton for repeatng the next teraton. The process termnates when a certan convergence crteron s satsfed. Here, Fg. 1 summarzes the procedure of PSO + EM and Fg. 2 shows the flowchart. 4. Expermental results In ths secton, we evaluate and compare the performances of these two methods: the NM-PSO algorthm and proposed PSO + EM algorthm whle mplementng Gaussan curve fttng for mult-level thresholdng. Both methods are mplemented on an Intel Celeron 2.8 GHz platform wth 512 MB DDR-RAM usng Matlab. The test mages are of sze 256 256 pxels wth 8 bt gray-levels, taken under natural room lghtng wthout the support of any specal lght source. The stoppng crteron of NM-PSO referred to Zahara et al. (2005) s 10N teratons when solvng an N-dmensonal problem. However, the

666 S.-K.S. Fan, Y. Ln / Pattern Recognton Letters 28 (2007) 662 669 Apply velocty update to the remanng 3N partcles No Generate PSO populaton of sze 3N+1 Evaluate the ftness of each partcle Rank partcles based on the ftness Apply EM approach to update the best partcle once Updated successfully? Yes Compute lkelhood of updated soluton Does lkelhood converge? Yes Gan the optmum soluton (estmates) No Fg. 2. Flowchart of PSO + EM algorthm. Ftness functon Generate a new soluton randomly. PSO + EM algorthm s halted whle the lkelhood functon mprovement falls wthn 10 6. To farly compare these two parametrc methods effcency and effectveness, for the NM-PSO algorthm the ntal parameter estmates used n (Zahara et al., 2005) are followed. For the PSO + EM algorthm, the pror estmates, P 0 s, are set at P =1/d; remanng parameters are all generated at random. The experment starts wth oxdaton nduce stackng fault (OISF) mage taken under un-central lghtng as shown n Fg. 3(a). It ams to verfy that the PSO + EM algorthm can delver satsfactory performance. Note agan that the mnmum error thresholdng technque as shown n Eq. (10) s employed here. The CPU tmes recorded n all tables do not nclude computaton of the threshold values. As can be seen from Fg. 3(b) and (c), the NM-PSO and PSO + EM algorthms both perform 4-level mxture Gaussan curve fttng and produce qute dfferent threshold values (see Fg. 3(f) and (g)). Obvously, the PSO + EM algorthm can handle low-contrast part well n upper-rght and lower-left corners of mage but the NM-PSO algorthm cannot. Comparng computaton tme n the 4-level fttng, the PSO + EM algorthm only takes one twenty-ffth of tme as the NM-PSO algorthm. The PSO + EM algorthm returns a much smaller Hammng dstance than the NM- PSO algorthm,.e., 40% reducton n fttng error. If a hgher fttng qualty s desred, a more accurate fttng result by usng seven levels s ganed whle computaton Fg. 3. Expermental results for the OISF mage.

S.-K.S. Fan, Y. Ln / Pattern Recognton Letters 28 (2007) 662 669 667 tme ncreases only to 5.84 s from 3.75 s. To be precse, the further mprovement on Hammng dstance made from 4- to 7-level fttng s about 5%. The second experment of PCB mage n Fg. 4(a) produces smlar results to those presented n the OISF example. Regardless of the complex pattern, the PSO + EM algorthm stll performs better than the NM-PSO algorthm n Hammng dstance and computaton tme for the 4-level fttng, achevng 9% and 93% mprovements, respectvely. For the thresholdng effectveness, the threshold values obtaned by usng the PSO + EM algorthm are totally dfferent from the NM+PSO algorthm (see Fg. 4(f) and (g)). Ths mght explan partly why the proposed algorthm can separate pnholes from background n the lower part of the PCB mage but the NM-PSO cannot. As the number of fttng level ncreases to 7, the fttng error n terms of Hammng dstance s reduced by 68%, whch makes the local patterns rehabltated more clearly. Based on the above two examples, t s perhaps surprsng that the PSO + EM algorthm s very quck, a fact that can be attrbuted to ts superor speed by ncorporatng EM nto the algorthm. Mnmum fttng errors acheved would be largely due to the global search capablty provded by PSO. The thrd experment s conducted on the RAM mage dsplayed n Fg. 5(a). The speed advantage by the PSO + EM algorthm stll prevals for ths mult-level thresholdng case, but the segmentaton qualty s no longer acceptable for ndustral applcatons (snce the mark on the module s not cleanly dentfed). Lookng at the orgnal hstogram, there are two hgh peaks between gray levels 50 and 100, whch complcates stuatons much where only the 4-level fttng s chosen. Thanks to PSO + EM s excellent speed, we could ncrease the fttng level to gan better results wthout sacrfcng computaton tme too much. Fortunately, full detals of the RAM mage are revealed when the fttng level s rased to 7, and Hammng dstance s further enhanced down to four dgts after decmal pont (see Fg. 5(g) and (h)). It mples that the PSO + EM algorthm can serve as a hgh-dmensonal tool for dealng wth the thresholdng stuatons where the other algorthms fal to yeld qualty outcomes wthn reasonable computaton tme. The proposed PSO + EM algorthm renders a degree of effcency n practcal applcatons, whch s unattanable wth the NM-PSO algorthm. Table 1 summarzes the estmaton results of prevous three experments, ncludng Hammng dstance, a set of parameter estmates, and CPU tme. One scenaro, 4-level fttng, s for the NM-PSO algorthm. Two scenaros, 4- and 7-level fttng, are for the PSO + EM algorthm. Wth a vew to evaluatng the algorthm s computaton robustness, the last two columns n the table report the average Hammng dstance (H) and CPU tme (T) statstcs computed over 10 ndependent runs. The number n parenthess ndcates the standard devaton of the assocated statstc. Concernng Hammng dstance statstc, the PSO + EM algorthm can yeld smaller average fttng errors and standard devatons than the NM-PSO algorthm n three test mages. These results clearly demonstrate the proposed Fg. 4. Expermental results for the PCB mage.

668 S.-K.S. Fan, Y. Ln / Pattern Recognton Letters 28 (2007) 662 669 Fg. 5. Expermental results for the RAM mage. Table 1 Expermental results va mult-level thresholdng approaches and ts statstcs Image Method Level (d) Hammng dstance (H) Parameter set {P 1,...,P d ; l 1,...,l d ; r 1,...,r d } OISF NM-PSO 4 1.9200e 04 {0.083, 0.020, 0.740, 0.157; 49.084, 70.410, 91.157, 142.785; 15.292, 3.293, 20.361, 15.088} PSO + EM 4 1.1356e 04 {0.258, 0.037, 0.106, 0.599; 126.002, 152.721, 31.592, 86.047; 22.213, 7.876, 12.641, 16.982} 7 1.0751e 04 {0.032, 0.001, 0.074, 0.186, 0.031, 0.000, 0.676; 115.185, 174.554, 37.148, 141.526, 18.892, 195.752, 88.116; 9.959, 17.053, 11.041, 15.766, 5.267, 6.612, 17.901} PCB NM-PSO 4 9.3501e 04 {0.441, 0.106, 0.100, 0.353; 49.063, 72.174, 98.686, 141.448; 9.143, 8.371, 6.172, 19.073} PSO + EM 4 8.5164e 04 {0.311, 0.313, 0.027, 0.349; 88.780, 144.521, 240.593, 50.480; 22.313, 17.817, 18.409, 6.864} 7 2.7661e 04 {0.199, 0.053, 0.103, 0.082, 0.173, 0.214, 0.176; 121.638, 199.606, 97.487, 42.160, 150.991, 51.168, 66.318; 19.514, 48.881, 16.237, 1.559, 11.735, 4.398, 10.503} RAM NM-PSO 4 1.3254e 03 {0.260, 0.330, 0.391, 0.019; 35.676, 53.921, 85.728, 132.711; 7.569, 4.665, 4.696, 4.373} PSO + EM 4 1.9656e 03 {0.521, 0.132, 0.066, 0.281; 48.623, 96.024, 185.986, 86.439; 12.666, 12.285, 49.848, 4.311} 7 1.6471e 04 {0.020, 0.056, 0.241, 0.103, 0.143, 0.221, 0.216; 241.183, 151.083, 48.460, 32.853, 55.044, 89.073, 86.404; 14.504, 44.647, 10.732, 7.122, 2.099, 12.768, 3.541} a Average/standard devaton computed over 10 ndependent runs. CPU tme (T) H Statstc a T Statstc a 94.50 1.19e 03 (1.47e 03) 93.7455 (0.5742) 3.75 1.29e 04 (3.74e 05) 6.6297 (3.9456) 5.84 1.18e 04 (2.70e 05) 8.7423 (5.0104) 95.26 4.32e 03 (1.71e 03) 93.3529 (4.2062) 5.95 8.60e 04 (9.68e 05) 5.2547 (2.2355) 7.54 2.79e 04 (3.88e 05) 12.9260 (3.4487) 97.15 3.68e 03 (2.99e 03) 95.0010 (1.4501) 29.01 2.14e 03 (2.33e 04) 23.3814 (5.1679) 70.78 1.60e 04 (2.57e 05) 90.6654 (17.4288) algorthm s edge over the NM-PSO algorthm n the consstency and stablty of fttng accuracy. Relatng to CPU statstc, bascally, the PSO + EM algorthm requres far less CPU tme on average than the NM-PSO algorthm.

S.-K.S. Fan, Y. Ln / Pattern Recognton Letters 28 (2007) 662 669 669 There s, however, an excepton as the 7-level fttng s appled to the RAM mage. In such a case, the average of CPU tme requred by the PSO + EM algorthm s somewhat less than that of the NM-PSO algorthm, but the standard devaton of the former s way larger than that of the latter. Yet, the effcency s apparently not the major concern n ths nstance, so the PSO + EM algorthm wth 7 levels s stll recommended for provdng qualty outputs despte ts nstablty n CPU tme. As can be seen from Table 1, the NM-PSO algorthm exhbts a better stablty of computaton tme than the PSO + EM algorthm (see the standard devatons). The reason s that, to mantan a reasonable effcency, the NM-PSO algorthm has to use a fxed teraton stoppng crteron. By contrast, for the PSO + EM algorthm a qute small tolerance of 1 10 6 n lkelhood s used nstead to termnate the search, and all the ntal parameter estmates, except the pror probablty, are all generated randomly. A careful observaton should be placed on the thrd experment. For the nstance of RAM mage, the PSO + EM algorthm s forced to requre many tmes computaton effort than the other two examples to gan a qualty fttng result. The fundamental message s that a mxture Gaussan model appled n ths paper may not be best to ft a Laplace-dstrbuton-type hstogram, lke the one shown n Fg. 5. A smlar dffculty can also arse when the PSO + EM algorthm s appled to a unform-dstrbuton-type hstogram for the mult-level curve fttng. Therefore, we feel safe to allege that the mxture generalzed Gaussan dstrbuton (GGD) model may be a better selecton for the RAM mage where a 4-level fttng would suffce. 5. Concludng remarks The NM-PSO algorthm has been verfed that s effectve n the b-level and tr-level thresholdng cases, but ts computaton tme becomes aggravated n the case of hgher dmensonal thresholdng. In order to make the evolutonary computaton method more practcal n on-lne applcatons, a much faster searchng procedure called the PSO + EM algorthm s proposed, whch solves the objectve functon of mxture Gaussan curve fttng by combnng the EM and PSO algorthms. Expermental results n terms of three test mages show that the PSO + EM algorthm converges much faster than the NM-PSO algorthm n the mult-level Gaussan curve fttng. In addton, comparsons between the PSO + EM and NM-PSO algorthms demonstrate that the new algorthm provdes better qualty n vsualzaton, object shape and contrast of mage segmentaton, partcularly when the mage has a very complex pattern. Thus, the PSO + EM algorthm can be consdered a promsng and vable method for on-lne mult-level thresholdng due to ts superor effcency. Acknowledgements The authors wsh to thank two anonymous referees for provdng many constructve suggestons that greatly mproved the qualty on an earler verson of ths paper. References Eberhart, R.C., Sh, Y., 2001. Trackng and optmzng dynamc systems wth partcle swarms. In: Proc. of the Congress on Evolutonary Computaton. Seoul, Korea, pp. 94 97. Fan, S.-K.S., Zahara, E., 2006. A hybrd smplex search and partcle swarm optmzaton for unconstraned optmzaton. Eur. J. Oper. Res. do:10.1016/j.ejor.2006.06.034. Hu, X., Eberhart, R.C., 2001. Trackng dynamc systems wth PSO: Where s the cheese? In: Proc. of The Workshop on Partcle Swarm Optmzaton. Indanapols, IN, USA. Kennedy, J., Eberhart, R.C., 1995. Partcle Swarm Optmzaton. In: Proc. of the IEEE Int. Conf. on Neural Networks. Pscataway, NJ, USA, pp. 1942 1948. Kttler, J., Illngworth, J., 1986. Mnmum error thresholdng. Pattern Recognton Lett. 19, 41 47. Renders, J.M., Flasse, S.P., 1996. Hybrd methods usng genetc algorthms for global optmzaton. IEEE Trans. Systems. Man Cybernet. Part B: Cybernetcs 26, 243 258. Sezgn, M., Sankur, B., 2004. Survey over mage thresholdng technques and quanttatve performance evaluaton. J. Electron. Imagng 13 (1), 146 165. Synder, W., Blbro, G., Logenthran, A., Rajala, S., 1990. Optmal thresholdng a new approach. Pattern Recognton Lett. 11, 803 810. Tomas, C. Estmatng Gaussan Mxture Denstes wth EM A Tutoral. 2005. <http://www.cs.duke.edu/courses/sprng04/cps196.1/handouts/ EM/tomasEM.pdf>. Weszka, J., Rosenfeld, A., 1979. Hstogram modfcatons for threshold selecton. IEEE Trans. Systems. Man Cybernet. 9, 38 52. Yen, J.C., Chang, F.J., Chang, S., 1995. A new crteron for automatc multlevel thresholdng. IEEE Trans. Image Process. 4 (3), 370 378. Yen, J., Lao, J.C., Lee, B., Randolph, D., 1998. A hybrd approach to modelng metabolc systems usng a genetc algorthm and smplex method. IEEE Trans. Systems. Man Cybernet. Part B: Cybernetcs 28, 173 191. Yn, P.Y., 1999. A fast scheme for optmal thresholdng usng genetc algorthms. Sgnal Process. 72, 85 95. Zahara, E., Fan, S.-K.S., Tsa, D.-M., 2005. Optmal mult-thresholdng usng a hybrd optmzaton approach. Pattern Recognton Lett. 26, 1082 1095.