FINDING INITIAL PARAMETERS OF NEURAL NETWORK FOR DATA CLUSTERING

Size: px
Start display at page:

Download "FINDING INITIAL PARAMETERS OF NEURAL NETWORK FOR DATA CLUSTERING"

Transcription

1 FINDING INITIAL PARAMETERS OF NEURAL NETWORK FOR DATA CLUSTERING ABSTRACT Suneetha Chttnen 1 and Raveendra Babu Bhogapath 2 1 R.V.R. & J.C. College of Engneerng, Guntur, Inda. Chttnen_es@yahoo.com 2 VNR Vgnana Jyoth Insttute of Engneerng & Technology, Hyderabad, Inda. rbhogapath@yahoo.com K-means fast learnng artfcal neural network (K-) algorthm begns wth the ntalzaton of two parameters vglance and tolerance whch are the key to get optmal clusterng outcome. The optmzaton task s to change these parameters so a desred mappng between nputs and outputs (clusters) of the K- s acheved. Ths study presents fndng the behavoral parameters of K- that yeld good clusterng performance usng an optmzaton method known as Dfferental Evoluton. DE algorthm s a smple effcent meta-heurstc for global optmzaton over contnuous spaces. The K- algorthm s modfed to select wnnng neuron (centrod) for a data member n order to mprove the matchng rate from nput to output. The experments were performed to evaluate the proposed work usng machne learnng artfcal data sets for classfcaton problems and synthetc data sets. The smulaton results have revealed that optmzaton of K- has gven qute promsng results n terms of convergence rate and accuracy when compared wth other algorthms. Also the comparsons are made between K- and modfed K-. KEYWORDS Optmzaton, Clusterng, Dfferental Evoluton, Artfcal neural network 1. INTRODUCTION Optmzaton s a procedure used to make a clusterng mechansm as effectve as possble usng mathematcal technques. It s nothng but fndng an alternatve wth the hghest achevable performance under the gven constrants. Clusterng s the process of organzng objects n to groups, whose members are smlar n some way. A cluster s therefore a collecton of objects whch are smlar to each other and dssmlar to the objects belongng to other clusters. In the context of machne learnng, clusterng s an unsupervsed task. The nput for clusterng problem s the set of objects, where each object s descrbed by ts relevant attrbutes. The desred output s the parttonng set of objects nto dsjont groups by mnmzng the objectve functon. Clusterng problems are of two types. In hard clusterng, an object s assgned to only one cluster. In fuzzy clusterng, an object s assgned to more than one cluster wth some membershp degree. Artfcal neural network composed of nter-connected artfcal neurons that mmc the propertes of bologcal neural networks. The major applcatons of artfcal neural networks appled to clusterng n dfferent domans are mage segmentaton for groupng smlar mages, n data mnng dvdng data set nto dfferent groups, n bo-nformatcs groupng genes whch are DOI : /jaa

2 havng smlar characterstcs. In ths paper, clusterng s carred out usng Modfed k-means fast learnng artfcal neural network. Evoluton of fast learnng artfcal neural networks begns wth the two basc models proposed by Tay and Evans: I and II. I model s the modfcaton of Adaptve Resonance Theory (ART 1) developed by Carpenter and Grossberg [6]. The desgn of II was based on Kohonen LVQ network wth Eucldan dstance as smlarty metrc [7].The later mprovement n the II s to take contnuous value n nput [8]. Later K-means calculatons were ncluded n the algorthm. Ths leads to the development of K- [9].Later K was modfed to nclude data pont reshufflng whch resolves the data sequence senstvty that creates stable clusters [10]. The most popular and well known algorthm known as Genetc algorthms, developed by John Holland n 1975, are a model or abstracton of bologcal evoluton based on Charles Darwn's theory of natural selecton [2].Later the bologcal crossover and mutaton of chromosomes are smulated n the algorthm by Holland and hs collaborators to mprove the qualty of solutons over successve teratons [3][24]. Dfferental evoluton was frst popularzed by R. Storn and K. Prce n 1997[1]. It s vector based evolutonary algorthm that s hghly effcent for global optmzaton problems [12-15]. It can be consdered as further development of genetc algorthm. Storn and Prce n 1997 stated that DE was more effcent than smulated annealng and genetc algorthms [1][18][19]. When compared to GA, DE s very attractve due to ts smple desgn and good performance. The mplementaton of DE s easy because smple mathematcal operators can be appled to evolve the populaton wth few control parameters [1]. The most mportant features of DE notced by several researchers are [11] fast convergence, speed, stronger stablty, easy to realze and easly adaptable for nteger and dscrete optmzaton. The same work s carred wth GA also. When GA s combned wth these algorthms, t was notced that GA s effcent for data sets of small sze.but when sze of the data set s large; GA requres more executon tme than DE to perform data clusterng [24]. In DE, all solutons have the same chance of beng selected as parents wthout dependence on ther ftness value. The better one of new soluton and ts parent wns the competton provdng sgnfcant advantage of convergng performance over genetc algorthms (GA) [11]. 2. K-MEANS FAST LEARNING ARTIFICIAL NEURAL NETWORK K- s an atomc module, sutable for creatng scalable networks that mmc the behavor of bologcal neural systems.k- s used to cluster data effcently and consstently wth the mnmal varaton n cluster centrods and s not senstve to the data presentaton sequences (DPS) caused by randomly orderng the arrval of data[10] Archtecture K- conssts of two layers, nput layer and output layer. The set of nodes n the nput layer s determned based on the dmenson of data. Input nodes are drectly connected to output nodes by a set of weghts. The number of nodes n the output layer grows as new groups formed n the clusterng process. The dynamc formaton of output nodes yeld K- s a self-organzed clusterng ANN model. The archtecture of the K- s shown n fgure 1. 58

3 Fgure 1. K- archtecture Algorthm: Step 1 Intalze the network parameters.. Step 2 Present the pattern to the nput layer. If there s no output node, GOTO step 6 Step 3 Determne all possble matches output node usng Eq. (1). d D δ 2 = 1 ( x ) n w j 2 ρ (1) 1, a > D[a]= 0, a 0 0 Step 4 Determne the wnner from all matches output nodes usng Eq. (2) Wnner = d mn wj = 1 ( ) x 2 (2) Step 5 Match node s found. Assgn the pattern to the match output node. Go to Step 2 Step 6 Create a new output node. Perform drect mappng of the vectors. nput vector nto weght Step 7 After completng a sngle epoch, compute clusters Centrod.If centrod ponts of all clusters unchanged, termnate Else GO TO Step 2. Step 8 Fnd closest pattern to the centrod and re-shuffle t to the top of the dataset lst, Go to Step 2. Note: ρ s the Vglance Value, δ s the tolerance for th feature of the nput space, and W j s the cluster center of the correspondng output node denotes the weght connecton of j th output to the th nput, X represent the th feature. K- algorthm conssts of two phases. Pror to clusterng process, step 1 called ntalzaton phase, s needed to determne vglance and tolerance. The computaton complexty depends on sze of the data set and dmensonalty of data set. The value of vglance s usually between 0.5 to 1.Snce the ρ value determnes the rato between the number of features to be wthn tolerance and the total number of features, the clusterng formaton becomes more strngent as the vglance settng approaches 1.Ths parameter was adopted from [26]. The tolerance settng of features s the measurement of attrbutes dsperson, and thus computaton s 59

4 performed for every feature of the tranng data at the ntal stage. Several methods were nvestgated to deal wth tolerance settng [10]. The formula s gven by the eq. (3). = max dff + mn dff 2 (3) Where mn dff Mnmum dfference n attrbute values for attrbute. Ths s the dfference between the smallest and second smallest values of the attrbute. max dff Maxmum dfference n attrbute values for attrbute. Ths s the dfference between the smallest and largest values of the attrbute. In the second phase, clusterng process begns wth the presentaton of data patterns n sequence. Step 3 nvolves two types of computaton called tolerance flterng and vglance screenng. Tolerance flterng shows local varaton n the nput features.e. t determnes whether the feature s wthn the acceptable tolerance or not. It s mplemented usng dscrmnatng functon D[a] n step 3.The postve value of D[a] ndcates that the partcular attrbute s wthn the bounds of the correspondng attrbute of the cluster n queston. Vglance screenng shows global varaton n the nput features.e. t determnes the number of features wthn the localzed nput space that have passed tolerance tunng. Compettve learnng begns from step 4.After satsfyng the screenng crtera, there are stll chances of exemplar mapped to more than one cluster. The fnal decson of ownershp s decded by the WTA property of compettve learnng. Eventually the cluster centrod bearng strongest weght wll absorb the exemplar n to ts cluster. The wnnng neuron s chosen based on the closest dstance from the data pont to centrod. The exemplar s assgned as the new cluster member of the selected wnner centrod n step 5. In step 6, f the new exemplar does not meet the screenng crtera, a new output node s created to represent new cluster. After a sngle epoch, cluster stablty s checked. New cluster centrods are calculated n step 7 and the data patterns closest to the centrods s determned.re-shufflng the seed ponts closest to the cluster centrods to the top of the lst creates new lst of patterns for the next teratve clusterng process n step Modfcatons n the K- Algorthm The mprovements n the K- algorthm are used to select the best matchng unt for the formaton of consstent clusters. A step 4 of the K- algorthm s modfed as follows: Step 4 Determne the wnner from all matched output nodes usng the followng crtera: If same match s found 60

5 Else n Wnner = mn ( w j x ) 2 (4) = 0 n δ 2 ( w x ) 2 = j max 0 Wnner = n (5) In the modfed K-, a data pont successfully satsfed the screenng process and well suted as a member of more than one cluster wth dfferent match values to the centrods.the data pont s assgned as a new cluster member of centrod f t has maxmum matched value out of all matches. The wnner centrod s the one wth the hghest matched value for a data pont even f t s not closest to the data pont. But n the orgnal K- algorthm, a data pont successfully satsfed the screenng process and well suted as a member of more than one cluster, s assgned as a new cluster member of the selected wnner centrod.the centrod closest to the data pont s the Wnner centrod because t has same match values for the other centrods. 3. DIFFERENTIAL EVOLUTION The dfferental evoluton (exploraton) [DE] s a novel parallel drect search method, whch utlzes NP parameter vectors as a populaton for each generaton G.DE can be categorzed nto a class of floatng-pont encoded, evolutonary optmzaton algorthms. Currently, there are several varants of DE. The DE varant used throughout ths nvestgaton s the DE/rand/1/bn scheme [1][17]. The dfferental evoluton (DE) algorthm s a populaton-based optmzaton method that smulates natural evoluton combned wth a mechansm to generate multple search drectons based on the dstrbuton of solutons (vectors) n the current populaton. The DE algorthm s man strategy s to generate new ndvduals by calculatng vector dfferences between other randomly selected ndvduals of the populaton. Man Steps of the DE Algorthm 1) Defnton of encodng scheme 2) Defnton of selecton crteron (ftness functon) 3) Creaton of populaton of chromosomes 4) Evaluaton Repeat Mutaton Crossover Selecton Untl (termnaton crtera are met). In DE, a soluton vector s randomly created at the start. Ths populaton s successfully mproved by applyng mutaton, crossover and selecton operators. Mutaton and crossover are used to generate new vectors (tral vectors), and selecton then are used to determne whether or not the new generated vectors can survve the next teraton [1]. The operators and equatons are descrbed n

6 3.1. Defnton of encodng scheme Intal populaton s a set of NP D dmensonal parameter vectors known as genomes or chromosomes. Each soluton vector conssts of two components representng vglance and tolerance parameters of K. Bnary encodng s used for vglance and floatng pont encodng s used for tolerance. Each parameter s of length d n a soluton vector. For example, consder a data set conssts of 5 features. The representaton of a soluton vector s shown below. Bts of Vglance parameter Feature Tolerances(δ) Vglance(ρ) Fg. 2 structure of a soluton vector So the total length of a soluton vector s ((2*D) +1) where D s the number of features n a data set. For a feature, tolerance s the dfference between maxmum value and mnmum value wthn the feature. In the above example, f the feature to be matched, the correspondng bt s 1 else the correspondng bt s 0.So for four features the correspondng bt s 1. So vglance s 4/5= Defnton of ftness functon The man goal of clusterng algorthms s maxmzng the between group varance and mnmzng wthn group varance. Any cluster valdty measure can be used as ftness functon. In ths paper, a new cluster valdty measure called CS measure proposed by Chou n 2004 s used as ftness functon to evaluate the results of clusterng and to select best vectors n the populaton. Also CS measure deals wth clusters of dfferent denstes and/or szes and s more effcent than other cluster valdty ndces. The CS measure s a functon of the rato of the sums of wthn cluster scatter to between-cluster separaton and smlar to DB or DI ndces. If the clusters are compact then wthn cluster scatter s mnmum and between-cluster separaton s maxmum for well separated clusters. So the CS measure s defned as follows: CS ( K ) = 1 K 1 max d X, X K q = 1 N x c X C q K 1 mn d m m K =, j 1 j K, j (6) = K 1 max d X, X q = 1 N x c X C q K = mn d m m, j 1 j K, j (7) Where k- the number of clusters 62

7 X X j d, - s the dstance metrc between any two data ponts X and X j m m j d, - s the dstance metrc between any two centrods m and m j The cluster centrod s calculated by takng the average of all data ponts that les n a cluster. It s denoted by m. m = 1 n x j c x j (8) The ftness functon s the CS measure s combned wth the ms-classfcaton rate.the error rate s computed by comparng cluster structure produced by K- and externally suppled class labels. Error rate of a soluton vector s the rato of the number of samples msclassfed and the total number of samples. (9) Where mc s the no.of ms-classfed samples total s the total number of samples Eq. (10) gves the ftness functon of a soluton vector ( X f ( X 1 )= CS ( K ) * errorrate + eps ) based on the result of K-. Where CS (K) - CS measure calculated over K parttons errorrate - ms-classfcaton rate of eps - small bas term equal to 2x 10-6 and prevents the denomnator of R.H.S from beng equal to zero Mutaton errorrate = mc total x100 To create V,G+1 for each th member of the current populaton (also called the target vector), three other dstnct parameter vectors, say the vectors X r1,g, X r2,g, and X r3,g are pcked up randomly from the current populaton. For a soluton vector, both bnary mutaton and floatng pont mutaton s appled to create tral vector Bnary mutaton In the bnary mutaton, logcal OR operator s used to add a vector to the weghted dfference between the two vectors. Dfference of two vectors s done by Exclusve- OR operaton. Multplcaton s performed by logcal AND operaton.so mutaton operaton for bnary vectors defned as follows: (10) For each bnary target vector X,G =1,2,3,..., NP (populaton Sze) A bnary mutant vector s produced by V,G+1 = X r1,g OR F AND ( X r2,g XOR X r3,g ) (11) 63

8 Floatng- pont mutaton DE generates new parameter vectors by addng the weghted dfference between two populaton vectors to a thrd vector. Ths operaton s called mutaton. The mutated vector s parameters are then mxed wth the parameters of another predetermned vector, the target vector, to yeld the socalled tral vector. For each target vector X,G =1,2,3,..., NP (populaton Sze). A mutant vector s produced by V, G+1 = X r1,g + F (X r2,g X r3,g ) (12) Where, r1, r2, r3 {1, 2,..., NP} are randomly chosen and must be dfferent from each other. F s the scalng factor, whch controls the magntude of the dfferental varaton of (x r2, G x r3, G). NP s sze of the populaton, whch remans constant durng the search process [1] Crossover The parent vector s combned wth the mutated vector to produce a tral vector through a crossover operaton whch combnes components from current element and mutant vector accordng to a control parameter CR Two types of crossover schemes can be used for DE famly of algorthms. In ths work, bnomal crossover s performed on each of the D varables whenever a randomly pcked number between 0 and 1. If (rnd j CR) or j = r n U j,g+1 = U j,g+1 End Else U j,g+1 = X j,g Where j = 1, 2,..., D; rnd j random number generated n the nterval [0 1]; CR s crossover constant [0, 1]; r n (1, 2,..., D) s the randomly chosen ndex; D represents the number of dmensons of a vector Selecton Selecton s the process of decdng the better one by comparng the performance of tral vector and target vector. The newly formed tral vector s evaluated to decde whether t wll survve to the next generaton or not. If the tral vector produces a better ftness value.e. lower value of objectve functon then t s coped to next generaton otherwse target vector s passed nto next generaton [1]. The pseudo code for selecton s as follows: 64

9 If f (U, G+1 ) f (X, G ) X, G+1 = U, G+1 Else X,G+1 = X,G End 4. K- AND DIFFERENTIAL EVOLUTION The K- s competent clusterng algorthm that provdes consstent clusters wth a short number of epochs. To partton a data set, K- needs optmal values of Vglance and Tolerance. To ntalze these parameters, a large search space s explored and best parameter values are determned usng the dfferental evoluton method. The encodng scheme s dscussed n secton 3.1. For each soluton vector, K- runs and exts when stable centrods formed. The results of K- are evaluated usng ftness functon dscussed n secton 3.2.Durng a generaton; new ndvduals are created usng the dfferental evoluton operators mutaton and crossover dscussed n secton 3.3 and 3.4.The ftness values of tral vectors s compared wth the ftness values of target vectors usng selecton explaned n secton EXPERIMENTAL RESULTS 5.1. Data sets used The data sets used to carry ths work are lsted n Table 1.It gves detaled descrptons of both artfcal data sets and synthetc data sets. The artfcal data sets are downloaded from UCI machne learnng repostory [27]. Table 1. Data sets descrpton. S.No. Data Set sze #of features # of clusters 1 Irs (class 1-50, class 2-50, class 3-50) 2 Wne (class 1-59,class 2-70, class 3-49) 3 New Thyrod (class 1-150,class 2-35, class 3-30) 4 Haberman (class 1-227,class 2-79) 5 Glass (class1-70,class 2-76, class 3-17,class 4-0,class 5-13, class 6-9, class 7-29) 6 Synthetc data set (class 1-500,class ) 7 Synthetc data set (class 1-500,class 2-500) 8 Synthetc data set (class 1-500,class 2-500) 9 Synthetc data set (class 1-250, class 2-150, class ) 10 Synthetc data set (class 1-150,class 2-150, class ) 11 Synthetc data set (class 1-100,class 2-150, class 3-100) 12 Synthetc data set 7 13 Synthetc data set (class1-150, class 2-100,class3-100,class 4-50, class 5-50) 4(class 1-100,class 2-150,class 3-200, class 4-80) 65

10 5.2. Scatter plots for data sets Fgure 2 shows the scatter plots of 8 synthetc data sets Comparson of Modfed K- and K- The results of modfed K- and K- were shown n two separate tables (table 2 and table 3) due to large sze. The followng secton gves detaled nformaton about them Results of DE appled to modfed K- The tables 2 and 3 shows average values of neural network parameters when DE runs about 100 runs and each run conssts a maxmum of 20 generatons usng both K- and Modfed K-. Table 2. Mean values of neural network parameters usng DE and Modfed K- 66

11 Results of DE appled to K- S.No Table 3. Mean values of neural network parameters usng DE and K- Data set Avg. no of clusters Avg. error rate Avg. ftness Avg. tolerance Avg.vgl ance 1 Irs ,0.4657,0.5569, Wne ,0.3196,0.3752,0.3378, ,0.2596,0.2985,0.3448, ,0.3413,0.3139,0.3429, New thyrod ,0.2742,0.2539, Glass ,0.0841,0.0644,0.0405, ,0.0824,0.1306,0.0398, Haberman ,0.0492, Syndata , Syndata , Syndata , Syndata ,3.4220,3.6010,3.6698, ,3.1782,2.9961, Syndata ,3.2593,3.5499,3.5297, ,3.1813,3.3929, Syndata6 3.89(Actual number s 3) ,1.7823,2.0226,2.0770, , ,2.0789, Syndata ,3.8599,3.8655,3.0607, ,4.1074,3.8569,3.3853, ,3.5799,3.7644, Syndata8 4.75((Actual ,9.0517,10.054,10.901, number s 4) ,11.417,9.6887,9.3184, , The results also compare both the versons of K- combned wth optmzaton algorthm called DE to determne best values of parameters vglance and features tolerance. For most of the data sets used n ths work, t was observed that modfed K- s effcent than orgnal K- n terms of error rate and ftness. Both algorthms were able to dscover number of clusters actually present n the data set shown n the frst column of the above two tables. The performance of modfed K- s better than the orgnal K- n terms of average error rate obtaned over 100 runs. In each generaton, the values of both tolerance and vglance values are recorded and the mean value s taken for both parameters at the end whch were shown n table 2 and table Results Based on maxmum vglance Table 4 shows maxmum vglance and correspondng tolerance values obtaned by DE for 100 runs. 67

12 Table 4. Comparson between modfed K- and K- based on maxmum vglance S.No Data set Algorthm Maxmum vglance(ρ) Number of clusters Features tolerance (δ) Error rate (%) 1 rs Modfed K- 0.75(3/4) ,0.6788,0.5608, K (3/4) ,0.3327,0.5247, Wne Modfed K (9/13) ,0.3101,0.2929,0.3157,0.3823, ,0.2822,0.4702,0.4058,0.3388, ,0.4672, K (9/13) ,0.2005,0.2188,0.4898,0.4101, ,0.3386,0.3851,0.3528,0.3982, ,0.4017, New Modfed K- 0.8(4/5) ,0.2167,0.4847,0.3654, Thyrod K- 0.8(4/5) ,0.4982,0.2404,0.3778, Glass Modfed K , ,0.0205, ,0.0189,0.0911, ,0.0191, K ,0.0938,0.1048,0.0613,0.0269, ,0.2059, , Haberman Modfed K (1/3) ,0.0517, K (1/3) ,0.0355, Syndata1 Modfed K- 0.5(1/2) , K- 0.5(1/2) , Syndata2 Modfed K- 0.5(1/2) , K- 0.5(1/2) , Syndata3 Modfed K- 1(2/2) , K- 1(2/2) , Syndata4 Modfed K- 1(8/8) ,5.3906,5.0819,4.3982,3.1581, ,4.9923, K- 1(8/8) ,2.3999,3.7814,5.1546,5.4707, , , Syndata5 Modfed K- 1(8/8) ,4.1205,4.6169,5.1569,2.6966, ,4.7333, K- 1(8/8) ,2.8447,5.1821,4.7824,5.6415, ,5.3356, Syndata6 Modfed K- 0.75(6/8) 4(Actual ,2.3158,2.6881,0.8054,1.9819, number s 3) ,1.7463, Syndata7 Modfed K- 13 Syndata8 Modfed K- K- 0.75(6/8) 4 (Actual number s 3) (10/12) 4(Actual number s 5) ,1.4159,2.4478,2.8806,1.3764, ,1.0430, ,3.4565,3.9432,4.0429,5.9774, ,4.3376,0.4221,6.3798,5.6094, , K (10/12) ,6.2038,3.4297,3.1956,5.0315, ,3.7991,2.8922,6.3156,4.8360, , (8/10) 5(actual number s 4) K- 0.8(8/10) 6(actual number s 4) , ,8.6679,12.899, ,6.855,9.6163,13.619,13.702, , , , , , , , , , The error rate was shown for maxmum vglance because vglance tells about the number of features that are wthn ther tolerance out of total features les n the data set. The values n bold shows mnmum error rate obtaned by the two varatons of K-.It was observed that modfed K- performs well for rs, wne, new thyrod, Glass, etc.it was shown that, the performance of both algorthms.e. the error rate s same for all well separated data sets 68

13 (syndata1, syndata2, syndata 4 and syndata5). For overlapped data sets syndata 7 and syndata 8 K- s effcent due to low error rate for maxmum vglance observed over all 100 runs. For most of the data sets the maxmum vglance value s same wth dfferent feature tolerances and ms-classfcaton rate Results Based On Mnmum Error Rate Table 5 shows the mnmum error rate obtaned for each data set over all 100 runs. For each data set, the number of clusters formed, tolerances of each feature and vglance value s recorded for mnmum error rate. For rs data set, the vglance parameter value s same for both the algorthms wth dfferent tolerance values. But ms-classfcaton rate s mnmzed when modfed K- s used. In the case of wne, vglance value s small ( e. 6 features matched out of 13) wth low ms-classfcaton rate. But usng K-, the vglance value s e. 9 features matched out of 13, but error rate s hgh. In the smlar way the analyss can be done for the other data sets also. For a data set, the obtaned feature tolerances and vglance values can be used for the K- algorthms to determne groups n a data set. Table 5. Comparson between modfed K- and K- based on mnmum error rate 69

14 6. CONCLUSIONS In ths work, the optmzaton algorthm s appled to neural network to set ntal parameters called vglance (ρ) and tolerance (δ). From the above results, t was observed that the number of output nodes generated (centrods) depends on these parameters. The nodes created n output layer gve the number of clusters present n a data set. So usng effcent values of the parameters found usng DE, optmal clusterng s acheved. Also orgnal K- algorthm s modfed to determne the best matchng unt out of the total matched nodes to mprove accuracy. The above results show that modfed K- s effcent to dscover actual number of clusters presents n a data set wth the best values of vglance and tolerance les n a soluton vector. In ths paper, the comparson s also made between orgnal K- and modfed K-.The above results show that the percentage of tuples ms-classfed (error rate) s mnmum usng modfed K- compared to orgnal K-. In some cases, the orgnal K- s profcent than modfed K-. So n future, modfed K- can be mproved for hghly overlapped data sets. In future, the algorthm can be extended to apply for medcal magng applcatons. 70

15 REFERENCES [1] R. Storn, K. Prce (1997), Dfferental Evoluton: A Smple and Effcent Heurstc for global Optmzaton over Contnuous Spaces, Journal of Global Optmzaton, vol. 11, no. 4, pp [2] Holland JH (1975), Adaptaton n Natural and Artfcal Systems, Unversty of Mchgan Press, Ann Arbor. [3] Goldberg DE (1975), Genetc Algorthms n Search, Optmzaton and Machne Learnng, Addson-Wesley, Readng, MA. [4] Bran Hegerty, Chh-Cheng Hung, and Krsten Kasprak (2009), A Comparatve Study on Dfferental Evoluton and Genetc Algorthms for Some Combnatoral Problems, Webste: [5] Xn-SheYang (2011), Metaheurstc optmzaton,scholarpeda,do: / scholarpeda [6] Carpenter, G. A. and Grossberg, S. (1987). A Massvely Parallel Archtecture for a Self-Organzng Neural Pattern Recognton Machne, Computer Vson, Graphcs, and Image Processng, Vol. 37, pp [7] L.P.Tay, D.J. Evans (1994), Fast Learnng Artfcal Neural Network Usng Nearest Neghbors Recall, Neural Parallel and Scentfc Computatons, 2(1). [8] D.J.Evans, L.P.Tay (1995), Fast Learnng Artfcal Neural Network for contnuous Input Applcatons, Kybernetes, 24 (3). [9] Alex T.L.P., Sandeep Prakash (2002), K-Means Fast Learnng Artfcal Neural Network, An Alternate Network for Classfcaton, ICONIP 02, Vol. 2, pp [10] L. P. Wong, T. L. P. Alex (2003), Centrod Stablty wth K-Means Fast Learnng Artfcal Neural Networks, Internatonal Jont Conference on Neural Network (IJCNN). [11] J, Vesterstrom, R. Thomsen (2004), A comparatve study of dfferental evoluton, partcle swarm optmzaton and evolutonary algorthms on numercal benchmark problems. Congress on Evolutonary Computaton, pp [12] U. Chakraborty (2008), Advances n Dfferental Evoluton (Studes n Computatonal Intellgence, vol. 142), Hedelberg, Germany: Sprnger-Verlag. [13] V. Feoktstov (2006), Dfferental Evoluton In Search of Solutons (Sprnger Optmzaton and Its applcatons, vol. 5). Hedelberg, Germany: Sprnger-Verlag. [14] K. V. Prce, R. M. Storn, and J. A. Lampnen, Dfferental Evoluton A Practcal Approach to global Optmzaton (Natural Computng Seres).Hedelberg, Germany: Sprnger-Verlag, [15] J. Zhang and A. C. Sanderson, JADE: Adaptve dfferental evoluton wth optonal external archve, IEEE Transactons on Evolutonary Computng. vol. 13, no. 5, pp , Oct [16] Danela Zahare (2009), Influence of crossover on the behavor of Dfferental Evoluton Algorthms, Appled Soft Computng, volume 9, ssue 3, pp [17] Godfrey Onwubolu, Donald Davendra (2006), Dscrete Optmzaton Schedulng flow shops usng dfferental evoluton algorthm, European Journal of Operatonal Research, vol.171, pp [18] [19] Prce K.V. (1999), An Introducton to Dfferental Evoluton n Corne, D., Dorgo, M. and Glover, F. (eds), New Ideas n Optmzaton, McGraw-Hll, London. [20] C.H. Chou, M.C. Su, E. La (2004), A new cluster valdty measure and ts applcaton to mage compresson, Pattern Analyss Applcatons, vol.7, pp , DOI /s [21] Bandyopadhyay S., and Maulk U. (2002) "An evolutonary technque based on K-means algorthm for optmal clusterng n RN", Informaton Scences, vol. 146, pp [22] Storn R., and Kenneth P. (1995), "Dfferental Evoluton a Smple and Effcent Adaptve Scheme for Global Optmzaton over Contnuous Spaces". Techncal Report, TR , ICSI. [23] Maulk U. and Bandyopadhyay S. (2000), "Genetc algorthm based clusterng technque", Pattern Recognton, Vol. 33, pp [24] Yn Xang, Alex Tay Leng Phuan (2004), Genetc algorthm based k-means fast learnng artfcal neural network.ai 04 proceedngs of the 17th Australan jont conference on Advances n Artfcal Intellgence, pp: (2004), ISBN: [25] Swagatam Das, Ajth Abraham and Amt Konar (2009), Metaheurstc Clusterng (studes n computatonal ntellgence vol.178), Hedelberg, Berln: sprnger-verlag. [26] S.Grossberg (1978), Compettve Learnng: from Interactve actvaton to adaptve resonance, Cogntve scence, 11, pp [27] Blake, C.L. and C.J. Merz, UCI Repostory of Machne Learnng Databases. 1st Edn, Unversty of Calforna, Irvne, CA. 71

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION 24 CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION The present chapter proposes an IPSO approach for multprocessor task schedulng problem wth two classfcatons, namely, statc ndependent tasks and

More information

Histogram based Evolutionary Dynamic Image Segmentation

Histogram based Evolutionary Dynamic Image Segmentation Hstogram based Evolutonary Dynamc Image Segmentaton Amya Halder Computer Scence & Engneerng Department St. Thomas College of Engneerng & Technology Kolkata, Inda amya_halder@ndatmes.com Arndam Kar and

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation Internatonal Conference on Logstcs Engneerng, Management and Computer Scence (LEMCS 5) Maxmum Varance Combned wth Adaptve Genetc Algorthm for Infrared Image Segmentaton Huxuan Fu College of Automaton Harbn

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem An Effcent Genetc Algorthm wth Fuzzy c-means Clusterng for Travelng Salesman Problem Jong-Won Yoon and Sung-Bae Cho Dept. of Computer Scence Yonse Unversty Seoul, Korea jwyoon@sclab.yonse.ac.r, sbcho@cs.yonse.ac.r

More information

An Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment

An Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 www.ijcsi.org 374 An Evolvable Clusterng Based Algorthm to Learn Dstance Functon for Supervsed

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Extraction of Fuzzy Rules from Trained Neural Network Using Evolutionary Algorithm *

Extraction of Fuzzy Rules from Trained Neural Network Using Evolutionary Algorithm * Extracton of Fuzzy Rules from Traned Neural Network Usng Evolutonary Algorthm * Urszula Markowska-Kaczmar, Wojcech Trelak Wrocław Unversty of Technology, Poland kaczmar@c.pwr.wroc.pl, trelak@c.pwr.wroc.pl

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

GA-Based Learning Algorithms to Identify Fuzzy Rules for Fuzzy Neural Networks

GA-Based Learning Algorithms to Identify Fuzzy Rules for Fuzzy Neural Networks Seventh Internatonal Conference on Intellgent Systems Desgn and Applcatons GA-Based Learnng Algorthms to Identfy Fuzzy Rules for Fuzzy Neural Networks K Almejall, K Dahal, Member IEEE, and A Hossan, Member

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b Internatonal Conference on Advances n Mechancal Engneerng and Industral Informatcs (AMEII 05) Clusterng Algorthm Combnng CPSO wth K-Means Chunqn Gu, a, Qan Tao, b Department of Informaton Scence, Zhongka

More information

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification Research of Neural Network Classfer Based on FCM and PSO for Breast Cancer Classfcaton Le Zhang 1, Ln Wang 1, Xujewen Wang 2, Keke Lu 2, and Ajth Abraham 3 1 Shandong Provncal Key Laboratory of Network

More information

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments Comparson of Heurstcs for Schedulng Independent Tasks on Heterogeneous Dstrbuted Envronments Hesam Izakan¹, Ath Abraham², Senor Member, IEEE, Václav Snášel³ ¹ Islamc Azad Unversty, Ramsar Branch, Ramsar,

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE"

KOHONEN'S SELF ORGANIZING NETWORKS WITH CONSCIENCE Kohonen's Self Organzng Maps and ther use n Interpretaton, Dr. M. Turhan (Tury) Taner, Rock Sold Images Page: 1 KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE" By: Dr. M. Turhan (Tury) Taner, Rock

More information

Design of Structure Optimization with APDL

Design of Structure Optimization with APDL Desgn of Structure Optmzaton wth APDL Yanyun School of Cvl Engneerng and Archtecture, East Chna Jaotong Unversty Nanchang 330013 Chna Abstract In ths paper, the desgn process of structure optmzaton wth

More information

Vectorization of Image Outlines Using Rational Spline and Genetic Algorithm

Vectorization of Image Outlines Using Rational Spline and Genetic Algorithm 01 Internatonal Conference on Image, Vson and Computng (ICIVC 01) IPCSIT vol. 50 (01) (01) IACSIT Press, Sngapore DOI: 10.776/IPCSIT.01.V50.4 Vectorzaton of Image Outlnes Usng Ratonal Splne and Genetc

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining A Notable Swarm Approach to Evolve Neural Network for Classfcaton n Data Mnng Satchdananda Dehur 1, Bjan Bhar Mshra 2 and Sung-Bae Cho 1 1 Soft Computng Laboratory, Department of Computer Scence, Yonse

More information

CHAPTER 4 OPTIMIZATION TECHNIQUES

CHAPTER 4 OPTIMIZATION TECHNIQUES 48 CHAPTER 4 OPTIMIZATION TECHNIQUES 4.1 INTRODUCTION Unfortunately no sngle optmzaton algorthm exsts that can be appled effcently to all types of problems. The method chosen for any partcular case wll

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Fuzzy C-Means Initialized by Fixed Threshold Clustering for Improving Image Retrieval

Fuzzy C-Means Initialized by Fixed Threshold Clustering for Improving Image Retrieval Fuzzy -Means Intalzed by Fxed Threshold lusterng for Improvng Image Retreval NAWARA HANSIRI, SIRIPORN SUPRATID,HOM KIMPAN 3 Faculty of Informaton Technology Rangst Unversty Muang-Ake, Paholyotn Road, Patumtan,

More information

Multi-objective Optimization Using Self-adaptive Differential Evolution Algorithm

Multi-objective Optimization Using Self-adaptive Differential Evolution Algorithm Mult-objectve Optmzaton Usng Self-adaptve Dfferental Evoluton Algorthm V. L. Huang, S. Z. Zhao, R. Mallpedd and P. N. Suganthan Abstract - In ths paper, we propose a Multobjectve Self-adaptve Dfferental

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS Academc Research Internatonal ISS-L: 3-9553, ISS: 3-9944 Vol., o. 3, May 0 EVALUATIO OF THE PERFORMACES OF ARTIFICIAL BEE COLOY AD IVASIVE WEED OPTIMIZATIO ALGORITHMS O THE MODIFIED BECHMARK FUCTIOS Dlay

More information

An Improved Image Segmentation Algorithm Based on the Otsu Method

An Improved Image Segmentation Algorithm Based on the Otsu Method 3th ACIS Internatonal Conference on Software Engneerng, Artfcal Intellgence, Networkng arallel/dstrbuted Computng An Improved Image Segmentaton Algorthm Based on the Otsu Method Mengxng Huang, enjao Yu,

More information

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010 Natural Computng Lecture 13: Partcle swarm optmsaton Mchael Herrmann mherrman@nf.ed.ac.uk phone: 0131 6 517177 Informatcs Forum 1.42 INFR09038 5/11/2010 Swarm ntellgence Collectve ntellgence: A super-organsm

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

A Two-Stage Algorithm for Data Clustering

A Two-Stage Algorithm for Data Clustering A Two-Stage Algorthm for Data Clusterng Abdolreza Hatamlou 1 and Salwan Abdullah 2 1 Islamc Azad Unversty, Khoy Branch, Iran 2 Data Mnng and Optmsaton Research Group, Center for Artfcal Intellgence Technology,

More information

An Improved Particle Swarm Optimization for Feature Selection

An Improved Particle Swarm Optimization for Feature Selection Journal of Bonc Engneerng 8 (20)?????? An Improved Partcle Swarm Optmzaton for Feature Selecton Yuannng Lu,2, Gang Wang,2, Hulng Chen,2, Hao Dong,2, Xaodong Zhu,2, Sujng Wang,2 Abstract. College of Computer

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

A Deflected Grid-based Algorithm for Clustering Analysis

A Deflected Grid-based Algorithm for Clustering Analysis A Deflected Grd-based Algorthm for Clusterng Analyss NANCY P. LIN, CHUNG-I CHANG, HAO-EN CHUEH, HUNG-JEN CHEN, WEI-HUA HAO Department of Computer Scence and Informaton Engneerng Tamkang Unversty 5 Yng-chuan

More information

K-means and Hierarchical Clustering

K-means and Hierarchical Clustering Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n gvng your own lectures. Feel free to use these sldes verbatm, or to modfy them to ft your

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

MIXED INTEGER-DISCRETE-CONTINUOUS OPTIMIZATION BY DIFFERENTIAL EVOLUTION Part 1: the optimization method

MIXED INTEGER-DISCRETE-CONTINUOUS OPTIMIZATION BY DIFFERENTIAL EVOLUTION Part 1: the optimization method MIED INTEGER-DISCRETE-CONTINUOUS OPTIMIZATION BY DIFFERENTIAL EVOLUTION Part : the optmzaton method Joun Lampnen Unversty of Vaasa Department of Informaton Technology and Producton Economcs P. O. Box 700

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index Orgnal Artcle Prnt ISSN: 3-6379 Onlne ISSN: 3-595X DOI: 0.7354/jss/07/33 K-means Optmzaton Clusterng Algorthm Based on Hybrd PSO/GA Optmzaton and CS valdty ndex K Jahanbn *, F Rahmanan, H Rezae 3, Y Farhang

More information

NGPM -- A NSGA-II Program in Matlab

NGPM -- A NSGA-II Program in Matlab Verson 1.4 LIN Song Aerospace Structural Dynamcs Research Laboratory College of Astronautcs, Northwestern Polytechncal Unversty, Chna Emal: lsssswc@163.com 2011-07-26 Contents Contents... 1. Introducton...

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

Biological Sequence Mining Using Plausible Neural Network and its Application to Exon/intron Boundaries Prediction

Biological Sequence Mining Using Plausible Neural Network and its Application to Exon/intron Boundaries Prediction Bologcal Sequence Mnng Usng Plausble Neural Networ and ts Applcaton to Exon/ntron Boundares Predcton Kuochen L, Dar-en Chang, and Erc Roucha CECS, Unversty of Lousvlle, Lousvlle, KY 40292, USA Yuan Yan

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

An Improved Neural Network Algorithm for Classifying the Transmission Line Faults

An Improved Neural Network Algorithm for Classifying the Transmission Line Faults 1 An Improved Neural Network Algorthm for Classfyng the Transmsson Lne Faults S. Vaslc, Student Member, IEEE, M. Kezunovc, Fellow, IEEE Abstract--Ths study ntroduces a new concept of artfcal ntellgence

More information

Fitting: Deformable contours April 26 th, 2018

Fitting: Deformable contours April 26 th, 2018 4/6/08 Fttng: Deformable contours Aprl 6 th, 08 Yong Jae Lee UC Davs Recap so far: Groupng and Fttng Goal: move from array of pxel values (or flter outputs) to a collecton of regons, objects, and shapes.

More information

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition Optmal Desgn of onlnear Fuzzy Model by Means of Independent Fuzzy Scatter Partton Keon-Jun Park, Hyung-Kl Kang and Yong-Kab Km *, Department of Informaton and Communcaton Engneerng, Wonkwang Unversty,

More information

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks Chnese Word Segmentaton based on the Improved Partcle Swarm Optmzaton Neural Networks Ja He Computatonal Intellgence Laboratory School of Computer Scence and Engneerng, UESTC Chengdu, Chna Department of

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 2 Sofa 2016 Prnt ISSN: 1311-9702; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-2016-0017 Hybrdzaton of Expectaton-Maxmzaton

More information

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices Steps for Computng the Dssmlarty, Entropy, Herfndahl-Hrschman and Accessblty (Gravty wth Competton) Indces I. Dssmlarty Index Measurement: The followng formula can be used to measure the evenness between

More information

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments Fourth Internatonal Conference Modellng and Development of Intellgent Systems October 8 - November, 05 Lucan Blaga Unversty Sbu - Romana Imperalst Compettve Algorthm wth Varable Parameters to Determne

More information

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009. Farrukh Jabeen Algorthms 51 Assgnment #2 Due Date: June 15, 29. Assgnment # 2 Chapter 3 Dscrete Fourer Transforms Implement the FFT for the DFT. Descrbed n sectons 3.1 and 3.2. Delverables: 1. Concse descrpton

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Cracking of the Merkle Hellman Cryptosystem Using Genetic Algorithm

Cracking of the Merkle Hellman Cryptosystem Using Genetic Algorithm Crackng of the Merkle Hellman Cryptosystem Usng Genetc Algorthm Zurab Kochladze 1 * & Lal Besela 2 1 Ivane Javakhshvl Tbls State Unversty, 1, I.Chavchavadze av 1, 0128, Tbls, Georga 2 Sokhum State Unversty,

More information

Unsupervised Neural Network Adaptive Resonance Theory 2 for Clustering

Unsupervised Neural Network Adaptive Resonance Theory 2 for Clustering 3 rd Internatonal G raduate Conference on Engneerng, Scence and Humantes (IGCESH) School of Graduate Studes Unverst Teknolog Malaysa 4 November 00 Unsupervsed Neural Network Adaptve Resonance Theory for

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Determining Fuzzy Sets for Quantitative Attributes in Data Mining Problems

Determining Fuzzy Sets for Quantitative Attributes in Data Mining Problems Determnng Fuzzy Sets for Quanttatve Attrbutes n Data Mnng Problems ATTILA GYENESEI Turku Centre for Computer Scence (TUCS) Unversty of Turku, Department of Computer Scence Lemmnkäsenkatu 4A, FIN-5 Turku

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling , pp.40-45 http://dx.do.org/10.14257/astl.2017.143.08 Applcaton of Improved Fsh Swarm Algorthm n Cloud Computng Resource Schedulng Yu Lu, Fangtao Lu School of Informaton Engneerng, Chongqng Vocatonal Insttute

More information

Non-Split Restrained Dominating Set of an Interval Graph Using an Algorithm

Non-Split Restrained Dominating Set of an Interval Graph Using an Algorithm Internatonal Journal of Advancements n Research & Technology, Volume, Issue, July- ISS - on-splt Restraned Domnatng Set of an Interval Graph Usng an Algorthm ABSTRACT Dr.A.Sudhakaraah *, E. Gnana Deepka,

More information

A Saturation Binary Neural Network for Crossbar Switching Problem

A Saturation Binary Neural Network for Crossbar Switching Problem A Saturaton Bnary Neural Network for Crossbar Swtchng Problem Cu Zhang 1, L-Qng Zhao 2, and Rong-Long Wang 2 1 Department of Autocontrol, Laonng Insttute of Scence and Technology, Benx, Chna bxlkyzhangcu@163.com

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

A New Token Allocation Algorithm for TCP Traffic in Diffserv Network

A New Token Allocation Algorithm for TCP Traffic in Diffserv Network A New Token Allocaton Algorthm for TCP Traffc n Dffserv Network A New Token Allocaton Algorthm for TCP Traffc n Dffserv Network S. Sudha and N. Ammasagounden Natonal Insttute of Technology, Truchrappall,

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation Tranng of Kernel Fuzzy Classfers by Dynamc Cluster Generaton Shgeo Abe Graduate School of Scence and Technology Kobe Unversty Nada, Kobe, Japan abe@eedept.kobe-u.ac.jp Abstract We dscuss kernel fuzzy classfers

More information

Complexity Analysis of Problem-Dimension Using PSO

Complexity Analysis of Problem-Dimension Using PSO Proceedngs of the 7th WSEAS Internatonal Conference on Evolutonary Computng, Cavtat, Croata, June -4, 6 (pp45-5) Complexty Analyss of Problem-Dmenson Usng PSO BUTHAINAH S. AL-KAZEMI AND SAMI J. HABIB,

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information