Classifier Ensemble Design using Artificial Bee Colony based Feature Selection
|
|
- Shona Dorsey
- 5 years ago
- Views:
Transcription
1 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): Classfer Ensemble Desgn usng Artfcal Bee Colony based Feature Selecton Shunmugaprya Palansamy 1 and Kanman S 2 Research Scholar, Department of Computer Scence and Engneerng, Pondcherry Engneerng College, Puducherry, Inda Professor, Department of Informaton Technology, Pondcherry Engneerng College, Puducherry, Inda Abstract Artfcal Bee Colony (ABC) s a popular meta-heurstc search algorthm used n solvng numerous combnatoral optmzaton problems. Feature Selecton (FS) helps to speed up the process of classfcaton by extractng the relevant and useful nformaton from the dataset. FS s seen as an optmzaton problem because selectng the approprate feature subset s very mportant. Classfer Ensemble s the best soluton for the ptfall of accuracy lag n a sngle classfer. Ths paper proposes a novel hybrd algorthm ABCE the combnaton of ABC algorthm and a classfer ensemble (CE). A classfer ensemble consstng of Support Vector Machne (SVM), Decson Tree and Naïve Bayes, performs the task of classfcaton and ABCE s used as a feature selector to select the most nformatve features as well as to ncrease the overall classfcaton accuracy of the classfer ensemble. Ten UCI (Unversty of Calforna, Irvne) benchmark datasets have been used for the evaluaton of the proposed algorthm. Three ensembles ABC- CE, ABC-Baggng and ABC-Boostng have been constructed from the fnally selected feature subsets. From the expermental results, t can be seen that these ensembles have shown up to 12% ncrease n the classfcaton accuracy compared to the consttuent classfers and the standard ensembles Baggng, Boostng, ACO-Baggng and ACO-Boostng. Keywords: Feature Selecton, Classfcaton, Classfer Ensemble, Ant Colony Optmzaton, Bee Colony Optmzaton, Artfcal Bee Colony, Meta-heurstc search. 1. Introducton Ensemble Learnng has been a great topc of research durng the last decade and vast amount of works have been carred out n the doman of Classfer Ensemble (CE) [2] and [3]. Classfer Ensemble s the combnaton of two or more classfcaton algorthms and t s formed as a best soluton to overcome the lmtaton of accuracy lag of a sngle classfer. Baggng, Boostng, Stackng, Maorty Votng, Behavoral Knowledge Space and Wernecke s are some popular ensemble technques. When the consttuent classfers n CE are of same type, t s called homogeneous CE otherwse heterogeneous CE [2] and [3]. In CE, each consttuent classfer s traned over the entre feature space. Sometmes the feature space s nosy consstng of rrelevant and redundant data [1]. In such cases, the classfer consumes more tme to get traned and also ts msclassfcaton rates are hgher. Feature Selecton (FS) s a possble soluton to ths problem. Feature Selecton extracts the necessary and relevant data from the feature space wthout affectng the orgnalty of ts representaton. Wth FS, the performance of the classfer s mproved, thereby mprovng the effcency of the ensemble [10]. Feature Selecton has been wdely used n the constructon of ensembles [2]. Whle employng FS for ensemble constructon, results would be better when FS s optmzed. Swarm and evolutonary algorthms are used for optmzng feature selecton resultng n optmal feature subset. In lterature, Genetc Algorthm, Ant Colony Optmzaton, Bee Colony Optmzaton and Partcle Swarm Optmzaton are used n numerous applcatons for optmzng FS [8], [9], [17] and [18]. ABC s a stochastc, swarm ntellgent algorthm proposed by Karaboga Et. al. for constraned optmzaton problems [4]. Snce ts proposal, ABC has been proved to be successful n solvng optmzaton problems n numerous applcaton domans. Also ABC s proved to gve promsng and enhanced results n the areas where Genetc Algorthm and ant Colony Optmzaton have gven already [4], [5], [6] and [7]. In order to enhance the classfcaton accuracy, dfferent algorthms for pattern classfcaton [1], dfferent technques for feature selecton and a number of classfer ensemble methodologes [2] and [3] have been Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
2 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): proposed and mplemented so far. The man lmtaton of these methods s that, none of them could gve a consstent performance over all the datasets [8]. The proposed method s also attempted as an effort towards effcent feature selecton optmzaton and ensemble constructon. In ths study, classfer ensembles are constructed usng optmal feature subset obtaned from the combnaton of classfer ensemble and Artfcal Bee Colony Algorthm (ABC). ABC s used to select the features and generate the feature subsets and these feature subsets are evaluated for effcency by an ensemble made up of classfers Decson Tree (DT), Naïve Bayes (NB) and Support Vector Machne (SVM). Each tme ABC generates dfferent feature subsets, the CE uses the average of mean accuracy of the ensemble and consensus as the ftness measure to select the feature subset. Ths paper s organzed n sx sectons. Secton 2 descrbes about feature selecton and ts types. In Secton 3, a bref descrpton of Artfcal Bee Colony s presented. Secton 4 outlnes the proposed method ABCE: the ABC based feature selecton and the ensemble constructon are explaned n ths secton. The experments and results are dscussed n secton 5 and the paper s concluded n secton FEATURE SELECTION (FS) Feature selecton s vewed as an mportant preprocessng step for dfferent tasks of data mnng especally pattern classfcaton [10], [11] and [12]. When the dmensonalty of the feature space s very hgh, FS s used to extract the nformatve features from the feature space and the unnformatve ones wll be removed. Otherwse the unnformatve features tend to ncrease the complexty of computaton by ntroducng nosy and redundant data nto the process. Wth FS, the features are ranked based on ther mportance makng the feature set more sutable for classfcaton wthout affectng the orgnal feature representaton and accuracy of predcton [10]. It has been proved n the lterature that classfcatons done wth feature subsets obtaned by FS have hgher predcton accuracy than classfcatons carred out wthout FS [19]. A number of algorthms have been proposed to mplement FS. FS algorthms related to pattern classfcaton fall nto two categores: Flter Approach and Wrapper Approach. When the process of FS s ndependent of any learnng algorthm, t s called flter approach. It depends on the general characterstcs of the tranng data and uses the measures such as dstance, nformaton, dependency and consstency to evaluate the feature subsets selected [10] and [20]. On the other hand when a classfer s nvolved, t s called wrapper approach. The feature subset that results from a wrapper approach depends on the classfcaton method used and two dfferent classfers can lead to two dfferent feature subsets. Compared to flter approach the feature subsets obtaned through wrapper are always effectve subsets but t s a tme consumng process [20] and [24]. Independent of flter and wrapper approaches, evolutonary algorthms are also used for searchng the best subset of features through the entre feature space [8], [9], [17] and [18]. 3. The Artfcal Bee Colony Algorthm (ABC) ABC s a swarm ntellgent and meta-heurstc search algorthm proposed by Karaboga [4] and snce then t has been used wdely n many felds for solvng optmzaton problems [5], [6], [7], [17] and [18]. ABC s nspred by the foragng behavor of honey bee swarms. The ABC algorthm employs three types of bees n the colony: employed bees, onlooker bees and scout bees. Intally, the food source postons are generated (N) and the populaton of employers s equal to the number of food sources. Each food source represents a soluton of the optmzaton problem. Each employed bee s assgned a food source and they explot the food sources and pass the nformaton of nectar content to the onlookers. The number of onlookers s equal to the number of employed bees. Based on the nformaton ganed, the onlookers explot the food sources and ts neghborhood untl the food sources become exhausted. The employed bee of exhausted food sources becomes a scout. Scouts then start searchng for new food source postons. The nectar nformaton represents the qualty of the soluton avalable from the food source. Increased amount of nectar ncreases the probablty of selecton of a partcular food source by the onlookers [5]. The ABC algorthm s gven n Fg.1 [4]. 1. Intalze the food source postons 2. Evaluate the food sources 3. Produce new food sources(solutons) for the employed bees 4. Apply greedy selecton 5. Calculate the ftness and probablty values 6. Produce new food sources for onlookers 7. Apply greedy selecton 8. Determne the food source to be abandoned and allocate ts employed bee as a Scout for searchng the new food sources 9. Memorze the best food source found 10. Repeat steps 3-9 for a pre-determned number of teratons Fg.1 Steps of the ABC algorthm Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
3 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): The Artfcal Bee Classfer Ensemble (ABCE) In the proposed study, the ensemble s constructed as a combnaton of ABC algorthm and a CE consstng of the classfers Decson tree, Naïve Bayes and Support Vector Machne [1] and [16]. In ABCE, ABC algorthm s used as a feature selector and feature subset generator; the ensemble classfer s used as the evaluator to evaluate the feature subsets generated. The classfer ensemble helps ABC n pckng up the best feature subset by evaluatng each confguraton suggested by ABC. The ABC algorthm helps n effcent CE constructon suggestng the best feature subset for the ensemble to work wth. Hence both ABC and CE try to enhance the performance of each other n the proposed method. The steps of ABCE are gven n Fg ABC Feature Selector The ABC algorthm s used to optmze the process of feature selecton and ncreases the predctve accuracy of the classfer ensemble. Frst, the classfer ensemble (made up of DT, SVM and NB) s used to evaluate the dscrmnatng ablty of each feature F n the dataset. Then, the ensemble accuracy ( x ) of each feature F s calculated by employng 10-fold cross valdaton [2] and [3] for each of the classfer. Each employed bee s assgned a bnary bt strng made of 0 s and 1 s. The length of the bnary bt strng s equal to the number of features n the dataset and s used to represent the feature selecton by each employed bee. A 1 means the feature s selected and a 0 means the feature s not selected. The populaton of the employed bees and onlooker bees are equal to the feature sze (m) of the dataset as features are consdered as food sources here. 1.Cycle =1 2. Intalze ABC parameters 3. Evaluate the ftness of each ndvdual feature 4. Repeat 5. Construct solutons by the employed bees For form 1 to m Assgn feature subset confguratons (bnary bt strng) to each employed bee Produce new feature subsets Pass the produced feature subset to the Classfer Ensemble v Evaluate the ftness ( ft ) of the feature subset based on the ensemble s mean accuracy and consensus Calculate the probablty p of feature subset soluton 6. Construct solutons by the onlookers For form 1 to m For form 1 to m Select a feature based on the probablty p Compute v usng x and Apply greedy selecton between v and x 7. Determne the scout bee and the abandoned soluton 8. Calculate the best feature subset of the cycle 9. Memorze the best optmal feature subset 10. Cycle = Cycle Untl pre-determned number of cycles s reached 12. Employ the same searchng procedure of bees to generate the optmal feature subset confguratons 13. Construct the ensembles ABCE, ABC-Baggng and ABC- Boostng usng the best optmal feature subset Fg. 2 Steps of ABCE Each employed bee s allocated a feature and t evaluates the ftness of the feature by usng the mean accuracy of the ensemble and the consensus [8]. The ftness for each feature (feature subset) ponted by the employed bee s calculated usng the equatons (1), (2) and (3). ftness 1 s) m = = 1 m x accuracy ( s) ( (1) ftness ( s) = consensus( ) (2) 2 s ftness1( s) + ftness 2( s) ft = (3) 2 accuracy (s) s the predctve accuracy of the th classfer n the ensemble and consensus(s) specfy the classfcaton accuracy usng consensus upon the s th feature subset [8]. The frst part of the ftness (mean accuracy) checks whether the feature subset has superor power on accurate classfcaton wth the whole classfer ensemble and targets to optmze t. So, the mean accuracy helps n ncreasng the generalzaton ablty of the feature subset. The second part of the ftness (consensus) checks for the optmalty of the feature subset n producng hgh consensus classfcaton [8]. Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
4 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): The onlooker bee gans nformaton from the employed bee and calculates the probablty of selectng a feature usng equaton (4). Then the onlooker computes the new soluton v usng the ensemble accuraces of the feature the employed bee s pontng to and the feature the onlooker bee has selected. If the new soluton v s greater than x, the employed bee wll be pontng to feature subset consstng of the feature t was prevously pontng and the newly selected feature. If v s not greater than x then, the employed bees feature wll be retaned and the newly selected feature s neglected. The new soluton v s computed by usng equaton (5). p = m = 1 ft ft ( (4) v = x + ϕ x x ) (5) where, x s the ensemble accuracy of the feature allocated to the employed bee and x s the ensemble accuracy of the feature the onlooker has selected. ϕ s an unformly dstrbuted real random number n the range [0,1]. Ths way, each tme the employed bee s assgned a new feature subset, the onlooker explots and tres to produce new feature subset confguraton. After all possble features are exploted for formng the feature subset, the nectar content gets accumulated towards better feature subset confguraton. If any employed bee has not mproved, then the employed bee becomes a scout. The scout s assgned a new bnary feature set based on the equaton (6). x = x + rand 0,1]( x x ) (6) mn Where [ max mn and represents the lower and upper bounds of the dmenson of the populaton. The bees keep executng the same procedure for a pre-determned number of runs to form the best feature subset. Hence ABC s used to select and rank dfferent features based on ther mportance. So, relevant features are extracted and the computaton complexty due to rrelevant and nosy features s greatly reduced. Apart from ths, for large datasets especally wth large number of features, the performance of classfers s affected because t has to handle more number of features. By usng ABC, the number of features s scaled down based on ther mportance and the computatonal speed of the classfer s ncreased. 4.2 The Ensemble Classfer The classfers Decson Tree, Naïve Bayes and SVM are put together to form the classfer ensemble n the proposed method. In the proposed study, when the bees keep executng ther searchng procedure, the feature subset selected by each of the bees s nput to the classfer ensemble. The three classfers consder the canddate feature subsets one at a tme, get traned wth the combnaton of features and classfy the test set. After the classfers have fnshed, the ABC algorthm calculates the mean accuracy of the classfer ensemble and consensus usng equatons (1) and (2). The ftness s then calculated as the average of mean accuracy and consensus. The ftness (ft ) s used as the evaluaton crteron for selectng the best feature subset combnaton. In the proposed method, the classfer ensembles ABC- CE, ABC-Baggng and ABC-Boostng are constructed usng the fnally selected feature subset. ABC-CE s formed by the maorty vote [2] of the three classfers, Decson Tree, Naïve Bayes and SVM. Baggng [2], [3] & [13] and Boostng are most famous CE methods whch have been used n numerous Pattern Classfcaton domans [2], [3] & [14]. ABC-Baggng s constructed by the combnaton of C4.5 Baggng wth ABC selected feature subset. ACO-Boostng s constructed by Boostng the C4.5 decson tree along wth ABC selected feature subset. 5. Experments and Results The datasets used, the mplementaton and the results of AC-ABC are dscussed n ths secton. 5.1 Datasets The performance of the proposed method ABCE dscussed n ths study has been mplemented and tested usng 10 dfferent medcal datasets. Heart-C, Dermatology, Hepatts, Lung Cancer, Pma Indan Dabetes, Irs, Wsconsn Breast Cancer, Lmphography, Dabetes and Stalog-Heart are the datasets used. These datasets are taken from UCI machne learnng repostory Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
5 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): [15] and ther descrpton s gven n Table I. The reasons for selectng these datasets are that they have been used n numerous classfer ensemble and feature selecton proposals for expermental proof. The datasets are chosen such that the number of features s n a vared range and large number of nstances, so that the effect of feature selecton by ABCE s easly vsble. 5.2 Implementaton of ABCE Table 1: Datasets Descrpton Dataset Instances Features Classes Heart-C Dermatology Hepatts Lung Cancer Pma Irs Wsconsn Lymph Dabetes Heart-Stalog Classfcatons of the datasets are mplemented usng WEKA Software from Wakato Unversty [16] and feature selecton usng ABC has been mplemented usng Net Beans IDE. Decson Tree s mplemented by usng J48 algorthm, SVM by the LIBSVM package and Naïve Bayes by the Naïve Bayes classfcaton algorthm from WEKA. The artfcal bees search for the best feature subset confguraton wth the followng parameter ntalzatons for ABC: Populaton Sze p : 2 * No. of features n the data set Dmenson of the populaton : p N Lower Bound : 1 Upper Bound : N Maxmum Number of teraton : Equal to the number of features No. of runs : 10 ϕ : 0.3 Wth these parameter settngs, the best optmal feature subset s recorded after executng a specfed number of cycles. After every teraton, the employed bees pass the selected features to the classfer ensemble for evaluaton. The mean accuracy of the classfers n the ensemble and the consensus upon the feature subset are calculated usng equatons (1) and (2). The ftness measure for each feature subset s the average of the mean accuracy and the consensus and t s calculated usng equaton (3). The onlookers decde upon a feature subset wth a probablty whch depends on the ftness. The number of features selected and the ensemble accuracy of ABCE s gven n Table 2. Table 2: Feature Selecton and Ensemble Accuracy Acheved through ABCE Dataset No. of Features Features Selected by Predctve Accuracy (ABCE)(%) ABCE Heart-C Dermatology Hepatts Lung Cancer Pma Irs Wsconsn Lymph Dabetes Heart-Stalog Three classfer ensembles ABC-CE, ABC-Baggng, ABC-Boostng are then constructed usng the optmal feature subset selected by the proposed ABCE method. The classfcaton accuraces acheved by these three ensembles are gven n Table 3. Also n Table 3, the performance of ABCE s compared wth ACO based ensemble, Baggng C4.5 and Boostng C fold Cross Valdaton has been used to evaluate the accuracy of the constructed ensembles [1], [2] and [3]. When the ABCE method s appled to the datasets and the ensembles are constructed usng the features output by ABCE, the recognton rates for all the ten datasets are mproved sgnfcantly and ths s shown n Fg.4. From the data represented n Table 2, Table 3 and Fg. 3, t can be nferred that:. Feature selecton defntely ncreases the classfcaton accuracy and speeds up the process of classfcaton. For all datasets except Hepatts and Dabetes, ABCE has gven the hghest recognton rates. For Hepatts, Boostng has gven the hghest accuracy and ABCE has gven better performance compared to ACO v. For dabetes, ACO has the leadng performance and accuracy of ABCE s margnally low compared to ACO v. For Heart-c, Irs, Pma and Wsconsn, feature subset obtaned s almost of same sze as n ACO v. For Lung Cancer, Lymph and Stalog, sze of the feature set s mnmzed to a greater level wth good predcton accuraces. Ths very well explans the effectveness of the proposed method v. Convergence of the search space s acheved quckly Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
6 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): Table 3: Classfcaton Accuracy of the Ensembles by 10 Fold Cross Valdaton Baggng Boostng ACO - ACO- ABC- ABC- Dataset (C4.5) (C4.5) Baggng Boostng ABC -CE Baggng Boostng Heart-C Dermatology Hepatts Lung Cancer Pma Irs Lymph Wsconsn Dabetes Heart Stalog (All numbers are n percentage) Fg.3 Graph Showng the Comparson of Predctons for the Ten UCI Datasets by the Consttuent Classfers, Tradtonal Ensembles and Ensembles Constructed Usng ACO and ABCE Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
7 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): v. In the graphcal representaton, the curve tends to elevate on the ABCE methods for most of the datasets, whch shows the betterment of the proposed method 6. Concluson In ths paper, a new method of classfer ensemble ABCE has been proposed and mplemented. ABCE s proposed by combnng the mult-obectve ABC wth a Classfer Ensemble (CE) and has been used to optmze the feature selecton process. Ths method has resulted n optmal selecton of feature subsets and the effectveness of the proposed method can be seen from the results obtaned. The ensembles ABC-CE, ABC-Baggng and ABC- Boostng developed usng the selected feature subset, has gven classfcaton accuraces ncreased by 12% than the consttuent classfers and the ensembles Baggng, Boostng, ACO-Baggng and ACO-Boostng. References [1] R.O. Duda, P.E. Hart and D.G. Stork, Pattern Recognton, John Wley & Sons, Inc 2nd edton, [2] L.I.Kuncheva, Combnng Pattern Classfers, Methods and Algorthms, Wley Interscence, [3] Polkar R., Ensemble based Systems n decson makng IEEE Crcuts and Systems Mag., vol. 6, no. 3, pp , [4] Karaboga and B. Basturk, A Powerful and Effcent Algorthm for Numercal Functon Optmzaton: Artfcal Bee Colony (ABC) Algorthm, Journal of Global Optmzaton, Sprnger Netherlands, vol.39, no.3, 2007, pp [5] Karaboga, An Idea Based on Honey Bee Swarm for Numercal Optmzaton, Techncal Report- TR06, Ercyes Unversty, Engneerng Faculty, Computer Engneerng Department, [6] D. Karaboga, B. Basturk, Artfcal Bee Colony (ABC) Optmzaton Algorthm for Solvng Constraned Optmzaton Problems, LNCS: Advances n Soft Computng: Foundatons of Fuzzy Logc and Soft Computng, Sprnger- Verlag, vol. 4529/2007, pp [7] Fe Kang, June L, Haon L, Zhenyue Ma and Qng Xu, An Improved Artfcal Bee Colony Algorthm, Proc. IEEE Internatonal Workshop on Intellgent Systems and Applcatons, 2010, pp 1-4. [8] Zl Zhang and Pengy Yang, An Ensemble of Classfers wth Genetc Algorthm Based Feature Selecton, IEEE Intellgent Informatcs Bulletn, 2008, Vol 9, No.1, pp [9] Nada Abd-Alsabour and Marcus Randall, Feature Selecton for Classfcaton Usng an Ant Colony System, Proc. Sxth IEEE Internatonal Conference on e Scence Workshops, 2010, pp [10] Dash.M and Lu.H., Feature Selecton for Classfcaton, Intellgent Data Analyss, Vol.39, 1997, No. 1, pp [11] Wtten, I. H., & Frank, E. (2005). Data mnng: Practcal machne learnng tools and technques. San Francsco: Morgan Kaufmann. [12] Han, J., and Kamber, M.: Data mnng concepts and technques, Academc Press, [13] L. Breman, Baggng predctors, Machne Learnng, 1996, vol. 24, no. 2, pp [14] Y. Freund, R.E. Schapre, Experments wth a new boostng algorthm, Proceedng of the Thrteenth Internatonal conference on Machne Learnng, 1996, [15] A.Frank, A. Asuncon, UCI Machne Learnng Repostory, ( Irvne, CA: Unversty of Calforna, School of Informaton and Computer Scence (2010)) [16] WEKA: A Java Machne Learnng Package, ml/weka/. [17] N.Suguna and K.G.Thanushkod, A novel Rough Set Reduct Algorthm for Medcal Doman based on Bee Colony Optmzaton, Journal of Computng, Vol. 2(6), 2010, [18] N.Suguna and K.G.Thanushkod, An Independent Rough Set Approach Hybrd wth Artfcal Bee Colony Algorthm for Dmensonalty Reducton, Amercan Journal of Appled Scences 8 (3), 2011, pp [19] Laura E A Santana, Lga Slva Anne M P Canuto, Fernando Pntro and Karlane O Vale, A Comparatve Analyss of Genetc Algorthm and Ant Colony Optmzaton to Select Attrbutes for an Heterogeneous Ensemble of Classfers, Proc. IEEE Congress on Evolutonary Computaton, 2010, pp [20] ShXn Yu, Feature Selecton and Classfer Ensembles: A Study on Hyperspectral Remote Sensng Data, Ph.D. Thess, The Unversty of Antwerp, P.Shunmugaprya receved her M.E. degree n 2006 from Department of Computer Scence and Engneerng, FEAT, Annamala Unversty, Chdambaram. She had been workng as a Senor Lecturer for the past 7 years n the Department of Computer Scence and Engneerng, Sr Manakula Vnayagar Engneerng College, Afflated to Pondcherry Unversty, Puducherry. Currently she s workng towards her Ph.D degree n Optmal Desgn of Classfer Ensembles usng Swarm Intellgent, Meta-Heurstc Search Algorthms. Her areas of nterest are Artfcal Intellgence, Ontology based Software Engneerng, Classfer Ensembles and Swarm Intellgence. Dr. S. Kanman receved her B.E and M.E n Computer Scence and Engneerng from Bharathyar Unversty and Ph.D n Anna Unversty, Chenna. She had been the faculty of Department of Computer Scence and Engneerng, Pondcherry Engneerng College from 1992 onwards. Presently she s a Professor n the Department of Informaton Technology, Pondcherry Engneerng College, Puducherry. Her research nterests are Software Engneerng, Software testng, Obect orented system, and Data Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
8 IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): Mnng. She s Member of Computer Socety of Inda, ISTE and Insttute of Engneers, Inda. She has publshed about 50 papers n varous nternatonal conferences and ournals. Copyrght (c) 2012 Internatonal Journal of Computer Scence Issues. All Rghts Reserved.
BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET
1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School
More informationEVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS
Academc Research Internatonal ISS-L: 3-9553, ISS: 3-9944 Vol., o. 3, May 0 EVALUATIO OF THE PERFORMACES OF ARTIFICIAL BEE COLOY AD IVASIVE WEED OPTIMIZATIO ALGORITHMS O THE MODIFIED BECHMARK FUCTIOS Dlay
More informationLearning the Kernel Parameters in Kernel Minimum Distance Classifier
Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department
More informationThe Research of Support Vector Machine in Agricultural Data Classification
The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou
More informationAn Optimal Algorithm for Prufer Codes *
J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,
More informationJournal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray
More informationFeature Selection as an Improving Step for Decision Tree Construction
2009 Internatonal Conference on Machne Learnng and Computng IPCSIT vol.3 (2011) (2011) IACSIT Press, Sngapore Feature Selecton as an Improvng Step for Decson Tree Constructon Mahd Esmael 1, Fazekas Gabor
More informationResearch of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification
Research of Neural Network Classfer Based on FCM and PSO for Breast Cancer Classfcaton Le Zhang 1, Ln Wang 1, Xujewen Wang 2, Keke Lu 2, and Ajth Abraham 3 1 Shandong Provncal Key Laboratory of Network
More informationClassifier Selection Based on Data Complexity Measures *
Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.
More informationImage Feature Selection Based on Ant Colony Optimization
Image Feature Selecton Based on Ant Colony Optmzaton Lng Chen,2, Bolun Chen, Yxn Chen 3, Department of Computer Scence, Yangzhou Unversty,Yangzhou, Chna 2 State Key Lab of Novel Software Tech, Nanng Unversty,
More informationA Lazy Ensemble Learning Method to Classification
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 344 A Lazy Ensemble Learnng Method to Classfcaton Haleh Homayoun 1, Sattar Hashem 2 and Al
More informationDetermining the Optimal Bandwidth Based on Multi-criterion Fusion
Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn
More informationAn Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 www.ijcsi.org 374 An Evolvable Clusterng Based Algorthm to Learn Dstance Functon for Supervsed
More informationA Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining
A Notable Swarm Approach to Evolve Neural Network for Classfcaton n Data Mnng Satchdananda Dehur 1, Bjan Bhar Mshra 2 and Sung-Bae Cho 1 1 Soft Computng Laboratory, Department of Computer Scence, Yonse
More informationSupport Vector Machines
/9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.
More informationMeta-heuristics for Multidimensional Knapsack Problems
2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,
More informationParallelism for Nested Loops with Non-uniform and Flow Dependences
Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr
More informationSum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints
Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan
More informationFeature selection for classification using an ant colony system
Bond Unversty epublcatons@bond Informaton Technology papers School of Informaton Technology 12-7-2010 Feature selecton for classfcaton usng an ant colony system Nada Abd-Alsabour Bond Unversty Marcus Randall
More informationCluster Analysis of Electrical Behavior
Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School
More informationNetwork Intrusion Detection Based on PSO-SVM
TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*
More informationFeature Reduction and Selection
Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components
More informationA fast algorithm for color image segmentation
Unersty of Wollongong Research Onlne Faculty of Informatcs - Papers (Arche) Faculty of Engneerng and Informaton Scences 006 A fast algorthm for color mage segmentaton L. Dong Unersty of Wollongong, lju@uow.edu.au
More informationClustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b
Internatonal Conference on Advances n Mechancal Engneerng and Industral Informatcs (AMEII 05) Clusterng Algorthm Combnng CPSO wth K-Means Chunqn Gu, a, Qan Tao, b Department of Informaton Scence, Zhongka
More informationOutline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1
4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:
More informationAnt Colony Optimization to Discover the Concealed Pattern in the Recruitment Process of an Industry
N. Svaram et. al. / (IJCSE) Internatonal Journal on Computer Scence and Engneerng Ant Colony Optmzaton to Dscover the Concealed Pattern n the Recrutment Process of an Industry N. Svaram Research Scholar,
More informationStructural optimization using artificial bee colony algorithm
2 nd Internatonal Conference on Engneerng Optmzaton September 6-9, 2010, Lsbon, ortugal Structural optmzaton usng artfcal bee colony algorthm Al Hadd 1, Sna Kazemzadeh Azad 2, Saed Kazemzadeh Azad Department
More informationAn Improved Particle Swarm Optimization for Feature Selection
Journal of Bonc Engneerng 8 (20)?????? An Improved Partcle Swarm Optmzaton for Feature Selecton Yuannng Lu,2, Gang Wang,2, Hulng Chen,2, Hao Dong,2, Xaodong Zhu,2, Sujng Wang,2 Abstract. College of Computer
More informationIncremental Learning with Support Vector Machines and Fuzzy Set Theory
The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and
More informationChinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks
Chnese Word Segmentaton based on the Improved Partcle Swarm Optmzaton Neural Networks Ja He Computatonal Intellgence Laboratory School of Computer Scence and Engneerng, UESTC Chengdu, Chna Department of
More informationOutline. Type of Machine Learning. Examples of Application. Unsupervised Learning
Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton
More informationSmoothing Spline ANOVA for variable screening
Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory
More informationProblem Definitions and Evaluation Criteria for Computational Expensive Optimization
Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty
More informationA Two-Stage Algorithm for Data Clustering
A Two-Stage Algorthm for Data Clusterng Abdolreza Hatamlou 1 and Salwan Abdullah 2 1 Islamc Azad Unversty, Khoy Branch, Iran 2 Data Mnng and Optmsaton Research Group, Center for Artfcal Intellgence Technology,
More informationCHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION
48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue
More informationContent Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers
IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth
More informationSolving two-person zero-sum game by Matlab
Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by
More informationConcurrent Apriori Data Mining Algorithms
Concurrent Apror Data Mnng Algorthms Vassl Halatchev Department of Electrcal Engneerng and Computer Scence York Unversty, Toronto October 8, 2015 Outlne Why t s mportant Introducton to Assocaton Rule Mnng
More informationYan et al. / J Zhejiang Univ-Sci C (Comput & Electron) in press 1. Improving Naive Bayes classifier by dividing its decision regions *
Yan et al. / J Zhejang Unv-Sc C (Comput & Electron) n press 1 Journal of Zhejang Unversty-SCIENCE C (Computers & Electroncs) ISSN 1869-1951 (Prnt); ISSN 1869-196X (Onlne) www.zju.edu.cn/jzus; www.sprngerlnk.com
More informationClassification / Regression Support Vector Machines
Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM
More informationNon-Negative Matrix Factorization and Support Vector Data Description Based One Class Classification
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 5, No, September 01 ISSN (Onlne): 1694-0814 www.ijcsi.org 36 Non-Negatve Matrx Factorzaton and Support Vector Data Descrpton Based One
More informationRecommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm
Recommended Items Ratng Predcton based on RBF Neural Network Optmzed by PSO Algorthm Chengfang Tan, Cayn Wang, Yuln L and Xx Q Abstract In order to mtgate the data sparsty and cold-start problems of recommendaton
More informationOptimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition
Optmal Desgn of onlnear Fuzzy Model by Means of Independent Fuzzy Scatter Partton Keon-Jun Park, Hyung-Kl Kang and Yong-Kab Km *, Department of Informaton and Communcaton Engneerng, Wonkwang Unversty,
More informationA Serial and Parallel Genetic Based Learning Algorithm for Bayesian Classifier to Predict Metabolic Syndrome
A Seral and Parallel Genetc Based Learnng Algorthm for Bayesan Classfer to Predct Metabolc Syndrome S. Dehur Department of Informaton and Communcaton Technology Fakr Mohan Unversty, Vyasa Vhar Balasore-756019,
More informationEdge Detection in Noisy Images Using the Support Vector Machines
Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona
More informationUsing Fuzzy Logic to Enhance the Large Size Remote Sensing Images
Internatonal Journal of Informaton and Electroncs Engneerng Vol. 5 No. 6 November 015 Usng Fuzzy Logc to Enhance the Large Sze Remote Sensng Images Trung Nguyen Tu Huy Ngo Hoang and Thoa Vu Van Abstract
More informationComparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments
Comparson of Heurstcs for Schedulng Independent Tasks on Heterogeneous Dstrbuted Envronments Hesam Izakan¹, Ath Abraham², Senor Member, IEEE, Václav Snášel³ ¹ Islamc Azad Unversty, Ramsar Branch, Ramsar,
More informationDetection of an Object by using Principal Component Analysis
Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,
More informationTerm Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task
Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto
More informationComplexity Analysis of Problem-Dimension Using PSO
Proceedngs of the 7th WSEAS Internatonal Conference on Evolutonary Computng, Cavtat, Croata, June -4, 6 (pp45-5) Complexty Analyss of Problem-Dmenson Usng PSO BUTHAINAH S. AL-KAZEMI AND SAMI J. HABIB,
More informationParameters Optimization of SVM Based on Improved FOA and Its Application in Fault Diagnosis
Parameters Optmzaton of SVM Based on Improved FOA and Its Applcaton n Fault Dagnoss Qantu Zhang1*, Lqng Fang1, Sca Su, Yan Lv1 1 Frst Department, Mechancal Engneerng College, Shjazhuang, Hebe Provnce,
More informationCorrelative features for the classification of textural images
Correlatve features for the classfcaton of textural mages M A Turkova 1 and A V Gadel 1, 1 Samara Natonal Research Unversty, Moskovskoe Shosse 34, Samara, Russa, 443086 Image Processng Systems Insttute
More informationSkew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach
Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research
More informationSpam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection
E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton We-Chh Hsu, Tsan-Yng Yu E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton
More informationNUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS
ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana
More informationFuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System
Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105
More informationAn Improvement to Naive Bayes for Text Classification
Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 2160 2164 Advancen Control Engneerngand Informaton Scence An Improvement to Nave Bayes for Text Classfcaton We Zhang a, Feng Gao a, a*
More informationCHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION
24 CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION The present chapter proposes an IPSO approach for multprocessor task schedulng problem wth two classfcatons, namely, statc ndependent tasks and
More informationA MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES
A MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES Aram AlSuer, Ahmed Al-An and Amr Atya 2 Faculty of Engneerng and Informaton Technology, Unversty of Technology, Sydney, Australa
More informationUnsupervised Learning
Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and
More informationA Clustering Algorithm Solution to the Collaborative Filtering
Internatonal Journal of Scence Vol.4 No.8 017 ISSN: 1813-4890 A Clusterng Algorthm Soluton to the Collaboratve Flterng Yongl Yang 1, a, Fe Xue, b, Yongquan Ca 1, c Zhenhu Nng 1, d,* Hafeng Lu 3, e 1 Faculty
More informationTHE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY
Proceedngs of the 20 Internatonal Conference on Machne Learnng and Cybernetcs, Guln, 0-3 July, 20 THE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY JUN-HAI ZHAI, NA LI, MENG-YAO
More informationFace Recognition Based on SVM and 2DPCA
Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty
More informationMeasure optimized cost-sensitive neural network ensemble for multiclass imbalance data learning
easure optmzed cost-senstve neural network ensemble for multclass mbalance data learnng Peng Cao, Dazhe Zhao Key Laboratory of edcal Image Computng of nstry of Educaton, Northeastern Unversty Shenyang,
More informationImproving Classifier Fusion Using Particle Swarm Optimization
Proceedngs of the 7 IEEE Symposum on Computatonal Intellgence n Multcrtera Decson Makng (MCDM 7) Improvng Classfer Fuson Usng Partcle Swarm Optmzaton Kalyan Veeramachanen Dept. of EECS Syracuse Unversty
More informationFrom Comparing Clusterings to Combining Clusterings
Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (008 From Comparng Clusterngs to Combnng Clusterngs Zhwu Lu and Yuxn Peng and Janguo Xao Insttute of Computer Scence and Technology,
More informationCourse Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms
Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques
More informationLoad Balancing for Hex-Cell Interconnection Network
Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,
More informationSubspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;
Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features
More informationA mathematical programming approach to the analysis, design and scheduling of offshore oilfields
17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and
More informationExtraction of Fuzzy Rules from Trained Neural Network Using Evolutionary Algorithm *
Extracton of Fuzzy Rules from Traned Neural Network Usng Evolutonary Algorthm * Urszula Markowska-Kaczmar, Wojcech Trelak Wrocław Unversty of Technology, Poland kaczmar@c.pwr.wroc.pl, trelak@c.pwr.wroc.pl
More informationLearning to Project in Multi-Objective Binary Linear Programming
Learnng to Project n Mult-Objectve Bnary Lnear Programmng Alvaro Serra-Altamranda Department of Industral and Management System Engneerng, Unversty of South Florda, Tampa, FL, 33620 USA, amserra@mal.usf.edu,
More informationParallel Artificial Bee Colony Algorithm for the Traveling Salesman Problem
Parallel Artfcal Bee Colony Algorthm for the Travelng Salesman Problem Kun Xu, Mngyan Jang, Dongfeng Yuan The School of Informaton Scence and Engneerng Shandong Unversty, Jnan, 250100, Chna E-mal: xukun_sdu@163.com,
More informationFeature Subset Selection Based on Ant Colony Optimization and. Support Vector Machine
Proceedngs of the 7th WSEAS Int. Conf. on Sgnal Processng, Computatonal Geometry & Artfcal Vson, Athens, Greece, August 24-26, 27 182 Feature Subset Selecton Based on Ant Colony Optmzaton and Support Vector
More informationClassifier Swarms for Human Detection in Infrared Imagery
Classfer Swarms for Human Detecton n Infrared Imagery Yur Owechko, Swarup Medasan, and Narayan Srnvasa HRL Laboratores, LLC 3011 Malbu Canyon Road, Malbu, CA 90265 {owechko, smedasan, nsrnvasa}@hrl.com
More informationAssociative Based Classification Algorithm For Diabetes Disease Prediction
Internatonal Journal of Engneerng Trends and Technology (IJETT) Volume-41 Number-3 - November 016 Assocatve Based Classfcaton Algorthm For Dabetes Dsease Predcton 1 N. Gnana Deepka, Y.surekha, 3 G.Laltha
More informationAvailable online at Available online at Advanced in Control Engineering and Information Science
Avalable onlne at wwwscencedrectcom Avalable onlne at wwwscencedrectcom Proceda Proceda Engneerng Engneerng 00 (2011) 15000 000 (2011) 1642 1646 Proceda Engneerng wwwelsevercom/locate/proceda Advanced
More informationCSCI 5417 Information Retrieval Systems Jim Martin!
CSCI 5417 Informaton Retreval Systems Jm Martn! Lecture 11 9/29/2011 Today 9/29 Classfcaton Naïve Bayes classfcaton Ungram LM 1 Where we are... Bascs of ad hoc retreval Indexng Term weghtng/scorng Cosne
More informationData Mining: Model Evaluation
Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct
More informationOptimizing SVR using Local Best PSO for Software Effort Estimation
Journal of Informaton Technology and Computer Scence Volume 1, Number 1, 2016, pp. 28 37 Journal Homepage: www.jtecs.ub.ac.d Optmzng SVR usng Local Best PSO for Software Effort Estmaton Dnda Novtasar 1,
More informationClassifying Acoustic Transient Signals Using Artificial Intelligence
Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)
More informationA Powerful Feature Selection approach based on Mutual Information
6 IJCN Internatonal Journal of Computer cence and Network ecurty, VOL.8 No.4, Aprl 008 A Powerful Feature electon approach based on Mutual Informaton Al El Akad, Abdelall El Ouardgh, and Drss Aboutadne
More informationSpecialized Weighted Majority Statistical Techniques in Robotics (Fall 2009)
Statstcal Technques n Robotcs (Fall 09) Keywords: classfer ensemblng, onlne learnng, expert combnaton, machne learnng Javer Hernandez Alberto Rodrguez Tomas Smon javerhe@andrew.cmu.edu albertor@andrew.cmu.edu
More informationIntelligent Information Acquisition for Improved Clustering
Intellgent Informaton Acquston for Improved Clusterng Duy Vu Unversty of Texas at Austn duyvu@cs.utexas.edu Mkhal Blenko Mcrosoft Research mblenko@mcrosoft.com Prem Melvlle IBM T.J. Watson Research Center
More informationCAN COMPUTERS LEARN FASTER? Seyda Ertekin Computer Science & Engineering The Pennsylvania State University
CAN COMPUTERS LEARN FASTER? Seyda Ertekn Computer Scence & Engneerng The Pennsylvana State Unversty sertekn@cse.psu.edu ABSTRACT Ever snce computers were nvented, manknd wondered whether they mght be made
More informationExperiments in Text Categorization Using Term Selection by Distance to Transition Point
Experments n Text Categorzaton Usng Term Selecton by Dstance to Transton Pont Edgar Moyotl-Hernández, Héctor Jménez-Salazar Facultad de Cencas de la Computacón, B. Unversdad Autónoma de Puebla, 14 Sur
More informationSRBIR: Semantic Region Based Image Retrieval by Extracting the Dominant Region and Semantic Learning
Journal of Computer Scence 7 (3): 400-408, 2011 ISSN 1549-3636 2011 Scence Publcatons SRBIR: Semantc Regon Based Image Retreval by Extractng the Domnant Regon and Semantc Learnng 1 I. Felc Raam and 2 S.
More informationFace Recognition University at Buffalo CSE666 Lecture Slides Resources:
Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural
More informationA PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION
1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute
More informationA Genetic Programming-PCA Hybrid Face Recognition Algorithm
Journal of Sgnal and Informaton Processng, 20, 2, 70-74 do:0.4236/jsp.20.23022 Publshed Onlne August 20 (http://www.scrp.org/journal/jsp) A Genetc Programmng-PCA Hybrd Face Recognton Algorthm Behzad Bozorgtabar,
More informationAn Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices
Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal
More informationA Novel Term_Class Relevance Measure for Text Categorization
A Novel Term_Class Relevance Measure for Text Categorzaton D S Guru, Mahamad Suhl Department of Studes n Computer Scence, Unversty of Mysore, Mysore, Inda Abstract: In ths paper, we ntroduce a new measure
More informationCollaboratively Regularized Nearest Points for Set Based Recognition
Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,
More informationTHE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT
Journal of Theoretcal and Appled Informaton Technology 30 th Aprl 013. Vol. 50 No.3 005-013 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 THE PATH PLANNING ALGORITHM AND
More informationA hybrid sequential approach for data clustering using K-Means and particle swarm optimization algorithm
MultCraft Internatonal Journal of Engneerng, Scence and Technology Vol., No. 6, 00, pp. 67-76 INTERNATIONAL JOURNAL OF ENGINEERING, SCIENCE AND TECHNOLOGY www.jest-ng.com 00 MultCraft Lmted. All rghts
More informationHigh-Boost Mesh Filtering for 3-D Shape Enhancement
Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,
More informationBAYESIAN MULTI-SOURCE DOMAIN ADAPTATION
BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,
More informationEYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS
P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye
More informationMaximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation
Internatonal Conference on Logstcs Engneerng, Management and Computer Scence (LEMCS 5) Maxmum Varance Combned wth Adaptve Genetc Algorthm for Infrared Image Segmentaton Huxuan Fu College of Automaton Harbn
More informationX- Chart Using ANOM Approach
ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are
More information