Consensus-Based Combining Method for Classifier Ensembles

Size: px
Start display at page:

Download "Consensus-Based Combining Method for Classifier Ensembles"

Transcription

1 76 The Internatonal Arab Journal of Informaton Technology, Vol. 15, No. 1, January 2018 Consensus-Based Combnng Method for Classfer Ensembles Omar Alzub 1, Jafar Alzub 2, Sara Tedmor 3, Hasan Rashadeh 4, and Omar Almoman 4 1 Computer and Network Securty, Al-Balqa Appled Unversty, Jordan 2 Computer Engneerng Department, Al-Balqa Appled Unversty, Jordan 3 Computer Scence Department, Prncess Sumaya Unversty, Jordan 4 Informaton Technology, Al-Balqa Appled Unversty, Jordan Abstract: In ths paper, a new method for combnng an ensemble of classfers, called Consensus-based Combnng Method (CCM) s proposed and evaluated. As n most other combnaton methods, the outputs of multple classfers are weghted and summed together nto a sngle fnal classfcaton decson. However, unlke the other methods, CCM adjusts the weghts teratvely after comparng all of the classfers outputs. Ultmately, all the weghts converge to a fnal set of weghts, and the combned output reaches a consensus. The effectveness of CCM s evaluated by comparng t wth popular lnear combnaton methods (majorty votng, product, and average method). Experments are conducted on 14 publc data sets, and on a blog spam data set created by the authors. Expermental results show that CCM provdes a sgnfcant mprovement n classfcaton accuracy over the product and average methods. Moreover, results show that the CCM s classfcaton accuracy s better than or comparable to that of majorty votng. Keywords: Artfcal ntellgence, classfcaton, machne learnng, pattern recognton, classfer ensembles, consensus theory, combnng methods, majorty votng, mean method, product method. Receved June 3, 2015; accept January 13, Introducton Ensemble classfers combne the decsons of multple ndependent base classfers machne learners n an attempt to ncrease the classfcaton accuracy compared to ndvdual classfers [12, 23]. Ths ncrease n classfcaton accuracy has been observed by many researchers n varous domans [9, 10, 13, 24]. Unfortunately, combnng multple classfers does not guarantee an mprovement n accuracy as n the case of when the majorty of classfers agree on an ncorrect classfcaton, leadng to an ncorrect classfcaton decson. An ensemble classfer requres careful selecton and tranng of base classfers so that base classfers do not make errors smultaneously [17]. Over the years, a great deal of research has focused on mprovng machne learnng results wth the ensemble methods of boostng and baggng. Boostng ams to sequentally add and tran an ensemble of classfers untl the desred number of models or accuracy s attaned. Baggng, on the other hand, ams to generate multple base classfers by tranng the classfers on the dfferent tranng datasets. The results of these multple classfers are then combned. Gven an ensemble of classfers, the best decson wll depend on optmally combnng the ndvdual decsons. Hence another drecton of research has focused on the classfers combnaton mechansm e.g., majorty votng, lnear combnaton, super-kernel nonlnear fuson, or SVM- based meta-classfcaton. In tradtonal combnaton mechansms, the base classfers are vewed as ndependent and dverse, snce lack of ndependence and dversty could possbly lead the undesrable stuaton where the classfers make the same classfcaton error. Treatng base classfers as completely ndependent, on the other hand, wll result n loss of potentally useful nformaton that one classfer mght learn from the others. For nstance a classfer mght learn that another classfer s more confdent n ts decson, e.g., because t was traned on a dfferent dataset or a more useful set of features. The authors of ths research vew the ensemble of classfers as a collaboratve socety n whch members learn from each other. Each base classfer produces an ntal classfcaton of the object under consderaton, but after communcaton wth the other classfers has the opportunty to change ts classfcaton. Through teratons, the classfers eventually reach a consensus on the best classfcaton decson. There are numerous models of how to conduct a consensus-based decson-makng process for classfcaton. In ths research, the authors focus on a new Consensus-based Combnng Method (CCM) that adaptvely terates the weghts n the combner. In each of the teratons, nformaton about each base classfer n the form of an uncertanty estmate s utlsed. Two types of uncertanty are utlsed: selfuncertanty and condtonal uncertantes of the other classfers. The pooled uncertanty estmates are used

2 Consensus-Based Combnng Method for Classfer Ensembles 77 to revse the weghts n the combner, and the process s repeated untl a consensus s reached. Secton 2 provdes a revew of related works. In secton 3, the central dea of consensus based decson makng s explaned n terms of the statonary probabltes of a Markov chan. The CCM algorthm s descrbed n detal n secton 4. Expermental results are presented n secton 5 and a concluson s provded n secton 6. For expermentaton purposes, the authors used a varety of publcly avalable data sets, n addton to a blog spam dataset constructed by crawlng the web. CCM was compared wth three popular combnaton mechansms. Results show a sgnfcant mprovement n classfcaton accuracy or no worse across all data sets. 2. Related Work The combnaton approach proposed n ths research falls under the lnear classfcaton combnaton mechansm [15]. Lnear combnaton s ntutve and smply equates to the sum of the weghted outputs from the base classfers. The most obvous concern wth lnear combnaton s the choce of best weghts. Multresponse lnear regresson, one of the popular lnear combnaton methods, calculates optmal weghts n order to get hgh classfcaton accuracy. Many attempts have been made at nonlnear methods to mprove ther performance n comparson wth lnear methods. For nstance, multvarate polynomal regresson can be unsutable n cases of hghdmensonal and hgh order problems because of ther hgh number of product terms. Later, an attempt to overcome the dmensonalty problem of polynomal regresson was made by Toh et al. [21]; results show that the accuracy was compromsed. The proposed combnaton approach s also an example of measurement-level combnaton where each base classfer provdes, n addton to a label for the object under consderaton, a measurement value score whch represents the degree to whch the object s assocated wth the label. Ths nformaton could be helpful for the classfcaton and thus mprove the overall classfcaton accuracy compared wth the accuracy attaned when utlsng only the classfer decsons. The combnaton approach proposed n ths research takes advantage of consensus theory prncples, wdely used n many felds such as statstcs, socal, poltcal, and management scences. Consensus theory, whch enables members of a group of experts to methodcally reach an agreement, was frst ntroduced to the arena of artfcal ntellgence n 1985 by Brenesten et al. [5]. Benedktsson and Swan [4] appled consensus theory prncples to mult-sensor fuson where data from varous locatons are ntegrated to extract more valuable nformaton The dea behnd ther research was to use Logarthmc Opnon Pool (LOP) to fuse data source outputs by assgnng dfferent weghts to the data sources accordng to ther relablty. Shaban et al. [19] ntroduced a framework for aggregatng cooperatve agents decsons wth respect to ther uncertanty. The framework whch models the nteracton between the group members was essentally desgned to solve some of contemporary Web nformaton retreval problems. The authors of ths research beleve that Shaban et al. s [19] framework presents a comprehensve and practcal mplementaton of the consensus theory concepts. Although, the framework was not desgned for classfer ensemble systems, some of the general gudelnes of the framework are adopted n the desgn of the proposed CCM. Km and Hong [11] presented a mult-classfer system comprsng of multple base classfers. Each base classfer n turn conssts of a general classfer, responsble for the classfcaton, and a meta-classfer, whose job s to evaluate classfcaton result of ts correspondng general classfer and make a decson of whether the base classfer partcpates nto the fnal decson-makng process or not. L et al. [14] proposed AMCE, a mult-classfer system for remotely sensed mages. AMCE s n essence an aggregatve model-based classfer ensemble wth two man components, namely ensemble learnng and predctons combnaton. In ensemble learnng and for purposes of mprovng the performance of sngle classfers, the authors employed two ensemble algorthms (Baggng and AdaBoost.M1). In regards to the predctons combnatons, dversty measurements wth an averaged double-fault ndcator and dfferent combnaton strateges where taken nto consderaton when ntegratng the results from sngle classfers. Fersn et al. [8] tackled the task of classfyng the polarty of texts.e., postve vs. Negatve by proposng an ensemble learnng model based on Bayesan Model Averagng. Detals related to the desgn of an dea behnd the proposed CCM are provded n the next secton. The paper also ncludes detals of the proposed algorthm and shows the calculaton of the varous stages of the combnaton process. The experments conducted demonstrate the performance of CCM by presentng expermental comparsons between majorty vote, average, and product methods. 3. Desgn of CCM Algorthm The relatonshp and nteracton between the classfers n CCM can be consdered as a recursve arrangement. Each classfer n the ensemble can engage n a dscourse wth other classfers and hence s capable of vewng and checkng other classfers decsons. Ths approach can be a powerful method for

3 78 The Internatonal Arab Journal of Informaton Technology, Vol. 15, No. 1, January 2018 resolvng or decreasng the level of uncertanty assocated wth the classfers decson makng processes, makng recursve groups one of the best nformaton fuson methods. As Degroot [7] suggested, ths type of modellng can be referred to as group consensus, whch s bascally the result of aggregatng multple and dfferent opnons nto a sngle decson that represent the group s consensus. As shown n Fgure1, ths desgn allows lnear poolng of classfers opnons n a recursve process n order to reach a consensus. It s smple and ntutve, yet a powerful decson makng desgn. classfer no longer expects to change the rankng of any other classfer, meanng that no change of decsons s expected. The output of ths process s an N N stochastc matrx denoted by W. Ths matrx can be vewed as a one-step transton probablty matrx of Markovan chan wth statonary probablty and N stages. Because of ths property, t s possble to use the lmt theorem of Markovan chans to determne whether the ensemble wll converge to a common rankngwhch represents the ensemble consensus- and f that s possble, what wll the value of ths rankng be? Degroot [7] and Berger [6] proved and explaned that such ensemble wll converge to a common rankng only n the case of the exstence of a vector π such that Subject to: W 1 L (2) (3) And the common group rankng, for each ω k Ω denoted y g (ω k ), k = 1,..., c s gven by: L y ( ) y ( ) g k k 1 (4) Fgure 1. Dagram of Consensus based Combnng Method (CCM). In ths desgn, each classfer n the ensemble must present hs own expected decson whch s a soft value that represents the membershp of the data pont x n the data set Z to one of the classes Ω. Ths value wll be denoted by y ( k ), k. It s then confronted wth decson profles of other classfers n the ensemble and revses ts own decson by makng an assessment for each classfer gven ts accuracy and decson on the current data pont. The formula that s used to calculate the revsed expected rankng s n the form of: L y ( ) y ( ) k j k 1 (1) Where, w j s a postve weght gven by th classfer to the j th classfer. Its summaton s one for all classfers,, j E. The process of opnon revson contnues n ths manner, where each classfer updates ts own decson whenever t s nformed of the revsons made by other classfers. The process termnates when each Classfers subjectvely calculate the weghts w j to reflect ther accuracy and the confdence of the decsons they made. Classfers also represent ther level of uncertanty about such decsons. As expected, t s clear that each classfer wll have a dfferent level of uncertanty n dfferent stuatons, and ths level s also dfferent from that of other classfers. The weght calculaton stage can be descrbed as a dynamc process wth an adaptve property, where t constantly changes as the classfer state of knowledge changes. In summary, ths weght shows the level of confdence that a classfer has n ts own decson and n the other classfers decsons as well. 4. CCM Algorthm Descrpton Our CCM algorthm s composed of the followng man stages. The frst stage nvolves buldng the decson profle for each classfer n the ensemble and s denoted by DP(x). In the second stage the Uncertanty matrx s calculated. The second stage s composed of two sub stages (Self-Uncertanty and Condtonal-Uncertanty). The weght calculaton stage s essental and s the cornerstone of ths algorthm and n the next stage a dffuson of the decsons s performed n order to reach the consensus. The fnal stage s the update stage where each classfer updates ts decson gven the fnal decson of the other classfers. The next subsectons nclude a detaled explanaton of these varous stages wth code fragments wrtten n Matlab.

4 Consensus-Based Combnng Method for Classfer Ensembles 79 Algorthm 1: Consensus-based Combng Method (CCM) #Inputs: Z: Data set x: Data pont E: Ensemble of classfers e: Classfer n E Δ: Vector of classfers accuracy DP: Decson profle U mat : Uncertanty matrx W mat : Weght matrx Ω: Vector of classes π: Vector #Output: y: predcton of x by E 1: for each x n Z do 2: for each e j n E do 3: DP j classfy (e j, x ) 4: end for 5: end for 6: Exchange DP between classfers n E 7: U mat = Computed by Eq. (7) and Eq. (8) 8: W mat = Computed by Eq. (17) 9: Calculate π by Eq. (2) 10: y = dffuse (DP, Δ, Ω, W mat ) 11: Update DP by Eq. (20) 12: Repeat 7 to 10 13: Return ω Ω wth the maxmum support A. Buldng Decson Profles: Suppose that an ensemble E conssts of L classfers whch are denoted by e 1, e 2,...,e L, performng a classfcaton of data pont x that belongs to a data set Z. Here a classfer e observes a subset of feature space θ (whch represents x ) over a unversal feature space of Z whch s gven by Θ. A hypothess h s used to relate θ to a belef ψ. Ths relatonshp s summarzed by the followng equaton: h ( ) Where, ψ Ψ s the knowledge space. Upon recevng a new x, e not only chooses a ω k from a set of possble classes Ω =ω 1, ω 2,...,ω c but also provdes the probablty for ω k Ω. Ths output s related to the belef ψ by decson functon δ as ( ) k (5) (6) Each e n E wll produce ts own outputs based on ts own hypothess, whch mght be dfferent from the outputs of other classfers, and gve the probablty of each class as a soft output decson. The fnal decson ntegraton problem can be presented as fndng the correct ω among a group of classes ω k Ω whch should consttute the group preference. B. Calculaton of Classfers Uncertanty Estmatons: ths stage nvolves fndng a functon by whch each classfer s uncertanty can be computed. The ntuton here s to assgn more weghts to classfers that are less certan and vce versa. However, the weghts should reflect the contrast of the classfers decsons. Durng ths stage, uncertanty wll be dvded nto two types: self or local and condtonal or global. Self-uncertanty s related to the qualty of the classfers own decsons where the condtonal-uncertanty emerges as the result of collaboraton between classfers that take place n the form of decson profle exchange. In ths stage a classfer wll be able to revew ts uncertanty level and modfy t gven ts own decson as well as the decsons of other classfers. Ths provdes the classfer wth a way to mprove ts decson when other classfers decson becomes avalable. In ths paper, local uncertanty and global uncertanty are referred to as self-uncertanty and condtonal-uncertanty, respectvely. 1. Self-Uncertanty: Self-Uncertanty s a measure of how much doubt a classfer has n ts own decson and also how much randomness s nvolved n that decson. Let U denote the self-uncertanty of classfer e. The followng equaton wll calculate U : U k 1 y ( )log y ( ) c k c k Where, c represents the number of labels or classes. 2. Condtonal-Uncertanty: The condtonal uncertanty s a measure of how much doubt a classfer has on ts own decson after observng the decsons of other classfers. Ths reflects how much knowledge can be nferred from others decsons. Condtonal uncertanty s computed by: U k 1 y ( )log y ( ) c k j c k j For an ensemble composed of L classfers the uncertantes are presented as show the followng matrx form: U U... U L U U... U U L U U... U L 1 L 2 L L Where the dagonals of the matrx represent the selfuncertanty and the off dagonals represent the condtonal-uncertanty. C. Calculaton of Classfers Weghts: After constructng the uncertanty matrx, t s now possble for each classfer to assgn weghts for tself and for other classfers n the ensemble as well. Here mnmzaton of the sum squares of selfuncertanty and condtonal uncertanty of other classfers s used. In ths way, classfers wth low condtonal uncertanty are gven hgher weghts whle the ones wth hgher condtonal uncertanty wll receve low weghts. The followng two equatons wll summarze the above dea: Mnmze T w U 2 2 j j L (7) (8) (9) (10)

5 80 The Internatonal Arab Journal of Informaton Technology, Vol. 15, No. 1, January 2018 Wth constrant that: L w j and w 1 j j However the above two equatons are equvalent to mnmzng the followng equaton: jl jl v w j U w j 1 j Where, ρ denotes the Larange multpler. Takng the partal dervatve of ν wth respect to w j and settng the equaton to zero wll result n: w j 2 2 U j Also, takng the partal dervatve of ν wth respect to the Lagrange multpler ρ and equatng to zero wll yeld: w 1 jl j The substtuton of Equaton (13) n Equaton (14) wll gve: It follows that: 2 jl 2 U j jl U j Substtutng Equatons (13) and (16) wll produce the classfer weghtng coeffcent w j computed by: w j U j 2 U k kl D. Decson Updates: The dea of ths stage s nspred by a suggeston from Degroot [7], who n hs famous paper enttled Reachng a Consensus., brefly rased the queston: what mght be the output f e wshes to change the weghts that t assgns to the other classfers after t learned ther ntal decsons, or after t has observed how much ther decsons dffer from the consensus decson. The authors of ths research have explored such possblty by takng advantage of ths dea n the desgn as an update to each e D. By usng ths update, e s able to revse all rankngs that have been gven to other classfers. These new rankngs wll subsequently mean a new calculaton of the uncertanty matrx and as a result new weghts calculaton. Detals of ths one loop process are provdes n the next paragraph. The ntal consensus decsons are presented n a vector Γ = {γ 1, γ 2,...,γ M } and the decsons of classfers are presented by Θ = {θ 1, θ 2,..., θ M }, then for each e, the value s calculate as: M j 1 (11) (12) (13) (14) (15) (16) (17) (18) For each value, another value denoted by α usng the equaton s calculated: 1 Now a new coeffcent ϕ can be calculated by: 1 2 Fnally each e s able to gve new rankngs to ts fellow classfers whch reflect the update that has been receved. 5. Expermental Results The performance of the algorthm has been evaluated by runnng experments on 14 representatve data sets from the Unversty of Calforna-Irvne (UCI) repostory [16]. These data sets have been used n smlar studes [1, 18]. Table 1 presents a summary of these data sets. Bnary and mult class data sets were consdered when choosng these data sets. In addton, a varaton n the number of the attrbutes and examples (data tems) were also consdered. After the ensembles have been traned, the predctons are obtaned by usng the majorty vote combnng method. In addton to the aforementoned data sets, the experments were also run on a blog spam detecton data set compled by the authors of ths research. Part of ts raw data was obtaned from Defenso, a company that specalzed n provdng securty aganst threats targetng socal meda, and the rest of the data was collected by the authors themselves. In order to collect raw data from the web, the authors desgned and bult a web crawler usng Perl programmng language. The am of ths data set s to dstngush between spam blog comments and non-spam ones. Thus the comments were dvded nto two classes, spam and non-spam comments. The dataset comprses of 56,000 blog comments, of whch 30,000 comments are spam comments and the rest are non-spam comments. Table 1. Summary of data sets. Name Examples Classes Attrbutes Blog Spam Breast Cancer Letter Recognton Irs Segment Ionosphere Auto (Statlog) Haberman s Contraceptve Isolet Glass Colc Heart-c Splce Anneal (19) (20)

6 Consensus-Based Combnng Method for Classfer Ensembles 81 For expermental purposes, the data set was dvded nto a tranng set and a valdaton set. A ten-fold cross valdaton method was used to test the classfers and the proposed method n comparson wth other methods. Table 2. Summary of base classfers. Name of Classfer k-nearest neghbor bnary decson tree Nave Bayes Support vector Normal denstes based lnear Lnear perceptron Normal denstes based quadratc Logstc lnear Tran neural network classfer by back-propagaton Nearest mean Tran radal bass neural network k-centers clusterng Radal bass support vector Parzen Mnmum least square lnear A set of base classfers was used n buldng the ensembles that are lsted n Table 2. Each of these classfers was evaluated usng 10 complete runs of 10- fold cross valdaton. In each 10-fold cross-valdaton, each data set s randomly dvded nto 10 equal-sze segments and the results are averaged over thrty trals. For each tral, all segments are set asde for tranng, whle only one segment of data s reserved for testng. To perform testng on varyng amounts of tranng data, learnng curves were generated by testng the ensembles after tranng on ncreasng subsets of the overall tranng data. In order to summarse the results over dfferent data sets of varyng szes, dfferent percentages of the total tranng-set sze were chosen as the ponts on the learnng curve. CCM was compared wth three dfferent combnng methods: votng combnng method, averagng combnng method, and product combnng method. These methods have been wdely used as ensembles combnng methods [20]. The results of comparng CCM wth majorty votng (votng), Mean, and Products methods are presented n three formats: tables, scatter plots, and lne graphs. For the purpose of comparng CCM wth other algorthms across all domans, the statstcs used n n [2, 3, 22] was mplemented, specfcally the wn/draw/loss record and the geometrc mean error rato. The statstcs (Wn/Draw/Loss, sgnfcant Wn/Draw/Loss, and geometrc mean error rato) are summarzed at the bottom of each table. The smple wn/draw/loss record computed by calculatng the number of data sets for whch CCM obtaned better, equal, or worse performance than any of the other algorthm wth respect to the ensemble classfcaton accuracy. In addton, another record representng the statstcally sgnfcant wn/draw/loss, accordng to ths record wn/loss s only computed f the dfference between two values s greater than 0.05 level whch was determned to be sgnfcant by computng the student pared t-test. The Geometrc Mean (GM) error rato was computed by: GM n E E Where E and E denote the mean errors of our A B algorthm and the other algorthm beng compared, respectvely. For the proposed algorthm to outperform the other algorthms, the geometrc mean error rato must be less than one. Error rato computaton captures the degree to whch algorthms outperform each other n wn or loss outcomes. The scatter plots present a clear vsualzaton of the performance of CCM and the method that t s beng compared wth. It compares the accuracy on all data sets at selected tranng sze. In each scatter plot, the data ponts represent the datasets, the one s located above the dagonal ndcates that the performance of CCM s hgher otherwse s not. However, the lne graphs show comparsons between all combnng methods on selected data sets over all tranng data szes Comparson of CCM wth Majorty Votng Method The results shown n Table 3 llustrate the comparson of CCM wth majorty votng methods. It s clear that combnng the predctons of ensemble usng CCM wll, on average, mprove the accuracy of the ensemble. CCM has more sgnfcant wns to losses over majorty votng for all ponts along the learnng curve. The geometrc mean error rato n Table 3 dsplays that CCM outperforms majorty votng. It suggests that even n stuatons where majorty votng beats CCM, the gan n the accuracy s less than the gan. 1 Table 3. CCM VS. Majorty votng method. Dataset 10% 20% 30% 40% 50% Breast Cancer 82.67/ / / / /93.61 Ionosphere 84.02/ / / / /86.59 Auto (Statlog) 63.46/ / / / /72.12 Haberman's 66.28/ / / / /74.48 Contraceptve 40.52/ / / / /61.24 Blog Spam 85.71/ / / / /92.37 Letter Recognton 78.90/ / / / /83.87 Irs 79.19/ / / / /86.33 Segment 77.23/ / / / /85.77 Isolet 78.60/ / / / /80.76 Glass 50.93/ / / / /71.39 Colc 66.85/ / / / /71.39 Heart-c 64.29/ / / / /84.72 Splce 68.13/ / / / /82.14 Anneal 77.56/ / / / /83.26 Wn/Draw/Loss 12/0/3 12/0/3 11/0/4 12/0/3 12/0/3 Sg. W/D/L 10/3/2 11/2/3 9/3/3 9/3/3 6/7/3 GM error rato Obtaned by the CCM over votng on the rest of the cases. It also suggests by the GM error rato that wth hgher tranng data set szes majorty votng s A B (21)

7 82 The Internatonal Arab Journal of Informaton Technology, Vol. 15, No. 1, January 2018 performng consderably better as the base classfers performance mproved. Fgure 4. Comparng CCM wth mean method on 15 data sets gven 10% of the data as tranng. Fgure 2. Comparng CCM wth majorty votng on 15 data sets gven 10% of the data as tranng. Fgure 5. Comparng CCM wth mean method on 15 data sets gven 40% of the data as tranng. Table 4. CCM VS. Mean method. Fgure 3. Comparng CCM wth majorty votng on 15 data sets gven 30% of the data as tranng. The scatter plots n Fgures 2 and 3 show the outputs of the 15 data sets at 10% and 30% tranng szes. CCM has 10 and 9 sgnfcant wns compared to 2 and 3 wns for the majorty votng method. Also the superorty n the gan can be seen clearly as the wns of CCM are far above the dagonal where they are close to the dagonal n case of votng wns Comparson of CCM wth Mean Method The numercal results presented n Table 4 assert our assumpton that CCM acheves better performance n comparson wth smlar lnear methods such as mean method. Statstcs of Table 4 demonstrate that CCM sgnfcantly outperforms mean method early on the learnng curve both on sgnfcant wns/draw/loss records and geometrc mean error rato. Dataset 10% 20% 30% 40% 50% Breast Cancer 82.67/ / / / /90.14 Ionosphere 84.02/ / / / /90.81 Auto (Statlog) 63.46/ / / / /72.51 Haberman's 66.28/ / / / /74.83 Contraceptve 40.52/ / / / /56.62 Blog Spam 85.71/ / / / /96.81 Letter Recognton 78.90/ / / / /82.47 Irs 79.19/ / / / /86.51 Segment 77.23/ / / / /78.66 Isolet 78.60/ / / / /78.30 Glass 50.93/ / / / /59.13 Colc 66.85/ / / / /69.32 Heart-c 64.29/ / / / /81.46 Splce 68.13/ / / / /81.55 Anneal 77.56/ / / / /81.95 Wn/Draw/Loss 13/0/2 13/0/2 12/0/3 13/0/3 13/0/2 Sg. W/D/L 12/2/1 11/2/2 11/2/2 10/3/2 8/5/2 GM error rato However, the trend becomes consderably less obvous. For nstance, gven 50% of tranng the GM error rato s compared to at 10% tranng data. The statstcally sgnfcant wns/draw/loss records follow the same pattern; for example the CCM acheves 12 wns gven 10% data compared to 8 wns at 50%. The scatter plot n Fgure 4 llustrates that CCM produces hgher accuracy on 12 out of 15 data sets compared to only1 out 15 n favour of mean method gven 10% tranng data. Smlarly n Fgure 5, CCM obtaned 10 out of 15 data sets, whle mean obtaned 3 out of 15 data sets. However, the gan obtaned n the Fgure 4 s less than the one acheved

8 Consensus-Based Combnng Method for Classfer Ensembles 83 n Fgure 5 (see the locaton of the data ponts n relaton to the dagonal) Comparson of CCM wth Product Method The results n Table 5 show the comparson between the CCM and the product method. It ndcates that CCM exhbts hgher performance across almost all data sets, at varous tranng szes. Table 5. CCM VS. Product method. Dataset 10% 20% 30% 40% 50% Breast Cancer 82.67/ / / / /94.59 Ionosphere 84.02/ / / / /90.76 Auto (Statlog) 63.46/ / / / /71.13 Haberman's 66.28/ / / / /73.31 Contraceptve 40.52/ / / / /56.54 Blog Spam 85.71/ / / / /92.47 Letter Recognton 78.90/ / / / /84.97 Irs 79.19/ / / / /86.19 Segment 77.23/ / / / /78.22 Isolet 78.60/ / / / /79.57 Glass 50.93/ / / / /59.29 Colc 66.85/ / / / /71.98 Heart-c 64.29/ / / / /79.14 Splce 68.13/ / / / /81.28 Anneal 77.56/ / / / /83.42 Wn/Draw/Loss 14/0/1 14/0/1 14/0/1 13/0/2 12/0/3 Sg. W/D/L 12/3/0 13/2/0 12/2/1 11/3/3 10/2/3 GM error rato The sgnfcant superorty n performance of CCM over product method s clearly vsble on some data sets,.e., Auto, Contraceptve, and Colc data sets. For nstance, n Colc data set at 10% tranng sze the gan s over 10 and contnues through the rest of the tranng data szes. Fgure 7. Comparng CCM wth product method on 15 data sets gven 50% of the data as tranng Comparson of CCM, Votng, Mean, and Product Methods Ths subsecton presents vsual presentaton for the result of comparng the four combnng methods on selected domans that expermented on. Generally speakng, all of the combnng methods yeld some ncrease n the accuracy of the ensemble over the base classfers. However the mprovements n performance acheved when usng CCM are, on average, much hgher than those obtaned by majorty votng, mean, and product methods. The amount of ncrease n accuracy acheved by CCM s also more obvous when the amount of tranng data sze s small; as s clearly exhbted by GM error rato. The results n Fgures 8, 9, 10, and 11 demonstrate that CCM s farly robust to varaton n domans propertes, n partcular, number of features, number of examples, and number of labels. The fgures also show that CCM performs well at varous tranng szes and consstently beats the other three methods at dfferent tranng data szes. Fgure 6. Comparng CCM wth product method on 15 data sets gven 40% of the data as tranng. In addton, the summarsed statstcs support the authors clam, n partcular, the GM error rato. Also the trend dscussed above, n comparsons wth votng and mean methods, can be clearly seen n Fgures 6 and 7. The advantages of CCM over product method s not lmted to early ponts on the learnng curve, CCM sgnfcantly outperforms product method on large tranng szes. It has 11 wns out of 15 data sets and 10 wns out 15 data sets at both 40% and 50% tranng szes respectvely (see Fgures 6 and 7). Fgure 8. Comparng CCM wth majorty votng, mean, and product methods on Anneal data set gven all tranng szes. Fgure 9. Comparng CCM wth majorty votng, mean, product methods on Isolet data set gven all tranng szes.

9 84 The Internatonal Arab Journal of Informaton Technology, Vol. 15, No. 1, January 2018 Fgure 10. Comparng CCM wth majorty votng, mean, and product methods on Splce data set gven all tranng szes. Fgure 11. Comparng CCM wth majorty votng, mean, and product methods on Irs data set gven all tranng szes. 6. Conclusons In ths paper, CCM that represents a new theoretcal framework for a lnear combnng method was developed. The effectveness of CCM method was evaluated by comparng ts performance wth the performance of exstng CCM (majorty votng, product, and average method). Expermental results carred out on 14 publc data sets from UCI machne learnng repostory and a blog spam data set that we created, show that CCM s a qute compettve method for classfcaton. It sgnfcantly mproves the average classfcaton accuracy compared to the product and average methods. However, the average classfcaton accuracy s better than or comparable to majorty votng method. The authors of ths research beleve that the proposed CCM provdes an mportant contrbuton to the state of the art of ensemble systems, as t provdes a compettve alternatve to exstng popular lnear combnaton methods. References [1] Alzub J., Dversty Based Improved Baggng Algorthm, n Proceedngs of the Internatonal Conference on Engneerng and MIS, Istanbul, [2] Alzub, J., Optmal Classfer Ensemble Desgn Based on Cooperatve Game Theory, Research Journal of Appled Scences, Engneerng and Technology, vol. 11, no. 12, pp , [3] Alzub O., An Emprcal Study of Irregular AG Block Turbo Codes over Fadng Channels, Research Journal of Appled Scence Engneerng and Technology, vol. 11, no. 12, pp , [4] Benedktsson J. and Swan P., Consensus Theoretc Classfcaton Methods, IEEE Transactons on Systems, Man and Cybernetcs, vol. 22, no. 4, pp , [5] Berensten C., Kanal L., and Lavne D., Consensus Rules, n Proceedngs of the Uncertanty n Artfcal Intellgence Conference, Los Angeles, pp , [6] Berger R., A necessary and Suffcent Condton for Reachng a Consensus Usng DeGroot's Method, Journal of the Amercan Statstcal Assocaton, vol. 76, no. 374, pp ,1981. [7] DeGroot M., Reachng a Consensus, Journal of the Amercan Statstcal Assocaton, vol. 69, no. 345, pp , [8] Fersn E., Messna E., and Pozz F., Sentment Analyss: Bayesan Ensemble Learnng, Decson Support Systems, vol. 68, pp , [9] Hou S., Chen L., Tas E., Demhovsky I., and Ye Y., Cluster-Orented Ensemble Classfers for Intellgent Malware Detecton, n Proceedng of the Internatonal Conference on Semantc Computng, Anahem, pp , [10] Ka L. and Png Z., Usng an Ensemble Classfer on Learnng Evaluaton for E-learnng System, n Proceedng of n Computer Scence and Servce System, Nanjng, pp , [11] Km Y. and Hong C., A Meta-Learnng Approach for Combnng Multple Classfers, Advanced Scence and Technology Letters, vol. 29, pp , [12] Kuncheva L., Combnng Pattern Classfers: Methods and Algorthms, John Wley and Sons, [13] Lela C., Maâmar K., and Salm C., Combnng Neural Networks for Arabc Handwrtng Recognton, n Proceedng of Internatonal Symposum on Programmng and Systems, Algers, pp , [14] L X., Lu X., and Yu L., Aggregatve Model- Based Classfer Ensemble for Improvng Land- Use/Cover Classfcaton of Landsat TM Images, Internatonal Journal of Remote Sensng, vol. 35, no. 4, pp , [15] Lu M., Yuan B., Shu F., and Feng C., A new Classfer Combnaton Method Based on TSKtype Fuzzy System, n Procedng of Internatonal Conference on Wreless, Moble and Multmeda Networks, Hangzhou, pp. 1-4, 2006.

10 Consensus-Based Combnng Method for Classfer Ensembles 85 [16] Newman J., Hettch S., Blake L., and Merz J., {UCI} Repostory of Machne Learnng Databases, Techncal report [17] Polkar R., Ensemble Based Systems n Decson Makng, IEEE Crcuts and Systems Magazne, vol. 6, no. 3, pp , [18] Qunlan J., Baggng, Boostng, and C4. 5, n Proceedng of the Thrteenth Natonal Conference on Artfcal Intellgence, Portland, pp ,1996. [19] Shaban K., Basr O., Kamel M., and Hassanen K., Intellgent Informaton Fuson Approach n Cooperatve Multagent Systems, n Proceedng of the 5 th Bannual World Automaton Congress, Orlando, pp ,2002. [20] Tax D., Van Breukelen M., Dun R., and Kttler J., Combnng Multple Classfers by Averagng or by Multplyng?, Pattern Recognton, vol. 33, no. 9, pp , [21] Toh K., Yau W., and Jang X., A Reduced Multvarate Polynomal Model for Multmodal Bometrcs and Classfers Fuson, IEEE Transactons on Crcuts and Systems for Vdeo Technology, vol. 14, no. 2, pp , [22] Webb G., Multboostng: A Technque for Combnng Boostng and Waggng, Machne Learnng, vol. 40, no. 2, pp , [23] Zhou Z., Ensemble Methods: Foundatons and Algorthms, CRC Press, [24] Zou Q., Guo J., Ju Y., Wu M., Zeng X., and HongZ., Improvng TRNAscan SE Annotaton Results Va Ensemble Classfers, Molecular Informatcs, vol. 34, no , pp , 2015.

11 86 The Internatonal Arab Journal of Informaton Technology, Vol. 15, No. 1, January 2018 Omar Alzub was born n Allan, Jordan, n He receved Master degree wth dstncton n Computer and Network Securty from New York Insttute of Technology (New York, USA) n He also holds Ph.D. degree n Computer and Network Securty from Swansea Unversty (Swansea, UK) n 2013.He joned Al-Balqa Appled Unversty snce 2013 as an assstant professor n computer and network securty. Dr. Alzub research nterest ncludes network securty, cloud securty, applcaton of Algebrac-Geometrc theory n channel codng, machne learnng, and Ellptc curve cryptosystems. He s also nvolved n UK-Turkey Hgher Educaton Partnershp Program projects where he proposed a cryptosystem based on Ellptc curves. Jafar Alzub receved a B.Sc (Hons) n Electrcal Engneerng, majorng Electroncs and Communcatons from the Unversty of Engneerng and Technology, Lahore, Pakstan n In 2005 receved M.Sc. (Hons) n Electrcal and Computer Engneerng from New York Insttute of Technology, New York, USA. Between , he became a full tme lectures n the School of Engneerng at Al-Balqa Appled Unversty. In 2008, He joned the Wreless Communcatons Research Laboratory at Swansea Unversty (Swansea, UK), completng hs PhD n Advanced Telecommuncatons Engneerng n June He s now an Assstant professor at Computer Engneerng department, Al-Balqa Appled Unversty; also he s deputy dean of Engneerng Faculty. Hs research nterests nclude Ellptc curves cryptography and cryptosystems, classfcatons, and codng. As part of hs research, he desgned the frst regular and frst rregular block turbo codes usng Algebrac Geometry codes and nvestgated ther performance across varous computer and wreless networks. Sara Tedmor, n 2001, Dr. Tedmor receved her BSc degree n Computer Scence from the Amercan Unversty of Berut, Lebanon. In 2003, she obtaned her MSc degree n Multmeda and Internet Computng from Loughborough Unversty. In 2008, she receved her Engneerng Doctorate n Computer Scence from Loughborough Unversty, UK. Currently she s an assstant professor n the Computer Scence Department at Prncess Sumaya Unversty of Technology, Jordan. Her research nterests nclude: sentment analyss, mage processng, knowledge extracton, classfcaton, knowledge sharng, prvacy, and software engneerng. Hasan Rashadeh receved hs Bachelor and Master degrees n computer scence and nformaton technology from Yarmouk Unversty n 1999 and 2002 respectvely. In 2008 he obtaned hs PhD degree n computer scence from Sant Petersburg Electrotechncal State Unversty. Then after, he joned the department of computer scence at Prnce Abdullah Bn Ghaz Faculty of Informaton Technology / Al-Balqa Appled Unversty- Jordan as an assstant professor, n 2015 he has been apponted as the Head of the department Hs research nterests ncludes: machne learnng, mage processng and computer vson, nformaton retreval and optmzaton. Omar Almoman receved hs Bachelor and Master degree n Telecommuncaton Technology from nsttute of Informaton Technology, Unversty of Sndh n 2002 and 2003 respectvely. He receved hs PhD from Unversty Utara Malaysa n computer network. Currently he s assstant professor and Vce Dean of Informaton Technology Faculty, the World Islamc Scences & Educaton Hs research nterests nvolves moble ad hoc networks, Network Performance, Multmeda Networks, Network Qualty of Servce (QoS), IP Multcast, Network modellng and Smulaton and Grd Computng.

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z. TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

A Lazy Ensemble Learning Method to Classification

A Lazy Ensemble Learning Method to Classification IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 344 A Lazy Ensemble Learnng Method to Classfcaton Haleh Homayoun 1, Sattar Hashem 2 and Al

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Dynamic Integration of Regression Models

Dynamic Integration of Regression Models Dynamc Integraton of Regresson Models Nall Rooney 1, Davd Patterson 1, Sarab Anand 1, Alexey Tsymbal 2 1 NIKEL, Faculty of Engneerng,16J27 Unversty Of Ulster at Jordanstown Newtonabbey, BT37 OQB, Unted

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105

More information

Query Clustering Using a Hybrid Query Similarity Measure

Query Clustering Using a Hybrid Query Similarity Measure Query clusterng usng a hybrd query smlarty measure Fu. L., Goh, D.H., & Foo, S. (2004). WSEAS Transacton on Computers, 3(3), 700-705. Query Clusterng Usng a Hybrd Query Smlarty Measure Ln Fu, Don Hoe-Lan

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Improving Combination Methods of Neural Classifiers Using NCL

Improving Combination Methods of Neural Classifiers Using NCL Internatonal Journal of Computer Informaton Systems and Industral Management Applcatons. ISS 50-7988 Volume 4 (0) pp. 679-686 MIR Labs, www.mrlabs.net/jcsm/ndex.html Improvng Combnaton Methods of eural

More information

Some material adapted from Mohamed Younis, UMBC CMSC 611 Spr 2003 course slides Some material adapted from Hennessy & Patterson / 2003 Elsevier

Some material adapted from Mohamed Younis, UMBC CMSC 611 Spr 2003 course slides Some material adapted from Hennessy & Patterson / 2003 Elsevier Some materal adapted from Mohamed Youns, UMBC CMSC 611 Spr 2003 course sldes Some materal adapted from Hennessy & Patterson / 2003 Elsever Scence Performance = 1 Executon tme Speedup = Performance (B)

More information

Overview. Basic Setup [9] Motivation and Tasks. Modularization 2008/2/20 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION

Overview. Basic Setup [9] Motivation and Tasks. Modularization 2008/2/20 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION Overvew 2 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION Introducton Mult- Smulator MASIM Theoretcal Work and Smulaton Results Concluson Jay Wagenpfel, Adran Trachte Motvaton and Tasks Basc Setup

More information

Using Fuzzy Logic to Enhance the Large Size Remote Sensing Images

Using Fuzzy Logic to Enhance the Large Size Remote Sensing Images Internatonal Journal of Informaton and Electroncs Engneerng Vol. 5 No. 6 November 015 Usng Fuzzy Logc to Enhance the Large Sze Remote Sensng Images Trung Nguyen Tu Huy Ngo Hoang and Thoa Vu Van Abstract

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc.

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 97-735 Volume Issue 9 BoTechnology An Indan Journal FULL PAPER BTAIJ, (9), [333-3] Matlab mult-dmensonal model-based - 3 Chnese football assocaton super league

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

A Fusion of Stacking with Dynamic Integration

A Fusion of Stacking with Dynamic Integration A Fuson of Stackng wth Dynamc Integraton all Rooney, Davd Patterson orthern Ireland Knowledge Engneerng Laboratory Faculty of Engneerng, Unversty of Ulster Jordanstown, ewtownabbey, BT37 OQB, U.K. {nf.rooney,

More information

Private Information Retrieval (PIR)

Private Information Retrieval (PIR) 2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Feature Selection as an Improving Step for Decision Tree Construction

Feature Selection as an Improving Step for Decision Tree Construction 2009 Internatonal Conference on Machne Learnng and Computng IPCSIT vol.3 (2011) (2011) IACSIT Press, Sngapore Feature Selecton as an Improvng Step for Decson Tree Constructon Mahd Esmael 1, Fazekas Gabor

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

Virtual Machine Migration based on Trust Measurement of Computer Node

Virtual Machine Migration based on Trust Measurement of Computer Node Appled Mechancs and Materals Onlne: 2014-04-04 ISSN: 1662-7482, Vols. 536-537, pp 678-682 do:10.4028/www.scentfc.net/amm.536-537.678 2014 Trans Tech Publcatons, Swtzerland Vrtual Machne Mgraton based on

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,

More information

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of

More information

EECS 730 Introduction to Bioinformatics Sequence Alignment. Luke Huan Electrical Engineering and Computer Science

EECS 730 Introduction to Bioinformatics Sequence Alignment. Luke Huan Electrical Engineering and Computer Science EECS 730 Introducton to Bonformatcs Sequence Algnment Luke Huan Electrcal Engneerng and Computer Scence http://people.eecs.ku.edu/~huan/ HMM Π s a set of states Transton Probabltes a kl Pr( l 1 k Probablty

More information

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT 3. - 5. 5., Brno, Czech Republc, EU APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT Abstract Josef TOŠENOVSKÝ ) Lenka MONSPORTOVÁ ) Flp TOŠENOVSKÝ

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Online Detection and Classification of Moving Objects Using Progressively Improving Detectors

Online Detection and Classification of Moving Objects Using Progressively Improving Detectors Onlne Detecton and Classfcaton of Movng Objects Usng Progressvely Improvng Detectors Omar Javed Saad Al Mubarak Shah Computer Vson Lab School of Computer Scence Unversty of Central Florda Orlando, FL 32816

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

Scheduling Remote Access to Scientific Instruments in Cyberinfrastructure for Education and Research

Scheduling Remote Access to Scientific Instruments in Cyberinfrastructure for Education and Research Schedulng Remote Access to Scentfc Instruments n Cybernfrastructure for Educaton and Research Je Yn 1, Junwe Cao 2,3,*, Yuexuan Wang 4, Lanchen Lu 1,3 and Cheng Wu 1,3 1 Natonal CIMS Engneerng and Research

More information

Load-Balanced Anycast Routing

Load-Balanced Anycast Routing Load-Balanced Anycast Routng Chng-Yu Ln, Jung-Hua Lo, and Sy-Yen Kuo Department of Electrcal Engneerng atonal Tawan Unversty, Tape, Tawan sykuo@cc.ee.ntu.edu.tw Abstract For fault-tolerance and load-balance

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

A New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1

A New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1 A New Feature of Unformty of Image Texture Drectons Concdng wth the Human Eyes Percepton Xng-Jan He, De-Shuang Huang, Yue Zhang, Tat-Mng Lo 2, and Mchael R. Lyu 3 Intellgent Computng Lab, Insttute of Intellgent

More information

Associative Based Classification Algorithm For Diabetes Disease Prediction

Associative Based Classification Algorithm For Diabetes Disease Prediction Internatonal Journal of Engneerng Trends and Technology (IJETT) Volume-41 Number-3 - November 016 Assocatve Based Classfcaton Algorthm For Dabetes Dsease Predcton 1 N. Gnana Deepka, Y.surekha, 3 G.Laltha

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

A User Selection Method in Advertising System

A User Selection Method in Advertising System Int. J. Communcatons, etwork and System Scences, 2010, 3, 54-58 do:10.4236/jcns.2010.31007 Publshed Onlne January 2010 (http://www.scrp.org/journal/jcns/). A User Selecton Method n Advertsng System Shy

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices Steps for Computng the Dssmlarty, Entropy, Herfndahl-Hrschman and Accessblty (Gravty wth Competton) Indces I. Dssmlarty Index Measurement: The followng formula can be used to measure the evenness between

More information

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm Recommended Items Ratng Predcton based on RBF Neural Network Optmzed by PSO Algorthm Chengfang Tan, Cayn Wang, Yuln L and Xx Q Abstract In order to mtgate the data sparsty and cold-start problems of recommendaton

More information

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data Malaysan Journal of Mathematcal Scences 11(S) Aprl : 35 46 (2017) Specal Issue: The 2nd Internatonal Conference and Workshop on Mathematcal Analyss (ICWOMA 2016) MALAYSIAN JOURNAL OF MATHEMATICAL SCIENCES

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Simulation Based Analysis of FAST TCP using OMNET++

Simulation Based Analysis of FAST TCP using OMNET++ Smulaton Based Analyss of FAST TCP usng OMNET++ Umar ul Hassan 04030038@lums.edu.pk Md Term Report CS678 Topcs n Internet Research Sprng, 2006 Introducton Internet traffc s doublng roughly every 3 months

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

Related-Mode Attacks on CTR Encryption Mode

Related-Mode Attacks on CTR Encryption Mode Internatonal Journal of Network Securty, Vol.4, No.3, PP.282 287, May 2007 282 Related-Mode Attacks on CTR Encrypton Mode Dayn Wang, Dongda Ln, and Wenlng Wu (Correspondng author: Dayn Wang) Key Laboratory

More information

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton We-Chh Hsu, Tsan-Yng Yu E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

A Multivariate Analysis of Static Code Attributes for Defect Prediction

A Multivariate Analysis of Static Code Attributes for Defect Prediction Research Paper) A Multvarate Analyss of Statc Code Attrbutes for Defect Predcton Burak Turhan, Ayşe Bener Department of Computer Engneerng, Bogazc Unversty 3434, Bebek, Istanbul, Turkey {turhanb, bener}@boun.edu.tr

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information