Improving Combination Methods of Neural Classifiers Using NCL
|
|
- Barrie Dennis
- 5 years ago
- Views:
Transcription
1 Internatonal Journal of Computer Informaton Systems and Industral Management Applcatons. ISS Volume 4 (0) pp MIR Labs, Improvng Combnaton Methods of eural Classfers Usng CL Arash Iranzad, Saeed Masoudna, Fatemeh Cheraghch, Abbas owzar-daln and Reza Ebrahmpour Department of Computer Scence, School of Mathematcs, Statstcs and Computer Scence, Unversty of Tehran, Tehran, Iran {ranzad}@sad.ut.r, {masoudna; xeraghch; nowzar}@ut.ac.r Electrcal and Computer Engneerng Department Shahd Rajaee Unversty, Tehran, Iran ebrahmpour@pm.r Abstract: In ths paper the effect of dversty caused by egatve Correlaton Learnng (CL) n the combnaton of neural classfer s nvestgated and an effcent way to mprove combnng performance s presented. Decson Templates and Averagng, as two non-tranable combnng methods and Stacked Generalzaton as a tranable combner are selected as base ensemble learner and CL verson of them are compared wth them n our experments. Utlzng CL for dversfyng the base classfers leads to sgnfcantly better results n all employed combnng methods. Expermental results on fve datasets from UCI Machne Learnng Repostory ndcate that by employng CL, the performance of the ensemble structure can be more favorable compared to that of an ensemble use ndependent base classfers. Keywords: eural etworks, Combnng Classfers, egatve Correlaton Learnng. I. Introducton Combnng multple classfers s a machne learnng method that uses a set of base classfers and combne the output of them for producng the fnal output. Ths method s known under dfferent names n the lterature such as: classfer ensembles, fuson of learners, mxtures of experts, multple classfer systems, etc. In general pont of vew, classfer ensembles system s a data classfcaton method made up of an ensemble of base classfers whose outputs on an nput sample data are combned n some way to get a fnal output on ts classfcaton task [,, 3]. Classfer ensembles system may generate more accurate output than each of the base classfers. It has been proved that Classfer ensembles system s a sutable way to mprove the data classfcaton performance, partcularly for complex problems such as those nvolvng lmted number of sample patterns, hgh-dmensonal feature sets, and hghly overlapped classes [4-8]. Combnaton of multple classfers decsons s a promsng soluton to gan an acceptable classfcaton performance [9-]. By utlzng a proper strategy for the constructon of an ensemble network, t can be successfully appled to classfcaton problems wth mprecse and uncertan nformaton. An artfcal neural network (A), commonly called neural network (), s one of the most common classfers whch are used n multple classfer systems. A s an nformaton processng system that s nspred by bologcal nervous systems, and approxmates the operaton of the human bran,.e. a s a mathematcal and computatonal model that tres to smulate the structure, functonal aspects, and behavor of bologcal neural networks of the human. eural network s the major technque used n decson makng and classfcaton phase for problem solvng n the recent years [3-6]. A s able to learnng and generalzng from some samples and tres to produce meanngful solutons to data whch s never seen by t. It can solve the problems even when nput data of problem contan errors and are ncomplete. In most cases, a s an adaptve system that changes ts topology based on external or nternal nformaton that flows through the network durng the tranng phase. eural networks are n the class of non-lnear statstcal system modelng machnes. They can be used to model complex relatonshps between nputs data and outputs data or to fnd patterns and sgnals n nput data. A neural network may be traned to perform classfcaton, estmaton, approxmaton, smulaton, and predcton,.e. a can be confgure for a specfc applcaton, such as pattern fndng, data classfcaton, functon approxmaton through the learnng process. eural network ensembles methods have two major components, a method to create base experts and a method for combnng output of base experts [7]. Both theoretcal and expermental studes [8, 9] are shown that combnng procedure s the most effectve when the experts MIR Labs, USA
2 Iranzad, Masoudna, Cheraghch, owzar-daln and Ebrahmpour estmates are negatvely correlated; but ths method s moderately effectve when the experts are uncorrelated and only mldly effectve when the experts are postvely correlated. Therefore, more mproved generalzaton ablty can be obtaned by combnng the outputs of experts whch are accurate and ther errors are negatvely correlated [0]. A large number of combnng schemes for ensembles classfer exsts. As a general vewpont, there are two types of combnaton strategy: classfer selecton and classfer fuson []. The presumpton n classfer selecton s that each classfer s an expert n some local area of feature space; and one or more classfer are selected to assgn the label of the nput sample n ths area. Recall from [], classfer fuson assumes that all classfers traned over the whole feature space, and are thereby consders as compettve rather than complementary. Classfer selecton has not attracted as much attenton as classfer fuson. However, classfer selecton s probably the better of the two strateges, f traned well. There also exst combnatonal schemes between the two pure strateges []. Such a scheme; for example, s takng the average of outputs wth coeffcents whch depend on the nput x. Thus, the local competence of the classfers, wth respect to x, s measured by the weghts, and, more than one classfer s responsble for x and the outputs of all responsble classfers are fused. The mxture of experts combnng method s an example of a method between selecton and fuson [, 3]. On the other pont of vew, some combnng methods do not need tranng after the classfers n the ensemble have been traned ndvdually. The Averagng combner [4] and Decson Template (DT) [5] are two examples of ths group. Other combners need addtonal tranng, for example, the weghted average combner and Stacked Generalzaton (SG) [6]. The frst group s called non-tranable combnng and the second one s named tranable [7, ]. The Averagng method s a relatvely smple method of combnng models n non-tranable group. In ths method, each classfer s traned and outputs for each class are combned by averagng. Commonly, the fnal result s obtaned by just averagng of the estmated posteror probabltes of each base classfer. Ths smple method gves very good results for some problems [4]. Ths result s slghtly surprsng, especally consderng ths fact that the posteror probabltes averagng are not based on some theoretcal support. DTs are another non-tranable combnng technque n fuson category. A DT s a robust classfer ensemble scheme that the output of classfers s combned by comparng them to a characterstc template for each class. DT ensemble method uses outputs of all classfer to calculate the fnal output for each class, that s n hgh contrast to most other ensemble methods whch use only the support for that partcular class to make ther decson [7]. SG s tranable combner n fuson category. SG s a way of combnng multple classfers that have been learned for a classfcaton task. In ths method, the output pattern of an ensemble of traned experts serves as an nput to a second-level expert. The frst step s to collect the output of each frst level classfer, nto a new set. For each nstance n the orgnal tranng set, ths data set represents every classfer's predcton. The new data s treated as the nput for another classfer and n the second step a learnng algorthm s employed to solve ths problem [6]. In ensemble research, t s wdely beleved that the success of ensemble algorthms depends on both the accuracy and dversty among ndvdual learners n the ensemble, demonstrated by theoretcal [8, 9] and emprcal studes [30]. In general, the component learners n an ensemble are desgned to be accurate yet dverse. The emprcal results show that the performance of an ensemble algorthm s related wth the dversty among base classfers n the ensemble, and better performance may be acheved wth more dversty [3-36]. Many related research on analyss and applcatons of dversty have been conducted [37-40]. There are varous approaches to construct dverse base classfers. Dfferent learnng algorthms, dfferent representaton of patterns and parttonng the tranng set are some of these approaches. egatve Correlaton Learnng (CL) s a method based on dfferent learnng approach to make dversty between base neural classfers [30, 39, 4]. Ths method adds a penalty term to the error functon whch helps n makng the base classfers as dfferent from each other as possble whle encouragng the accuracy of base classfers. In ths paper, the effect of dversty made by CL on the combnaton methods of neural networks s nvestgated. Some fuson methods whch nclude DTs, SG and average are employed as combnng method. The expermental results show that dversfed base classfers by CL enhanced the performance of all mentoned combnng methods. II. Combnng Methods Learnng s a process, that dfferent procedures exceeds to dfferent performance also feature extracton methods nfluence on performance. There s no standard algorthm to do best performance wth a few numbers of data. Each of classfcaton method dependent on dfferent bases performs dfferent from the other. The combnaton of multple classfers was shown to be sutable way to mprove the performance of predcton n dffcult class determnaton problems [4]. It has become clear that for more datasets that are complcated, varous types of combnng rules can mprove the fnal classfcaton. Complexty n classfcaton can be represented from lmtaton n number of data, classes overlappng and credble nose n data. Results from [8] show that when classfers have small error rate and are ndependent n decson-makng, combnaton of classfers s useful. There are many ways to combne the results of a set of classfers, dependng on the type of the classfers' output [5]. Classfers are dfferent by one of these methods. In ths secton some famous fuson methods whch are used n ths paper, are descrbed. A. Decson Templates DT s a fuson approach taken from fuzzy templates [4]. The general schema of a DT ensemble s shown n Fgure. Assume that {D, D,,D L } s a set of L classfers that classfy samples set S = {( x, t ),( x, t ),...,( x, t )}, nto C classes. The outputs of these classfers for nput x can be arranged as a L C matrx called decson profle (DP). The structure of ths matrx s defned as follows: d, d, C DP( x) = () d L, dl, C
3 Improvng Combnaton Methods of eural Classfers Usng CL 3 Fgure. Decson Template schema. In ths matrx, the row shows the output of classfer D and column j shows support of classfers D,..., D L for class j, therefore entry d,j ndcates the support of classfer for class j. Decson template matrx of a class s the average of decson profles obtaned from the tranng samples belongng to that class. The (k, s)-th element of the DT matrx for class j s calculated by: Ind( x, ) ( ) q= q j dk, s xq dt j ( k, s)( x) =, () Ind( x, j) q= where x q s the q-th element of tranng dataset, Ind(x q,j) s an ndcator functon and ts value s f x q belongs to class j, and 0 otherwse. Durng the testng phase, DT scheme compares the DP of the nput sample wth all of the DTs and suggest a support for each class equal to the smlarty between ts DT and the DP. There are varous smlarty measures that can be appled to estmate the smlarty. The below functon s used as smlarty measure n ths paper. q the level-network. The tranng algorthm of ths modular ensemble can be summarzed as follows. ) From samples of dataset S, leave out one test sample, and tran each base classfers of the level-0 on the remanng - samples ) Produce a predcton output for the test sample. The output pattern y=[y, y,,y K ] wth the target t of the base classfers of the level-0, for the test sample, becomes a tranng sample for the classfer of level-. 3) Repeat the process of Step and, n a leave-one-out methodology. Ths process produces a tranng set D wth samples, whch s used to tran the classfer of level-. 4) To create fnal learner system, all of the base classfers of the level-0 are traned one more tme usng samples n S. L C H( DT, DP( x)) = DT ( k, s) d( k, s) ( x). (3) LC k= s= B. Stacked Generalzaton In stacked generalzaton method, the output patterns of some ensemble of traned base experts serve as an nput to a second-level expert. The general framework of ths method conssts of two levels (see Fgure ). The frst level, level-0, s formed by base classfers whch are traned usng the nput data and the target output data. The output of level-0 s then used as the nput data to level-. As s shown n Fgure, for tranng set S = {( x, t ),( x, t ),...,( x, t )}, a set of K 0 0 level-0 base neural networks from to s arranged as the frst layer, and s traned by S. Then, the outputs of frst layer networks are combned usng a level- network. In the other words, frst, the level-0 networks are traned usng the nput data and the target outputs. Then the outputs of the frst layer wth the correspondng target class are used to tran Fgure. Archtecture of Stacked Generalzaton. C. Averagng The Averagng method s a relatvely smple method of combnng models. In ths method, each classfer s traned and outputs for each class are combned by averagng. Once the classfers n the ensemble are traned they do not requre any further tranng. The Averagng method s used when each classfer produces a confdence estmate (e.g., a posteror). In ths case, the wnner class s the class wth the hghest average posteror probablty across the ensemble. Let
4 4 Iranzad, Masoudna, Cheraghch, owzar-daln and Ebrahmpour S = {( x, t),( x, t ),...,( x, t )} be the tranng set. At frst, the set of classfers are traned by ths set. After that, n the Averagng combnaton, class label of test data x s obtaned by µ x) = f ( y,, y ), (4) ( L E = = = = e ( n) ( y ( n) t ( n)) + λ. (6) where y the output of classfer for test data x, f s the average operaton, and µ (x) s the class label of pattern x. III. egatve Correlaton Learnng n combnng methods In ths secton, at frst, a short explanaton of the egatve Correlaton Learnng (CL) s gven. Later ensemble schema wth egatve Correlaton Learnng for desgnng neural network ensembles s presented. A. egatve Correlaton Learnng Most of the methods for desgnng neural network ensembles tran the ndvdual neural networks ndependent of each other or n sequental. One of dsadvantage of such methods s the loss of nteracton among ndvdual neural networks durng learnng process. It s desred to encourage dfferent ndvdual neural networks to learn dfferent parts of tranng data set so that the neural network ensemble can learn the whole parts of tranng data set better. CL ntroduced n Secton I s a method that supports ths am and trans the ensemble smultaneously and nteractvely. Ths method presents a correlaton penalty term whch has been added to the error functon of each ndvdual neural network. Durng the tranng process, each ndvdual neural network n the ensemble nteracts wth other neural networks through ther penalty terms n the error functons [30, 43]. Suppose that there s a tranng set denoted by S = {( x, t ),( x, t ),...,( x, t where x R n s the n -dmensonal pattern, t s a scalar and the target output and s the sze of the tranng set. The assumpton that the output t s a scalar has been made nearly to smplfy exposton of dea wthout loss of generalty. CL consders estmatng output by formng an ensemble, whose fnal output s a smple averagng on outputs of the set of ndvdual neural networks, L y( n) = y ( n), (5) L = where L s the number of ndvdual neural networks n the ensemble and y (n) s the output of network on the n -th tranng pattern, and y (n) s the output of ensemble on the same tranng pattern. The general Archtecture of CL s shown n Fgure 3. The error functon E for neural network n negatve correlaton learnng s defned by )}, Smlarly, t (n) and P (n) are the desred output and correlaton penalty functon for the n -th pattern respectvely. The penalty term P (n) s the error functon of each ndvdual neural network, and λ s a weghtng parameter on the penalty functon. The λ parameter controls a trade-off between objectve and penalty functons; when λ=0, the penalty functon s omtted and we have an ensemble that each neural network s traned ndependently of the other, usng backpropagaton algorthm. All networks can tran smultaneously and nteractvely on the same tranng dataset, and the penalty term has the followng form: P ( n ) = ( y ( n ) y ( n )) ( y k ( n ) y ( n)), (7) P ( n ) = ( y ( n ) y ( n)) Above equaton follows from the fact that L = k. ( y ( n) y( n)) = 0, (8) whch s based on the fact that the sum of devatons around a mean s equal to zero. Mnmzaton of (7) mples that each neural network has to mnmze the dfference between the target output and ts actual output, lke the penalty term. Mnmzaton of the penalty term n (7) results n maxmzaton of the dstance between ndvdual neural networks output and the average values. It s clear that from the second part of (7) snce there s a negatve sgn before the dstance term n ths equaton. Therefore the penalty term causes each neural network acts functonally dfferent, and can be expected to get useful dversty among the neural networks. Snce the CL ncorporates a dversty measurement drectly nto the error functon, therefore t s consdered as an explct ensemble method [43, 44]. Brown et al. [44] showed that CL can be vewed as a technque derved from ambguty decomposton. The ambguty decomposton, gven by Eq. 9, has been wdely recognzed as one of the most mportant theoretcal results obtaned for ensemble learnng. ( y t) = w ( y t) w ( y y). Ths equaton states that the Mean-Square-Error (MSE) of the ensemble s guaranteed to be less than or equal to the average MSE of the ensemble members. In ths equaton t s the target value of an arbtrary data pont and w =, 0 and y w s the convex combnaton of the L ensemble members as: (9) y = w y. (0)
5 Improvng Combnaton Methods of eural Classfers Usng CL 5 Fgure 3. Archtecture of egatve Correlaton Learnng. The ambguty decomposton n CL ensemble method provdes a smple expresson for the effect of error correlatons. The decomposton s composed of two terms. The frst term w ( y t), s the weghted average error of the ndvdual neural networks. The second term, w ( y y) referred to as the ambguty, measures the amount of varablty among the ensemble members and can be consdered to be a correlaton of the ndvdual neural networks. Utlzng the coeffcent λ allows us to vary the emphass on the correlaton components to yeld a near-optmum balance n the trade-off between these two terms; ths can be regarded as an accuracy-dversty trade-off [39, 44]. Hansen [8, 45] showed that there s a relatonshp between bas-varance-covarance decomposton and ambguty decomposton, n whch portons of the frst decomposton terms correspond to the portons of the ambguty decomposton terms. Therefore, any attempt to strke a balance between the two ambguty decomposton terms leads to three components of the other decomposton that can be balanced aganst each other, and the MSE tends to a near-mnmum condton. B. Usng CL to dverse base classfers of combnaton Combnng classfers s used as a common way for ncreasng the classfcaton performance. Ths technque s helpful, but ts effects can be ncreased by usng basc classfers whch have been dversfed by a dversty method. Ths proposed method makes classfers dverse by CL and then combne them by some combnng methods. It wll be shown that ths method s successful when Decson Template, Stacked Generalzaton and Averagng are used as combnng methods. IV. Expermental Results To evaluate the proposed method, we expermentally compare the Decson Template, Stacked Generalzaton and Averagng to CL verson of these ensembles. We test the combnaton rules usng real data sets. For ease of comparson, fve man benchmark classfcaton datasets are used. These datasets are Sat-mage, Vehcle, Pma, Breast and Sonar from the UCI Machne Learnng Repostory [46]. Informaton of these data sets s shown n Table. Also, a bref revew of each data set s gven n the followng. Dataset name umber of umber of attrbutes Tranng sze Test sze Classes Sat-mage Vehcle Pma Breast Sonar Table. Datasets Informaton. Sat-mage Ths data was generated from Land sat Mult-Spectral Scanner mage data. It conssts of 6435 pxels wth 36 attrbutes. The pxels are crsply classfed n 6 classes. The classes are: red sol (3.8 %), cotton crop (0.9 %), grey sol (.0 %), damp grey sol (9.73 %), sol wth vegetaton stubble (0.99 %), and very damp grey sol (3.43 %). we used only features # 7 to # 0, as recommended by the database desgners. Vehcle Ths data was generated from D mages of varous types of vehcles by applcaton of an ensemble of shape feature extractors to the D slhouettes of the objects. It conssts of 946 eghteen-dmensonal vectors as the extracted features for four types of vehcle: OPEL, SAAB, BUS and VA. The four classes of ths dataset almost have same sze. Pma Ths data was from atonal Insttute of Dabetes and Dgestve and Kdney Dseases. All consdered patents are females at least years old of Pma Indan hertage. The feature vector of each patent nformaton s an eght-dmensonal vector, ncludng both the nteger and real
6 6 Iranzad, Masoudna, Cheraghch, owzar-daln and Ebrahmpour valued numbers. The dataset conssts of 768 cases for two types: healthy people aganst who have dabetes. Breast Cancer Wsconsn Ths data was computed from a dgtzed D mage of a fne needle asprate of a breast mass. It descrbes characterstcs of the cell nucle present n the mage. The dataset conssts of 569 thrty-two dmensonal vectors as the real-valued features are computed for each cell nucleus. The two classes of ths dataset present two types of dagnoss: malgnant and bengn tumors. Sonar Ths dataset obtaned by bouncng sonar sgnals off cylnders made up from two dfferent materals, at varous angles and under varous condtons. It contans patterns obtaned by bouncng sonar sgnals off a metal cylnder and 97 patterns obtaned from rocks under smlar condtons. The transmtted sonar sgnal s a frequency-modulated chrp, rsng n frequency. The data set contans sgnals obtaned from a varety of dfferent aspect angles, spannng 90 degrees for the cylnder and 80 degrees for the rock. Each pattern s a set of 60 numbers n the range 0.0 to.0. Each number represents the energy wthn a partcular frequency band, ntegrated over a certan perod of tme. Therefore, ths dataset conssts of 08 sxty dmensonal real value vectors for two groups of bouncng sonar sgnals off metal and rocks. In all experments there are fve MLPs n the ensemble. The strength parameter of CL tested n the nterval of [0:0.:] and 0.8 has been chosen as the best one and 50 epochs has been done to tran the ensemble. The weak classfers have been chosen as the base classfers whch are obtaned by usng neural networks wth low complexty. The small sub-sample of each dataset s selected randomly as the tranng set. It s appled to show generalzaton ablty of proposed method. The small part of dataset (about 0%) s selected for tranng, but for some datasets ncludng: Vehcle and Pma, followng ths approach was not practcal because they have many classes that t s mpossble to tran well by that sze of tranng data and t have to be chosen more data as tranng set (around 0%). Also, snce Sonar dataset has small nstances, classfers cannot tran by 0 percentages of dataset and t has to be chosen more tranng data (around 0%). It s expected that ths proposed method would have thrvng results n the case of small sample sze problems, and ths s a way for solvng that problems whch are dffcult to classfy by smple neural networks lke MLP. The related tables of the results are llustrated below. The results of proposed method n Decson Template as a combnng method are presented n Table. Dataset Average percentages of 5 base DT CL-DT classfers Sat-mage (±3.9) Vehcle 6.07(±.9) Pma 63.4(±.34) Breast (±.) Sonar 44.70(±.46) Table. Results for combnng based on Decson Template. The results of proposed method n Stacked Generalzaton method are presented n Table 3. Dataset Average percentages of 5 base SG CL-SG classfers Sat-mage (±3.9) Vehcle 6.07(±.9) Pma 63.4(±.34) Breast (±.) Sonar 44.70(±.46) Table 3. Results of Stacked Generalzaton. The results of proposed method n Averagng combnng method are presented n Table 4. Dataset Average percentages of 5 base classfers Averagng CL- Averagng Sat-mage (±3.9) Vehcle 6.07(±.9) Pma 63.4(±.34) Breast (±.) Sonar 44.70(±.46) Table 4. Results of combnng based on averagng. It s obvous that CL mproved the performance of all nvestgated methods consderng to the results shown n the tables. The examples of the above tables show that the classfers on the ndependent data sets classfy a large fracton of the data correctly when the CL method uses for dversty. The results also show that the ndependent vews of the dfferent classfers can contrbute sgnfcantly to correct the output for some of the more dffcult data n CL verson of each combnng classfers. V. Concluson The approach ntroduced n ths paper s usng dverse neural network classfers for combnng,.e. the man goal of ths work was to nvestgate the relatve effect of CL over classfer outputs. The proposed method has two steps: at frst, the base classfers are dversfed by CL and then combnng methods are appled. Two non-tranable combnng methods ncludng: averagng and DTs and a tranable combner SG are nvestgated. Expermental results show that dverse base classfers mprove the performance of combnng n both tranable and non-tranable combnng methods. References [] L. Rokach, 00, Ensemble-based classfers, Artfcal Intellgence Revew, 33(), pp [] T. Wlk, and M. Woznak, 0, Soft computng methods appled to combnaton of one class classfers, eurocomputng, 75(), pp
7 Improvng Combnaton Methods of eural Classfers Usng CL 7 [3] S. Bhardwaj, S. Srvastava, J. Gupta, and A. Srvastava, 0, Chaotc tme seres predcton usng combnaton of hdden markov model and neural nets, Internatonal Journal of Computer Informaton Systems and Industral Management Applcatons, 4, pp [4] T. P. Tran, T. T. S. guyen, P. Tsa et al., 0, BSP: boosted subspace probablstc neural network for emal securty, Artfcal Intellgence Revew, 35(4), pp [5] S. Kotsants, 0, An ncremental ensemble of classfers, Artfcal Intellgence Revew, 36(4), pp [6] S. Kotsants, 0, Combnng baggng, boostng, rotaton forest and random subspace methods, Artfcal Intellgence Revew, 35(3), pp [7] L. Rokach, 00, Pattern Classfcaton Usng Ensemble Methods: World Scentfc Pub. Co. Inc., Hackensack, J. [8] Z. Yu-Quan, O. J-Shun, C. Geng, and Y. Ha-Png, 0, Dynamc weghtng ensemble classfers based on cross-valdaton, eural Computng & Applcaton, 0(3), pp [9] M. Tabassan, R. Ghader, and R. Ebrahmpour, 0, Combnaton of multple dverse classfers usng functons for handlng data wth mperfect labels, Expert Systems wth Applcatons, 39, pp [0] M. Tabassan, R. Ghader, and R. Ebrahmpour, 0, Kntted fabrc defect classfcaton for uncertan labels based on Dempster-Shafer theory of evdence, Expert Systems wth Applcatons, 38, pp [] R. Ebrahmpour, E. Kabr, and M. Yousef, 0, Improvng mxture of experts for vew-ndependent face recognton usng teacher-drected learnng, Machne Vson & Applcatons,, pp [] F. Behjat-Ardakan, F. Khademan, A. owzar-daln, and R. Ebrahmpour, 0, Low resoluton face recognton usng mxture of experts, World Academy of Scence, Engneerng and Technology, 80, pp [3] R. Ghazal,. M. aw, and M. Z. M. Salkon, 009, Forecastng the UK/EU and JP/UK tradng sgnals usng polynomal neural networks, Internatonal Journal of Computer Informaton Systems and Industral Management Applcatons,, pp [4] P. Klement and V. Snasel, 00, SOM neural network - a pece of ntellgence n dsaster management, Internatonal Journal of Computer Informaton Systems and Industral Management Applcatons,, pp [5] P. M. Carell, E. Olvera, C. Badue, and A. F. D. SouzaKlement, 009, Mult-Label text categorzaton usng a probablstc neural etwork, Internatonal Journal of Computer Informaton Systems and Industral Management Applcatons,, pp [6] M. Barakata, F. Druauxa, D. Lefebvrea, M. Khallb, O. Mustaphac, 0, Self adaptve growng neural network classfer for faults detecton and dagnoss, eurocomputng, 74(8), pp [7] R. Polkar, 007, Bootstrap-nspred technques n computatonal ntellgence, IEEE Sgnal Processng Magazne, 4(4), pp [8] K. Tumer and J. Ghosh, 996, Error correlaton and error reducton n ensemble classfers, Connecton Scence, 8(3), pp [9] R. Polkar, 006, Ensemble based systems n decson makng, IEEE Crcuts and Systems Magazn, 6(3), pp [0] R. A. Jacobs, 997, Bas/varance analyses of mxtures-of-experts archtectures, eural Computaton, 9(), pp [] L. I. Kuncheva, 004, Combnng Pattern Classfers, Methods and Algorthms: John Wley, Hoboken, J. [] L. I. Kuncheva, 00, Swtchng between selecton and fuson n combnng classfers: An experment, IEEE Transactons on Systems Man and Cybernetcs Part B-Cybernetcs, 3(), pp [3] R. Ebrahmpour, H. koo, S. Masoudna et al., 00, Mxture of MLP-experts for trend forecastng of tme seres: A case study of the Tehran stock exchange, Internatonal Journal of Forecastng, 7(3), pp [4] J. Kttler, 998, Combnng classfers: A theoretcal framework, Pattern Analyss and Applcatons, (), pp [5] L. I. Kuncheva, J. C. Bezdek, and R. P. W. Dun, 00, Decson templates for multple classfer fuson: an expermental comparson, Pattern Recognton, 34(), pp [6] D. H. Wolpert, 99, Stacked generalzaton, eural etworks, 5(), pp [7] L. I. Kuncheva, 00, Usng measures of smlarty and ncluson for multple classfer fuson by decson templates, Fuzzy Sets and Systems, (3), pp [8] L. K. Hansen and P. Salamon, 990, eural network ensembles, IEEE Transactons on Pattern Analyss and Machne Intellgence, (0), pp [9] A. Krogh and J. Vedelsby, 995, eural network ensembles, cross valdaton and actve learnng, In Proceedngs of Advances n eural Informaton Processng Systems 7, G. Teasuro, D. S. Touretzky, and T. K. Leen (eds.), MIT Press, Cambrdge, MA, pp [30] Y. Lu and X. Yao, 999, Ensemble learnng va negatve correlaton, eural etworks, (0), pp [3] P. Cunnngham and J. Carney, 000, Dversty versus qualty n classfcaton ensembles based on feature selecton, In Proceedngs of th European Conference on Machne Learnng, pp [3] L. I. Kuncheva and C. J. Whtaker, 003, Measures of dversty n classfer ensembles and ther relatonshp wth the ensemble accuracy, Machne Learnng, 5(), pp [33] T. Wndeatt, 006, Accuracy/dversty and ensemble MLP classfer desgn, IEEE Transactons on eural etworks, 7(5), pp [34] G. I. Webb and Z. Zheng, 004, Multstrategy ensemble learnng: Reducng error by combnng ensemble learnng technques, IEEE Transactons on Knowledge and Data Engneerng, 6(8), pp [35] Z. H. Zhou and Y. Yu, 005, Ensemblng local learners through multmodal perturbaton, IEEE Transactons
8 8 Iranzad, Masoudna, Cheraghch, owzar-daln and Ebrahmpour on Systems Man and Cybernetcs Part B-Cybernetcs, 35(4), pp [36] L. I. Kuncheva, 005, Usng dversty measures for generatng error-correctng output codes n classfer ensembles, Pattern Recognton Letters, 6(), pp [37] R. Kohav and D. H. Wolpert, 996, Bas plus varance decomposton for zero-one loss functons, In Proceedngs of the 3th Internatonal Conference on Machne Learnng, pp [38] D. Partrdge and W. Krzanowsk, 997, Software dversty: practcal statstcs for ts measurement and explotaton, Informaton and software technology, 39(0), pp [39] H. Chen and Xn Yao, 008, Regularzed negatve correlaton learnng for neural network ensembles, IEEE Transactons on eural etworks, 0(), pp [40] J. J. Rodrguez and L. I. Kuncheva, 006, Rotaton forest: A new classfer ensemble method, IEEE Transactons on Pattern Analyss and Machne Intellgence, 8(0), pp [4] Y. Lu and X. Yao, 999, Smultaneous tranng of negatvely correlated neural networks n an ensemble, IEEE Transactons on Systems Man and Cybernetcs Part B-Cybernetcs, 9(6), pp [4] K. Woods, W. P. Kegelmeyer, and K. Bowyer, 997, Combnaton of multple classfers usng local accuracy estmates, IEEE Transactons on Pattern Analyss and Machne Intellgence, 9(4), pp [43] M. F. Amn, M. M. Islam, and K. Murase, 009, Ensemble of sngle-layered complex-valued neural networks for classfcaton tasks, eurocomputng, 7(0), pp [44] G. Brown, and J. M. Wyatt, 003, egatve correlaton learnng and the ambguty famly of ensemble methods, Lecture ote n Computer Scence, 709, pp [45] J. V. Hansen, 999, Combnng predctors: comparson of fve meta machne learnng methods, Informaton Scences, 9(), pp [46] A. Asuncon and D. J. ewman, 00, UCI machne learnng repostory [ MLRepostory.html], Unversty of Calforna, School of Informaton and Computer Scence, Irvne, CA. Arash Iranzad receved the BS degree n computer scence from School of Mathematcs, Statstcs, and Computer Scence, Unversty of Tehran n 0. He s currently a MS student n Computer Scence Department, Brock Unversty, Ontaro, CA. Hs research nterests are neural networks, pattern recognton, multple classfer systems, and machne vson. Saeed Masoudna receved the MS degree n computer scence from School of Mathematcs, Statstcs, and Computer Scence, Unversty of Tehran n 0. Hs research nterests are neural networks, pattern recognton, machne learnng, computatonal neuro-scence and models of human vson. Fatemeh Cheraghch receved the MS degree n computer scence from School of Mathematcs, Statstcs, and Computer Scence, Unversty of Tehran n 00. Hs research nterests are neural networks, pattern recognton, multple classfer systems, quantum computng and analyss of algorthms. Author Bographes Abbas owzar-daln receved hs PhD degree from the School of Mathematcs, Statstcs, and Computer Scence n 003. He s an assocate professor of School of Mathematcs and Computer Scence, Unversty of Tehran, and s currently the drector of Graduate studes n ths school. Hs research nterests nclude neural networks, bonformatcs, combnatoral algorthm, parallel algorthms, DA computng, and computer networks. Reza Ebrahmpour receved the BS degree n electroncs engneerng from Mazandaran Unversty, Mazandaran and the MS degree n bomedcal engneerng from Tarbat Modarres Unversty, Tehran, Iran, n 999 and 00, respectvely. He receved hs PhD degree n July 007 from the School of Cogntve Scence, Insttute for Studes on Theoretcal Physcs and Mathematcs, where he worked on vew-ndependent face recognton wth Mxture of Experts. He s an assstant professor of Shahd Rajaee Unversty. Hs research nterests nclude human and machne vson, neural networks, and pattern recognton.
Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers
IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth
More informationLearning the Kernel Parameters in Kernel Minimum Distance Classifier
Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department
More informationThe Research of Support Vector Machine in Agricultural Data Classification
The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou
More informationFace Recognition Based on Neuro-Fuzzy System
IJCSNS Internatonal Journal of Computer Scence and Network Securty, VOL.9 No.4, Aprl 2009 39 Face Recognton Based on Neuro-Fuzzy System Nna aher Makhsoos, Reza Ebrahmpour and Alreza Hajany Department of
More informationCluster Analysis of Electrical Behavior
Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School
More informationA Lazy Ensemble Learning Method to Classification
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 344 A Lazy Ensemble Learnng Method to Classfcaton Haleh Homayoun 1, Sattar Hashem 2 and Al
More informationFeature Reduction and Selection
Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components
More informationBOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET
1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School
More informationLecture 5: Multilayer Perceptrons
Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented
More informationSupport Vector Machines
/9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.
More informationClassifier Selection Based on Data Complexity Measures *
Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.
More informationDetermining the Optimal Bandwidth Based on Multi-criterion Fusion
Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn
More informationImprovement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration
Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,
More informationSubspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;
Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features
More informationClassifying Acoustic Transient Signals Using Artificial Intelligence
Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)
More informationFace Recognition Based on SVM and 2DPCA
Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty
More informationS1 Note. Basis functions.
S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type
More informationSmoothing Spline ANOVA for variable screening
Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory
More informationJournal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray
More informationA Fast Visual Tracking Algorithm Based on Circle Pixels Matching
A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng
More informationX- Chart Using ANOM Approach
ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are
More informationFEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur
FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents
More informationSimulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010
Smulaton: Solvng Dynamc Models ABE 5646 Week Chapter 2, Sprng 200 Week Descrpton Readng Materal Mar 5- Mar 9 Evaluatng [Crop] Models Comparng a model wth data - Graphcal, errors - Measures of agreement
More informationTerm Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task
Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto
More informationAn Image Fusion Approach Based on Segmentation Region
Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua
More informationSkew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach
Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research
More informationParallelism for Nested Loops with Non-uniform and Flow Dependences
Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr
More informationIncremental Learning with Support Vector Machines and Fuzzy Set Theory
The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and
More informationSupport Vector Machines
Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned
More informationOutline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1
4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:
More informationUsing Neural Networks and Support Vector Machines in Data Mining
Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss
More informationA Binarization Algorithm specialized on Document Images and Photos
A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a
More informationTECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.
TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of
More informationBAYESIAN MULTI-SOURCE DOMAIN ADAPTATION
BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,
More informationFuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches
Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of
More informationProblem Definitions and Evaluation Criteria for Computational Expensive Optimization
Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty
More informationMULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION
MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and
More informationA Fusion of Stacking with Dynamic Integration
A Fuson of Stackng wth Dynamc Integraton all Rooney, Davd Patterson orthern Ireland Knowledge Engneerng Laboratory Faculty of Engneerng, Unversty of Ulster Jordanstown, ewtownabbey, BT37 OQB, U.K. {nf.rooney,
More informationSum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints
Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan
More informationEdge Detection in Noisy Images Using the Support Vector Machines
Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona
More informationAn Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 www.ijcsi.org 374 An Evolvable Clusterng Based Algorthm to Learn Dstance Functon for Supervsed
More informationUser Authentication Based On Behavioral Mouse Dynamics Biometrics
User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA
More informationOutline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:
Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A
More informationDetection of an Object by using Principal Component Analysis
Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,
More informationA Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems
A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty
More informationA New Approach For the Ranking of Fuzzy Sets With Different Heights
New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays
More informationDynamic Integration of Regression Models
Dynamc Integraton of Regresson Models Nall Rooney 1, Davd Patterson 1, Sarab Anand 1, Alexey Tsymbal 2 1 NIKEL, Faculty of Engneerng,16J27 Unversty Of Ulster at Jordanstown Newtonabbey, BT37 OQB, Unted
More informationModular PCA Face Recognition Based on Weighted Average
odern Appled Scence odular PCA Face Recognton Based on Weghted Average Chengmao Han (Correspondng author) Department of athematcs, Lny Normal Unversty Lny 76005, Chna E-mal: hanchengmao@163.com Abstract
More informationAn Entropy-Based Approach to Integrated Information Needs Assessment
Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology
More informationA Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers
62626262621 Journal of Uncertan Systems Vol.5, No.1, pp.62-71, 211 Onlne at: www.us.org.u A Smple and Effcent Goal Programmng Model for Computng of Fuzzy Lnear Regresson Parameters wth Consderng Outlers
More informationGSLM Operations Research II Fall 13/14
GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are
More informationCLASSIFICATION OF ULTRASONIC SIGNALS
The 8 th Internatonal Conference of the Slovenan Socety for Non-Destructve Testng»Applcaton of Contemporary Non-Destructve Testng n Engneerng«September -3, 5, Portorož, Slovena, pp. 7-33 CLASSIFICATION
More informationA classification scheme for applications with ambiguous data
A classfcaton scheme for applcatons wth ambguous data Thomas P. Trappenberg Centre for Cogntve Neuroscence Department of Psychology Unversty of Oxford Oxford OX1 3UD, England Thomas.Trappenberg@psy.ox.ac.uk
More informationBackpropagation: In Search of Performance Parameters
Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,
More informationA New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1
A New Feature of Unformty of Image Texture Drectons Concdng wth the Human Eyes Percepton Xng-Jan He, De-Shuang Huang, Yue Zhang, Tat-Mng Lo 2, and Mchael R. Lyu 3 Intellgent Computng Lab, Insttute of Intellgent
More informationData Mining: Model Evaluation
Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct
More informationOnline Detection and Classification of Moving Objects Using Progressively Improving Detectors
Onlne Detecton and Classfcaton of Movng Objects Usng Progressvely Improvng Detectors Omar Javed Saad Al Mubarak Shah Computer Vson Lab School of Computer Scence Unversty of Central Florda Orlando, FL 32816
More informationFace Recognition University at Buffalo CSE666 Lecture Slides Resources:
Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural
More informationAn Optimal Algorithm for Prufer Codes *
J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,
More informationMachine Learning 9. week
Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below
More informationParameter estimation for incomplete bivariate longitudinal data in clinical trials
Parameter estmaton for ncomplete bvarate longtudnal data n clncal trals Naum M. Khutoryansky Novo Nordsk Pharmaceutcals, Inc., Prnceton, NJ ABSTRACT Bvarate models are useful when analyzng longtudnal data
More informationLearning-based License Plate Detection on Edge Features
Learnng-based Lcense Plate Detecton on Edge Features Wng Teng Ho, Woo Hen Yap, Yong Haur Tay Computer Vson and Intellgent Systems (CVIS) Group Unverst Tunku Abdul Rahman, Malaysa wngteng_h@yahoo.com, woohen@yahoo.com,
More informationA Fast Content-Based Multimedia Retrieval Technique Using Compressed Data
A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,
More informationA fast algorithm for color image segmentation
Unersty of Wollongong Research Onlne Faculty of Informatcs - Papers (Arche) Faculty of Engneerng and Informaton Scences 006 A fast algorthm for color mage segmentaton L. Dong Unersty of Wollongong, lju@uow.edu.au
More informationEYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS
P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye
More informationA Genetic Programming-PCA Hybrid Face Recognition Algorithm
Journal of Sgnal and Informaton Processng, 20, 2, 70-74 do:0.4236/jsp.20.23022 Publshed Onlne August 20 (http://www.scrp.org/journal/jsp) A Genetc Programmng-PCA Hybrd Face Recognton Algorthm Behzad Bozorgtabar,
More informationSome Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated.
Some Advanced SP Tools 1. umulatve Sum ontrol (usum) hart For the data shown n Table 9-1, the x chart can be generated. However, the shft taken place at sample #21 s not apparent. 92 For ths set samples,
More informationTsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance
Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for
More informationFuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System
Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105
More informationWishing you all a Total Quality New Year!
Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma
More informationHU Sheng-neng* Resources and Electric Power,Zhengzhou ,China
do:10.21311/002.31.6.09 Applcaton of new neural network technology n traffc volume predcton Abstract HU Sheng-neng* 1 School of Cvl Engneerng &Communcaton, North Chna Unversty of Water Resources and Electrc
More informationPrivate Information Retrieval (PIR)
2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market
More informationUsing Fuzzy Logic to Enhance the Large Size Remote Sensing Images
Internatonal Journal of Informaton and Electroncs Engneerng Vol. 5 No. 6 November 015 Usng Fuzzy Logc to Enhance the Large Sze Remote Sensng Images Trung Nguyen Tu Huy Ngo Hoang and Thoa Vu Van Abstract
More informationLoad Balancing for Hex-Cell Interconnection Network
Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,
More informationFast Feature Value Searching for Face Detection
Vol., No. 2 Computer and Informaton Scence Fast Feature Value Searchng for Face Detecton Yunyang Yan Department of Computer Engneerng Huayn Insttute of Technology Hua an 22300, Chna E-mal: areyyyke@63.com
More informationCorrelative features for the classification of textural images
Correlatve features for the classfcaton of textural mages M A Turkova 1 and A V Gadel 1, 1 Samara Natonal Research Unversty, Moskovskoe Shosse 34, Samara, Russa, 443086 Image Processng Systems Insttute
More informationA Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining
A Notable Swarm Approach to Evolve Neural Network for Classfcaton n Data Mnng Satchdananda Dehur 1, Bjan Bhar Mshra 2 and Sung-Bae Cho 1 1 Soft Computng Laboratory, Department of Computer Scence, Yonse
More informationAssociative Based Classification Algorithm For Diabetes Disease Prediction
Internatonal Journal of Engneerng Trends and Technology (IJETT) Volume-41 Number-3 - November 016 Assocatve Based Classfcaton Algorthm For Dabetes Dsease Predcton 1 N. Gnana Deepka, Y.surekha, 3 G.Laltha
More informationKeywords - Wep page classification; bag of words model; topic model; hierarchical classification; Support Vector Machines
(IJCSIS) Internatonal Journal of Computer Scence and Informaton Securty, Herarchcal Web Page Classfcaton Based on a Topc Model and Neghborng Pages Integraton Wongkot Srura Phayung Meesad Choochart Haruechayasak
More information6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour
6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the
More informationMathematics 256 a course in differential equations for engineering students
Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the
More informationThree supervised learning methods on pen digits character recognition dataset
Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru
More informationA Saturation Binary Neural Network for Crossbar Switching Problem
A Saturaton Bnary Neural Network for Crossbar Swtchng Problem Cu Zhang 1, L-Qng Zhao 2, and Rong-Long Wang 2 1 Department of Autocontrol, Laonng Insttute of Scence and Technology, Benx, Chna bxlkyzhangcu@163.com
More informationCollaboratively Regularized Nearest Points for Set Based Recognition
Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,
More informationLearning a Class-Specific Dictionary for Facial Expression Recognition
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for
More informationSolving two-person zero-sum game by Matlab
Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by
More informationInvestigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers
Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,
More informationSLAM Summer School 2006 Practical 2: SLAM using Monocular Vision
SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,
More informationA Robust Method for Estimating the Fundamental Matrix
Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.
More informationHelsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)
Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute
More informationIMAGE FUSION TECHNIQUES
Int. J. Chem. Sc.: 14(S3), 2016, 812-816 ISSN 0972-768X www.sadgurupublcatons.com IMAGE FUSION TECHNIQUES A Short Note P. SUBRAMANIAN *, M. SOWNDARIYA, S. SWATHI and SAINTA MONICA ECE Department, Aarupada
More informationLecture 4: Principal components
/3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness
More informationFace Detection with Deep Learning
Face Detecton wth Deep Learnng Yu Shen Yus122@ucsd.edu A13227146 Kuan-We Chen kuc010@ucsd.edu A99045121 Yzhou Hao y3hao@ucsd.edu A98017773 Mn Hsuan Wu mhwu@ucsd.edu A92424998 Abstract The project here
More informationCourse Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms
Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques
More informationNon-Negative Matrix Factorization and Support Vector Data Description Based One Class Classification
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 5, No, September 01 ISSN (Onlne): 1694-0814 www.ijcsi.org 36 Non-Negatve Matrx Factorzaton and Support Vector Data Descrpton Based One
More informationTuning of Fuzzy Inference Systems Through Unconstrained Optimization Techniques
Tunng of Fuzzy Inference Systems Through Unconstraned Optmzaton Technques ROGERIO ANDRADE FLAUZINO, IVAN NUNES DA SILVA Department of Electrcal Engneerng State Unversty of São Paulo UNESP CP 473, CEP 733-36,
More informationHierarchical clustering for gene expression data analysis
Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally
More information12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification
Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero
More informationA mathematical programming approach to the analysis, design and scheduling of offshore oilfields
17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and
More informationRemote Sensing Textual Image Classification based on Ensemble Learning
I.J. Image, Graphcs and Sgnal Processng, 2016, 12, 21-29 Publshed Onlne December 2016 n MECS (http://www.mecs-press.org/) DOI: 10.5815/jgsp.2016.12.03 Remote Sensng Textual Image Classfcaton based on Ensemble
More information