Benchmarking of Update Learning Strategies on Digit Classifier Systems

Size: px
Start display at page:

Download "Benchmarking of Update Learning Strategies on Digit Classifier Systems"

Transcription

1 2012 Internatonal Conference on Fronters n Handwrtng Recognton Benchmarkng of Update Learnng Strateges on Dgt Classfer Systems D. Barbuzz, D. Impedovo, G. Prlo Dpartmento d Informatca Unverstà degl Stud d Bar "ldo Moro" Bar, Italy {d.mpedovo, prlo}@d.unba.t bstract Three dfferent strateges n order to re-tran classfers, when new labeled data become avalable, are presented n a mult-expert scenaro. The frst method s the use of the entre new dataset. The second one s related to the consderaton that each sngle classfer s able to select new samples startng from those on whch t performs a mssclassfcaton. Fnally, by nspectng the mult expert system behavor, a sample msclassfed by an expert, s used to update that classfer only f t produces a mss-classfcaton by the ensemble of classfers. Ths paper provdes a comparson of three approaches under dfferent condtons on two state of the art classfers (SVM and Nave Bayes) by takng nto account four dfferent combnaton technques. Experments have been performed by consderng the CEDR (handwrtten dgt) database. It s shown how results depend by the amount of the new tranng samples, as well as by the specfc combnaton decson schema and by classfers n the ensemble. Keywords- Feedback learnng, Mult Expert, Tranng Sample Selecton I. INTRODUCTION pattern recognton system conssts of two man processes: enrollment (or tranng) and matchng (or recognton). In the frst phase, samples of specfc classes are acqured, processed and features extracted. These features are labeled wth the ground truth and used to generate the model representng the class. Matchng mode performs the recognton of the (unknown) nput pattern by comparng t to the enrolled templates. Dependng by the specfc scenaro, a sngle classfer s not always able to gan acceptable or hgh performance, so that n many applcatons [1, 2, 4, 6] classfers combnaton s a sutable soluton. On the other hand, as the specfc scenaro evolves, new labeled data can became avalable. In these stuatons, the typcal ssue s the way n whch the new data should be used. Recently, t has been showed, that n cases where a Mult Expert system (ME) s adopted, the collectve behavor of classfers can be used to select the most proftable samples n order to update the knowledge base of classfers [15, 18]. More specfcally samples, to be used for re-tranng, are selected by consderng those, msclassfed by a specfc expert of the set, whch produces a msclassfcaton at the ME level. Ths approach moves from the consderaton that the collectve behavor of a set of classfers can convey more nformaton than those of each classfer of the set, and ths nformaton can be exploted for classfcaton ams [4, 5, 17]. Ths paper reports of a comparson of ths approach to stuaton n whch the entre new dataset s used for learnng as well as the case n whch specfc samples are selected by the ndvdual classfer. Tests have been performed on the task of handwrtten dgt recognton, on the CEDR database, by consderng dfferent type of features, two state of the art classfers (Support Vector Machne and a Nave Bayes classfer), and four dfferent combnaton technques (Majorty Vote, Weghted Majorty vote, Sum Rule and Product Rule). It s shown how results depend by the specfc classfer, by the combnaton decson schema, as well as by tranng/test data dstrbuton. The paper s organzed as follows: Secton II presents the background of re-tranng and the dfferent strateges. Expermental setup and results are, respectvely, n Secton III and IV. Secton V reports conclusons of the work. II.. Background LNING UPDTING STRTEGIES Template update s an nterestng and open ssue both n the case of supervsed and sem-supervsed learnng. Let us consder the scenaro n whch new labeled data become avalable. The queston s: how to use new data? The smplest way, to update the knowledge base of the classfer, s probably to use the entre new set to retran the system gven the ntal tranng condton or, dependng by the classfer, the set of new+old data. On the other hand, many nterestng algorthms can be adopted n order to select specfc samples. mong the others, daboost [7, 9] s able to mprove performance of a classfer on a gven data set by focusng the learner attenton on dffcult nstances. Even f ths approach s very powerful, t works well n the case of weak classfers, moreover not all the learnng algorthms accept weghts for the ncomng samples. nother nterestng approach s the baggng one: a /12 $ IEEE DOI /ICFHR

2 number of weak classfers traned on dfferent subset (random nstance) of the entre dataset are combned by means of the smple majorty votng [8]. Unfortunately baggng and daboost are desgned and work well n the case of weak classfers and, on the other hand, f appled to a ME system, they do not take nto account the behavor of dfferent classfers n the ensemble, n fact they are appled consderng a sngle classfer n a stand-alone modalty. From ths pont of vew, the frst ntent of ths work s to deal wth state of the art performng classfers (not weak) and to determne strateges whch can be appled whatever classfer s consdered. Let us consder the case of new un-labeled data, two well known approaches, used n order to select specfc samples, are self-tranng and co-tranng. These are sem-supervsed learnng paradgms (the updatng process s performed usng both the ntal set of labeled templates and a set of unlabelled data acqured durng the on-lne operaton). Selftranng (or self-update) [13] s based on the concept that a classfer s retraned on ts own most confdent output produced from unlabeled data. The classfer s at frst traned on the set of labeled data and, subsequently, several self-tranng teratons are performed to ncorporate the unlabeled data untl some stop rule s satsfed. Co-tranng (or co-update) [12] s the stuaton under whch two classfers mprove each other. More specfcally the frst expert s up-dated wth elements confdently recognzed by the other one and vce-versa: the assumpton s that classfers nvolved n the co-tranng process have a condtonally ndependent vew of the data. Co-tran and self-tran have a very strong role, n the current state of the art, on bometrcs template up-datng process [14], moreover they have been appled recently even on the feld of handwrtng recognton [10, 11]. The man result observed for self-tranng on the task of handwrtten word recognton [10] s that the challenge of successful self-tranng les n fndng the optmal tradeoff between data qualty and data quantty for retranng. In partcular, f the re-tranng s done wth only those elements whose correctness can nearly be guaranteed, the retranng set does not change sgnfcantly and the classfer may reman nearly the same, or n other cases t could dscard genune samples whose dstrbuton s far from the one already embedded n the knowledge base thus resultng n a performance degradaton. Enlargng the retranng set, on the other hand, s only possble at the cost of ncreasng nose,.e. addng mslabeled words to the tranng set. In ths scenaro, the cotran approach [11] appears to be much more nterestng, n fact t does not suffer lmtatons of the self-update process, and performance mprovement are more evdent than those observed n the self-tranng case, even f the confdence threshold stll plays a crucal role. Co-tran can be easly extended from two to n classfers but the basc observaton s that, once more, even f an ensemble of classfers s avalable, there s no analyss and use of ther common behavor of classfcaton gven the nput to be recognzed and the specfc combnaton schema. From these observatons, specfc strateges are depcted n the next paragraph takng nto account the task of supervsed learnng. B. Learnng Strateges Let be: C j, for j=1,2,,m, the set of pattern classes, P = xk k = 1,2,..., K, a set of pattern to be feed { } to the Mult Expert (ME) system. P s consdered to be parttoned nto S subsets P 1,P 2,, P s,, P S, beng P s ={x k P k [N s (s-1)+1, N s s]} and N s =K/S (N s nteger), that are fed one after the other to the mult-expert system. In partcular, P 1 s used for learnng only, whereas P 2, P 3,,P s,,p S are used both for classfcaton and learnng (when necessary); y s Ω, the label for the x s pattern, Ω = { C, 1 C2,..., C M }, the -th classfer for =1,2,,N, F (k) = (F,1 (k), F,2 (k),, F,r (k), F,R (k)) the numeral feature vector used by for representng the pattern x k P (for the sake of smplcty t s here assumed that each classfer uses R numeral features) KB ( k), the knowledge base of after the processng of P k. In partcular ( ) 1 2 M ( k ) = KB ( k ), KB ( k ) KB ( k ) KB,..., E the mult expert system whch combnes hypothess n order to obtan the fnal one. Intally, frst stage (s=1), the classfer s traned usng the patterns x k P * = P 1. Therefore, the knowledge base KB (s) of s ntally defned as: KB (s)=(kb 1 (s),kb 2 (s),,kb j (s),,kb M (s)) where, for j=1,2,,m: KB j (s)=(f j,1 (s),f j,2 (s),, F j,r (s),, F j,r (s)) (1a) (1b) beng F j,r (s) the set of the r-th feature of the -th classfer for the patterns of the class C j that belongs to P *. Successvely, the subsets P 2, P 3,, P s,, P S-1 are provded one after the other to the mult-classfer system both for classfcaton and for learnng. P S s just consdered to be the testng set n order to avod based or too optmstc results. When consderng new labeled data (samples of P 2, 36

3 P 3,, P s,, P S-1 ), two dfferent strateges can be followed n order to select patterns from P s to tran : 1. x t Ps : update _ KB,.e. all the avalable new patterns belongng to P s are used to update the knowledge base of each ndvdual classfer x P ' x y : update _ KB, 2. t s ( t ) t.e. the ndvdual classfer s updated by consderng only samples belongng to P s whch have been msclassfed by tself. The second approach s derved from daboost and baggng and, at the same tme, t can be consdered as a supervsed verson of self-tranng. In order to nspect and take advantage of the common behavor of the ensemble of classfers, the thrd strategy proposed n ths work (and compared to the prevous two) s the followng: 3. x t Ps ' ( ( xt t E( xt t ): update_ KB.e. the ndvdual classfer s updated by consderng all ts msclassfed samples f and only f these produce (or contrbute to) a msclassfcaton of the ME. For the sake of smplcty, let us consder a ME adoptng three base classfers combned by means of Majorty Vote. In the case depcted n fgure 1(a) n whch two classfers correctly recognze the sample x, both the frst and the second approach would update the knowledge base of wth x thus ncreasng the smlarty ndex [16] of 1, 2 and 3 wth the only advantage of ncreasng performance of 1 on the tranng set and on pattern smlar to x. On the other hand, the ME system would exhbt the prevous performance wthout any mprovements. The thrd approach would not update the knowledgebase of 1. In the case depcted n fgure 1(b), the updatng of the knowledge base of 1 and/or 3 would produce the mprovements of the ME performance. The thrd strategy takes nto account performance of the ndvdual classfer as well as performance at ME level. It s able to select not only samples to be used for the updatng process, but also classfers to whch those samples must be feed. Of course, t s evdent that, many new samples wll not be feed to a specfc classfer and, n general, we could expect to observe performance degradaton f compared to the other strateges. Ths can happen dependng by performances of classfers as well as by the rato new/old data. However we have to consder and remark that we are dealng wth already traned and workng classfers: ntal performances are expected to be hgh (not weak). Ths leads to two consderatons: gven a specfc classfer, the dfference between the confdence value n the case of mssclassfcaton and n the case of correct one could be mputed to the fact that the specfc classfer (features, matchng technque, etc.) s unable to represent t, and no mprovements would be obtaned by ntroducng the new sample n the knowledge base. Ths s partcularly true under the assumpton that strong (not weak) classfers are used. f each classfer n the ensemble were able to recognze exactly the same set of patterns, t would be un-useful ther combnaton. From ths pont of vew we are nterested n not ncreasng the smlarty ndex (SI) among classfers. (a) (b) Fgure 1. Examples of updatng requests III. 1 ( x 2 ( x 3 ( x 1 ( x 2 ( x 3 ( x E ( x ( x EXPIMENTL SETUP. Data Set mult-expert system for handwrtten dgt recognton has been consdered: the CEDR database [9] P={x k j=1,2,,20351} (classes from 0 to 9 ) has been used. The DB has been ntally parttoned nto 6 subsets: P 1 ={x 1,x 2,x 3,,,x }, P 2 ={x 12751,,,x }, P 3 ={x 14120,,,x }, P 4 ={x 15489,,,x }, P 5 ={x 16858,,,x }, P 6 ={x 18224,,,x }. In partcular, P 1 P 2 P 3 P 4 P 5 represent the set usually adopted for tranng when consderng the CEDR DB [6]. P 6 s the testng dataset. Each dgt s zoned nto 16 unform (regular) regons [5], successvely, for each regon, the followng set of features have been consdered [6]: F 1 : features set 1: hole, up cavty, down cavty, left cavty, rght cavty, up end pont, down end pont, left end pont, rght end pont, crossng ponts, up extrema ponts, down extrema ponts, left extrema ponts, rght extrema ponts; E 37

4 F 2 : features set 2 (contour profles): max/mn peaks, max/mn profles, max/mn wdth, max/mn heght; F 3 : feature set 3 (ntersecton wth lnes): 5 horzontal lnes, 5 vertcal lnes, 5 slant -45 lnes and 5 slant +45 lnes. B. Classfers Tests have been performed takng nto account Support Vector Machnes (SVMs) and a Nave Bayes classfer (NB). SVM s a bnary (two-class) classfer, mult-class recognton s here performed by combnng multple bnary SVMs. The kernel functon adopted s the rbf gamma, performance are nfluenced by the standard devaton value ( ) and by the tolerance of classfcaton errors n learnng. In ths work 2 =0.3var (var s the varance of the tranng data set) [6]. NB classfer fts, n the tranng phase, a multvarate normal densty to each class C by consderng a dagonal covarance matrx. Gven an nput to be classfed, the Maxmum a Posteror (MP) decson rule s adopted to select the most probable hypothess among the dfferent classes. C. Combnaton technques Many approaches have been consdered so far for classfers combnaton. These approaches dffer n terms of type of output they combne, system topology and degree of a-pror knowledge they use [1, 2, 3]. The combnaton technque plays a crucal role n the selecton of new patterns to be feed to the classfer n the proposed approach. In ths work the followng decson combnaton strateges have been consdered and compared: Majorty Vote (MV), Weghted Majorty Vote (WMV), Sum Rule (SR) and Product Rule (PR). MV just consders labels provded by the ndvdual classfers, t s generally adopted f no knowledge s avalable about performance of classfers so that they are equal-consdered. The second approach can be adopted by consderng weghts related to the performance of ndvdual classfers on a specfc dataset. Gven the case depcted n ths work, t seems to be more realstc, n fact the behavor of classfers can be evaluated, for nstance, on the new avalable dataset. In partcular, let ε be the error rate of the -th classfer evaluated on the last avalable tranng set, the weght assgned to s w = log 1 β j, beng β ε =. 1 ε Sum Rule (SR) and Product Rule (PR) take nto account the confdence of each ndvdual classfer gven the nput pattern and the dfferent classes [1]. Before the combnaton, confdence values provded by dfferent classfers were normalzed by means of Z-score. IV. RESULTS Results are reported n terms of error rate percentage (). Values of the smlarty ndex (SI) are reported n the last row of each table, moreover for the dfferent learnng strateges and for each classfer, the followng ratos are evaluated and reported:, beng N F the total number of avalable new samples, N S the number of samples (selected among the prevous one) used for learnng and N I the number of samples used for the ntal tranng. R1 represents the percentage of new patterns selected from those avalable whle R2 s a measure of ther nfluence on the ntal tranng set. The label X-feed refers to the use of the X modalty for the feedback tranng process: ll s the feedback of the entre set (frst strategy n par. II.B), s feedback at classfer level (second strategy n par. II.B). MV, WMV, "SR", "PR" are feedback at ME level adoptng, respectvely, the majorty vote, the weghted majorty vote, the sum rule and the product rule schema. Table I reports results related to the use of SVM. The three set of features F 1, F 2 and F 3 (see par. III.) lead, respectvely, to SVM 1, SVM 2 and SVM 3. P 1 s used for tranng and P 6 for testng. P 2 P 3 P 4 P 5 s used for feedback learnng. In ths case the total amount of new samples s the 42.86% of the number of samples of the ntal tranng set (P 1 ). The frst column (No-feed) reports results related to the use of P 1 for tranng and of P 6 for testng, wthout applyng any feedback (R1=R2=0%), whle the approach ll-feed uses all samples belongng to the new set n order to update the knowledge base of each sngle classfer (R1=100%, R2=42.86% ). Dependng by the combnaton technque, a specfc strategy can outperform the others. In two cases out four (MV-feed and SR-feed), the mult-expert strategy outperforms the use of the entre new dataset, whle on three cases out 4 (MV-feed, WMVfeed and SR-feed) the mult-expert strategy outperforms the feedback at sngle expert level. In the case of Majorty Vote and of Sum Rule, feedback at ME level defntvely outperforms other two approaches. In these cases t s of nterest the fact that a very restrcted subset of samples s selected for re-tranng. Table II reports results related to the use of NB classfer under the same condtons of the prevous experment. In ths case performance mprovements provded by feedback at sngle expert level are always better than those obtaned by any ME technque. t the same tme, t must be underlned that the worst re-tranng strategy s the one whch consders the entre set of new avalable samples. typcal mplementaton of co- and self-tran takes nto account multple tranng teratons. In ths work experments have been performed consderng up to 3 teratons. 38

5 Nofeed TBLE I. SVM, FEEDBCK - P 2 P 3 P 4 P 5 - feed MV-feed WMV-feed SR-feed PR- feed R1 R2 R1 R2 R1 R2 R1 R2 R1 R2 SVM SVM SVM MV X X X 2.58 WMV X 1.79 X X 1.74 SR X X 1.36 X 1.41 PR X X X SI ll-feed TBLE II. BF, FEEDBCK - P 2 P 3 P 4 P 5 - feed MV-feed WMV-feed SR-feed PR- feed R1 R2 R1 R2 R1 R2 R1 R2 R1 R2 BF 1 6,81 5,97 11,54 4,95 6,44 5,63 2,41 6,67 3,84 1,65 6,72 3,99 1,71 6,63 3,38 1,45 6,53 BF 2 12,55 11,51 14,49 6,21 11,70 7,10 3,04 11,94 5,31 2,27 12,08 4,45 1,90 12,08 3,77 1,61 12,27 BF 3 10,62 9,59 10,92 4,68 10,24 5,74 2,46 10,24 5,74 2,46 10,29 3,71 1,59 10,34 2,87 1,23 11,04 MV 6,44 5,64 5,97 X X X 6.63 WMV 4,56 4,14 X 4,23 X X 4.70 SR 3,67 3,24 X X 3,52 X 3.81 PR 3,10 2,82 X X X 3, SI 84,05 85,34 84,70 84,52 84,37 84, ll-feed Nofeed Nofeed TBLE III. BF, 3 FEEDBCK LNING ITTIONS - P 2 P 3 P 4 P 5 - feed MV-feed WMV-feed SR-feed PR- feed llfeed R1 R2 R1 R2 R1 R2 R1 R2 R1 R2 BF 1 6,81 5, ,06 6,45 6,48 10,41 4,46 6,44 10,50 4, ,96 3,84 6,77 BF 2 12, ,28 18,00 7,71 11,70 13,48 5, ,64 4, ,68 4,15 12,45 BF 3 10, ,00 6, ,08 6,46 10,20 9,46 4,05 10,39 7,46 3,20 11,61 MV 6, ,39 X X X 6.91 WMV 4, X 4,13 X X 4.79 SR 3, X X 3,34 X 3.99 PR 3, X X X SI 84, , , Nofeed TBLE IV. FEEDBCK - P 2, P 3, P 4, P 5 C-feed MP- feed WMV- Feed SR- Feed PR- Feed llfeed R1 R2 R1 R2 R1 R2 R1 R2 R1 R2 SVM ,60 0, ,18 0, ,33 0, ,18 0, ,15 0, NB ,18 0, ,25 0, ,23 0, ,38 0, ,28 0, MP X X X 3.32 WMV X 2.54 X X 2.35 SR X X 2.30 X 2.41 PR X X X SI In the case of SVM classfers, slght mprovements have been observed n terms of whle confermng the general trend between dfferent feedback strateges already reported n table I. much more nterestng trend has been observed n the case of NB classfers (table III) where 3 re-tranng teratons have been consdered. The spread between performance obtaned wth a sngle expert strategy and a ME one s sensly lower than the case of a sngle teraton (table II), moreover MV-feed s able to outperform feedback at sngle expert level. It s also of nterest to observe that, due to over-fttng, results obtaned gvng the entre set of new samples for feedback learnng provdes a decreasng of performance as the number of teratons ncrease. Fnally, table IV reports results related to the use of a unque feature set F= F 1 F 2 F 3, the two dfferent classfers and a reduced set of samples provded for feedback learnng. In partcular P 1 s used for the ntal tranng and P 6 for testng. P 2, P 3, P 4, P 5 are ndependently 39

6 used, one from the other, for feedback learnng, performance were evaluated for each set and the average s fnally reported. The frst consderaton s that the performed by SVM s so low that t appears un-useful combnng t wth BF (no complementary nfo s added). Of course ths represents an extreme workng pont. Feedback at ME level s able to outperform ll-feed n the only case of Sum Rule (SR) by usng a reduced subset of the avalable new patterns. In all other cases, results provded by the ME feedback approach are equal to those obtaned by feedback at sngle expert level. statstcal sgnfcance of the <0.03 level was acheved for all tests performed and here reported. V. CONCLUSIONS Ths paper shows the possblty to mprove the effectveness of a mult-classfer system, when new labeled data are avalable, by a sutable use of the nformaton extracted from the collectve behavor of the classfers. Experments have been performed consderng state of the art classfers, features and combnatons technques. It has been showed that performance of feedback tranng strctly depend by the classfer structure, by the combnaton strategy of the ME whch s responsble for sample selecton, but also by the data dstrbuton, and the smlarty between samples n the feedback set and samples of the testng set. It has also showed that multple tranng teratons on the same set of data are able to mprove performance both n the case of feedback at sngle and mult expert level. Fnally, also n cases of whch the cardnalty of the new selected tranng set s neglgble f compared to that of the ntal tranng set, the feedback strategy s able to produce mprovements. Future works wll nspect deeply the possblty of teratve re-tranng gven a set of new labeled samples as well as the possblty of evaluate the approaches on the task of sem-supervsed learnng. REFENCES [1] J. Kttler, M. Hatef, R.P.W. Dun, J. Matas, "On combnng classfers", IEEE Trans. on PMI, Vol.20, no.3, pp , [2] R. Plamondon, S.N. Srhar, On-lne and Off-lne Handwrtng Recognton: comprehensve survey, IEEE Trans. on PMI, Vol. 22, n.1, pp , [3] C.Y. Suen, C. Nadal, R. Legault, T.. Ma, L. Lam, Computer Recognton of unconstraned handwrtten numerals, Proc. IEEE, Vol. 80, Issue 7, pp , [4] C.Y. Suen, J. Tan, nalyss of errors of handwrtten dgts made by a multtude of classfers, Pattern Recognton Letters, Vol. 26, Issue 3, pp , [5] G. Prlo, D. Impedovo Fuzzy-Zonng-Based Classfcaton for Handwrtten Characters, IEEE Trans. on Fuzzy Systems, Vol. 19, Issue 4, pp , [6] C.L. Lu, K. Nakashma, H. Sako, H. Fujsawa, Handwrtten dgt recognton: benchmarkng of state-of-the-art technques, Pattern Recognton, Vol. 36, No. 10, pp , [7] R.E. Schapre, The strength of weak learnablty, Machne Learnng, vol. 5, Issue 2, pp , [8] R. Polkar, Bootstrap-Inspred Technques n Computatonal Intellgence, IEEE Sgnal Processng Magazne, Vol. 24, No. 4, pp , [9] Y. Freud and R.E. Schapre, Decson-theoretc generalzaton of on-lne learnng and an applcaton to boostng, J. of Computer and System Scences, vol. 55, no. 1, pp , [10] V. Frnken, H. Bunke, Evaluatng Retranng Rules for Sem- Supervsed Learnng n Neural Network Based Cursve Word Recognton, proc. of ICDR, pp , [11] V. Frnken,. Fscher, H. Bunke,. Fornes, Co-Tranng for Handwrtten Word Recognton, n proc. of ICDR, pp , [12]. Blum and T. Mtchell, Combnng Labeled and Unlabeled Data wth Co-Tranng. CM Proc. of COLT, pp , [13] H. J. Scudder, Probablty of Error of Some daptve Pattern-Recognton Machnes. IEEE Trans. on Informaton Theory, Vol. 11, No. 3, pp , [14] U. Uludag,. Ross,. Jan, Bometrc template selecton and update: a case study n fngerprnts, Pattern Recognton, Vol. 37, Issue 7, pp , [15] D. Impedovo, G. Prlo, "Updatng Knowledge n Feedbackbased Mult-Classfer Systems", n IEEE proc. of Internatonal Conference on Document nalyss and Recognton (ICDR2011), pp , [16] D. Impedovo, G. Prlo, L. Sarcnella, E. Stasolla, rtfcal Classfer generaton for Mult-Expert System Evaluaton, n IEEE Proc. Of Internatonal Conference on Fronters n Handwrtng Recognton (ICFHR 2010), pp , [17] D. Impedovo, G. Prlo, M. Petrone, " mult-resoluton multclassfer system for speaker verfcaton", early vew on Wley Expert Systems, [18] G. Prlo, C.. Trullo, D. Impedovo, Feedback-Based Mult-Classfer System, IEEE proc. of Internatonal Conference on Document nalyss and Recognton (ICDR 2009), pp ,

D. Barbuzzi* and G. Pirlo

D. Barbuzzi* and G. Pirlo Int. J. Sgnal and Imagng Systems Engneerng, Vol. 7, No. 4, 2014 245 bout retranng rule n mult-expert ntellgent system for sem-supervsed learnng usng SVM classfers D. Barbuzz* and G. Prlo Department of

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Online Detection and Classification of Moving Objects Using Progressively Improving Detectors

Online Detection and Classification of Moving Objects Using Progressively Improving Detectors Onlne Detecton and Classfcaton of Movng Objects Usng Progressvely Improvng Detectors Omar Javed Saad Al Mubarak Shah Computer Vson Lab School of Computer Scence Unversty of Central Florda Orlando, FL 32816

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Incremental MQDF Learning for Writer Adaptive Handwriting Recognition 1

Incremental MQDF Learning for Writer Adaptive Handwriting Recognition 1 200 2th Internatonal Conference on Fronters n Handwrtng Recognton Incremental MQDF Learnng for Wrter Adaptve Handwrtng Recognton Ka Dng, Lanwen Jn * School of Electronc and Informaton Engneerng, South

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

PRÉSENTATIONS DE PROJETS

PRÉSENTATIONS DE PROJETS PRÉSENTATIONS DE PROJETS Rex Onlne (V. Atanasu) What s Rex? Rex s an onlne browser for collectons of wrtten documents [1]. Asde ths core functon t has however many other applcatons that make t nterestng

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

A Lazy Ensemble Learning Method to Classification

A Lazy Ensemble Learning Method to Classification IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 344 A Lazy Ensemble Learnng Method to Classfcaton Haleh Homayoun 1, Sattar Hashem 2 and Al

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009)

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009) Statstcal Technques n Robotcs (Fall 09) Keywords: classfer ensemblng, onlne learnng, expert combnaton, machne learnng Javer Hernandez Alberto Rodrguez Tomas Smon javerhe@andrew.cmu.edu albertor@andrew.cmu.edu

More information

Foreground and Background Information in an HMM-based Method for Recognition of Isolated Characters and Numeral Strings

Foreground and Background Information in an HMM-based Method for Recognition of Isolated Characters and Numeral Strings Foreground and Background Informaton n an HMM-based Method for Recognton of Isolated Characters and Numeral Strngs Alceu de S. Brtto Jr a,b, Robert Sabourn c,d, Flavo Bortolozz a,chng Y. Suen d a Pontfíca

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Recognition of Handwritten Numerals Using a Combined Classifier with Hybrid Features

Recognition of Handwritten Numerals Using a Combined Classifier with Hybrid Features Recognton of Handwrtten Numerals Usng a Combned Classfer wth Hybrd Features Kyoung Mn Km 1,4, Joong Jo Park 2, Young G Song 3, In Cheol Km 1, and Chng Y. Suen 1 1 Centre for Pattern Recognton and Machne

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

A Background Subtraction for a Vision-based User Interface *

A Background Subtraction for a Vision-based User Interface * A Background Subtracton for a Vson-based User Interface * Dongpyo Hong and Woontack Woo KJIST U-VR Lab. {dhon wwoo}@kjst.ac.kr Abstract In ths paper, we propose a robust and effcent background subtracton

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

ISSN: International Journal of Engineering and Innovative Technology (IJEIT) Volume 1, Issue 4, April 2012

ISSN: International Journal of Engineering and Innovative Technology (IJEIT) Volume 1, Issue 4, April 2012 Performance Evoluton of Dfferent Codng Methods wth β - densty Decodng Usng Error Correctng Output Code Based on Multclass Classfcaton Devangn Dave, M. Samvatsar, P. K. Bhanoda Abstract A common way to

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Feature Kernel Functions: Improving SVMs Using High-level Knowledge

Feature Kernel Functions: Improving SVMs Using High-level Knowledge Feature Kernel Functons: Improvng SVMs Usng Hgh-level Knowledge Qang Sun, Gerald DeJong Department of Computer Scence, Unversty of Illnos at Urbana-Champagn qangsun@uuc.edu, dejong@cs.uuc.edu Abstract

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Learning-based License Plate Detection on Edge Features

Learning-based License Plate Detection on Edge Features Learnng-based Lcense Plate Detecton on Edge Features Wng Teng Ho, Woo Hen Yap, Yong Haur Tay Computer Vson and Intellgent Systems (CVIS) Group Unverst Tunku Abdul Rahman, Malaysa wngteng_h@yahoo.com, woohen@yahoo.com,

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

Accumulated-Recognition-Rate Normalization for Combining Multiple On/Off-Line Japanese Character Classifiers Tested on a Large Database

Accumulated-Recognition-Rate Normalization for Combining Multiple On/Off-Line Japanese Character Classifiers Tested on a Large Database 4 th Internatonal Workshop on Multple Classfer Systems (MCS23) Guldford, UK Accumulated-Recognton-Rate Normalzaton for Combnng Multple On/Off-Lne Japanese Character Classfers Tested on a Large Database

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated.

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated. Some Advanced SP Tools 1. umulatve Sum ontrol (usum) hart For the data shown n Table 9-1, the x chart can be generated. However, the shft taken place at sample #21 s not apparent. 92 For ths set samples,

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

A Statistical Model Selection Strategy Applied to Neural Networks

A Statistical Model Selection Strategy Applied to Neural Networks A Statstcal Model Selecton Strategy Appled to Neural Networks Joaquín Pzarro Elsa Guerrero Pedro L. Galndo joaqun.pzarro@uca.es elsa.guerrero@uca.es pedro.galndo@uca.es Dpto Lenguajes y Sstemas Informátcos

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions Sortng Revew Introducton to Algorthms Qucksort CSE 680 Prof. Roger Crawfs Inserton Sort T(n) = Θ(n 2 ) In-place Merge Sort T(n) = Θ(n lg(n)) Not n-place Selecton Sort (from homework) T(n) = Θ(n 2 ) In-place

More information

Manifold-Ranking Based Keyword Propagation for Image Retrieval *

Manifold-Ranking Based Keyword Propagation for Image Retrieval * Manfold-Rankng Based Keyword Propagaton for Image Retreval * Hanghang Tong,, Jngru He,, Mngjng L 2, We-Yng Ma 2, Hong-Jang Zhang 2 and Changshu Zhang 3,3 Department of Automaton, Tsnghua Unversty, Bejng

More information

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z. TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of

More information

Analysis of Continuous Beams in General

Analysis of Continuous Beams in General Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,

More information

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye

More information

An Improved Neural Network Algorithm for Classifying the Transmission Line Faults

An Improved Neural Network Algorithm for Classifying the Transmission Line Faults 1 An Improved Neural Network Algorthm for Classfyng the Transmsson Lne Faults S. Vaslc, Student Member, IEEE, M. Kezunovc, Fellow, IEEE Abstract--Ths study ntroduces a new concept of artfcal ntellgence

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

Title: A Novel Protocol for Accuracy Assessment in Classification of Very High Resolution Images

Title: A Novel Protocol for Accuracy Assessment in Classification of Very High Resolution Images 2009 IEEE. Personal use of ths materal s permtted. Permsson from IEEE must be obtaned for all other uses, n any current or future meda, ncludng reprntng/republshng ths materal for advertsng or promotonal

More information

An Approach to Real-Time Recognition of Chinese Handwritten Sentences

An Approach to Real-Time Recognition of Chinese Handwritten Sentences An Approach to Real-Tme Recognton of Chnese Handwrtten Sentences Da-Han Wang, Cheng-Ln Lu Natonal Laboratory of Pattern Recognton, Insttute of Automaton of Chnese Academy of Scences, Bejng 100190, P.R.

More information

Yan et al. / J Zhejiang Univ-Sci C (Comput & Electron) in press 1. Improving Naive Bayes classifier by dividing its decision regions *

Yan et al. / J Zhejiang Univ-Sci C (Comput & Electron) in press 1. Improving Naive Bayes classifier by dividing its decision regions * Yan et al. / J Zhejang Unv-Sc C (Comput & Electron) n press 1 Journal of Zhejang Unversty-SCIENCE C (Computers & Electroncs) ISSN 1869-1951 (Prnt); ISSN 1869-196X (Onlne) www.zju.edu.cn/jzus; www.sprngerlnk.com

More information

Fast Feature Value Searching for Face Detection

Fast Feature Value Searching for Face Detection Vol., No. 2 Computer and Informaton Scence Fast Feature Value Searchng for Face Detecton Yunyang Yan Department of Computer Engneerng Huayn Insttute of Technology Hua an 22300, Chna E-mal: areyyyke@63.com

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK L-qng Qu, Yong-quan Lang 2, Jng-Chen 3, 2 College of Informaton Scence and Technology, Shandong Unversty of Scence and Technology,

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

ENSEMBLE OF NEURAL NETWORKS FOR IMPROVED RECOGNITION AND CLASSIFICATION OF ARRHYTHMIA

ENSEMBLE OF NEURAL NETWORKS FOR IMPROVED RECOGNITION AND CLASSIFICATION OF ARRHYTHMIA XVIII IMEKO WORLD COGRESS Metrology for a Sustanable Development September, 7 22, 2006, Ro de Janero, Brazl ESEMBLE OF EURAL ETWORKS FOR IMPROVED RECOGITIO AD CLASSIFICATIO OF ARRHYTHMIA S. Osowsk,2, T.

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Shape Representaton Robust to the Sketchng Order Usng Dstance Map and Drecton Hstogram Department of Computer Scence Yonse Unversty Kwon Yun CONTENTS Revew Topc Proposed Method System Overvew Sketch Normalzaton

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Journal of Process Control

Journal of Process Control Journal of Process Control (0) 738 750 Contents lsts avalable at ScVerse ScenceDrect Journal of Process Control j ourna l ho me pag e: wwwelsevercom/locate/jprocont Decentralzed fault detecton and dagnoss

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

Dynamic Voltage Scaling of Supply and Body Bias Exploiting Software Runtime Distribution

Dynamic Voltage Scaling of Supply and Body Bias Exploiting Software Runtime Distribution Dynamc Voltage Scalng of Supply and Body Bas Explotng Software Runtme Dstrbuton Sungpack Hong EE Department Stanford Unversty Sungjoo Yoo, Byeong Bn, Kyu-Myung Cho, Soo-Kwan Eo Samsung Electroncs Taehwan

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Private Information Retrieval (PIR)

Private Information Retrieval (PIR) 2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market

More information

General Vector Machine. Hong Zhao Department of Physics, Xiamen University

General Vector Machine. Hong Zhao Department of Physics, Xiamen University General Vector Machne Hong Zhao (zhaoh@xmu.edu.cn) Department of Physcs, Xamen Unversty The support vector machne (SVM) s an mportant class of learnng machnes for functon approach, pattern recognton, and

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Bootstrapping Structured Page Segmentation

Bootstrapping Structured Page Segmentation Bootstrappng Structured Page Segmentaton Huanfeng Ma and Davd Doermann Laboratory for Language and Meda Processng Insttute for Advanced Computer Studes (UMIACS) Unversty of Maryland, College Park, MD {hfma,

More information

INTELLECT SENSING OF NEURAL NETWORK THAT TRAINED TO CLASSIFY COMPLEX SIGNALS. Reznik A. Galinskaya A.

INTELLECT SENSING OF NEURAL NETWORK THAT TRAINED TO CLASSIFY COMPLEX SIGNALS. Reznik A. Galinskaya A. Internatonal Journal "Informaton heores & Applcatons" Vol.10 173 INELLEC SENSING OF NEURAL NEWORK HA RAINED O CLASSIFY COMPLEX SIGNALS Reznk A. Galnskaya A. Abstract: An expermental comparson of nformaton

More information

Research and Application of Fingerprint Recognition Based on MATLAB

Research and Application of Fingerprint Recognition Based on MATLAB Send Orders for Reprnts to reprnts@benthamscence.ae The Open Automaton and Control Systems Journal, 205, 7, 07-07 Open Access Research and Applcaton of Fngerprnt Recognton Based on MATLAB Nng Lu* Department

More information

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status

More information

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following. Complex Numbers The last topc n ths secton s not really related to most of what we ve done n ths chapter, although t s somewhat related to the radcals secton as we wll see. We also won t need the materal

More information