Enclosing Machine Learning

Size: px
Start display at page:

Download "Enclosing Machine Learning"

Transcription

1 Enclosng Machne Learnng Xunka We* Department of Arcraft and Power Engneerng Unversty of Ar Force Engneerng Xan, Shaanx Chna voce: fax: emal: Ynghong L Department of Arcraft and Power Engneerng Unversty of Ar Force Engneerng Xan, Shaanx Chna voce: fax: emal: ynghong_l@126.com Yufe L Department of Arcraft and Power Engneerng Unversty of Ar Force Engneerng Xan, Shaanx Chna fax: emal: horzon_lyf@hotmal.com (* Correspondng author)

2 Enclosng Machne Learnng Xunka We, Unversty of Ar Force Engneerng, Chna Ynghong L, Unversty of Ar Force Engneerng, Chna Yufe L, Unversty of Ar Force Engneerng, Chna INTRODUCTION As s known to us, Cognton process s the nstnct learnng ablty of human beng, and ths process s perhaps the most complex but hghly effcent and ntellgzed nformaton processng process. For the cognton process of the natural world, human always transfers the feature nformaton to the bran through percepton, and then the bran wll process the feature nformaton and remember the feature nformaton for the gven objects. Snce the nventon of computer, scentsts are always workng toward mprovng ts artfcal ntellgence, and hope one day the computer could have ther own genune ntellgent bran lke the human bran. However, accordng to the cognton scence theory, the human bran can be mtated but cannot be completely reproduced. Thus, to let the computer truly thnk by themselves seems easy yet there s stll a long way to accomplsh ths objectve. Currently, artfcal ntellgence s stll an mportant and actve drecton of functon mtaton of the human bran. Yet tradtonally, the Neural-computng and neural networks famles are the majorty part of the drecton (Haykn, 1994). By mtatng the workng mechansm of human neuron of the bran, scentst found the neural networks computng theory accordng to expermental progresses such as Percepton neurons and Spkng neurons (Gerstner

3 & Kstler, 2002) n understandng the workng mechansm of neurons. For a long tme, the related research works manly emphasze on neuron model, neural network topology, learnng algorthm, and thus there are qute floursh large famles (Bshop, 1995) such as, Back Propagaton neural networks (BPNN), Radcal Bass Functon neural networks (RBFNN), Self Organzaton Map (SOM), and varous other varants. Neural-computng and neural networks (NN) famles have made great achevements n varous aspects. Recently, statstcal learnng and support vector machnes (SVM) (Vapnk, 1995; Scholkopf, & Smola, 2001) draw extensve attenton, show attractve and excellent performances n varous areas (L, We & Lu, 2004) compared wth NN, whch mply that artfcal ntellgence can also be made va advanced statstcal computng theory. Nowadays, these two methods tend to merge under statstcal learnng theory framework. BACKGROUND It should be noted as for NN and SVM, the functon mtaton s from the mcrocosmc vew utlzng the mathematc model of neuron workng mechansm. However the whole cognton process can also be summarzed as two basc prncples from the macroscopcal vew,.e. the frst s that human always cognzes thngs of the same knd, the second s that human recognzes thngs of a new knd easly wthout affectng the exstng knowledge. These two common prncples are easly concluded. In order to make the dea more clearly, we frstly analyze the functon mtaton explanaton of NN and SVM. The functon mtaton of human cogntve process for pattern classfcaton va NN and SVM can be explaned as follows (L & We, 2005). Gven the tranng pars (sample features, class ndcator), we can tran a NN or a SVM learnng machne. The tranng process of these learnng machnes actually mtates the learnng ablty of human beng.

4 For clarty, we call ths process cognzng. Then, the traned NN or SVM can be used for testng an unknown sample and determne the class ndcator t belongs to. The testng process of an unknown sample actually mtates the recognzng process of human beng. We call ths process recognzng. From the mathematc pont of vew, both these two learnng machnes are based on the hyperplane adjustment, and obtan the optmum or sub-optmum hyperplane combnatons after the tranng process. As for NN, each neuron acts as a hyperplane n the feature space. The feature space s dvded nto many parttons accordng to the selected tranng prncple. Each feature space partton s then lnked wth a correspondng class, whch accomplshes the cognzng process. Gven an unknown sample, t only detects the partton where the sample locates n and then assgns the ndcator of ths sample, whch accomplshes the recognzng process. Lke NN, SVM s based on the optmum hyperplane. Unlke NN, standard SVM determnes the hyperplane va solvng a QP convex optmzaton problem. They have the same cognzng and recognzng process except dfferent solvng strateges. Now, suppose we have a complete sample database, and f a totally unknown and novel sample comes, both SVM and NN wll not naturally recognze t correctly and conversely prefer to assgn a most close ndcator n the learned classes (L & We, 2005). However, ths phenomenon s qute easy for human to handle wth. If we have learned some thngs of the same knd before, gven smlar thngs we can easly recognze them. If we have never encountered wth them, we can also easly tell that they are fresh thngs. Then under supervsed learnng of them, we can then remember ther features n the bran wthout changng other learned thngs.

5 The root cause of ths phenomenon s the learnng prncple of the NN or SVM cognzng algorthm, whch s based on feature space partton. Ths knd of learnng prncple may amplfy each class s dstrbuton regon especally when the samples of dfferent knds are small due to ncompleteness. Ths makes t mpossble to automatcally detect the novel samples. Here comes the concern: how to make t automatcally dentfy the novel samples lke human. MAIN FOCUS Human beng generally cognzes thngs of one knd and recognzes complete unknown thngs of a novel knd easly. So the answer s why not let the learnng machne cognze or recognze lke human beng (L, We & Lu, 2004). In other words, the learnng machne should cognze the tranng samples of the same class regardless of the other classes, so that our ntenton s focused only on each sngle class. Ths pont s mportant to assure that all the exstng classes are precsely learned wthout amplfcaton. To learn each class, we can just let each class be cognzed or descrbed by a cogntve learner. It uses some knd of model to descrbe each class nstead of usng feature space partton so as to mtate the cognzng process. Therefore, now there s no amplfcaton occur dfferent from NN or SVM. The boundng boundary of each cogntve learner scatters n the feature space. All the learners boundares consst of the whole knowledge of the learned classes. For an unknown sample, the cogntve class recognzer then detects whether the unknown sample s located nsde a cogntve learner s boundary to mtate the recognzng process. If the sample s completely new (.e., none of the traned cogntve learner contans the sample), t can be agan descrbed by a new cogntve learner and the new obtaned learner can be added to the feature space wthout affectng others. Ths concludes the basc process of our proposed enclosng machne learnng paradgm (We, L & L, 2007A).

6 Mathematc Modelng In order to realze above deas for practcal usage, we have to lnk the deas wth concrete mathematc models (We, L & L, 2007A). Actually the frst prncple can be modeled as a mnmum volume enclosng problem. The second prncple can be deally modeled as a pont detecton problem. In fact, the mnmum volume enclosng problem s qute hard to solve for samples from arbtrary dstrbuton,and the actual dstrbuton shape mght be rather complex for calculatng drectly. Therefore, a natural alternatve s to use regular shapes such as sphere (Fscher, Gartner, & Kutz, 2003), ellpsod and so on to enclose all samples of the same class wth mnmum volume to approxmate the true mnmum volume enclosng boundary (We, Löfberg, Feng, L & L, 2007). Moreover, the approxmaton method can also be easly formulated as a convex optmzaton problem. Thus t can be effcently solved n polynomal tme usng state-of-the-art avalable open source solvers such as SDPT3 (Toh, Todd & Tutuncu, 1999), SEDUMI (Sturm, 1999), YALMIP (Löfberg, 2004) etc. Consequently, the pont detecton algorthm can be easly concluded va detectng ts locaton nsde t or not. Enclosng Machne Learnng Concepts Usng prevous modelng methods, we can now ntroduce some mportant concepts. Note that the new learnng methodology now actually has three aspects. The frst aspect s to learn each class respectvely, we call t cogntve learnng. The second aspect s to detect unknown samples locaton and determne ts ndcator, we call t cogntve classfcaton. Whle the thrd aspect s to conduct a new cogntve learnng process, we call t feedback self-learnng, and the thrd aspect s for mtaton of the character of learnng samples of new knd wthout affectng the exstng knowledge. The whole process can be depcted n Fg 1. We now can gve followng two defntons.

7 Class Learner. A cogntve class learner s defned as the boundng boundary of a mnmum volume set whch encloses all the gven samples. The cogntve learner can be ether a sphere or an ellpsod or ther combnatons. Fg 2, Fg 3 and Fg 4 depct the examples of sphere learner, ellpsod learner, and combnatonal ellpsod learner n 2D. Cognze Output Learner Recognze Output Results Class Learner Recognzer Yes Is Novel? No Self-Learnng Fg.1 Enclosng Machne Learnng Process. The real lne denotes the cognzng process. The dotted lne denotes the recognzng process. The dash-dotted lne denotes the feedback self-learnng process. Fg. 2 Sphere Learner

8 Fg.3 Ellpsod Learner Fg.4 Combnatonal Ellpsod Learner Remarks: As for the above llustrated three type learner, we can see that the sphere learner generally has the bggest volume, and next s sngle Ellpsod learner, and the combnatonal Ellpsod learner has the smallest volume. Recognzer. A cogntve recognzer s defned as the pont detecton and assgnment algorthm. The cogntve learner should own at least followng features to get commendable performance: A. regular and convenent to calculate B. boundng wth the mnmum volume C. convex bodes to guarantee optmalty D. fault tolerant to assure generalzaton performance.

9 The basc geometrc shapes are the best choces. Because they are all convex bodes and the operatons lke ntersecton, unon or complement of the basc geometrc shapes can be mplemented usng convex optmzaton methods easly. So we propose to use basc geometrc shapes such as sphere, box or ellpsod to serve as base learner. The cogntve learner s then to use these geometrc shapes to enclose all the gven samples wth the mnmum volume objectve n the feature space. Ths s the most mportant reason why we call ths learnng paradgm enclosng machne learnng. Cogntve Learnng & Classfcaton algorthms We frst nvestgate the dfference between enclosng machne learnng and other feature space partton based methods. Fg.5 gves a geometrc llustraton of the dfferences. For the cognzng (or learnng) process, each class s descrbed by a cogntve class descrpton learner. For the recognzng (or classfcaton) process, we only need to check whch boundng learner the testng sample locates nsde (a) (b) Fg.5. A geometrc llustraton of learnng a three class samples va enclosng machne learnng vs. feature space partton learnng paradgm. (a) For the depcted example, the cogntve learner s the boundng mnmum volume ellpsod, whle the cogntve recognzer s actually the pont locaton detecton algorthm of the testng sample. (b) All the three classes are separated by three hyperplanes.

10 But for the partton based learnng paradgm, among the learnng process, each two classes are separated va a hyperplane (or other boundary forms, such as hypersphere etc.). Whle among the classfcaton process, we need to check whether t s located on the left sde or the rght sde of the hyperplane and then assgn the correspondng class ndcator. We can see that the feature space partton learnng paradgm n fact amplfy the real dstrbuton regons of each class. But the enclosng machne learnng paradgm obtans more reasonable dstrbuton regon of each class. In enclosng machne learnng, the most mportant step s to obtan a proper descrpton of each sngle class of samples. From mathematc pont of vew, our cogntve class descrpton methods actually are the so-called one class classfcaton method (OCC) (Scholkopf, Platt, Shawe-Taylor, Smola, & Wllamson, 2001). OCC can recognze the new samples that resemble the tranng set and detect uncharacterstc samples, or outlers, to avod the ungrounded classfcaton. By far, the well-known examples of OCC are studed n the context of SVM. For ths problem, One Class Support Vector Machnes (OCSVM) (Tax & Dun, 1999) s frstly proposed. The OCSVM frst maps the data from the orgnal nput space to a feature space va some map, and then construct a hyperplane n whch separate the mapped patterns from the orgn wth maxmum margn. The one-class SVM proposed by Tax (Tax, 2001) s named support vector doman descrpton (SVDD), whch seeks the mnmum hypersphere that encloses all the data of the target class n a feature space. In ths way, t fnds the descrptve area that covers the data and excludes the superfluous space that results n false alarms. However, both OCSVM and SVDD depend on the Eucldean dstance, whch s often sub-optmal. An mportant problem n Eucldean dstance based learnng algorthm s the scale of

11 the nput varables. And thus Tax et al (Tax & Juszczak, 2003) proposes a KPCA based technques to rescale the data n a kernel feature space to unt varance n order to reduce the nput varable scale nfluences to mnma. And People proposed to maxmze the Mahalanobs dstance of the hyperplane to the orgn nstead, whch s the core dea of the One Class Mnmax Probablty Machne (OCMPM) (Lanckret, Ghaou & Jodan, 2002) and the Mahalanobs One Class Support Vector Machnes (MOCSVM) (Tsang, Kwok, & L, S., 2006). Because the Mahalanobs dstance s normalzed by the covarance matrx, t s lnear translaton nvarant. Therefore, we need not worry about the scales of nput varables. What s more, to allevate the undesrable effects of estmaton error n the covarance matrx, we can easly ncorporate a pror knowledge wth an uncertanty model and then address t as a robust optmzaton problem. Because Ellpsod and the accompanyng Mahalanobs dstance own many commendable vrtues mentoned above, we proposed to ncorporate Ellpsod and Mahalanobs nto class learnng. And then currently our man progress towards class learnng or cognzng s that we proposed a new mnmum volume enclosng ellpsod learner and several Mahalanobs dstance based OCC methods. In our prevous works, we proposed a QP based Mahalanobs Ellpsodal Learnng Machne (QP-MELM) (We, Huang & L, 2007A) and QP based Mahalanobs Hyperplane Learnng Machne (QP-MHLM) (We, Huang & L, 2007B) va solvng the dual form, and applcatons to real world datasets show promsng performances. However, as s suggested (Boyd, & Vandenberghe, 2004), f both the prmal form and dual form of an optmzaton problem s feasble, then the prmal form s more preferable. Therefore, we proposed a Second Order Cone Programmng representable Mahalanobs Ellpsodal Learnng

12 Machne (SOCP-MELM) (We, L, Feng & Huang, 2007A). And accordng to ths new learner, we developed several useful learnng algorthms. Mnmum Volume Enclosng Ellpsod Learner In ths new algorthm, we summarze several solutons (see Kumar, Mtchell, & Yldrm, 2003; Kumar, P. & Yldrm, 2005; Sun & Freund, 2004). As for the SDP soluton, we can drectly solve ts prmal form usng Schur complement theorem. As for the lndet soluton, we can solve ts dual effcently n polynomal tme. As for the SOCP soluton (We, L, Feng, & Huang, 2007B), we can also effcently solve ts prmal form n polynomal tme. We suppose all the samples are centered frstly. So we only gve results for mnmum volume enclosng ellpsod center at the orgn for ths case. But t s straghtforward for lft the ellpsod wth center n a d dmenson space to a generalzed ellpsod wth center at the orgn n a d 1dmenson space, for more detal, the reader may check the paper (We, L, Feng & Huang, 2007A) for more detal. Gven samples X m n R T 1, suppose c x x c x c ellpsod, then the mnmum volume problem can be formulated as (, ) : { : 1} s the demanded mn ln det Ab, 1 T 1 x c x c 1 st (1) However, ths s not a convex optmzaton problem. Fortunately, t can be transformed nto followng convex optmzaton problem mn ln det A Ab, T Ax b Ax b 1 st.. A 0, 1, 2,, n (2)

13 1 2 A usng matrx transform. 1 2 b c In order to allow errors, usng Schur Complete theorem, (2) can be represented n followng SDP form mn ln det A Ab,, 1 I Ax b st.. T 0 Ax b 1 n (3) Solvng (3), we can then obtan the mnmum volume enclosng ellpsod. Yet, SDP s qute demanded especally for large scale or hgh dmensonal data learnng problem. T As for 1 2 ( c, ) : { x : x c x c R }, we can reformulate the prmal form of mnmum volume enclosng problem as followng SOCP form: mn R, 0, c R N 1 st.. R 0, 0, 1,2,, N. T 1 ( x c) Σ ( x c) R, (4) Accordngly, t can be kernelzed as mn w, R, 0 R n 1 1 n Ω Q k Kw R 2 ( ), st.. R 0, 0, 1,2,, n. (5) Where c the center of the ellpsod s, R s the generalzed radus, n s number of samples, T and K Q ΩQ. C

14 So as to obtan more effcent solvng method, except above prmal form based methods, we can also reformulate the mnmum volume enclosng ellpsod centered at orgn as followng optmzaton problem: n 1 mn ln det U, 1 T 1 x x 1 st.. 0, 1,2,, n (4) Where balances the volume and the errors, 0 s slack varable. Actually va optmzaton condtons and KKT condtons, ths problem can be effcently solved usng followng dualzed representaton form: max ln det 0 st.. 1,, n n n T xx 1 1 (5) Where s the dual varable. We see that (5) cannot be kernelzed drectly, therefore we need to use some trcks [] to kernelzed ts equvalent counterpart max ln det 0 st.. 1,, n n 1 (6) 1 Where s the dual varable, : k( x1, x1 ) k( xn, x1 ), : n k ( xn, x1 ) k( xn, xn)

15 Multple Class Classfcaton algorthms As s ponted out prevously, cogntve learnng s actually to use mnmum volume geometrc shapes to enclose each class samples for mtatng the learnng process of human bran. Thus for multple class classfcaton problem, a naturally soluton s frstly to use mnmum volume geometrc shapes to approxmate each class samples dstrbuton, and then for gvng unknown samples, we only need to check whether they are nsde a learner or not. But these are for deal cases, where no overlaps occur n each sngle class dstrbutons. When overlaps occur, we proposed two algorthms to handle ths case (We, Huang & L, 2007C). For the frst soluton, we use a dstance based metrc, we would lke to assgn t to the closest class. Ths algorthm can be summarzed as f ( x) arg mn x c R k{1,2,, m} (7) Where denotes Mahalanobs norm. Another way s to use optmum Bayesan decson theory, and assgn ts ndcator to the class wth maxmum posteror probablty: P k M f( x) arg max exp( x c ) k{1,2,, m} 2 Rk (2 R ) k d 2 2 k 2 (8) where d s the dmenson of the feature space and 1 P s the pror dstrbuton of N k N 1y k 1 class k. Accordng to (8) the decson boundary between class 1 and 2 s gven by d 2 x c M 1 R1 2 R1 d 2 x c M 2 R2 2 R2 P(2 ) exp( ) P(2 ) exp( ) 1 (9)

16 And ths s equvalent to R 2 2 x c x c 1 M 2 M T 2 1 T R2 (10) Therefore we can gve a new decson rule x c M f ( x) arg max( Tk ) R k 2 k 2 (11) where T d log R log P can be estmated from the tranng samples. k k k Remarks. Actually, we also proposed a sngle MVEE learner based two class classfcaton algorthm (We, L, Feng & Huang, 2007A), whch owns both features of MVEE descrpton and SVM dscrmnaton. Then usng One Vs One or One Aganst One, we can also get a multple class classfcaton algorthm. Except ths, we are now workng on a multple class classfcaton algorthm at complexty of a sngle MVEE based two class classfcaton algorthm, whch s expected to obtan promsng performances. Gap tolerant SVM desgn Here we brefly revew the new gap tolerant SVM desgn algorthm. Ths new algorthm s based on the mnmum volume enclosng ellpsod learner for assurng a compact descrpton of all the samples. We frstly fnd the MVEE around all the samples and thus obtan a Mahalanobs transform. We then use the Mahalanobs transform to whten all the samples and thus map them to a sphere dstrbuton. Then we construct standard SVMs n ths whten space. The MVEE gap tolerant classfer desgn algorthm can be summarzed as Step1, Solve MVEE and obtan and center c Step2, Whten data usng Mahalanobs transform t 2 ( x c) and get new sample n parst, y 1 T Step3, Solve standard SVM and get the decson functon y( x) sgn( w t b). 1

17 Separaton Hyperplane Separaton Hyperplane M Mnmum Volume Enclosng Ellpsod M Mnmum Volume Enclosng Sphere Fg 6 MVEE gap tolerant classfer llustraton Remarks. Ths algorthm s very concse and has several commendable features worth notng. The classfer desgned usng ths algorthm has less VC dmenson compared wth tradtonal ones. Also ths algorthm s scale nvarant. For more detals, the reader should refer to (We, L & Dong, 2007). FUTURE TRENDS In the future, more learner algorthms wll be developed. Another mportant drecton s to develop set based combnatonal learner algorthm (We, & L, 2007; We, L, & L, 2007B). Also more reasonable classfcaton algorthms wll be focused. Except theoretcal developments, we wll also focus on applcatons such as novelty detecton (Dola, Page, Whte & Harrs, 2004), face detecton, ndustral process condton montorng, and many other possble applcatons. CONCLUSION In ths artcle, we ntroduced enclosng machne learnng paradgm. We focused on ts concept defnton, and progresses n modelng the cognzng process va mnmum volume enclosng ellpsod. We then ntroduced several learnng and classfcaton algorthms based on

18 MVEE. And we also report a new gap tolerant SVM desgn method based MVEE. Fnally, we gve future development drectons. REFERENCES Bshop, C. M. (1995) Neural Networks for Pattern Recognton, 1st edn, Oxford: Oxford Unversty Press Boyd, S. & Vandenberghe, L. (2004) Convex Optmzaton, 2nd edn, Cambrdge: Cambrdge Unversty Press Dola, A. N., Page, S. F., Whte, N. M. & Harrs, C.J. (2004) D-optmalty for Mnmum Volume Ellpsod wth Outlers. In: Proceedngs of the Seventh Internatonal Conference on Sgnal/Image Processng and Pattern Recognton (UkrOBRAZ'2004), Fscher, K., Gartner, B., & Kutz, M. (2003) Fast Smallest-Enclosng-Ball Computaton n Hgh Dmensons, In: Algorthms - ESA 2003, LNCS 2832, Gerstner, W. & Kstler, W. M. (2002) Spkng Neuron Models: Sngle Neurons, Populatons, Plastcty, 1st edn, Cambrdge: Cambrdge Unversty Press Haykn, S. (1994) Neural Networks: A Comprehensve Foundaton, 1st edn, NJ: Prentce Hall Press Löfberg, J. (2004) YALMIP: A Toolbox for Modelng and Optmzaton n MATLAB. ~joloef Kumar, P., Mtchell, J. S. B. & Yldrm, E. A. (2003) Approxmate mnmum enclosng balls n hgh dmensons usng core-sets, The ACM Journal of Expermental Algorthmcs, 8(1), 1-29 Kumar, P. & Yldrm, E. A. (2005) Mnmum volume enclosng ellpsods and core sets, Journal of Optmzaton Theory and Applcatons, 126(1), 1 21

19 Lanckret, G. Ghaou, L. E. & Jodan, M. (2002) Robust Novelty Detecton wth Sngle-Class MPM. In: Becker, S., Thrun, S. & Obermayor, K. (Eds.): NIPS 15 L, Y. H., We, X. K & Lu, J. X. (2004) Engneerng Applcatons of Support Vector Machnes. 1st edn. Bejng: Weapon Industry Press L, Y. H. & We, X. K. (2005) Fuson Development of Support Vector Machnes and Neural Networks, Journal of Ar Force Engneerng Unversty, 4, Scholkopf, B., Platt, J., Shawe-Taylor, J., Smola, A., & Wllamson R. (2001) Estmatng the Support of a Hgh Dmensonal Dstrbuton, Neural Computaton, 13(7), Scholkopf, B. & Smola, A. (2001) Learnng wth Kernels, 1st edn, Cambrdge, MA: MIT Press Shawe-Taylor, J., Wllams, C., Crstann, N. & Kandola, J. S. (2002) On the Egenspectrum of the Gram Matrx and Its Relatonshp to the Operator Egenspectrum. In: N. CesaBanch et al. (Eds.): Proceedngs of the 13th Internatonal Conference on Algorthmc Learnng Theory (ALT2002), LNAI 2533, Sturm, J. F. (1999) Usng SeDuM, a Matlab Toolbox for Optmzaton over Symmetrc Cones, Optmzaton Methods and Software, 11-12, Sun, P. & Freund, R. M. (2004) Computaton of Mnmum-Volume Coverng Ellpsods, Operatons Research, 5, Tax, D. M. J. (2001) One-class Classfcaton: Concept-learnng n the Absence of Counter- Examples, PhD Thess, Delft Unversty of Technology Tax, D. M. J. & Dun, R. P. W. (1999) Support Vector Doman Descrpton, Pattern Recognton Letters, 20, Tax, D. M. J. & Juszczak, P. (2003) Kernel Whtenng for One-Class Classfcaton, Internatonal Journal of Pattern Recognton and Artfcal Intellgence, 17(3),

20 Toh, K.C., Todd, M.J. & Tutuncu, R.H. (1999) SDPT3-a Matlab Software Package for Semdefnte Programmng, Optmzaton Methods and Software, 11, Tsang, Ivor W. Kwok, James T. & L, S. (2006) Learnng the Kernel n Mahalanobs One-class Support Vector Machnes, n Proceedng of IJCNN 2006 Conference, Vapnk, V. N. (1995) The Nature of Statstcal learnng theory, 1st edn, New York: Sprnger- Verlag We, X. K., Huang, G. B. & L, Y. H. (2007A) Mahalanobs Ellpsodal Learnng Machne for One Class Classfcaton. In Proceedng of ICMLC 2007, Accepted We, X. K., Huang, G. B. & L, Y. H. (2007B) A New One Class Mahalanobs Hyperplane Learnng Machne based on QP and SVD. In Proceedng of LSMS2007, DCDIS journal, Accepted We, X. K., Huang, G. B. & L, Y. H. (2007C) Bayes Cogntve Ellpsodal Learnng Machne for Recognzng Process Imtaton. In Proceedng of LSMS2007, DCDIS journal, Accepted We, X. K., L, Y. H & Feng, Y. (2006) Comparatve Study of Extreme Learnng Machne and Support Vector Machnes. In: Wang, J. et al. (Eds.): Advances n Neural Networks - ISNN 2006, LNCS 3971, We, X. K., L, Y. H., Feng, Y. & Huang, G.B. (2007A) Mnmum Mahalanobs Enclosng Ellpsod Machne for Pattern Classfcaton. In: Huang, D.-S., Heutte, L. & Loog, M. (Eds.): ICIC 2007, CCIS 2, We, X. K., L, Y. H, Feng, Y. & Huang, G.B. (2007B) Solvng Mahalanobs Ellpsodal Learnng Machne va Second Order Cone Programmng. In: Huang, D.-S., Heutte, L. & Loog, M. (Eds.): ICIC 2007, CCIS 2,

21 We, X. K. & L, Y. H. (2007) Lnear Programmng Mnmum Sphere Set Coverng for Extreme Learnng Machnes, Neurocomputng, DOI: /j.neucom We, X.K., L, Y.H., and Dong, Y. (2007) A New Gap Tolerant SVM Classfer Desgn based on Mnmum Volume Enclosng Ellpsod, In Chna Conference of Pattern Recognton 2007, Accepted. We, X. K., L, Y. H, & L, Y. F. (2007A) Enclosng machne learnng: Concepts and Algorthms, Neural computng and Applcatons, DOI /s y We, X. K., L, Y. H. & L, Y. F. (2007B) Optmum Neural Network Constructon Va Lnear Programmng Mnmum Sphere Set Coverng. In: Alhajj, R. et al. (Eds.): ADMA 2007, LNAI 4632, We, X. K., Löfberg, J., Feng, Y., L, Y. H., & L, Y.F. (2007) Enclosng Machne Learnng for Class Descrpton. In: Lu, D. et al. (Eds.): Advances n Neural Networks - ISNN 2007, LNCS 4491,

22 KEY TERMS AND THEIR DEFINITIONS Enclosng Machne Learnng: It s a new machne learnng paradgm whch s based on functon mtaton of human beng s cognzng and recognzng process. Cogntve Learner: A cogntve learner s defned as the boundng boundary of a mnmum volume set whch encloses all the gven samples to mtate the learnng process. Cogntve Recognzer: A cogntve recognzer s defned as the pont detecton and assgnment algorthm to mtate the recognzng process. MVEE Gap Tolerant Classfer: A MVEE Gap Tolerant Classfer s specfed by the shape matrx and locaton of an ellpsod, and by two hyperplanes, wth parallel normals. The set of ponts lyng n between (but not on) the hyperplanes s called the margn set. Ponts that le nsde the ellpsod but not n the margn set are assgned a class, 1, dependng on whch sde of the margn set they le on. All other ponts are defned to be correct: they are not assgned a class. A MVEE gap tolerant classfer s n fact a specal knd of Support Vector Machne whch does not count data fallng outsde the ellpsod contanng the tranng data or nsde the margn as an error.

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE Dorna Purcaru Faculty of Automaton, Computers and Electroncs Unersty of Craoa 13 Al. I. Cuza Street, Craoa RO-1100 ROMANIA E-mal: dpurcaru@electroncs.uc.ro

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming CEE 60 Davd Rosenberg p. LECTURE NOTES Dualty Theory, Senstvty Analyss, and Parametrc Programmng Learnng Objectves. Revew the prmal LP model formulaton 2. Formulate the Dual Problem of an LP problem (TUES)

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Elliptical Rule Extraction from a Trained Radial Basis Function Neural Network

Elliptical Rule Extraction from a Trained Radial Basis Function Neural Network Ellptcal Rule Extracton from a Traned Radal Bass Functon Neural Network Andrey Bondarenko, Arkady Borsov DITF, Rga Techncal Unversty, Meza ¼, Rga, LV-1658, Latva Andrejs.bondarenko@gmal.com, arkadjs.borsovs@cs.rtu.lv

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

Using Neural Networks and Support Vector Machines in Data Mining

Using Neural Networks and Support Vector Machines in Data Mining Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

Fast Computation of Shortest Path for Visiting Segments in the Plane

Fast Computation of Shortest Path for Visiting Segments in the Plane Send Orders for Reprnts to reprnts@benthamscence.ae 4 The Open Cybernetcs & Systemcs Journal, 04, 8, 4-9 Open Access Fast Computaton of Shortest Path for Vstng Segments n the Plane Ljuan Wang,, Bo Jang

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

SUMMARY... I TABLE OF CONTENTS...II INTRODUCTION...

SUMMARY... I TABLE OF CONTENTS...II INTRODUCTION... Summary A follow-the-leader robot system s mplemented usng Dscrete-Event Supervsory Control methods. The system conssts of three robots, a leader and two followers. The dea s to get the two followers to

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

The Shortest Path of Touring Lines given in the Plane

The Shortest Path of Touring Lines given in the Plane Send Orders for Reprnts to reprnts@benthamscence.ae 262 The Open Cybernetcs & Systemcs Journal, 2015, 9, 262-267 The Shortest Path of Tourng Lnes gven n the Plane Open Access Ljuan Wang 1,2, Dandan He

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

A Robust LS-SVM Regression

A Robust LS-SVM Regression PROCEEDIGS OF WORLD ACADEMY OF SCIECE, EGIEERIG AD ECHOLOGY VOLUME 7 AUGUS 5 ISS 37- A Robust LS-SVM Regresson József Valyon, and Gábor Horváth Abstract In comparson to the orgnal SVM, whch nvolves a quadratc

More information

Related-Mode Attacks on CTR Encryption Mode

Related-Mode Attacks on CTR Encryption Mode Internatonal Journal of Network Securty, Vol.4, No.3, PP.282 287, May 2007 282 Related-Mode Attacks on CTR Encrypton Mode Dayn Wang, Dongda Ln, and Wenlng Wu (Correspondng author: Dayn Wang) Key Laboratory

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem Ecent Computaton of the Most Probable Moton from Fuzzy Correspondences Moshe Ben-Ezra Shmuel Peleg Mchael Werman Insttute of Computer Scence The Hebrew Unversty of Jerusalem 91904 Jerusalem, Israel Emal:

More information

UNIT 2 : INEQUALITIES AND CONVEX SETS

UNIT 2 : INEQUALITIES AND CONVEX SETS UNT 2 : NEQUALTES AND CONVEX SETS ' Structure 2. ntroducton Objectves, nequaltes and ther Graphs Convex Sets and ther Geometry Noton of Convex Sets Extreme Ponts of Convex Set Hyper Planes and Half Spaces

More information

CLASSIFICATION OF ULTRASONIC SIGNALS

CLASSIFICATION OF ULTRASONIC SIGNALS The 8 th Internatonal Conference of the Slovenan Socety for Non-Destructve Testng»Applcaton of Contemporary Non-Destructve Testng n Engneerng«September -3, 5, Portorož, Slovena, pp. 7-33 CLASSIFICATION

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM PERFORMACE EVALUAIO FOR SCEE MACHIG ALGORIHMS BY SVM Zhaohu Yang a, b, *, Yngyng Chen a, Shaomng Zhang a a he Research Center of Remote Sensng and Geomatc, ongj Unversty, Shangha 200092, Chna - yzhac@63.com

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

On Some Entertaining Applications of the Concept of Set in Computer Science Course

On Some Entertaining Applications of the Concept of Set in Computer Science Course On Some Entertanng Applcatons of the Concept of Set n Computer Scence Course Krasmr Yordzhev *, Hrstna Kostadnova ** * Assocate Professor Krasmr Yordzhev, Ph.D., Faculty of Mathematcs and Natural Scences,

More information

Querying by sketch geographical databases. Yu Han 1, a *

Querying by sketch geographical databases. Yu Han 1, a * 4th Internatonal Conference on Sensors, Measurement and Intellgent Materals (ICSMIM 2015) Queryng by sketch geographcal databases Yu Han 1, a * 1 Department of Basc Courses, Shenyang Insttute of Artllery,

More information

SENSITIVITY ANALYSIS IN LINEAR PROGRAMMING USING A CALCULATOR

SENSITIVITY ANALYSIS IN LINEAR PROGRAMMING USING A CALCULATOR SENSITIVITY ANALYSIS IN LINEAR PROGRAMMING USING A CALCULATOR Judth Aronow Rchard Jarvnen Independent Consultant Dept of Math/Stat 559 Frost Wnona State Unversty Beaumont, TX 7776 Wnona, MN 55987 aronowju@hal.lamar.edu

More information

General Vector Machine. Hong Zhao Department of Physics, Xiamen University

General Vector Machine. Hong Zhao Department of Physics, Xiamen University General Vector Machne Hong Zhao (zhaoh@xmu.edu.cn) Department of Physcs, Xamen Unversty The support vector machne (SVM) s an mportant class of learnng machnes for functon approach, pattern recognton, and

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation Tranng of Kernel Fuzzy Classfers by Dynamc Cluster Generaton Shgeo Abe Graduate School of Scence and Technology Kobe Unversty Nada, Kobe, Japan abe@eedept.kobe-u.ac.jp Abstract We dscuss kernel fuzzy classfers

More information

Comparison Study of Textural Descriptors for Training Neural Network Classifiers

Comparison Study of Textural Descriptors for Training Neural Network Classifiers Comparson Study of Textural Descrptors for Tranng Neural Network Classfers G.D. MAGOULAS (1) S.A. KARKANIS (1) D.A. KARRAS () and M.N. VRAHATIS (3) (1) Department of Informatcs Unversty of Athens GR-157.84

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE

ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE Yordzhev K., Kostadnova H. Інформаційні технології в освіті ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE Yordzhev K., Kostadnova H. Some aspects of programmng educaton

More information

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification Research of Neural Network Classfer Based on FCM and PSO for Breast Cancer Classfcaton Le Zhang 1, Ln Wang 1, Xujewen Wang 2, Keke Lu 2, and Ajth Abraham 3 1 Shandong Provncal Key Laboratory of Network

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

A New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1

A New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1 A New Feature of Unformty of Image Texture Drectons Concdng wth the Human Eyes Percepton Xng-Jan He, De-Shuang Huang, Yue Zhang, Tat-Mng Lo 2, and Mchael R. Lyu 3 Intellgent Computng Lab, Insttute of Intellgent

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS Academc Research Internatonal ISS-L: 3-9553, ISS: 3-9944 Vol., o. 3, May 0 EVALUATIO OF THE PERFORMACES OF ARTIFICIAL BEE COLOY AD IVASIVE WEED OPTIMIZATIO ALGORITHMS O THE MODIFIED BECHMARK FUCTIOS Dlay

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

Harvard University CS 101 Fall 2005, Shimon Schocken. Assembler. Elements of Computing Systems 1 Assembler (Ch. 6)

Harvard University CS 101 Fall 2005, Shimon Schocken. Assembler. Elements of Computing Systems 1 Assembler (Ch. 6) Harvard Unversty CS 101 Fall 2005, Shmon Schocken Assembler Elements of Computng Systems 1 Assembler (Ch. 6) Why care about assemblers? Because Assemblers employ some nfty trcks Assemblers are the frst

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information