Robust Dictionary Learning with Capped l 1 -Norm

Size: px
Start display at page:

Download "Robust Dictionary Learning with Capped l 1 -Norm"

Transcription

1 Proceedngs of the Twenty-Fourth Internatonal Jont Conference on Artfcal Intellgence (IJCAI 205) Robust Dctonary Learnng wth Capped l -Norm Wenhao Jang, Fepng Ne, Heng Huang Unversty of Texas at Arlngton Arlngton, Texas 7609, USA cswhjang@gmal.com, fepngne@gmal.com, heng@uta.edu Abstract Expressng data vectors as sparse lnear combnatons of bass elements (dctonary) s wdely used n machne learnng, sgnal processng, and statstcs. It has been found that dctonares learned from data are more effectve than off-the-shelf ones. Dctonary learnng has become an mportant tool for computer vson. Tradtonal dctonary learnng methods use quadratc loss functon whch s known senstve to outlers. Hence they could not learn the good dctonares when outlers exst. In ths paper, ag at learnng dctonares resstant to outlers, we proposed capped l -norm based dctonary learnng and an effcent teratve re-weghted algorthm to solve the problem. We provded theoretcal analyss and carred out extensve experments on real word datasets and synthetc datasets to show the effectveness of our method. Introducton Dctonary learnng [Tosc and Frossard, 20] and sparse representaton [Olshausen and Feld, 997] are mportant tools for computer vson. Dctonary learnng seeks to learn an adaptve set of bass elements (dctonary) from data nstead of predefned ones [Mallat, 999], so that each data sample s represented by sparse lnear combnaton of these bass vectors. It has acheved start-of-the-art performance for numerous mage processng tasks such as classfcaton [Rana et al., 2007; Maral et al., 2009b], denosng [Elad and Aharon, 2006], self-taught learnng [Wang et al., 203b], and audo processng [Grosse et al., 2007]. Unlke prncpal component analyss, dctonary learnng does not mpose that the dctonary be orthogonal, hence allow more flexblty to represent data. Usually, dctonary learnng s formulated as: X D C,A DA 2 F, s.t. A 0 γ. () In ths formulaton, X R d n s the data matrx whose columns represent samples, A R K n s the new representatons of data and the matrx D R d K contans K bass Correspondng Author. Ths work was partally supported by NSF IIS-7965, IIS , IIS-34452, DBI (a) Capped l -norm loss (ε = 2.5) (b) l -norm loss Fgure : Capped l -norm loss vs l -norm loss vectors to learn. To prevent the l 2 norm of D beng arbtrarly large whch would lead to arbtrarly small values n the columns of A, t s common to constrant the columns have l 2 norm less than or equal to. Hence, D s n the followng convex set of matrces: C = D R d K,, K}, d 2 }. (2) The dctonary learned s the foundaton for sparse representatons. Dctonary wth good expressve ablty s the key to acheve good performance. It s well known that the quadratc loss functon s not robust to outlers. To tran the dctonary, a large amount data wll be used, and outlers wll be ncluded n the tranng data unavodably. Hence t s necessary to learn dctonary resstant to outlers. To be robust to outlers, the quadratc loss functon could be replaced by an loss functon that are not senstve to outlers, e.g. l norm loss or Huber loss. In ths paper, we use capped l -norm based loss functon l ε (u) = ( u, ε), where ε s a parameter. It s llustrated n Fg.. Ths loss functon treat u equally f u s bgger than ε. Hence, t s more robust to outlers than l norm. Unfortunately, ths loss functon s not convex. In ths paper, we propose an effcent algorthm to fnd the local optmal solutons. And we wll also provde theoretcal analyss of ths method. 2 Related Work In ths secton, we present a bref revew on dctonary learnng methods as follows. Snce the orgnal dctonary learnng model () s NP-hard, the straghtforward way to fnd dctonary s to adopt greedy 3590

2 strategy. K-SVD [Aharon et al., 2006] tred to solve the followng model: D,A 2 X DA 2 2 s.t. d j =, for j =, 2,, K a 0 T 0, for =, 2,, n, (3) where d s the th column of matrx D. The K-SVD s an teratve method that alternates between sparse codng of the examples based on the current dctonary and a process of updatng the dctonary atoms. The columns of dctonary are updated wth SVD sequentally, and the correspondng coeffcents are updated wth any pursut algorthm. The K-SVD algorthm showed good performance on mage denosng. As extensons of K-SVD, LC-KSVD [Jang et al., 203] and dscratve K-SVD [Zhang and L, 200] ntroduced label nformaton nto the procedure of learnng dctonares. Hence the dctonares learned are more dscratve. Except for adoptng greedy strategy to solve problems wth l 0 constrants, the orgnal problem can be relaxed nto the followng tradtonal dctonary learnng problem wth l regularzaton: X D C,A DA 2 F + λ A. (4) It s convex separately wth respect to D and A, a l regularzed least squares problem and a l 2 constraned least squares problem are needed to be solved teratvely to fnd the solutons. In [Lee et al., 2007], an feature-sgn search algorthm for learnng coeffcents, and a Lagrange dual method for learnng the bases are proposed to compute the dctonary and correspondng representatons. An onlne algorthm was proposed to solve t effcently n [Maral et al., 2009a]. The method mentoned above all use quadratc loss functons to measure the reconstructon errors, hence these methods are not robust enough to outlers. In order to learn dctonary robustly, Lu et al. proposed onlne robust dctonary learnng (ORDL) [Lu et al., 203], whch uses l loss functon and l regularzaton term that could be expressed as D,A, d x Da + λ a. (5) It updates dctonary and representatons alternately n an onlne way. For vsual trackng, Wang proposed onlne robust non-negatve dctonary learnng [Wang et al., 203c], whch uses the Huber loss functon. In [Wang et al., 203a], the sem-supervsed robust dctonary learnng model was proposed by solvng the l p -norm based objectve. 3 Proposed New Algorthm To facltate the presentaton of our method, we descrbe the notatons n the followng subsecton. 3. Notatons Throughout ths paper, the followng defntons and notatons are used. We use bold upper letters for matrces, bold lower letters for vectors and regular lower letters for elements. For vector a = (a, a 2,, a m ) T R m, let a = m a be the l -norm of a. Smlarly, we defne A 0 as the number of nonzero elements n matrx A and A = a be the l norm of matrx A. 3.2 Robust dctonary learnng Capped l -norm has been used n [Zhang, 200; 203] as a better approxmaton of l 0 -norm regularzaton. An extenson capped-l, regularzaton was proposed n the mult-task feature learnng settng [Gong et al., 203]. In ths paper, we use capped l -norm as a robust loss functon. For dctonary learnng, our goal s to fnd a set of hgh qualty atoms. An straghtforward way s to use l loss functon. To provde better robustness, we go further to use capped l -norm loss functon. As llustrated n Fg, the objectve values of capped l -norm loss does not ncrease any more f u s larger than ε. Therefore, capped l -norm loss s more robust than l -norm loss. We use the same constrants for dctonary and the objectve functon of our roust dctonary learnng wth capped l -norm s expressed as: D,A, d ( x Da 2, ε) + λ A. (6) From ths objectve functon, we can see that f ε s set as, the above objectve functon becomes l 2, -norm wth the same constrants. We defne f(a) = A, whch s a convex functon. And defne a concave functon L(u) = ( u, ε) where u > 0 and g(d, a) = x Da 2 2. We have: L (u) = 2, f 0 < u < ε2 u 0, f u > ε 2 (7) Our objectve functon s sum of a convex functon and a concave functon. Accordng to the re-weghted method proposed n [Ne et al., 200; 204], t can be solved by updatng D, A and auxlary varables s wth followng updatng rules: [D, A] = arg s x Da λ A, s = D,A, d 2 x Da 2, f x Da 2 ε 0, otherwse The whole teratve re-weghted method s lsted n Alg.. The subproblem (8) s smlar to the tradtonal dctonary learnng. The only dfference les n the fact that samples are not treated equally. It s weghted dctonary learnng. From the updatng rule for s, we can see that the samples wth lower reconstructon errors have hgher weghts, whch s very ntutve for learnng dctonary robustly. The subproblem (8) s convex separately wth respect to D and A. We adopt the same strategy as solvng tradtonal dctonary learnng. We solve t by updatng D and A alternately. Wth A fxed, the objectve functon becomes: D, d (8) (9) s x Da 2 2, (0) 359

3 whch s equvalent to: D, d z Dc 2 2, () where z = s x and c = s a. To solve the above problem, we update D column by column whch s smlar to the tradtonal dctonary learnng. The algorthm we use s adopted from [Maral et al., 2009a], whch s descrbed n Alg. 3. Wth D fxed, the objectve of (8) becomes: A = arg A s x Da λ A, (2) whch can be decomposed nto n ndependent problems as follows: a = arg a s x Da λ a. (3) If s 0, ths problem can be transformed nto the followng LASSO problem: a = arg a x Da λ s a, (4) whch could be solved effcently. The algorthm to solve weghted dctonary learnng s summarzed n Alg. 2. Durng the stage of tranng dctonary D, f s s 0, the correspondng representaton a wll be a zero vector. Hence, when tranng D, our method does not fnd codngs for outlers. However, the outlers n tranng D stage mght not be outlers for classfcaton tasks. Hence, we let the classfcaton algorthm to decde how to use the codngs of the outlers. Wth the learned D, n order to get the codngs for outlers, we just run Alg. wth settng ε as and omt the D updatng step (step n Alg. 2) to learn the representatons for both tranng data and testng data wth the same λ. In fact, we are solvng a dctonary learnng problem wth l 2, -norm loss functon. Algorthm Robust dctonary learnng wth capped l -norm nput: Data matrx X, dctonary sze K, λ and ε. Intalze D, A and s = for =, 2,, n. repeat. Solve subproblem (8) wth algorthm 2 2. Update s for =,, n as (9) untl Converge output: D and A 3.3 Convergence analyss Theorem 3.. The algorthm descrbed n Alg. wll decrease the objectve value of (6) n each teraton untl t converges. Proof. We defne h(w) = w 2, and denote u = x Da 2 2 and L(u) = ( u, ε). The capped l -norm loss functon ( x Da 2, ε) could be re-wrtten as: ( x Da 2, ε) (5) = nf 2) L (s)], s 0 (6) Algorthm 2 Weghted dctonary learnng nput: Data matrx X, ntal dctonary matrx D 0 and representaton matrx A 0, λ and weght vector s. Intalze D = D 0 and A = A 0 repeat. Update D wth algorthm Update a for =,, n as arga x a = Da λ s a f s > 0 0 otherwse untl Converge output: D, A Algorthm 3 Dctonary update nput: Data matrx X, dctonary matrx D, representaton matrx A and weght vector s. Compute matrx G = [g,, g K ] R K K and H = [h,, h K ] R d K as G = s a a T, H = repeat for j = to K do Update the j-th column of D: end for untl Converge output: D s x a T. u j = (h j Dg j ) + d j G jj d j = max( u j 2, ) u j where L (s) s the concave dual of L(u) defned as: L [ ] (s) = nf su L(u). (7) u Plug n L(u) and t s easy to fnd that: L (s) = 4s, f u < ε sε 2 ε, f u ε. (8) Therefore, the capped l -norm loss could be expressed as: ( x Da 2, ε) = nf L x(s, D, a), (9) s 0 where L x (s, D, a) s x Da 2 = + 4s, f x Da 2 < ε s x Da 2 2 sε 2 + ε, f x Da 2 ε. (20) Hence, the objectve functon (6) s equvalent to the followng objectve: nf L x (s, D, a ) + λ a, (2) s 0 D,A, d 3592

4 whch can be re-wrtten as: where we denote: o(d, A, s) = o(d, A, s), (22) D,A, d,s 0 L x (s, D, a ) + λ a. (23) Therefore, the algorthm descrbed n Alg. can be seen as a two stage optmzaton method: D and A are updated by solv- Updatng D and A stage: ng: o(d, A, s), (24) D,A, d whch s equvalent to: s x Da λ a. (25) D,A, d The updated D and A decrease the objectve values of (25). Hence, they also decrease the objectve values of (24). Updatng s stage: solvng: The auxlary varable s are updated by o(d, A, s), (26) s 0 whch s equvalent to n ndependent problems L x (s, D, a ). (27) s 0 Both stages wll decrease the value of the objectve functon (2), hence our algorthm also decrease the value of the orgnal objectve functon (6) and s guaranteed to converge. 4 Expermental Results In ths secton, we analyze and llustrate the performance of our method on real word datasets. 4. Prelary study on natural mage patches Frst, we present a prelary analyss of our method on natural mage patches. In our method, λ manly controls the sparsty of representatons and ε manly controls the number of outlers dentfed by the algorthm. ε s related to the resduals of representatons. If the resdual of a sample s larger than ε, t s not used to tran the dctonary, snce the correspondng s s zero. We set ε as nfnty frst to get the dstrbuton of the resduals. We use ths dstrbuton as an estmaton of the dstrbuton of resduals wth optmal D and A. We set ε such that a fracton of samples are seen as outlers. In ths experment, we set the fracton of outlers as 0.05 and λ = 0. emprcally. The values of objectve functon durng teratons are presented n Fg. 2. We can see that our method converged n only 22 teratons. The bases learned on natural mage patches by our method s shown n Fg. 3. objectve value teraton Fgure 2: The values of objectve functon durng teratons when learnng dctonares on 0,000 natural mages patches. Fgure 3: Learned,024 bases mages bases (each 4 4 pxels) on 0,000 natural mages patches. 4.2 Face recognton wthout occluson In ths subsecton, we provde the expermental results on face recognton tasks wthout synthetc occluson on extended Yale B dataset [Georghades et al., 200] and AR face dataset [Martnez, 998]. Extended Yale B dataset The extended Yale B database [Georghades et al., 200] contans 2,44 mages of 38 human frontal faces under 64 lluaton condtons and expressons. There are about 64 mages for each person. The orgnal mages were cropped to pxels. Some samples are shown n Fg. 4(a). We do not perform any preprocess on the mages. Followng [Wrght et al., 2009], we project each face mage nto a 504-dmensonal feature vector usng a random matrx. We splt the database randomly nto two halves. One half whch contans about 32 mages for each person was used for tranng the dctonary. The other half was used for testng. 3593

5 (a) Sample mages from extended Yale B dataset Table : Average classfcaton accuraces(%) on extended Yale B dataset. Method Accuracy (%) Capped Norm 96.9 Tradtonal DL KSVD LC-KSVD 93.6 LC-KSVD D-KSVD ORDL 89.7 (b) Sample mages from AR face dataset Fgure 4: Sample mages In the practcal mplementaton of our method, we frst set ε as nfnty and choose the best λ wth cross valdaton. Then we fx λ and choose ε. For face recognton, we compared our method wth a few start-of-the-art methods, ncludng tradtonal dctonary learnng [Maral et al., 2009a], K-SVD [Aharon et al., 2006], LC-KSVD and LC-KSVD2 [Jang et al., 203], D-KSVD [Zhang and L, 200] and ORDL [Lu et al., 203]. The dctonary sze s 570 for all methods, whch means 5 tems for person on average. The parameters for these methods were selected by cross valdaton. We ran all methods on 0 dfferent splts of tranng and testng set. The results are summarzed n Table. We can see that our method acheved the best performance. The reason that ORDL dd not perform well mght be the dctonary and representatons traned wth l -norm loss functon are not sutable for classfcaton tasks. We show that our method s robust when tranng dctonary. Recall that, n our method s = 0 means that the th sample s recognzed as outler and not used n tranng the dctonary. In runnng on one random splt, our method found 20 outlers, whch are all shown n Fg. 5. We can see that most of the outlers found by our method are wth extreme lluaton, whch wll effect the qualty of bases. AR face dataset The AR face dataset [Martnez, 998] conssts of over 4,000 color mages from 26 ndvduals. For each ndvdual, 26 pctures were taken n two separate sessons. Compared to the extended Yale B dataset, as shown Fgure 5: The outlers found by our method when tranng dctonary on an randomly selected tranng set from extended Yale B dataset. n Fg. 4(b), the AR face dataset ncludes more facal varatons, ncludng dfferent lluaton condtons, dfferent expressons, and dfferent facal dsguses (sunglasses and scarves). Followng the standard evaluaton procedure from [Wrght et al., 2009], we used a subset of the database consstng of 2,600 mages from 50 male subjects and 50 female subjects. For each person, 20 mages were randomly selected for tranng and the remanng mages are for testng. Each face mage s cropped to and then projected nto a 540-dmensonal feature vector. Smlar to the experments on Yale B dataset, we also ran all methods n 0 dfferent splts of tranng and testng set. The results are presented n Table 2. We can see that our method acheved the best performance among all dctonary learnng algorthms. We show the 7 outlers found by our algorthm n Fg 6. We can see that these outlers are faces wth glasses or scarves. Our algorthm dentfed these faces as outlers that wll hurt the qualty of dctonary and dd not use them n the tranng process. 4.3 Face recognton wth occluson In order to study the property of robustness to outlers, we carred out experments on extended Yale B dataset wth syn- We use grayscale mages n our experments. 3594

6 Table 2: Average classfcaton accuraces(%) on AR face dataset. Method Accuracy (%) Capped Norm Tradtonal DL KSVD LC-KSVD LC-KSVD D-KSVD 88.8 ORDL 9.72 Fgure 7: Sample faces wth occluson from extended Yale B dataset. Table 3: Average classfcaton accuraces(%) on extended Yale B dataset wth block occluson of dfferent levels. Method 0% 20% 30% 40% Capped Norm Tradtonal DL KSVD LC-KSVD LC-KSVD D-KSVD ORDL Fgure 6: The 7 outlers found by our method when tranng dctonary on an randomly selected tranng set from AR face dataset. and are dffcult to recognze. In our method, the dctonares were traned wthout these samples, hence they were better than dctonares traned n other ways, whch can be proved from the comparsons of performances n Table 3. thetc outlers. If the samples for tranng the dctonary contan outlers, tradtonal dctonary learnng wll try to fnd dctonary that could express these outlers. But t s of no use to express the outlers, hence the qualty of dctonary wll be effected for tradtonal dctonary learnng. In ths experment, we generate outlers by addng block occluson. Other ways of generatng outlers wll also lead to the smlar phenomena descrbed below. We selected a fracton of mages (outler rato) from tranng set, then we added block occluson of sze nto these mages at random postons. Some samples are shown n Fg. 7. We set outler rato as 0%, 20%, 30% and 40% to get 4 dfferent datasets wth synthetc outlers n the tranng sets. The mages from testng sets are not added any occlusons. Under ths settng, the effect on the performance of classfcaton wll come from the qualty of dctonares. Smlar to the experments we have done above, we also ran experments on 0 dfferent datasets and the average accuraces are reported n Table 3. We can see that the classfcaton accuraces of all methods decreased as the outler rato ncreased. Compared wth performances wthout block occluson n Table, the performance of tradtonal dctonary learnng dropped a lot (from to wth 0% outlers ). But our method dd not drop so much (from 96.9 to ). And our method also performed the best on all datasets. Hence we can conclude that our capped l -norm loss does provde resstance to such knd of nose. The outlers found by our algorthm from an synthetc dataset wth 0% corrupton are shown n Fg. 8. We can see that most outlers are mages that are too dark or corrupted Fgure 8: The 93 outlers found by our method when tranng dctonary on an randomly selected tranng set wth 0% samples corrupted wth block occluson from extended Yale B dataset. 5 Concluson In ths paper, we presented a robust dctonary learnng model based on capped l -norm and an effcent algorthm to fnd local solutons. One mportant advantage of our method s ts robustness to outlers. By teratvely assgnng weghts to samples, our algorthm succeeded n fndng outlers and reduced ther effects on tranng the dctonares. The proposed method was extensvely evaluated on face recognton jobs on dfferent real word datasets and synthetc datasets. The expermental results demonstrated that our method outperforms prevous state-of-the-art dctonary learnng methods. 3595

7 References [Aharon et al., 2006] Mchal Aharon, Mchael Elad, and Alfred Brucksten. K-svd: An algorthm for desgnng overcomplete dctonares for sparse representaton. IEEE Transactons on Sgnal Processng, 54(): , [Elad and Aharon, 2006] Mchael Elad and Mchal Aharon. Image denosng va sparse and redundant representatons over learned dctonares. IEEE Transactons on Image Processng, 5(2): , [Georghades et al., 200] A.S. Georghades, P.N. Belhumeur, and D.J. Kregman. From few to many: Illuaton cone models for face recognton under varable lghtng and pose. IEEE Transactons on Pattern Analyss and Machne Intellgence, 23(6): , 200. [Gong et al., 203] Pnghua Gong, Jepng Ye, and Changshu Zhang. Mult-stage mult-task feature learnng. The Journal of Machne Learnng Research, 4(): , 203. [Grosse et al., 2007] Roger Grosse, Rajat Rana, Helen Kwong, and Andrew Y. Ng. Shft-Invarant Sparse Codng for Audo Classfcaton. In Uncertanty n Artfcal Intellgence, [Jang et al., 203] Zhuoln Jang, Zhe Ln, and Larry S Davs. Label consstent k-svd: learnng a dscratve dctonary for recognton. IEEE Transactons on Pattern Analyss and Machne Intellgence, 35(): , 203. [Lee et al., 2007] Honglak Lee, Alexs Battle, Rajat Rana, and Andrew Y. Ng. Effcent sparse codng algorthms. In B. Schölkopf, J. Platt, and T. Hoffman, edtors, Advances n Neural Informaton Processng Systems 9, pages MIT Press, Cambrdge, MA, [Lu et al., 203] Cewu Lu, Japng Sh, and Jaya Ja. Onlne robust dctonary learnng. In IEEE Conference on Computer Vson and Pattern Recognton (CVPR) 203, pages , 203. [Maral et al., 2009a] Julen Maral, Francs Bach, Jean Ponce, and Gullermo Sapro. Onlne dctonary learnng for sparse codng. In Proceedngs of the 26th Internatonal Conference on Machne Learnng, pages , Montreal, June Omnpress. [Maral et al., 2009b] Julen Maral, Francs Bach, Jean Ponce, Gullermo Sapro, and Andrew Zsserman. Supervsed dctonary learnng. In D. Koller, D. Schuurmans, Y. Bengo, and L. Bottou, edtors, Advances n Neural Informaton Processng Systems 2, pages [Mallat, 999] Stéphane Mallat. A wavelet tour of sgnal processng. Academc press, 999. [Martnez, 998] Alex M Martnez. The ar face database. CVC Techncal Report, 24, 998. [Ne et al., 200] Fepng Ne, Heng Huang, Xao Ca, and Chrs H Dng. Effcent and robust feature selecton va jont 2, -norms mzaton. In Advances n Neural Informaton Processng Systems, pages 83 82, 200. [Ne et al., 204] Fepng Ne, Janjun Yuan, and Heng Huang. Optmal mean robust prncpal component analyss. In Proceedngs of the 3st Internatonal Conference on Machne Learnng (ICML-4), pages , 204. [Olshausen and Feld, 997] Bruno A Olshausen and Davd J Feld. Sparse codng wth an overcomplete bass set: A strategy employed by v? Vson research, 37(23): , 997. [Rana et al., 2007] Rajat Rana, Alexs Battle, Honglak Lee, Benja Packer, and Andrew Y Ng. Self-taught learnng: transfer learnng from unlabeled data. In Proceedngs of the 24th nternatonal conference on Machne learnng, pages ACM, [Tosc and Frossard, 20] I. Tosc and P. Frossard. Dctonary learnng. IEEE Sgnal Processng Magazne, 28(2):27 38, March 20. [Wang et al., 203a] Hua Wang, Fepng Ne, Wedong Ca, and Heng Huang. Sem-supervsed robust dctonary learnng va effcent l 2,0+ -norms mzaton. Internatonal Conference on Computer Vson (ICCV 203), pages 45 52, 203. [Wang et al., 203b] Hua Wang, Fepng Ne, and Heng Huang. Robust and dscratve self-taught learnng. The 30th Internatonal Conference on Machne Learnng (ICML 203), Journal of Machne Learnng Research, W&CP, 28(3): , 203. [Wang et al., 203c] Nayan Wang, Jngdong Wang, and Dt- Yan Yeung. Onlne robust non-negatve dctonary learnng for vsual trackng. In IEEE Internatonal Conference on Computer Vson, ICCV 203, Sydney, Australa, December -8, 203, pages , 203. [Wrght et al., 2009] J. Wrght, A.Y. Yang, A. Ganesh, S.S. Sastry, and Y Ma. Robust face recognton va sparse representaton. IEEE Transactons onpattern Analyss and Machne Intellgence, 3(2):20 227, [Zhang and L, 200] Qang Zhang and Baoxn L. Dscratve k-svd for dctonary learnng n face recognton. In IEEE Conference on Computer Vson and Pattern Recognton (CVPR) 203, pages IEEE, 200. [Zhang, 200] Tong Zhang. Analyss of mult-stage convex relaxaton for sparse regularzaton. The Journal of Machne Learnng Research, :08 07, 200. [Zhang, 203] Tong Zhang. Mult-stage convex relaxaton for feature selecton. Bernoull, 9(5B): ,

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

General Regression and Representation Model for Face Recognition

General Regression and Representation Model for Face Recognition 013 IEEE Conference on Computer Vson and Pattern Recognton Workshops General Regresson and Representaton Model for Face Recognton Janjun Qan, Jan Yang School of Computer Scence and Engneerng Nanjng Unversty

More information

Competitive Sparse Representation Classification for Face Recognition

Competitive Sparse Representation Classification for Face Recognition Vol. 6, No. 8, 05 Compettve Sparse Representaton Classfcaton for Face Recognton Yng Lu Chongqng Key Laboratory of Computatonal Intellgence Chongqng Unversty of Posts and elecommuncatons Chongqng, Chna

More information

Robust Low-Rank Regularized Regression for Face Recognition with Occlusion

Robust Low-Rank Regularized Regression for Face Recognition with Occlusion Robust Low-Rank Regularzed Regresson for ace Recognton wth Occluson Janjun Qan, Jan Yang, anlong Zhang and Zhouchen Ln School of Computer Scence and ngneerng, Nanjng Unversty of Scence and echnology Key

More information

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning Internatonal Journal of Intellgent Informaton Systems 2018; 7(2): 15-22 http://www.scencepublshnggroup.com/j/js do: 10.11648/j.js.20180702.11 ISSN: 2328-7675 (Prnt); ISSN: 2328-7683 (Onlne) Kernel Collaboratve

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Fast Robust Non-Negative Matrix Factorization for Large-Scale Human Action Data Clustering

Fast Robust Non-Negative Matrix Factorization for Large-Scale Human Action Data Clustering roceedngs of the Twenty-Ffth Internatonal Jont Conference on Artfcal Intellgence IJCAI-6) Fast Robust Non-Negatve Matrx Factorzaton for Large-Scale Human Acton Data Clusterng De Wang, Fepng Ne, Heng Huang

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

SPARSE NULL SPACE BASIS PURSUIT AND ANALYSIS DICTIONARY LEARNING FOR HIGH-DIMENSIONAL DATA ANALYSIS. Xiao Bian Hamid Krim Alex Bronstein Liyi Dai

SPARSE NULL SPACE BASIS PURSUIT AND ANALYSIS DICTIONARY LEARNING FOR HIGH-DIMENSIONAL DATA ANALYSIS. Xiao Bian Hamid Krim Alex Bronstein Liyi Dai SPARSE NULL SPACE BASIS PURSUIT AND ANALYSIS DICTIONARY LEARNING FOR HIGH-DIMENSIONAL DATA ANALYSIS Xao Ban Hamd Krm Alex Bronsten Ly Da Department of Electrcal Engneerng, North Carolna State Unversty,

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Robust Kernel Representation with Statistical Local Features. for Face Recognition

Robust Kernel Representation with Statistical Local Features. for Face Recognition Robust Kernel Representaton wth Statstcal Local Features for Face Recognton Meng Yang, Student Member, IEEE, Le Zhang 1, Member, IEEE Smon C. K. Shu, Member, IEEE, and Davd Zhang, Fellow, IEEE Dept. of

More information

Tone-Aware Sparse Representation for Face Recognition

Tone-Aware Sparse Representation for Face Recognition Tone-Aware Sparse Representaton for Face Recognton Lngfeng Wang, Huayu Wu and Chunhong Pan Abstract It s stll a very challengng task to recognze a face n a real world scenaro, snce the face may be corrupted

More information

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Joint Example-based Depth Map Super-Resolution

Joint Example-based Depth Map Super-Resolution Jont Example-based Depth Map Super-Resoluton Yanje L 1, Tanfan Xue,3, Lfeng Sun 1, Janzhuang Lu,3,4 1 Informaton Scence and Technology Department, Tsnghua Unversty, Bejng, Chna Department of Informaton

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Efficient Sparsity Estimation via Marginal-Lasso Coding

Efficient Sparsity Estimation via Marginal-Lasso Coding Effcent Sparsty Estmaton va Margnal-Lasso Codng Tzu-Y Hung 1,JwenLu 2, Yap-Peng Tan 1, and Shenghua Gao 3 1 School of Electrcal and Electronc Engneerng, Nanyang Technologcal Unversty, Sngapore 2 Advanced

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

New l 1 -Norm Relaxations and Optimizations for Graph Clustering

New l 1 -Norm Relaxations and Optimizations for Graph Clustering Proceedngs of the Thrteth AAAI Conference on Artfcal Intellgence (AAAI-6) New l -Norm Relaxatons and Optmzatons for Graph Clusterng Fepng Ne, Hua Wang, Cheng Deng 3, Xnbo Gao 3, Xuelong L 4, Heng Huang

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Weighted Sparse Image Classification Based on Low Rank Representation

Weighted Sparse Image Classification Based on Low Rank Representation Copyrght 08 Tech Scence Press CMC, vol.56, no., pp.9-05, 08 Weghted Sparse Image Classfcaton Based on Low Rank Representaton Qd Wu, Ybng L, Yun Ln, * and Ruoln Zhou Abstract: The conventonal sparse representaton-based

More information

Computer Aided Drafting, Design and Manufacturing Volume 25, Number 2, June 2015, Page 14

Computer Aided Drafting, Design and Manufacturing Volume 25, Number 2, June 2015, Page 14 Computer Aded Draftng, Desgn and Manufacturng Volume 5, Number, June 015, Page 14 CADDM Face Recognton Algorthm Fusng Monogenc Bnary Codng and Collaboratve Representaton FU Yu-xan, PENG Lang-yu College

More information

Neurocomputing 101 (2013) Contents lists available at SciVerse ScienceDirect. Neurocomputing

Neurocomputing 101 (2013) Contents lists available at SciVerse ScienceDirect. Neurocomputing Neurocomputng (23) 4 5 Contents lsts avalable at ScVerse ScenceDrect Neurocomputng journal homepage: www.elsever.com/locate/neucom Localty constraned representaton based classfcaton wth spatal pyramd patches

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

A Robust LS-SVM Regression

A Robust LS-SVM Regression PROCEEDIGS OF WORLD ACADEMY OF SCIECE, EGIEERIG AD ECHOLOGY VOLUME 7 AUGUS 5 ISS 37- A Robust LS-SVM Regresson József Valyon, and Gábor Horváth Abstract In comparson to the orgnal SVM, whch nvolves a quadratc

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Hybrid Non-Blind Color Image Watermarking

Hybrid Non-Blind Color Image Watermarking Hybrd Non-Blnd Color Image Watermarkng Ms C.N.Sujatha 1, Dr. P. Satyanarayana 2 1 Assocate Professor, Dept. of ECE, SNIST, Yamnampet, Ghatkesar Hyderabad-501301, Telangana 2 Professor, Dept. of ECE, AITS,

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Face Detection with Deep Learning

Face Detection with Deep Learning Face Detecton wth Deep Learnng Yu Shen Yus122@ucsd.edu A13227146 Kuan-We Chen kuc010@ucsd.edu A99045121 Yzhou Hao y3hao@ucsd.edu A98017773 Mn Hsuan Wu mhwu@ucsd.edu A92424998 Abstract The project here

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Semi-supervised Classification Using Local and Global Regularization

Semi-supervised Classification Using Local and Global Regularization Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (2008) Sem-supervsed Classfcaton Usng Local and Global Regularzaton Fe Wang 1, Tao L 2, Gang Wang 3, Changshu Zhang 1 1 Department of

More information

Efficient Text Classification by Weighted Proximal SVM *

Efficient Text Classification by Weighted Proximal SVM * Effcent ext Classfcaton by Weghted Proxmal SVM * Dong Zhuang 1, Benyu Zhang, Qang Yang 3, Jun Yan 4, Zheng Chen, Yng Chen 1 1 Computer Scence and Engneerng, Bejng Insttute of echnology, Bejng 100081, Chna

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Image Deblurring and Super-resolution by Adaptive Sparse Domain Selection and Adaptive Regularization

Image Deblurring and Super-resolution by Adaptive Sparse Domain Selection and Adaptive Regularization Image Deblurrng and Super-resoluton by Adaptve Sparse Doman Selecton and Adaptve Regularzaton Wesheng Dong a,b, Le Zhang b,1, Member, IEEE, Guangmng Sh a, Senor Member, IEEE, and Xaoln Wu c, Senor Member,

More information

Learning Semantic Visual Dictionaries: A new Method For Local Feature Encoding

Learning Semantic Visual Dictionaries: A new Method For Local Feature Encoding Learnng Semantc Vsual Dctonares: A new Method For Local Feature Encodng Bng Shua, Zhen Zuo, Gang Wang School of Electrcal and Electronc Engneerng, Nanyang Technologcal Unversty, Sngapore. Emal: {bshua001,

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices Hgh resoluton 3D Tau-p transform by matchng pursut Wepng Cao* and Warren S. Ross, Shearwater GeoServces Summary The 3D Tau-p transform s of vtal sgnfcance for processng sesmc data acqured wth modern wde

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Feature Extraction Based on Maximum Nearest Subspace Margin Criterion

Feature Extraction Based on Maximum Nearest Subspace Margin Criterion Neural Process Lett DOI 10.7/s11063-012-9252-y Feature Extracton Based on Maxmum Nearest Subspace Margn Crteron Y Chen Zhenzhen L Zhong Jn Sprnger Scence+Busness Meda New York 2012 Abstract Based on the

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

A DCVS Reconstruction Algorithm for Mine Video Monitoring Image Based on Block Classification

A DCVS Reconstruction Algorithm for Mine Video Monitoring Image Based on Block Classification 1 3 4 5 6 7 8 9 10 11 1 13 14 15 16 17 18 19 0 1 Artcle A DCVS Reconstructon Algorthm for Mne Vdeo Montorng Image Based on Block Classfcaton Xaohu Zhao 1,, Xueru Shen 1,, *, Kuan Wang 1, and Wanme L 1,

More information

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification Fast Sparse Gaussan Processes Learnng for Man-Made Structure Classfcaton Hang Zhou Insttute for Vson Systems Engneerng, Dept Elec. & Comp. Syst. Eng. PO Box 35, Monash Unversty, Clayton, VIC 3800, Australa

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

An Image Compression Algorithm based on Wavelet Transform and LZW

An Image Compression Algorithm based on Wavelet Transform and LZW An Image Compresson Algorthm based on Wavelet Transform and LZW Png Luo a, Janyong Yu b School of Chongqng Unversty of Posts and Telecommuncatons, Chongqng, 400065, Chna Abstract a cylpng@63.com, b y27769864@sna.cn

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Single Sample Face Recognition via Learning Deep Supervised Auto-Encoders Shenghua Gao, Yuting Zhang, Kui Jia, Jiwen Lu, Yingying Zhang

Single Sample Face Recognition via Learning Deep Supervised Auto-Encoders Shenghua Gao, Yuting Zhang, Kui Jia, Jiwen Lu, Yingying Zhang 1 Sngle Sample Face Recognton va Learnng Deep Supervsed Auto-Encoders Shenghua Gao, Yutng Zhang, Ku Ja, Jwen Lu, Yngyng Zhang Abstract Ths paper targets learnng robust mage representaton for sngle tranng

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

WIRELESS CAPSULE ENDOSCOPY IMAGE CLASSIFICATION BASED ON VECTOR SPARSE CODING.

WIRELESS CAPSULE ENDOSCOPY IMAGE CLASSIFICATION BASED ON VECTOR SPARSE CODING. WIRELESS CAPSULE ENDOSCOPY IMAGE CLASSIFICATION BASED ON VECTOR SPARSE CODING Tao Ma 1, Yuexan Zou 1 *, Zhqang Xang 1, Le L 1 and Y L 1 ADSPLAB/ELIP, School of ECE, Pekng Unversty, Shenzhen 518055, Chna

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Shape-adaptive DCT and Its Application in Region-based Image Coding

Shape-adaptive DCT and Its Application in Region-based Image Coding Internatonal Journal of Sgnal Processng, Image Processng and Pattern Recognton, pp.99-108 http://dx.do.org/10.14257/sp.2014.7.1.10 Shape-adaptve DCT and Its Applcaton n Regon-based Image Codng Yamn Zheng,

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

A New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1

A New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1 A New Feature of Unformty of Image Texture Drectons Concdng wth the Human Eyes Percepton Xng-Jan He, De-Shuang Huang, Yue Zhang, Tat-Mng Lo 2, and Mchael R. Lyu 3 Intellgent Computng Lab, Insttute of Intellgent

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017 U.C. Bereley CS294: Beyond Worst-Case Analyss Handout 5 Luca Trevsan September 7, 207 Scrbed by Haars Khan Last modfed 0/3/207 Lecture 5 In whch we study the SDP relaxaton of Max Cut n random graphs. Quc

More information

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm Recommended Items Ratng Predcton based on RBF Neural Network Optmzed by PSO Algorthm Chengfang Tan, Cayn Wang, Yuln L and Xx Q Abstract In order to mtgate the data sparsty and cold-start problems of recommendaton

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010 Smulaton: Solvng Dynamc Models ABE 5646 Week Chapter 2, Sprng 200 Week Descrpton Readng Materal Mar 5- Mar 9 Evaluatng [Crop] Models Comparng a model wth data - Graphcal, errors - Measures of agreement

More information

SPARSE-CODED NET MODEL AND APPLICATIONS

SPARSE-CODED NET MODEL AND APPLICATIONS TO APPEAR IN 2016 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING SPARSE-CODED NET MODEL AND APPLICATIONS Youngjune Gwon 1, Mram Cha 2, Wllam Campbell 1, H. T. Kung 2, Cagr Dagl 1

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

On Incremental and Robust Subspace Learning

On Incremental and Robust Subspace Learning On Incremental and Robust Subspace Learnng Yongmn L, L-Qun Xu, Jason Morphett and Rchard Jacobs Content and Codng Lab, BT Exact pp1 MLB3/7, Oron Buldng, Adastral Park, Ipswch, IP5 3RE, UK Emal: Yongmn.L@bt.com

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

Scale and Orientation Adaptive Mean Shift Tracking

Scale and Orientation Adaptive Mean Shift Tracking Scale and Orentaton Adaptve Mean Shft Trackng Jfeng Nng, Le Zhang, Davd Zhang and Chengke Wu Abstract A scale and orentaton adaptve mean shft trackng (SOAMST) algorthm s proposed n ths paper to address

More information

Modular PCA Face Recognition Based on Weighted Average

Modular PCA Face Recognition Based on Weighted Average odern Appled Scence odular PCA Face Recognton Based on Weghted Average Chengmao Han (Correspondng author) Department of athematcs, Lny Normal Unversty Lny 76005, Chna E-mal: hanchengmao@163.com Abstract

More information

A Bilinear Model for Sparse Coding

A Bilinear Model for Sparse Coding A Blnear Model for Sparse Codng Davd B. Grmes and Rajesh P. N. Rao Department of Computer Scence and Engneerng Unversty of Washngton Seattle, WA 98195-2350, U.S.A. grmes,rao @cs.washngton.edu Abstract

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

CLASSIFICATION OF ULTRASONIC SIGNALS

CLASSIFICATION OF ULTRASONIC SIGNALS The 8 th Internatonal Conference of the Slovenan Socety for Non-Destructve Testng»Applcaton of Contemporary Non-Destructve Testng n Engneerng«September -3, 5, Portorož, Slovena, pp. 7-33 CLASSIFICATION

More information

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms 3. Fndng Determnstc Soluton from Underdetermned Equaton: Large-Scale Performance Modelng by Least Angle Regresson Xn L ECE Department, Carnege Mellon Unversty Forbs Avenue, Pttsburgh, PA 3 xnl@ece.cmu.edu

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

AS a classical problem in low level vision, image denoising. Group Sparsity Residual Constraint for Image Denoising

AS a classical problem in low level vision, image denoising. Group Sparsity Residual Constraint for Image Denoising 1 Group Sparsty Resdual Constrant for Image Denosng Zhyuan Zha, Xnggan Zhang, Qong Wang, Lan Tang and Xn Lu arxv:1703.00297v6 [cs.cv] 31 Jul 2017 Abstract Group-based sparse representaton has shown great

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Efficient Distributed Linear Classification Algorithms via the Alternating Direction Method of Multipliers

Efficient Distributed Linear Classification Algorithms via the Alternating Direction Method of Multipliers Effcent Dstrbuted Lnear Classfcaton Algorthms va the Alternatng Drecton Method of Multplers Caoxe Zhang Honglak Lee Kang G. Shn Department of EECS Unversty of Mchgan Ann Arbor, MI 48109, USA caoxezh@umch.edu

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

International Conference on Applied Science and Engineering Innovation (ASEI 2015)

International Conference on Applied Science and Engineering Innovation (ASEI 2015) Internatonal Conference on Appled Scence and Engneerng Innovaton (ASEI 205) Desgn and Implementaton of Novel Agrcultural Remote Sensng Image Classfcaton Framework through Deep Neural Network and Mult-

More information

Robust and Effective Metric Learning Using Capped Trace Norm

Robust and Effective Metric Learning Using Capped Trace Norm Robust and Effectve Metrc Learnng Usng Capped Trace Norm Zhouyuan Huo Department of Computer Scence and Engneerng Unversty of Texas at Arlngton Texas, USA huozhouyuan@gmal.com Fepng Ne Department of Computer

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

Nonlocally Centralized Sparse Representation for Image Restoration

Nonlocally Centralized Sparse Representation for Image Restoration Nonlocally Centralzed Sparse Representaton for Image Restoraton Wesheng Dong a, Le Zhang b,1, Member, IEEE, Guangmng Sh a, Senor Member, IEEE, and Xn L c, Senor Member, IEEE a Key Laboratory of Intellgent

More information