Efficient Sparsity Estimation via Marginal-Lasso Coding

Size: px
Start display at page:

Download "Efficient Sparsity Estimation via Marginal-Lasso Coding"

Transcription

1 Effcent Sparsty Estmaton va Margnal-Lasso Codng Tzu-Y Hung 1,JwenLu 2, Yap-Peng Tan 1, and Shenghua Gao 3 1 School of Electrcal and Electronc Engneerng, Nanyang Technologcal Unversty, Sngapore 2 Advanced Dgtal Scences Center, Sngapore 3 ShanghaTech Unversty, Shangha, Chna Abstract. Ths paper presents a generc optmzaton framework for effcent feature quantzaton usng sparse codng whch can be appled to many computer vson tasks. Whle there are many works workng on sparse codng and dctonary learnng, none of them has exploted the advantages of the margnal regresson and the lasso smultaneously to provde more effcent and effectve solutons. In our work, we provde such an approach wth a theoretcal support. Therefore, the computatonal complexty of the proposed method can be two orders faster than that of the lasso wth sacrfcng the nevtable quantzaton error. On the other hand, the proposed method s more robust than the conventonal margnal regresson based methods. We also provde an adaptve regularzaton parameter selecton scheme and a dctonary learnng method ncorporated wth the proposed sparsty estmaton algorthm. Expermental results and detaled model analyss are presented to demonstrate the effcacy of our proposed methods. Keywords: Sparsty estmaton, margnal regresson, sparse codng, lasso, dctonary learnng, adaptve regularzaton parameter. 1 Introducton Sparse codng has been successfully appled to many machne learnng and computer vson tasks, ncludng mage classfcaton [30,31,8], face recognton [28,5], actvtybased person dentfcaton [22,15], etc. Sparse codng refers to a feature quantzaton mechansm whch obtans a codebook to encode each nput sgnal as a sparse hstogram feature based on the lnear combnaton of a few vsual words. Due to ts soft assgnment prncpal, t can make the quantzaton error much smaller [9,22]. Moreover, sparse codng s more robust to the nose and easly ncorporatng nto the bag-of-feature (BoF) framework. The least absoluton shrnkage and selecton operator whch s also known as lasso [27] n short s a popular sparse codng model n statstcs sgnal processng whch has wldly used n sparsty estmaton. It smultaneously mzes the quantzaton error, and mpose an L 1 penalty wth a regularzaton parameter λ whch controls the sparsty of the coeffcents. Whle there exsts many effcent algorthms [20,7,29], however, fndng the lasso solutons reman a computatonal task wth the complexty O(p 3 +dp 2 ) where d refers to the nput feature dmenson, and p refers to the amount of vsual D. Fleet et al. (Eds.): ECCV 2014, Part IV, LNCS 8692, pp , c Sprnger Internatonal Publshng Swtzerland 2014

2 Effcent Sparsty Estmaton va Margnal-Lasso Codng 579 words n the dctonary. Thus, t hardly supports large scale data analyss effcently and effectvely [6,23,11,1]. Recently, margnal regresson, has been revsted and shown ts effcent performance for vsual word selecton and sparsty estmaton [6,11,1]. For each feature, t calculates the correlaton wth each vsual word and mposes a tunng parameter to acheve the sparsty. Whle margnal regresson s smple and fast wth the complexty O(dp), t usually consders and utlzes a fxed tunng parameter whch s not always the good cut-off pont for each feature and could result n a large quantzaton error snce some coeffcents may be shrunk too much, but some may be ncluded too much nose. Hence, how to detere a sutable cut-off pont becomes an typcal problem n featurng codng. To address ths, the sure ndependent screenng (SIS) [6] method mposes an L 1 - norm penalty and an energy constrant E to each margnal regresson coeffcent vector. It sorts coeffcents n terms of ther absolute values and selects the top k coeffcents whose L 1 norm s bounded by E so that each sparse code can be represented by smlar sparsty level. However, ths method may also cause a large quantzaton error, because margnal regresson works well when vsual words have low correlaton n the dctonary [1,23]; however, n general, they are hghly correlated, and thus may fal to clarfy the relatonshp between vsual words and the nput feature. Whle there are many works workng on sparse codng and dctonary learnng, none of them has exploted the advantages of the margnal regresson and the lasso smultaneously to provde more effcent and effectve solutons. To ths end, we provde n ths paper such an approach wth a theoretcal support. We propose an effcent sparsty estmaton approach usng Margnal-Lasso Codng (MLC) whch quantzes each feature nto a small set of vsual words usng margnal regresson combned wth the lasso framework. Our model represents each sparse code wth smlar sparsty level bounded by a global sparsty energy and smultaneously shrnks ndvdual coeffcents to allevate the bas of sparse codes caused by the hghly correlated vsual words. Moreover, our approach automatcally deteres the shrnkng regularzaton parameter for each feature and further desgns a self-rato energy constrant of sparsty level whch s dfferent from the tradtonal constrant wth a case-dependent chosen value. Contrbuton: 1) We propose an effcent Margnal-Lasso Codng (MLC) framework for feature quantzaton wth sparsty estmaton, regularzatonparameterselecton and dctonary learnng. 2) We explot the advantages of the margnal regresson and the lasso smultaneously to provde more effcent and effectve solutons. 3) We detere the regularzaton parameter automatcally and adaptvely for each ndvdual feature and provde a self-rato energy bound. 4) We provde a theoretcal support for the proposed model. 5) We successfully apply the proposed method to varous recognton tasks. 2 Related Work Our approach s related to the general sparse codng model whch has been wdely used for sparse representaton [28,27,30,22], and there exst many algorthms to estmate sparse codes, such as least angle regresson [4], gradentdescent [7,29], and feature-sgn

3 580 T.-Y. Hung et al. search [20]. The feature-sgn search method [20,8,31] s a popular and effcent soluton for sparsty estmaton. It contnuously selects and updates potental canddates n an actve set to mantan these nonzero coeffcents and ther correspondng sgns untl reachng the constrant. However, wth a large dctonary sze, the actve set would become largeand the feature-sgn searchng process may be hard to terate [1,23]. Whle these methods am to obtan a small quantzaton error, they have a large computatonal cost [6,11]. Therefore, dfferent from tradtonal sparsty estmaton technques whch estmate sparse codes by contnuously selectng varables to the actve set based on the searchng crtera, we ncorporate margnal regresson coeffcents nto the sparse codng model whch s more effcent for sparsty estmaton. Recently, Fan and Lv [6] proposed a sure ndependent screenng (SIS) method usng margnal regresson to obtan sparse codes effcently and can easly deal wth largescale problems. Genovese et al. [11] provded a statstcal comparson of the lasso and margnal regresson. Due to the promsng performance n terms of accuracy and speed, Krshnakumar et al. [1] appled margnal regresson to learn sparse representaton for vsual tasks. Whle these solutons use hard-thresholdng-type methods, we propose n ths work a soft-thresholdng-type method wth a theoretcal support to use margnal regresson combned wth the lasso model for sparsty estmaton. 3 Effcent Sparsty Estmaton va Margnal-Lasso Codng 3.1 Sparsty Estmaton va Margnal Regresson Consder a regresson model x = Us + z wth an nput feature x R d, a coeffcent vector s R p, a column-wsely normalzed dctonary U = [u 1,...,u p ] R d p (p d) and a nose vector z R d. The feature quantzaton task s to quantze an nput feature x nto a feature hstogram s wth a few non-zero elements so that the lnear combnaton of U and s can gan a mzed quantzaton error. The mathematcal expresson can be shown as follows: 1 s 2 x Us 2 2 (1) To estmate the sparse code s, we frst compute the component-wse margnal regresson coeffcents â: â U T x (2) where â (k) = u T k x s the kth element of â and u k s the kth column of the dctonary U. Then, we threshold the coeffcents n terms of ther absolute values usng a tunng parameter t>0 [3,11] so that the coeffcents whose absolute values are larger than t are kept and the rest are set to zero. The mathematcal expresson can be wrtten as follows: ŝ (k) = { â(k) f â (k) >t 0 otherwse (3) Accordng to Eq. 3, however, utlzng a fxed tunng parameter may cause a large quantzaton error snce t s not always the good cut-off pont for each feature. Instead of

4 Effcent Sparsty Estmaton va Margnal-Lasso Codng 581 cut-off pont selecton, we can select the top k large coeffcents n terms of ther absolute values whose L 1 -norm s bounded by a constrant E [6,1]. The equaton can be reformulated as follows: ŝ (k) = { â(k) f k B 0 otherwse (4) where r B = {J 1,...,J r : r p : â (Jk) E} (5) k=1 We sort the coeffcents n terms of ther absolute values n descendng order whch denotes by ndexes J 1,...,J p where ŝ (J1) > ŝ (J2) > > ŝ (Jp). The two thresholdng approaches above are hard-thresholdng-type methods. 3.2 Margnal-Lasso Codng Whle margnal regresson has shown ts effectveness, there stll exsts three lmtatons: () Margnal regresson works well when the columns of dctonary have low correlaton [1,23]. However, n general, the columns of over-complete dctonares are usually hghly correlated, and thus, n a lnear regresson model, ther coeffcents can become poorly detered and exhbt hgh varance. () In addton, the fxed regularzaton parameter may cause a large quantzaton error. Some coeffcents may be shuck too much and some may be ncluded too much nose. () Moreover, the exstng margnal regresson solutons are hard-thresholdng-type methods, and there are no theoretcal supports. To address ths, we consder the followng condtons: (1) The regularzaton parameter should be chosen adaptvely and automatcally for each feature to mze ndvdual estmate of expected quantzaton error. (2) The L 1 -norm n the lasso framework should be consdered locally for vsual word selecton and coeffcent shrnkage to allevate the bas of sparse codes caused by the hghly correlated vsual words. (3) The L 1 -norm n the constrant should be bounded locally and globally for sparse representaton. To ths end, we ntroduce a Margnal-Lasso Codng (MLC) method to better characterze nput features and provde a theoretcally soft-thresholdng-type soluton. Smlar to the exstng works, we frst compute the margnal regresson coeffcents of each feature va Eq. 2. Then, we consder the followng optmzaton problem: s,λ 1 2 U T x s λ s 1 subject to s 1 E (6) The frst term s the quantzaton error, and the second term s a local L 1 penalty wth an adaptve regularzaton parameter λ whch controls the cut-off pont selecton and the shrnkage of coeffcents. The global sparsty constrant E bounds each sparse code to the smlar sparsty level. In ths model, we am to estmate a sparse code and a

5 582 T.-Y. Hung et al. parameter λ so that the quantzaton error can be mzed and the coeffcents can be shrunk smultaneously for sparse representaton. When consderng multple features, the optmzaton problems can be refned as follows: 1 s 1,...,s N 2 U T X S 2 F + N λ s 1 λ 1,...,λ N =1 (7) subject to s 1 E where X =[x 1,...,x N ] R d N be a set of features, and S =[s 1,...,s N ] R p N be a set of correspondng sparse codes. Snce the sparsty energy E s case-dependent varable, we provde here another self-rato energy e such that each feature can obtan ts sparse code by proportonally shrnkng ther margnal coeffcents to the desred level. The constrant can be reformulated as follows: 1 N s 1,...,s N 2 U T X S 2 F + λ s 1 λ 1,...,λ N =1 (8) subject to f λ(u T x) 1 U T x 1 e where f λ (A) refers to the soluton of Eq. 3 wth â = A and t = λ. Whle Eq. 7 consders the smlar L 1 -norm between sparse codes, Eq. 8 consders the smlar selfrato between each other. 3.3 Sparsty Estmaton When λ s fxed, we can solve the optmzaton problem of Eq. 6 by rewrtng the objectve as follows: 1 s 2 â s λ s 1 (9) Mnmzng the Eq. 9 s equvalent to mze ndvdual element errors wth ther correspondng absolute values. Thus, we can rewrte the objectve as follows: 1 p (â (k) s (k) ) 2 + λ s (k) (10) s 2 k=1 Snce the margnal regresson coeffcents are computed ndependently for each element, hence, we can decompose the problem nto p separate optmzaton tasks. For each task k, the optmzaton problem can be defned as follows: (â (k) s (k) ) 2 + λ s (k) (11) s (k) Then, by solvng the above optmzaton problem, we can obtan a soft-thresholdngtype optmal soluton of each element s (k) : â (k) λ f â (k) >λ s (k) = â (k) + λ f â (k) < λ (12) 0 otherwse = sgn(â (k) )( â (k) λ) +

6 Effcent Sparsty Estmaton va Margnal-Lasso Codng 583 where sgn(j) refers to the sgn of j,and(l) + means to keep the value when l s postve and set t to zero when l s negatve. The L 1 -norm can be defned as the sum of absolute elements s 1 = p k=1 sgn(â(k) )( â (k) λ) +. In Eq. 12, the regularzaton parameter λ controls the element selecton and the shrnkage of the coeffcents. If λ equals to zero, then s 1 = â 1. Instead, f λ>0, then some coeffcents wll be drectly set to zero and others wll be shrunk towards zero automatcally so that s 1 < â 1.By maxmzng s 1 bounded by the global sparsty constrant E, the ndvdual λ can be calculated as follows: λ =argmax λ s 1 =argmax λ subject to s 1 E p sgn(â (k) )( â (k) λ) + By dong so, we can estmate each element of the sparse code s (k) = sgn(â (k) )( â (k) λ ) + whch s a soft-thresholdng-type soluton. When consderng a set of features, the optmzaton problem of Eq. 7 or 8 can be solved va Algorthm 1. k=1 (13) 4 Margnal-Lasso Codng for Dctonary Learnng The dctonary U s a set of normalzed vsual words denotng by each column u k.to learn the dctonary, we teratvely optmze U and S by obtanng S wth a fxed U and updatng U based on a gven S. The sparsty estmaton S va the margnal-lasso model has been ntroduced n the prevous subsecton. In ths subsecton, we ntroduce the dctonary learnng method. Snce margnal regresson works well when the columns n the dctonary have low correlaton [6,11,1,23], we formulate the optmzaton problem of the dctonary learnng as follows: U N x Us γ U T U I 2 F =1 subject to u T k u k 1 k (18) We smultaneously mze the quantzaton errors denotng by the frst term and the correlatons between columns n the second term wth a regularzaton parameter γ. Ths optmzaton problem can be solved va the frst-order gradent descent method [23,1]: U q+1 =Π U {U q β F (U q )} (19) where N F (U) = x Us γ U T U I 2 F (20) and =1 F (U q )=2(U q SS T XS T )+4γ(U q U T q U q U q ) (21) F (U q ) s the gradent of F (U) wth respect to U at each teraton q, parameter β s the step sze, and Π U s the projecton functon whch maps each column u k to the

7 584 T.-Y. Hung et al. Algorthm 1. Effcent Sparsty Estmaton va Margnal-Lasso Codng Requre: Data X =[x 1,...,x N] R d N, a learned dctonary U =[u 1,...,u p] R d p, and a specfc value of the sparsty constrant E or a self-rato constrant e Ensure: Sparse coeffcents S =[s 1,...,s N ] R p N 1: For x, compute the margnal regresson coeffcents â (k) = ut k x u k 2 2 k (14) 2: Calculate the regularzaton parameter λ for each x based on the sparsty constrant E of Eq. 6 and 13 λ =argmax λ s 1 =argmax λ subject to s 1 E p k=1 sgn(â (k) )( â (k) λ ) + (15) Or calculate the regularzaton parameter λ for each x based on the self-rato constrant e n Eq. 8 λ =argmax λ s 1 =argmax λ subject to f λ (U T x ) 1 U T x 1 p k=1 e sgn(â (k) )( â (k) λ ) + (16) 3: Obtan sparse codes s (k) = â (k) λ f â (k) >λ â (k) + λ f â (k) < λ 0 otherwse (17) L 2 -norm unt ball. More specfcally, the dctonary s learned teratvely by fndng the local mum along the gradent drecton based on the small step sze and normalzng each column vector wth length smaller or equal to one untl convergence. The sparsty estmaton and the dctonary learnng process s shown n Algorthm 2. 5 Expermental Results In ths secton, we descrbe the expermental settngs and analyze the proposed method on the mage classfcaton, acton recognton and actvty-based human dentfcaton tasks. We denote the proposed margnal-lasso codng approach by MLC wth Matlab mplementaton. We compare the proposed method wth 5 algorthms: (1) The lasso model wth feature-sgn search (LASSO-FS): the code has been mplemented by ts authors n Matlab [20]. (2) The lasso model wth least angle regresson (LASSO-LAR): we used the code from SPAMS software mplemented by C++ [23]. (3) The margnal regresson model wth the hard-thresholdng method (MR-Hard): we mplemented t n

8 Effcent Sparsty Estmaton va Margnal-Lasso Codng 585 Algorthm 2. Sparsty Estmaton and Dctonary Learnng Requre: Data X = [x 1,...,x N] R d N, an ntal normalzed dctonary U = [u 1,...,u p] R d p, and an energy bound wth value E or self-rato e Ensure: Learned dctonary U R d p, sparse coeffcents S =[s 1,...,s N ] R p N 1: repeat 2: Obtan sparse code va Algorthm 1 usng the latest dctonary. 1 s 1,...,s N 2 U T X S 2 F + λ 1,...,λ N subject to s 1 E N λ s 1 =1 (22) 3: Update the dctonary based on the sparse codes obtaned from the prevous step. U N x Us γ U T U I 2 F =1 subject to u T k u k 1 k (23) 4: untl convergence Matlab accordng to Eq. 3. (4) The margnal regresson model wth the soft-thresholdng method (MR-Soft): we mplemented t n Matlab accordng to Eq. 3 wth soft thresholdng. (5) The margnal regresson model wth the sure ndependence screenng method (MR-SIS): we mplemented t n Matlab accordng to Eq Image Classfcaton We evaluate the performance of the proposed method for the mage classfcaton task on USPS and Scene 15 datasets. USPS s a handwrtten dgts dataset [14,31] whch conssts of 7291 tranng mages and 2007 test mages of sze wth dgts 0 to 9. We represent each sample by a 256-dmensonal vector. The dctonary szes are selected from 32, 64, 128, 256, 512, 1024 under dfferent tranng szes. For the penalty constrant, we adopt self-rato n Eq. (8), and the parameters are set emprcally va cross-valdaton. To evaluate the performance, we tran a lnear SVM classfer and compare to orgnal regresson (OR) and sparse codng (SC) methods. Table 1 shows the results under dfferent szes of tranng samples from 100 to Asshown,the proposed MLC method outperforms the other two methods n most of the cases and obtans the best accuracy result 95.3 under 5000 tranng sample sze. Scene 15 [24,21,19] contans 4485 mages wth 15 categores. Each category conssts of rangng from 200 to 400 mages wth average sze of pxels. Ths dataset s very dverse from ndoor such as lvng room, ktchen, offce, store, etc., to outdoor ncludng street, nsde cty, buldng, mountan, forest, etc. Followng the same settng as the prevous works [19,30], we randomly select 100 mages per category as tranng samples and the rest as test samples and ncorporate the proposed method wth spatal pyramd matchng whch denotes by MLcSPM. Table 2 shows the results. As shown, our MLcSPM method outperforms other methods, especally for the

9 586 T.-Y. Hung et al. Table 1. Recognton accuracy (%) of dfferent methods on the USPS dataset Sze of tranng set Method OR SC MLC Table 2. Recognton accuracy (%) of dfferent methods on the Scene 15 dataset Sze of tranng set Method 100 KSPM [19] KC [10] ScSPM [30] HardSPM SoftSPM SISSPM MLcSPM popular lasso-based ScSPM scheme. Further, the proposed MLcSPM outperforms other margnal regresson based methods. 5.2 Acton Recognton We analyze the characterstc of the proposed method usng the KTH acton recognton dataset. The KTH dataset [26] contans 25 persons. Each person performed 6 dfferent actvtes, ncludng boxng, handclappng, handwavng, joggng, runnng, and walkng, respectvely. There are 4 scenaros under each actvty, ncludng outdoors, outdoors wth scale varaton, outdoors wth dfferent clothes and ndoors, respectvely, resultng 599 vdeo sequences n total. We follow the orgnal expermental setup whch dvdes clps nto 2391 subclps wth 863 clps for the test set (9 subjects) and 1528 clps for the tranng set (16 subjects). We use three space-tme local feature descrptors n the experments, HOG, HOF and HOGHOF [18] obtaned by a 3D-Harrs detector [17], and quantze each local feature nto a sparse code va Algorthm 1. After that, we represent each vdeo clp usng the maxmum poolng method and use a non-lnear support vector machne [2] wth a X 2 kernel [18]. We select the parameters usng cross valdaton. To speedup the dctonary learnng process, we frstly learn an ntal dctonary usng onlne dctonary learnng method [23] wth 100 runs as the warm start. Comparson between Hard-Thresholdng and Soft-Thresholdng: We encode each HOF local feature usng a hard-thresholdng technque, MR-Hard, and a softthresholdng technque, MR-Soft, respectvely, wth dctonary sze 4000 n the tranng set as the tunnng parameter t vares. Fg. 1 (a) shows the average L 1 -norm energy of sparse codes under dfferent t. As shown, usng the soft-thresholdng scheme s a more effectve way for varable selecton and coeffcent shrnkage. Further, the soft-thresholdng

10 Effcent Sparsty Estmaton va Margnal-Lasso Codng 587 Fg. 1. Comparson of the soft and hard-thresholdng schemes. (a) The average L 1-norm energy of sparse codes under dfferent t. As shown, usng the soft-thresholdng scheme s a more effectve way for varable selecton and coeffcent shrnkage. (b) The average standard devaton of the L 1- norm energy between sparse codes as t changes. Sparse codes obtaned va the soft-thresholdng scheme are much stable than those obtaned va the hard-thresholdng scheme under dfferent t. scheme acheves smaller standard devaton between sparse codes as t vares as shown n Fg. 1 (b). In the recognton pont of vew, Fg. 2 shows the recognton accuracy of MR-Soft and MR-Hard wth the HOG, HOF, and HOGHOF features respectvely under dfferent t. The accuracy of the MR-soft method s much better than that of the MR-Hard method. Therefore, we can conclude that soft-thresholdng technque such as MR-Soft s much stable and robust than hard-thresholdng technque such as MR-Hard. Energy Dscusson: The sparsty constrant E n Eq. 6 needs to be specfed a value, and the value may vary wth dfferent datasets. Instead, the self-rato constrant e n Eq. 8 ams to keep the certan rato of the margnal regresson coeffcents n terms of L 1 - norm energy. For example, the self-rato e =5%means that, for each feature, only 5% of ts margnal regresson coeffcents n terms of the L 1 norm can be kept and the rest wll be set to zero. Thus, we can obtan sparse codes more easly by assgnng a desred rato. Fg. 3 shows the recognton accuracy between dfferent values of the self-rato constrant e. The results show that we obtan around 90% recognton accuracy by settng the self-rato e from 5% to 35%, and obtan poor recognton accuracy (below 10%) by settng e from 75% to 100%. Therefore, we can conclude that sparse representaton helps sgnal nterpretaton. Speed Comparson: In ths subsecton, we exae the feature quantzaton speed usng dfferent sparsty estmaton methods under three dfferent feature descrptors, HOG, HOF and HOGHOF, respectvely. The dctonary szes are set from 1000 to It s worth mentonng that all the methods except for LASSO-LAR were mplemented n Matlab, and thus t would be unfar to compare them wth the LASSO-LAR method together snce C++ program of LASSO-LAR has a bult-n speed advantage. Nevertheless, as shown n Table 3, the margnal regresson related methods stll take sgnfcantly less tme than those of the lasso solutons. In addton, MR-SIS and MLC have

11 588 T.-Y. Hung et al. Fg. 2. Accuracy of soft and hard-thresholdng wth the HOG, HOF and HOGHOF features as t vares. The accuracy of MR-soft s much better than that of MR-Hard n terms of recognton accuracy under three dfferent features. comparable quantzaton speed but perform slower than MR-Hard and MR-soft snce the former two consder the L 1 -norm energy constrant, and the latter two do not. Feature Quantzaton Error: Except for the speed comparson, Table 3 also shows the quantzaton errors. As shown, LASSO-FS and LASSO-LAR acheve the lowest quantzaton error. For the margnal regresson related methods, the energy-based methods (MR-SIS and MLC) acheve lower quantzaton error than those wthout L 1 - norm energy (MR-Hard and MR-Soft) constrant. In addton, the soft-thresholdng-lke methods acheve lower quantzaton error than the hard-thresholdng-lke methods parwsely,.e. MR-Soft aganst MR-Hard, and MLC aganst MR-SIS. Recognton Accuracy: Table 4 shows the recognton accuracy under three dfferent features wth the dctonary sze In ths experment, we select the sparsty constrant E va cross valdaton, and compare the proposed MLC model to other margnal related methods: MR-Hard, MR-Soft and MR-SIS. As shown, the proposed MLC approach obtans the best results, and the MR-Hard method obtans the worst and performs poorly under the HOF feature. In addton, the energy-based methods (MLC and MR-SIS) performs better than those wthout the energy constrant (MR-Hard and MR-Soft), and the soft-thresholdng-type methods have better performance aganst the hard-thresholdng-type methods parwsely (MLC aganst MR-SIS and MR-Soft aganst MR-Hard).

12 Effcent Sparsty Estmaton va Margnal-Lasso Codng 589 Fg. 3. Recognton accuracy under dfferent self-ratos Table 3. Comparson wth the exstng methods for the KTH dataset. The results are shown n terms of speed and quantzaton error. Notce that here we random sampled 1, 258 features from the tranng set. The actual numbers of the local features n the KTH dataset are 261, 946 for the tranng set and 136, 219 for the test set. tme (sec) Reconstructon error Dctonary sze Method HOG-LASSO-FS [20] HOG-LASSO-LAR [4] HOG-MR-Hard HOG-MR-Soft HOG-MR-SIS [6] HOG-MLC HOF-LASSO-FS [20] HOF-LASSO-LAR [4] HOF-MR-Hard HOF-MR-Soft HOF-MR-SIS [6] HOF-MLC HOGHOF-LASSO-FS [20] HOGHOF-LASSO-LAR [4] HOGHOF-MR-Hard HOGHOF-MR-Soft HOGHOF-MR-SIS [6] HOGHOF-MLC

13 590 T.-Y. Hung et al. Table 4. Comparson wth the exstng methods for the KTH dataset Feature-Method KTH HOG-MR-Hard HOG-MR-Soft HOG-MR-SIS [6] HOG-MLC HOF-MR-Hard HOF-MR-Soft HOF-MR-SIS [6] HOF-MLC HOGHOF-MR-Hard HOGHOF-MR-Soft HOGHOF-MR-SIS [6] HOGHOF-MLC Summary: Whle the lasso-lke models (LASSO-FS and LASSO-LAR) acheve the lowest quantzaton error, they take more tme to obtan sparse codes. Instead, the margnal regresson models wthout energy constrant (MR-Hard and MR-Soft) can reach the fastest speed to generate sparse codes; however, they also gan the hghest quantzaton errors. Whle MR-SIS can compete wth our model n terms of the quantzaton error, we reach better recognton accuracy than that of MR-SIS as shown n Table 4 because the proposed approach s a soft-thresholdng method whch s more stable and robust. There are two man dfferences between the MR-SIS model and the proposed MLC model: 1) Whle the dea of MR-SIS s frequently used of the hardthresholdng technque n the applcatons, t has no theoretcal support. We ntegrates margnal regresson and the lasso model wth a theoretcal support whch s a softthresholdng scheme. 2) We estmate the regularzaton parameter for each ndvdual nput feature and gan the lower quantzaton error than that of the MR-SIS model. 5.3 Actvty-Based Human Identfcaton The actvty-based human dentfcaton task ams to recognze the dentty of a person based on hs/her actvtes. We performed the experment on two publcly avalable databases, wezmann [13] and MOBISERV-AIIA [16], respectvely. In the dataset, each clp contans a person perforg one actvty. For each mage frame, we extract human body slhouette whch s smlar to the way n [25,12,16,22,15]. An ntal dctonary s learned va the onlne learnng method proposed n [23] to speed up the learnng process. The wezmann dataset contans 215 clps wth 9 persons perforg 9 dfferent actvtes. The bend actvty s excluded due to lack of samples. We randomly select one clp per acton per class as tranng data, and use the remanng as test data. The MOBISERV-AIIA dataset contans 12 persons perforg eatng and drnkng actvtes wth 2 clothng scenaros n four dfferent days. We adopt 2 knds of the actvtes, drnkng wth a cup and eatng wth a fork, wth 776 clps n total. We randomly choose one-day sequences as test samples and use the rest as the tranng samples.

14 Effcent Sparsty Estmaton va Margnal-Lasso Codng 591 The parameters are detered va cross valdaton. Table 5 shows the recognton accuracy based on the 10% self-rato sparsty constrant. In ths applcaton, the softthresholdng-type methods (MLC and MR-Soft) work better than the hard-thresholdngtype methods (MR-Hard and MR-SIS). In addton, MLC outperforms MR-Soft. Table 5. Comparson wth the exstng methods for the wezmann and the MOBISERV-AIIA datasets backslashboxmethoddataset wezmann MOBISERV-AIIA MR-Hard MR-Soft MR-SIS [6] MLC Concluson In ths paper, we have proposed a novel feature quantzaton approach for sparsty estmaton by explotng the advantages of the margnal regresson and the lasso smultaneously to provde more effcent and effectve solutons. The proposed approach has been evaluated on three vsual applcatons: mage classfcaton, acton recognton and actvty-based human dentfcaton. Expermental results have shown that our method can acheve excellent performance n terms of speed, quantzaton error and recognton accuracy. All these suffcently demonstrate the effcacy of our proposed methods. Acknowledgment. Ths work s supported by the research grant for the Human Cyber Securty Systems (HCSS) Program at the Advanced Dgtal Scences Center from the Agency for Scence, Technology and Research of Sngapore. References 1. Balasubramanan, K., Yu, K., Lebanon, G.: smooth sparse codng va margnal regresson for learnng sparse representatons. In: ICML (2013) 2. Chang, C.C., Ln, C.J.: Lbsvm: A lbrary for support vector machnes. ACM Trans. Intell. Syst. Technol. 2(3), 27:1 27:27 (2011) 3. Donoho, D.L.: For most large underdetered systems of lnear equatons the mal l1- norm soluton s also the sparsest soluton. Comm. Pure Appl. Math. 59, (2004) 4. Efron, B., Haste, T., Johnstone, I., Tbshran, R.: Least angle regresson. Annals of Statstcs (2004) 5. Elhamfar, E., Vdal, R.: Robust classfcaton usng structured sparse representaton. In: CVPR (2011) 6. Fan, J., Lv, J.: Sure ndependence screenng for ultrahgh dmensonal feature space. Journal of the Royal Statstcal Socety: Seres B (Statstcal Methodology) 70(5), (2008) 7. Fredman, J., Haste, T., Hoflng, H., Tbshran, R.: Pathwse coordnate optmzaton (2007) 8. Gao, S., Tsang, I., Cha, L.: Laplacan sparse codng, hypergraph laplacan sparse codng, and applcatons. IEEE Transactons on Pattern Analyss and Machne Intellgence(2012)

15 592 T.-Y. Hung et al. 9. Gao, S., Tsang, I.W.H., Cha, L.T., Zhao, P.: Local features are not lonely - laplacan sparse codng for mage classfcaton. In: 2013 IEEE Conference on Computer Vson and Pattern Recognton (2010) 10. van Gemert, J.C., Geusebroek, J.-M., Veenman, C.J., Smeulders, A.W.M.: Kernel codebooks for scene categorzaton. In: Forsyth, D., Torr, P., Zsserman, A. (eds.) ECCV 2008, Part III. LNCS, vol. 5304, pp Sprnger, Hedelberg (2008) 11. Genovese, C.R., Jn, J., Wasserman, L., Yao, Z.: A comparson of the lasso and margnal regresson. J. Mach. Learn. Res. 13(1), (2012) 12. Gkalels, N., Tefas, A., Ptas, I.: Human dentfcaton from human movements. In: ICIP, pp (2009) 13. Gorelck, L., Blank, M., Shechtman, E., Iran, M., Basr, R.: Acton as space-tme shapes. PAMI 29(12), (2007) 14. Hull, J.: A database for handwrtten text recognton research. PAMI 16(5), (1994) 15. Hung, T.-Y., Lu, J., Tan, Y.-P.: Graph-based sparse codng and embeddng for actvty-based human dentfcaton. In: ICME (2013) 16. Iosfds, A., Tefas, A., Ptas, I.: Actvty based person dentfcaton usng fuzzy representaton and dscrant learnng. TIFS 7(2), (2012) 17. Laptev, I.: On space-tme nterest ponts. Internatonal Journal of Computer Vson 64(2-3), (2005) 18. Laptev, I., Marszalek, M., Schmd, C., Rozenfeld, B.: Learnng realstc human actons from moves. In: CVPR (2008) 19. Lazebnk, S., Schmd, C., Ponce, J.: Beyond bags of features: Spatal pyramd matchng for recognzng natural scene categores. In: CVPR, pp (2006) 20. Lee, H., Battle, A., Rana, R., Ng, A.Y.: Effcent sparse codng algorthms. In: Schölkopf, B., Platt, J., Hoffman, T. (eds.) Advances n Neural Informaton Processng Systems, vol. 19, pp MIT Press, Cambrdge (2007) 21. L, F.F., Perona, P.: A bayesan herarchcal model for learnng natural scene categores. In: CVPR, pp (2005) 22. Lu, J., Hu, J., Zhou, X., Shang, Y.: Actvty-based human dentfcaton usng sparse codng and dscratve metrc learnng. In: ACM Multmeda (2012) 23. Maral, J., Bach, F., Ponce, J., Sapro, G.: Onlne learnng for matrx factorzaton and sparse codng. J. Mach. Learn. Res. 11, (2010) 24. Olva, A., Torralba, A.: Modelng the shape of the scene: A holstc representaton of the spatal envelope. IJCV 42(3), (2001) 25. Sarkar, S., Phllps, P.J., Lu, Z., Vega, I.R., Grother, P., Bowyer, K.W.: The humanid gat challenge problem: data sets, performance, and analyss. PAMI 27(2), (2005) 26. Schuldt, C., Laptev, I., Caputo, B.: Recognzng human actons: A local svm approach. In: ICPR, pp (2004) 27. Tbshran, R.: Regresson shrnkage and selecton va the lasso. Journal of the Royal Statstcal Socety. Seres B (Methodologcal) 58(1), (1996) 28. Wrght, J., Yang, A., Ganesh, A., Sastry, S., Ma, Y.: Robust face recognton va sparse representaton. IEEE Transactons on Pattern Analyss and Machne Intellgence 31(2), (2009) 29. Wu, T., Lange, K.: Coordnate descent algorthms for lasso penalzed regresson. Annals of Appled Statstcs (2008) 30. Yang, J., Yu, K., Gong, Y., Huang, T.: Lnear spatal pyramd matchng usng sparse codng for mage classfcaton. In: IEEE Conference on Computer Vson and Pattern Recognton, CVPR 2009, pp (June 2009) 31. Zheng, M., Bu, J., Chen, C., Wang, C., Zhang, L., Qu, G., Ca, D.: Graph regularzed sparse codng for mage representaton. IEEE Transactons on Image Processng 20(5), (2011)

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Robust Dictionary Learning with Capped l 1 -Norm

Robust Dictionary Learning with Capped l 1 -Norm Proceedngs of the Twenty-Fourth Internatonal Jont Conference on Artfcal Intellgence (IJCAI 205) Robust Dctonary Learnng wth Capped l -Norm Wenhao Jang, Fepng Ne, Heng Huang Unversty of Texas at Arlngton

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

WIRELESS CAPSULE ENDOSCOPY IMAGE CLASSIFICATION BASED ON VECTOR SPARSE CODING.

WIRELESS CAPSULE ENDOSCOPY IMAGE CLASSIFICATION BASED ON VECTOR SPARSE CODING. WIRELESS CAPSULE ENDOSCOPY IMAGE CLASSIFICATION BASED ON VECTOR SPARSE CODING Tao Ma 1, Yuexan Zou 1 *, Zhqang Xang 1, Le L 1 and Y L 1 ADSPLAB/ELIP, School of ECE, Pekng Unversty, Shenzhen 518055, Chna

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Object-Based Techniques for Image Retrieval

Object-Based Techniques for Image Retrieval 54 Zhang, Gao, & Luo Chapter VII Object-Based Technques for Image Retreval Y. J. Zhang, Tsnghua Unversty, Chna Y. Y. Gao, Tsnghua Unversty, Chna Y. Luo, Tsnghua Unversty, Chna ABSTRACT To overcome the

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Learning Semantic Visual Dictionaries: A new Method For Local Feature Encoding

Learning Semantic Visual Dictionaries: A new Method For Local Feature Encoding Learnng Semantc Vsual Dctonares: A new Method For Local Feature Encodng Bng Shua, Zhen Zuo, Gang Wang School of Electrcal and Electronc Engneerng, Nanyang Technologcal Unversty, Sngapore. Emal: {bshua001,

More information

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity Journal of Sgnal and Informaton Processng, 013, 4, 114-119 do:10.436/jsp.013.43b00 Publshed Onlne August 013 (http://www.scrp.org/journal/jsp) Corner-Based Image Algnment usng Pyramd Structure wth Gradent

More information

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning Internatonal Journal of Intellgent Informaton Systems 2018; 7(2): 15-22 http://www.scencepublshnggroup.com/j/js do: 10.11648/j.js.20180702.11 ISSN: 2328-7675 (Prnt); ISSN: 2328-7683 (Onlne) Kernel Collaboratve

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

An efficient method to build panoramic image mosaics

An efficient method to build panoramic image mosaics An effcent method to buld panoramc mage mosacs Pattern Recognton Letters vol. 4 003 Dae-Hyun Km Yong-In Yoon Jong-Soo Cho School of Electrcal Engneerng and Computer Scence Kyungpook Natonal Unv. Abstract

More information

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms 3. Fndng Determnstc Soluton from Underdetermned Equaton: Large-Scale Performance Modelng by Least Angle Regresson Xn L ECE Department, Carnege Mellon Unversty Forbs Avenue, Pttsburgh, PA 3 xnl@ece.cmu.edu

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

A fast algorithm for color image segmentation

A fast algorithm for color image segmentation Unersty of Wollongong Research Onlne Faculty of Informatcs - Papers (Arche) Faculty of Engneerng and Informaton Scences 006 A fast algorthm for color mage segmentaton L. Dong Unersty of Wollongong, lju@uow.edu.au

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

Robust Low-Rank Regularized Regression for Face Recognition with Occlusion

Robust Low-Rank Regularized Regression for Face Recognition with Occlusion Robust Low-Rank Regularzed Regresson for ace Recognton wth Occluson Janjun Qan, Jan Yang, anlong Zhang and Zhouchen Ln School of Computer Scence and ngneerng, Nanjng Unversty of Scence and echnology Key

More information

Tone-Aware Sparse Representation for Face Recognition

Tone-Aware Sparse Representation for Face Recognition Tone-Aware Sparse Representaton for Face Recognton Lngfeng Wang, Huayu Wu and Chunhong Pan Abstract It s stll a very challengng task to recognze a face n a real world scenaro, snce the face may be corrupted

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

Joint Example-based Depth Map Super-Resolution

Joint Example-based Depth Map Super-Resolution Jont Example-based Depth Map Super-Resoluton Yanje L 1, Tanfan Xue,3, Lfeng Sun 1, Janzhuang Lu,3,4 1 Informaton Scence and Technology Department, Tsnghua Unversty, Bejng, Chna Department of Informaton

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

An Image Compression Algorithm based on Wavelet Transform and LZW

An Image Compression Algorithm based on Wavelet Transform and LZW An Image Compresson Algorthm based on Wavelet Transform and LZW Png Luo a, Janyong Yu b School of Chongqng Unversty of Posts and Telecommuncatons, Chongqng, 400065, Chna Abstract a cylpng@63.com, b y27769864@sna.cn

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures A Novel Adaptve Descrptor Algorthm for Ternary Pattern Textures Fahuan Hu 1,2, Guopng Lu 1 *, Zengwen Dong 1 1.School of Mechancal & Electrcal Engneerng, Nanchang Unversty, Nanchang, 330031, Chna; 2. School

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

General Regression and Representation Model for Face Recognition

General Regression and Representation Model for Face Recognition 013 IEEE Conference on Computer Vson and Pattern Recognton Workshops General Regresson and Representaton Model for Face Recognton Janjun Qan, Jan Yang School of Computer Scence and Engneerng Nanjng Unversty

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

A Background Subtraction for a Vision-based User Interface *

A Background Subtraction for a Vision-based User Interface * A Background Subtracton for a Vson-based User Interface * Dongpyo Hong and Woontack Woo KJIST U-VR Lab. {dhon wwoo}@kjst.ac.kr Abstract In ths paper, we propose a robust and effcent background subtracton

More information

Action Recognition by Matching Clustered Trajectories of Motion Vectors

Action Recognition by Matching Clustered Trajectories of Motion Vectors Acton Recognton by Matchng Clustered Trajectores of Moton Vectors Mchals Vrgkas 1, Vasleos Karavasls 1, Chrstophoros Nkou 1 and Ioanns Kakadars 2 1 Department of Computer Scence, Unversty of Ioannna, Ioannna,

More information

AS a classical problem in low level vision, image denoising. Group Sparsity Residual Constraint for Image Denoising

AS a classical problem in low level vision, image denoising. Group Sparsity Residual Constraint for Image Denoising 1 Group Sparsty Resdual Constrant for Image Denosng Zhyuan Zha, Xnggan Zhang, Qong Wang, Lan Tang and Xn Lu arxv:1703.00297v6 [cs.cv] 31 Jul 2017 Abstract Group-based sparse representaton has shown great

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Using Fuzzy Logic to Enhance the Large Size Remote Sensing Images

Using Fuzzy Logic to Enhance the Large Size Remote Sensing Images Internatonal Journal of Informaton and Electroncs Engneerng Vol. 5 No. 6 November 015 Usng Fuzzy Logc to Enhance the Large Sze Remote Sensng Images Trung Nguyen Tu Huy Ngo Hoang and Thoa Vu Van Abstract

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap Int. Journal of Math. Analyss, Vol. 8, 4, no. 5, 7-7 HIKARI Ltd, www.m-hkar.com http://dx.do.org/.988/jma.4.494 Emprcal Dstrbutons of Parameter Estmates n Bnary Logstc Regresson Usng Bootstrap Anwar Ftranto*

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Real-time Motion Capture System Using One Video Camera Based on Color and Edge Distribution

Real-time Motion Capture System Using One Video Camera Based on Color and Edge Distribution Real-tme Moton Capture System Usng One Vdeo Camera Based on Color and Edge Dstrbuton YOSHIAKI AKAZAWA, YOSHIHIRO OKADA, AND KOICHI NIIJIMA Graduate School of Informaton Scence and Electrcal Engneerng,

More information

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye

More information

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. *, NO. *, Dictionary Pair Learning on Grassmann Manifolds for Image Denoising

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. *, NO. *, Dictionary Pair Learning on Grassmann Manifolds for Image Denoising IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. *, NO. *, 2015 1 Dctonary Par Learnng on Grassmann Manfolds for Image Denosng Xanhua Zeng, We Ban, We Lu, Jale Shen, Dacheng Tao, Fellow, IEEE Abstract Image

More information

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc.

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 97-735 Volume Issue 9 BoTechnology An Indan Journal FULL PAPER BTAIJ, (9), [333-3] Matlab mult-dmensonal model-based - 3 Chnese football assocaton super league

More information

Efficient Segmentation and Classification of Remote Sensing Image Using Local Self Similarity

Efficient Segmentation and Classification of Remote Sensing Image Using Local Self Similarity ISSN(Onlne): 2320-9801 ISSN (Prnt): 2320-9798 Internatonal Journal of Innovatve Research n Computer and Communcaton Engneerng (An ISO 3297: 2007 Certfed Organzaton) Vol.2, Specal Issue 1, March 2014 Proceedngs

More information

Fast Robust Non-Negative Matrix Factorization for Large-Scale Human Action Data Clustering

Fast Robust Non-Negative Matrix Factorization for Large-Scale Human Action Data Clustering roceedngs of the Twenty-Ffth Internatonal Jont Conference on Artfcal Intellgence IJCAI-6) Fast Robust Non-Negatve Matrx Factorzaton for Large-Scale Human Acton Data Clusterng De Wang, Fepng Ne, Heng Huang

More information

A Study on Clustering for Clustering Based Image De-Noising

A Study on Clustering for Clustering Based Image De-Noising Journal of Informaton Systems and Telecommuncaton, Vol. 2, No. 4, October-December 2014 196 A Study on Clusterng for Clusterng Based Image De-Nosng Hossen Bakhsh Golestan* Department of Electrcal Engneerng,

More information

Robust Kernel Representation with Statistical Local Features. for Face Recognition

Robust Kernel Representation with Statistical Local Features. for Face Recognition Robust Kernel Representaton wth Statstcal Local Features for Face Recognton Meng Yang, Student Member, IEEE, Le Zhang 1, Member, IEEE Smon C. K. Shu, Member, IEEE, and Davd Zhang, Fellow, IEEE Dept. of

More information

A Bilinear Model for Sparse Coding

A Bilinear Model for Sparse Coding A Blnear Model for Sparse Codng Davd B. Grmes and Rajesh P. N. Rao Department of Computer Scence and Engneerng Unversty of Washngton Seattle, WA 98195-2350, U.S.A. grmes,rao @cs.washngton.edu Abstract

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Large-scale Web Video Event Classification by use of Fisher Vectors

Large-scale Web Video Event Classification by use of Fisher Vectors Large-scale Web Vdeo Event Classfcaton by use of Fsher Vectors Chen Sun and Ram Nevata Unversty of Southern Calforna, Insttute for Robotcs and Intellgent Systems Los Angeles, CA 90089, USA {chensun nevata}@usc.org

More information

Fast Feature Value Searching for Face Detection

Fast Feature Value Searching for Face Detection Vol., No. 2 Computer and Informaton Scence Fast Feature Value Searchng for Face Detecton Yunyang Yan Department of Computer Engneerng Huayn Insttute of Technology Hua an 22300, Chna E-mal: areyyyke@63.com

More information

Histogram of Template for Pedestrian Detection

Histogram of Template for Pedestrian Detection PAPER IEICE TRANS. FUNDAMENTALS/COMMUN./ELECTRON./INF. & SYST., VOL. E85-A/B/C/D, No. xx JANUARY 20xx Hstogram of Template for Pedestran Detecton Shaopeng Tang, Non Member, Satosh Goto Fellow Summary In

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

A Computer Vision System for Automated Container Code Recognition

A Computer Vision System for Automated Container Code Recognition A Computer Vson System for Automated Contaner Code Recognton Hsn-Chen Chen, Chh-Ka Chen, Fu-Yu Hsu, Yu-San Ln, Yu-Te Wu, Yung-Nen Sun * Abstract Contaner code examnaton s an essental step n the contaner

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Related-Mode Attacks on CTR Encryption Mode

Related-Mode Attacks on CTR Encryption Mode Internatonal Journal of Network Securty, Vol.4, No.3, PP.282 287, May 2007 282 Related-Mode Attacks on CTR Encrypton Mode Dayn Wang, Dongda Ln, and Wenlng Wu (Correspondng author: Dayn Wang) Key Laboratory

More information

Relational Lasso An Improved Method Using the Relations among Features

Relational Lasso An Improved Method Using the Relations among Features Relatonal Lasso An Improved Method Usng the Relatons among Features Kotaro Ktagawa Kumko Tanaka-Ish Graduate School of Informaton Scence and Technology, The Unversty of Tokyo ktagawa@cl.c..u-tokyo.ac.jp

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Gender Classification using Interlaced Derivative Patterns

Gender Classification using Interlaced Derivative Patterns Gender Classfcaton usng Interlaced Dervatve Patterns Author Shobernejad, Ameneh, Gao, Yongsheng Publshed 2 Conference Ttle Proceedngs of the 2th Internatonal Conference on Pattern Recognton (ICPR 2) DOI

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information