Multi-view Clustering with Adaptively Learned Graph

Size: px
Start display at page:

Download "Multi-view Clustering with Adaptively Learned Graph"

Transcription

1 Proceedngs of Machne Learnng Research 77: , 217 ACML 217 Mult-vew Clusterng wth Adaptvely Learned Graph Hong Tao Chenpng Hou Jubo Zhu ju bo Dongyun Y dongyun.y@gmal.com College of Scence, Natonal Unversty of Defense Technology, Changsha, Hunan, 4173, Chna Edtors: Yung-Kyun Noh and Mn-Lng Zhang Abstract Mult-vew clusterng, whch ams to mprove the clusterng performance by explorng the data s multple representatons, has become an mportant research drecton. Graph based methods have been wdely studed and acheve promsng performance for mult-vew clusterng. However, most exstng mult-vew graph based methods perform clusterng on the fxed nput graphs, and the results are dependent on the qualty of nput graphs. In ths paper, nstead of fxng the nput graphs, we propose Mult-vew clusterng wth Adaptvely Learned Graph (MALG), learnng a new common smlarty matrx. In our model, we not only consder the mportance of multple graphs from vew level, but also focus on the performance of smlartes wthn a vew from sample-par level. Sample-par-specfc weghts are ntroduced to explot the connecton across vews n more depth. In addton, the obtaned optmal graph can be parttoned nto specfc clusters drectly, accordng to ts connected components. Expermental results on toy and real-world datasets demonstrate the effcacy of the proposed algorthm. Keywords: Mult-vew Clusterng, Graph-based Learnng, Sample Par Sgnfcance 1. Introducton In many real-world applcatons, such as vdeo survellance and mage retreval, heterogeneous features can be obtaned to represent the same nstance. For example, an mage can be characterzed by varous descrptors, such as SIFT (Lowe (24)), hstograms of orented gradents (HOG) (Dalal and Trggs (25)), GIST (Olva and Torralba (21)) and local bnary pattern (LBP) (Ojala et al. (22)); a webpage can be descrbed by ts content, the text of webpages lnkng to t, and the lnk structure of lnked pages. Snce these heterogeneous features summarze the objects characterstcs from dstnct perspectves, they are regarded as multple vews of the data. Mult-vew learnng, whch explores the nformaton from dfferent vews to mprove the learnng performance, has become an mportant research drecton (Xu et al. (213); Hou et al. (217)). Clusterng s the task to partton objects nto meanngful groups wthout label nformaton. To properly ntegrate the nformaton from multple vews, many mult-vew clusterng approaches have been proposed (Chaudhur et al. (29); Kumar and Daumé (211); Lu et al. (213); Ca et al. (213)). Among these methods, a number of graph based mult-vew clusterng algorthms were presented wth good performance. Kumar et al. (211) extended spectral clusterng for mult-vew learnng by co-regularzng the clusterng hypotheses to c 217 H. Tao, C. Hou, J. Zhu & D. Y.

2 Tao Hou Zhu Y make graphs from dfferent vews agree wth each other. Wth a nonnegatve constrant on the relaxed cluster assgnment matrx, Ca et al. (211) developed the mult-modal spectral clusterng (MMSC) algorthm to learn a commonly shared graph Laplacan matrx. Based on bpartte graph, L et al. (215) utlzed local manfold fuson to ntegrate heterogeneous features and proposed a new large-scale mult-vew spectral approach (MVSC). Ne et al. (217) proposed a mult-vew clusterng method wth adaptve neghbors (MLAN), ths method performs clusterng and local structure learnng smultaneously. Despte the effcacy of graph based mult-vew clusterng methods, there stll exsts some lmts. On one hand, some methods conduct the subsequent procedures based on the gven smlarty matrces wthout modfyng them. Thus, the clusterng results are dependent on the qualty of the nput graphs. To solve ths problem, Ne et al. (216) proposed a constraned laplacan rank (CLR) model to learn a new smlarty matrx based on the gven ntal affnty matrx. By mposng rank constrant on the correspondng Laplacan matrx, the resultant smlarty matrx has c exact connected components (c s the number of clusters) and each component corresponds to a cluster. However, CLR s developed n sngle-vew settng and s napplcable to dealng wth multple graphs smultaneously. On the other hand, those methods combnng dfferent vews only consder the dversty of vew contrbutons, but gnore the qualty of parwse relatonshps wthn the same vew. In practce, the nose s usually non-homogeneously dstrbuted, smlartes of dfferent sample pars wthn the same vew mght also have vared capablty for dstngushng the true cluster structure. Thus, sample par sgnfcance analyss s very mportant for learnng the detaled connecton across vews. Instead of treatng smlartes of dfferent sample pars equally (usng the same weght for a certan vew), a promsng choce s to assume that dfferent smlartes usually have dstnct weghts accordng to ther sample par sgnfcance. Specfcally, we ntroduce sample-par-specfc weghts to measure the sample par sgnfcance. How to adaptvely learn these sample-par-specfc weghts s crucal to fnd the rght clusterng structure. Self-paced learnng (SPL) (Kumar et al. (21); Jang et al. (214)), whch ncorporates samples from easy to complex nto the tranng process by adaptvely assgnng them weghts, offers a feasble strategy for addressng ths problem. There are two weghtng manners n SPL,.e., hard weghtng and soft weghtng. Soft weghtng s demonstrated more effectve n prevous studes (Xu et al. (215); Zhao et al. (215)) and also fts our settng better. Hence, we transplant the soft weghtng manner of SPL nto mult-vew graph based clusterng. In ths paper, based on CLR, we propose Mult-vew clusterng wth Adaptvely Learned Graph (MALG), consderng both l 1 -norm and l 2 -norm dstances. Specfcally, gven ntal affnty matrces from multple vews, we attempt to learn a new common smlarty matrx wth explct clusterng structure by forcng rank constrant on the correspondng Laplacan matrx. Meanwhle, both the qualty dversty of multple graphs and the sample par sgnfcance wthn a certan graph are taken nto consderaton by ntroducng the sample-par-specfc weghts. Based on the above two aspects, the resultng model s able to adaptvely learn a new common smlarty matrx and explot the connecton between dfferent vews n more depth. There are several benefts of our approach: The proposed approach modfyng the smlarty matrx durng each teraton untl reach to the optmal one, and the outputted smlarty matrx explctly gve the clusterng result; The sample-par-specfc weghts are 114

3 Mult-vew Clusterng wth Adaptvely Learned Graph frst employed to measure the qualty of smlartes of dfferent sample pars for mult-vew graph based clusterng; The proposed objectve functons can be solved by a smple yet effectve algorthm; The clusterng performance of the proposed method s exaed on toy and real-world datasets and our approach outperforms other related methods n most cases. 2. Related Works We frst ntroduce the notatons used n ths paper. Matrces and vectors are wrtten as boldface uppercase letters and boldface lowercase letters respectvely. For matrx M, the -th row and the -th element of M are denoted as m and m, respectvely. The trace of matrx M s denoted by T r(m). The transpose of matrx M s denoted as M T. The l 1 - norm and l 2 -norm of vector v s denoted by v 1 and v 2, the Frobenus and the l 1 -norm of matrx M are represented as M F and M 1 respectvely. 1 denotes a column vector wth all ones, and the dentty matrx s denoted by I CLR Revsed Gven an ntal smlarty matrx A R n n, CLR ams to learn a new smlarty matrx S R n n wth exact c connected components, where n s the number of data ponts and c s the number of clusters. The Laplacan matrx related to S s defned as L S = D S (S T + S)/2, where D S s a dagonal matrx wth -th dagonal element as j (s + s j )/2. There s an mportant property of the Lapalcan matrx as follows (Chung (1997)). Theorem 1 The multplcty c of the egenvalue of the Laplacan matrx L S s equal to the number of connected components n the graph assocated wth S. Based on ths observaton, Ne et al. (216) constraned the rank of L S to be n c, and proposed the followng CLR model for graph based clusterng wth both l 1 -norm and l 2 -norm dstances: J CLR l1 = j s =1,s,rank(L S )=n c S A 1, (1) J CLR l2 = S j s =1,s,rank(L S )=n c A 2 F. (2) It has been shown that CLR acheves superor performance for clusterng (Ne et al. (216)). Note that the CLR method only works for sngle-vew data. It cannot deal wth graphs from multple vews smultaneously SPL By stmulatng the learnng process of humans, Bengo et al. (29) proposed currculum learnng to learn a model by gradually ncludng samples nto tranng from easy to complex. Instead of usng the predetered currculum, Kumar et al. (21) derved the SPL framework to learn the currculum and the model smultaneously. The general SPL model ncludes a weghted loss term on all samples and a regularzer term mposed on weghts: Θ,w [,1] n n w l (Θ) + f(w; λ), (3) =1 115

4 Tao Hou Zhu Y where l (Θ) denotes the loss functon of -th sample, Θ represents the model parameter, w = [w 1,, w n ] T denote the weght varables reflectng the samples mportance, f(w; λ) s the self-paced regularzer, and λ s a parameter. There are two knds of regularzaton n SPL: one assgns bnary weghts to samples (hard regularzaton), and the other allocates real-valued weghts (soft regularzaton). In real-world applcatons, the nose contaned n data s usually non-homogeneously dstrbuted. Under ths crcumstance, t has been shown that soft weghtng s more effectve (Zhao et al. (215); Xu et al. (215)). 3. Mult-vew Clusterng wth Adaptvely Learned Graph 3.1. The Proposed Formulaton Let A (v) R n n denote the smlarty matrx from the v-th vews (v = 1,, V ). In multvew clusterng, the clusterng results of dfferent vews should be consstent. We assume there s a common shared smlarty matrx S whch has exact c connected components. Thus, n the mult-vew settng, Eq. (1) and Eq. (2) can be extended to the followng formulatons: S A (v) 1, (4) S j s =1,s,rank(L S )=n c S j s =1,s,rank(L S )=n c S A (v) 2 F. (5) As dfferent vews may have dstnct physcal meanngs, treatng multple graphs equally s often dffcult to fnd the optmal soluton. What s more, the above two formulatons fal to measure the dfferences n ndvdual sample pars wthn and across vews. In order to brdge ths gap, we ntroduce sample-par-specfc weghts w (v) nto the mult-vew graph based clusterng model. Consderng both l 1 -norm and l 2 -norm dstances, the objectve functons of MALG are formulated as follows: S,{W (v) } V s.t. S,{W (v) } V s.t. W (v) (S A (v) ) 1 + f(w (v) ; λ) j s = 1, s, rank(l S ) = n c, W (v) [, 1] n n, 1 v V, W (v) (S A (v) ) 2 + F f(w(v) ; λ) j s = 1, s, rank(l S ) = n c, W (v) [, 1] n n, 1 v V, (6) (7) where W (v) = (w (v) ) 1,j n s composed of the weghts of n n elements n the v-th vew, W (v) calculates element-wse square root of W (v) (.e., the so-called Hadamard s root), s the element-wse product (Hadamard product) operaton of matrces, f(w (v) ; λ) denotes the regularzer mposed on the weghts, and λ s a parameter. The frst term n Eq. (6) or (7) s the weghted loss between the learned smlarty and ther nput values across dfferent 116

5 Mult-vew Clusterng wth Adaptvely Learned Graph vews. Weghts of hgh-qualty smlartes are encouraged to be larger n order to obtan small losses. In order to learnng the weghts w (v) automatcally durng the optmzaton process, followng Xu et al. (215), the regularzaton term f(w (v) ; λ) s set as follows: f(w (v) ; λ) = ln (1 + e λ w (v) w (v) )(1+e λ ) + ln (w (v) )w(v) λw (v). (8) The optmal weght of the ()-th element n the smlarty matrx n the v-th vew can be obtaned by solvng l(v) + f(w (v) ; λ), (9) where l (v) represents s (v) w (v) the dervatve wth respect to w (v) [,1] w (v) a (v) or (s(v) a (v) )2 for the convenence of notaton. Settng to zero, then we have w (v) = 1 + e λ. (1) 1 + e l(v) λ It s worth to note that Eq. (1) can be regarded as an adapted logstc functon, whch nherts the merts of logstc functon and provdes probablstc weghts. By combnng Eqs. (6) - (7) and (8) respectvely, we obtan the resultng objectve functons. The learned common smlarty matrx S can be used for clusterng drectly accordng to Tarjan s strongly connected components algorthm (Tarjan (1972)) Optmzaton Denote σ (L S ) as the -th smallest egenvalue of L S. Snce L S s postve sem-defnte, we c have σ (L S ). So the constrant rank(l S ) = n c wll be ensured f σ (L S ) =. Accordng to Ky Fan s Theorem (Fan (1949)), we have c σ (L S ) = T r(f T L S F) (11) F R n c,f T F=I =1 Hence, wth a large enough γ, problems (6) and (7) are equvalent to the followng problems: =1 S,F,{W (v) } V s.t. W (v) (S A (v) ) 1 + 2γT r(f T L S F) + f(w (v) ; λ) j s = 1, s, F T F = I, F R n c, W (v) [, 1] n n, 1 v V, (12) S,F,{W (v) } V s.t. W (v) (S A (v) ) 2 + 2γT F r(ft L S F) + f(w (v) ; λ) j s = 1, s, F T F = I, F R n c, W (v) [, 1] n n, 1 v V. (13) 117

6 Tao Hou Zhu Y The optmal soluton to these two problems wll make equaton c =1 σ (L S ) = holds. The problems (12) and (13) can be solved n an alternatng fashon,.e., dvdng varables nto dsjont blocks and alternatvely optmzng one of them wth the others fxed. Fx S and F, update {W (v) } V. When S and F s fxed, W(v) s optmzed by solvng the followng problem, { w (v) l(v) + f(w (v) }, ; λ) (14) W (v) 1,j n where l (v) = s (v) a (v) for problem (12) or l(v) = (s (v) a (v) )2 for problem (13) wth the current obtaned S. Ths optmzaton s separable wth respect to each w (v), and thus can be easly solved by Eq. (1). Fx S and {W (v) } V, update F. (13) are transformed nto When S and {W(v) } V are fxed, problems (12) and F R n c,f T F=I T r(f T L S F). (15) The optmal soluton F s formed by the c egenvectors that correspond to the c smallest egenvalues of L S. Fx {W (v) } V and and F, update S. S j s =1,s,j S j s =1,s,j Respectvely, problems (12) and (13) become w (v) s a (v) + γ,j w (v) (s a (v) )2 + γ,j f f j 2 2 s, (16) f f j 2 2 s, (17) where f denotes the -th row ( = 1,, n) of F. As there s no dependence between dfferent s, the above two problems can be solved for each ndvdual : j s =1,s j s =1,s j j w (v) s a (v) + γ,j w (v) (s a (v) )2 + γ,j f f j 2 2 s, (18) f f j 2 2 s. (19) Denote b = f f j 2 2, and denote b as a vector wth the j-th element as b (same for s, a (v) and w (v) ), then problems (18) and (19) can be wrtten n vector form as s T 1=1,s dag(w (v) )(s a (v) + γs T b, (2) 1 ) 118

7 Mult-vew Clusterng wth Adaptvely Learned Graph s T 1=1,s dag( w (v) )(s a (v) ) 2 F + γs T b, (21) where dag(x) returns a dagonal matrx wth the elements of x on the man dagonal. Usng the teratve re-weghtng method (Ne et al. (216)), problem (2) can be addressed by solvng the followng problem teratvely: s T 1=1,s T r(s a (v) ) T U (v) (s a (v) )+γs T b, (22) where U (v) s a dagonal matrx wth the j-th dagonal element as u (v) jj = w (v) 2 s a (v) (23), and s s the current soluton. For problem (21) wth the l 2 -norm dstance, denote U (v) = dag(w (v) ), then t can also be unfed n the form of problem (22). Now, we focus on solvng problem (22). Let p = U = U (v), (24) then problem (22) can be further smplfed as The Lagrangan functon of problem (26) s U (v) a (v) γ 2 b, (25) s T Us s T p. (26) s T 1=1,s L(s, η, β ) = 1 2 st Us s T p η(s T 1 1) β T s, (27) where η and β are the Lagrangan multplers. Accordng to the KKT condton, the optmal soluton of s s s = ( 1 η + 1 p ) +, (28) u u where (x) + = max(, x). Defne the followng functon g (x) = ( 1 u x + 1 u p ) + 1. (29) Then accordng to Eqs. (28) and (29), and the constrant s T 1 = 1, t holds that g (η) =. (3) 119

8 Tao Hou Zhu Y That s to say, the value of η s the root of functon g (x). Note that g (x) s a pecewse lnear and monotoncally ncreasng functon, thus ts root can be easly found wth Newton method. Once η s solved, the optmal soluton to problem (26) can be obtaned by Eq. (28). As shown n Eq. (23), the U (v) defned for problem (2) s dependent on S and thus s also an unknown varable. Wth alternatvely updated U (v) and S, problem (2) can be fnally solved. It s proved that ths teratve method decreases the objectve of problem (2) n each teraton and wll converge to the optmal soluton (Tao et al. (216)). For problem (21), t can be drectly resolved, snce there s no dependence between U (v) and S. The algorthm s descrbed n Algorthm 1. Algorthm 1 Mult-vew Clusterng wth Adaptvely Learned Graph Input: The smlarty matrces for V vews {A (1),, A (V ) }, A (v) R n n, number of clusters c, a large enough γ, the regularzaton parameter λ, ntal smlarty matrx S R n n. Intalzaton: calculate {l (v) } 1,j n, v = 1,, V, t Repeat 1. Update W (v) t = (w (v),t ) accordng to Eq. (1). 2. Update each row of S t by solvng problem (2) or (21). 3. Update F t by solvng problem (15). 4. Compute current {l (v) } 1,j n, v = 1,, V. 5. t t + 1. Untl convergence. Output: The smlarty matrx S R n n wth exact c connected components. 4. Experment In ths secton, we evaluate MALG on both synthetc and real-world datasets. For smplcty, we denote MALG wth l 1 -norm and l 2 -norm dstances as MALG(l 1 ) and MALG(l 2 ), respectvely Toy Example We construct a synthetc dataset, whch contans three graphs sharng the same nodes (vertces) but multple types of nteractons. Ths synthetc dataset has 3 clusters, and each cluster has 5 members. Fg. 1(a) - 1(c) dsplay the ntal smlarty matrces from the three graphs. On each vew, two of three clusters entangle wth each other. The affnty data wthn each block s randomly generated n the range of and 1, whle the nose data s randomly generated n the range of and.8 Moreover, to make ths clusterng task more challengng, we randomly choose m nose data and set ther value to be 1, from Vew1 to Vew3, m s 2, 3, and 4 respectvely. Note that when parttonng data ponts nto groups accordng to the connected components of a graph, we merely care whether a element of the smlarty matrx s zero or not and gnore ts detaled values. Thus, we use the adjacent matrx whch only contans 12

9 Mult-vew Clusterng wth Adaptvely Learned Graph bnary values ( or 1) to represent the parwse relatonshps learned by related methods. We run all methods usng the l 1 -norm dstances. Fg. 1(d) - 1(f ) show the learned parwse relatonshps by perforg CLR on each vew. It can be observed that sngle-vew CLR fals to fnd the true clusterng structure. To valdate the effcency of sample-par-specfc weghts n MALG, we compare our method wth combnng multple graphs wthout weght (denoted as MvCLR-nave) as shown n Eq. (4) as well as vew-weghtng CLR (Vew-CLR for short). The objectve of vew-weghtng CLR wth l 1 -norm dstance s formulated as S,α s.t. α (v) S A (v) 1 j s = 1, s, rank(l S ) = n c α (v) = 1, α (v) >, (31) where α (v) s the weght of the v-th vew. The learned parwse relatonshps of MvCLR-nave, Vew-CLR and MALG are presented n Fg. 1(g), 1(h) and 1(), respectvely. It can be seen that the performances of both MvCLR-nave and Vew-CLR are affected by the nose. The dstorton of parwse relatonshps for MvCLR-nave and Vew-CLR are manly brought by Vew3, snce t has the most nose data. In comparson, our proposed MALG elates the mpact of nose and obtans the deal structure for clusterng Mult-vew Clusterng Comparson on Real-world Datasets In ths subsecton, we evaluate our approach on fve real-world datasets. Snce the proposed method s knd of graph based learnng model, we compare t wth other related graph based mult-vew clusterng algorthms. In partcular, we focus on comparng wth followng algorthms: Best sngle-vew CLR (BestCLR) (Ne et al. (216)). On each vew, the sngle-vew CLR algorthm s performed n both l 1 -norm and l 2 -norm dstances. The best results are reported. Co-regularzed mult-vew spectral clusterng (CoRegSC) (Kumar et al. (211)). Ths method co-regularzes the clusterng hypotheses from multple vews to enforce each vew to have the same cluster membershp. We mplement the centrod-based coregularzaton approach. Co-traned mult-vew spectral clusterng (CoTranSC) (Kumar and Daumé (211)). CoTranSC utlzes the spectral embeddng from one vew to constran the affnty graph used for the other vews. By teratvely applyng ths approach, the clusterng of multple vews tend to be the same. As there s no specfc stop crteron, we run 2 teratons. Mult-modal spectral clusterng (MMSC) (Ca et al. (211)). MMSC learns a commonly shared graph Laplacan matrx by mzng the dfferences between the common Laplacan graph and that of each vew. 121

10 Tao Hou Zhu Y (a) (b) (c) (d ) (g) (h) 1 15 (f ) 15 5 (e) ( ) Fgure 1: Vsualzaton of the toy example. (a) - (c) The generated synthetc nput smlarty matrces of three vews. (d) - (f) The parwse relatonshps learned by CLR on each vew. (g) The parwse relatonshps learned by MvCLR-nave. (h) The parwse relatonshps learned Vew-CLR. () The parwse relatonshps learned by our MALG. Note that we use the adjacent matrx whch contans bnary values ( or 1) to represent the parwse relatonshps of samples. 122

11 Mult-vew Clusterng wth Adaptvely Learned Graph Mult-vew spectral clusterng (MVSC) (L et al. (215)). Local manfold fuson s adopted to ntegrate heterogeneous features based on the bpartte graph. Mult-vew clusterng wth adaptve neghbors (MLAN) (Ne et al. (217)). MLAN performs clusterng and local structure learnng smultaneously, whch s also based on the property of the Laplacan matrx llustrated n Theorem 1. The mplementatons of CLR, CoRegSC, CoTranSC, MMSC and MLAN are downloaded from ther authors homepages Datasets Descrptons The detaled descrptons of the datasets are presented as follows and the statstcs of them are summarzed n Table 1. 3sources s a mult-vew text dataset, collected from three onlne news sources: BBC, Reuters, and The Guardan. We use the 169 stores that were reported n all three sources. Each source corresponds to a vew. Accordng to the prmary secton headngs used across the three news sources, each story was manually annotated wth one of the sx topcal labels: busness, entertanment, health, poltcs, sport, technology. Snce the orgnal dmenson of each vew s hgh (all above 3,), we frst perform PCA on each vew and reduce the dmenson to 1 for each vew. MSRC-v1 dataset contans 24 mages n 8 classes. Followng Ca et al. (211), we select 7 classes composed of tree, buldng, arplane, cow, face, car, bcycle and each class has 3 mages. We extract fve vsual features from each mage: color moment (CM) wth dmenson 48, local bnary pattern (LBP) wth 256 dmenson, HOG wth 1 dmenson, GIST wth 512 dmenson, and CENTRIST (Wu and Rehg (211)) feature wth 132 dmenson. Caltech11 s an object recognton dataset consstng of 11 categores of mages. We follow prevous work (L et al. (215)) and select the wdely used 7 classes,.e. Dolla-Bll, Face, Garfeld, Motorbkes, Snoopy, Stop-Sgn and Wndsor-Char and get 441 mages. The selected dataset s referred to as Caltech11-7. The same fve types of features wth MSRC-v1 are extracted. Trecvd23 s a vdeo dataset, whch contans 178 manually labeled vdeo shots that belong to 5 categores. Each shot s represented as a 1894-dmensonal vector of text features and a 165-dmensonal vector of HSV color hstogram, whch s extracted from the assocated keyframe. GRAZ2 s a database for object categorzaton. It contans mages wth objects of hgh complexty and hgh ntra-class varablty. It conssts of 365 mages wth bkes, 311 mages wth persons, 42 mages wth cars and 38 mages not contanng one of these objects. We extract the followng 3 vsual features: SIFT, GIST, and LBP. 123

12 Tao Hou Zhu Y Table 1: Statstcs of the mult-vew datasets used n our experments. Dataset Instances Vews Clusters 3sources MSRC-v Caltech Trecvd GRAZ Table 2: Clusterng performance on 3sources. Methods NMI ACC F-score BestCLR.59().651().627() CoRegSC.24(.34).433(.39).36(.24) CoTranSC.476(.37).573(.3).493(.39) MMSC.621(.23).596(.35).585(.37) MVSC.314(.27).55(.55).444(.43) MLAN.59().586().556() MALG(l 1 ).733().85().795() MALG(l 2 ).746().811().798() Experment Setup The sngle-vew CLR approach s conducted on each vew wth both l 1 -norm and l 2 -norm dstances and the best results are reported. In CLR, MLAN and our proposed MALG, there s a common parameter γ brought by the Laplacan matrx rank constrant. For smple mplementaton and acceleratng the convergence speed, we ntalze γ = 8 for these three methods, and decrease t (γ = γ/4) f the connected components of S s greater than the number of clusters c or ncrease t (γ = γ 4) f smaller than c n each teraton. There s another parameter n our method,.e., the regularzaton parameter λ. To obtan a better model, t s set vew-by-vew: λ (v) = π (v) + log((π (v) ) 2 + 1)t,where π (v) denotes the medan of the losses l (v) of all smlartes on the v-th vew, and t s the number of teratons. For CoRegSC, CoTranSC and MMSC, ther trade-off parameters are all selected from {.1,.1, 1, 1, 1}, the best results are reported. For MVSC, the number of salent ponts s set as 1% of the number of total samples. On all datasets, each sample s assgned 1 nearest neghbors to construct graph. We use the graph constructon method descrbed n Ne et al. (216) to ntalze the graphs for CLR and our method. Approaches based on spectral clusterng need perform post-processng, such as k-means, to get the fnal clusterng results. We repeat experments for 5 tmes for all methods and report the average results and the standard devaton. The clusterng performance s measured usng three evaluaton metrcs: clusterng accuracy (ACC), normalzed mutual nformaton (NMI) and F-score. For all three metrcs, hgher value means better clusterng qualty. 124

13 Mult-vew Clusterng wth Adaptvely Learned Graph Table 3: Clusterng performance on MSRC-v1. Methods NMI ACC F-score BestCLR.692().79().651() CoRegSC.772(.21).841(.44).745(.32) CoTranSC.797(.29).86(.53).764(.39) MMSC.795(.21).825(.58).742(.4) MVSC.531(.2).613(.25).485(.17) MLAN.778().81().722() MALG(l 1 ).924().962().924() MALG(l 2 ).96().952().94() Table 4: Clusterng performance on Caltech11-7. Methods NMI ACC F-score BestCLR.576().58().428() CoRegSC.623(.32).661(.37).68(.34) CoTranSC.6(.23).657(.49).595(.38) MMSC.698(.27).696(.29).661(.37) MVSC.583(.29).64(.47).568(.46) MLAN.438().488().444() MALG(l 1 ).739().77().78() MALG(l 2 ).728().664().633() Performance Evaluaton The clusterng results are presented n Table 2-6. We have followng observatons. It can be seen that the proposed method acheves the best clusterng results n most cases. As shown n Table 2 and 3, on 3sources and MSRC-v1, our proposed MALG n both l 1 -norm and l 2 -norm dstances outperforms the compared approaches sgnfcantly n terms of all the three metrcs. Compared wth the second best baselne, MALG acheves over 1% mprovements wth respect to NMI, ACC and F-score. Table 4 shows that, on Caltech11-7, despte the performance superorty s not obvous, MALG stll gets the best clusterng results when adoptng the l 1 -norm dstance. In terms of ACC and F-score, MALG(l 1 ) obtans the hghest scores on Trecvd23, as dsplayed n Table 5. When measurng wth NMI, MMSC performs the best, and MALG(l 1 ) ranks second. On GRAZ2, MALG(l 2 ) and MALG(l 1 ) respectvely get the best NMI and F-score, whle MMSC s slghtly better than MALG wth respect to ACC (Table 6). Except for Caltech11-7 and GRAZ2, the mpact of the utlzaton of dfferent dstances (l 1 -norm or l 2 -norm) n MALG on the clusterng results s not markedly on the other three datasets. On Caltech11-7, MALG(l 1 ) gets hgher ACC and F-score values wth a obvous gap compared wth that of MALG(l 2 ), whereas on GRAZ2, MALG(l 2 ) outperforms MALG(l 1 ) n terms of ACC. MLAN s most closely related wth our method, as t s also developed based on the Laplacan rank constrant. However, n comparson wth our method, the clusterng results 125

14 Tao Hou Zhu Y Table 5: Clusterng performance on Trecvd23. Methods NMI ACC F-score BestCLR.219().433().427() CoRegSC.177(.5).43(.12).32(.7) CoTranSC.215(.4).42(.13).343(.11) MMSC.266(.3).431(.32).365(.18) MVSC.189(.36).43(.37).344(.15) MLAN.194().434().412() MALG(l 1 ).252().497().434() MALG(l 2 ).254().497().435() Table 6: Clusterng performance on GRAZ2. Methods NMI ACC F-score BestCLR.14().457().363() CoRegSC.79(.1).48(.6).36(.1) CoTranSC.6(.5).393(.16).292(.4) MMSC.142(.2).477(.2).356(.1) MVSC.85(.1).426(.2).311(.) MLAN.45().268().396() MALG(l 1 ).153().387().419() MALG(l 2 ).169().475().4() obtaned by MLAN are much worse. Especally, on Caltech11-7, the NMI (ACC, F-score) of MALG(l 1 ) mproves about 3% (21%, 26%) over that of MLAN. The reason mght be that MALG consder the sgnfcance of sample pars both wthn vew and across vews, whle MLAN treat sample pars wthn the same vew equally. Note that the standard devaton values of CLR, MLAN and our method are all zeros, snce they need not post-processng wth k-means. 5. Concluson In ths paper, we propose a mult-vew graph based clusterng method to adaptvely learn a common graph wth exact c connected components, whch s an deal structure for clusterng. Instead of treatng sample pars wthn each vew equally, sample-par-specfc weghts are ntroduced to evaluate the mportance of smlartes n a partcular vew. Two objectve functons wth both l 1 -norm and l 2 -norm dstances are formulated, and a smple yet effectve algorthm s derved to solve them. The effectveness of our method s valdated on fve realworld datasets for news artcle clusterng and mage clusterng. Expermental results show that the proposed method acheves robust performance and outperforms several related methods. 126

15 Mult-vew Clusterng wth Adaptvely Learned Graph Acknowledgments The authors would lke to thank the anonymous revewers. Ths work was supported by Chna Natonal Scence Foundaton under Grants No and No References Yoshua Bengo, Jérôme Louradour, Ronan Collobert, and Jason Weston. Currculum learnng. In ICML, pages 41 48, 29. Xao Ca, Fepng Ne, Heng Huang, and Farhad Kamangar. Heterogeneous mage feature ntegraton va mult-modal spectral clusterng. In CVPR, pages IEEE, 211. Xao Ca, Fepng Ne, and Heng Huang. Mult-vew k-means clusterng on bg data. In IJCAI, pages , 213. Kamalka Chaudhur, Sham M Kakade, Karen Lvescu, and Karthk Srdharan. Mult-vew clusterng va canoncal correlaton analyss. In ICML, pages , 29. Fan RK Chung. Spectral graph theory. Number 92. Amercan Mathematcal Socety, Navneet Dalal and Bll Trggs. Hstograms of orented gradents for human detecton. In CVPR, volume 1, pages , 25. K. Fan. On a theorem of weyl concernng egenvalues of lnear transformatons I. Proceedngs of the Natonal Academy of Scences, 35(11): , Chenpng Hou, Fepng Ne, Hong Tao, and Dongyun Y. Mult-vew unsupervsed feature selecton wth adaptve smlarty and vew weght. IEEE Transactons on Knowledge and Data Engneerng, PP(99):1 1, 217. Lu Jang, Deyu Meng, Shoou Yu, Zhenzhong Lan, Shguang Shan, and Alexander Hauptmann. Self-paced learnng wth dversty. In NIPS, pages , 214. Abhshek Kumar and Hal Daumé. A co-tranng approach for mult-vew spectral clusterng. In ICML, pages 393 4, 211. Abhshek Kumar, Pyush Ra, and Hal Daume. Co-regularzed mult-vew spectral clusterng. In NIPS, pages , 211. M Pawan Kumar, Benja Packer, and Daphne Koller. varable models. In NIPS, pages , 21. Self-paced learnng for latent Yeqng L, Fepng Ne, Heng Huang, and Junzhou Huang. Large-scale mult-vew spectral clusterng va bpartte graph. In AAAI, pages , 215. Jalu Lu, Ch Wang, Jng Gao, and Jawe Han. Mult-vew clusterng va jont nonnegatve matrx factorzaton. In ICDM, pages , 213. Davd G Lowe. Dstnctve mage features from scale-nvarant keyponts. Internatonal Journal of Computer Vson, 6(2):91 11,

16 Tao Hou Zhu Y Fepng Ne, Xaoqan Wang, Mchael I. Jordan, and Heng Huang. The constraned laplacan rank algorthm for graph-based clusterng. In AAAI, pages , 216. Fepng Ne, Guohao Ca, and Xuelong L. Mult-vew clusterng and sem-supervsed classfcaton wth adaptve neghbours. In AAAI, pages , 217. Tmo Ojala, Matt Petkänen, and Top Mäenpää. Multresoluton gray-scale and rotaton nvarant texture classfcaton wth local bnary patterns. IEEE Transactons on Pattern Analyss and Machne Intellgence, 24(7): , Jul. 22. Aude Olva and Antono Torralba. Modelng the shape of the scene: A holstc representaton of the spatal envelope. Internatonal Journal of Computer Vson, 42(3): , 21. H. Tao, C. Hou, F. Ne, Y. Jao, and D. Y. Effectve dscratve feature selecton wth nontrval soluton. IEEE Transactons on Neural Networks and Learnng Systems, 27 (4):796 88, 216. Robert Tarjan. Depth-frst search and lnear graph algorthms. SIAM Journal on Computng, 1(2):146 16, J. Wu and J. M. Rehg. CENTRIST: A vsual descrptor for scene categorzaton. IEEE Transactons on Pattern Analyss and Machne Intellgence, 33(8): , 211. Chang Xu, Dacheng Tao, and Chao Xu. A survey on mult-vew learnng. Computer Scence, 213. Chang Xu, Dacheng Tao, and Chao Xu. Mult-vew self-paced learnng for clusterng. In IJCAI, pages , 215. Qan Zhao, Deyu Meng, Lu Jang, Q Xe, Zongben Xu, and Alexander G Hauptmann. Self-paced learnng for matrx factorzaton. In AAAI, pages ,

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Heterogeneous Visual Features Fusion via Sparse Multimodal Machine

Heterogeneous Visual Features Fusion via Sparse Multimodal Machine 013 IEEE Conference on Computer Vson and Pattern Recognton Heterogeneous Vsual Features Fuson va Sparse Multmodal Machne Hua ang, Fepng Ne, Heng Huang, Chrs Dng Department of Electrcal Engneerng and Computer

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

Multi-Source Multi-View Clustering via Discrepancy Penalty

Multi-Source Multi-View Clustering via Discrepancy Penalty Mult-Source Mult-Vew Clusterng va Dscrepancy Penalty Wexang Shao, Jawe Zhang, Lfang He, Phlp S. Yu Unversty of Illnos at Chcago Emal: wshao4@uc.edu, jzhan9@uc.edu, psyu@uc.edu Shenzhen Unversty, Chna Emal:

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

MULTI-VIEW ANCHOR GRAPH HASHING

MULTI-VIEW ANCHOR GRAPH HASHING MULTI-VIEW ANCHOR GRAPH HASHING Saehoon Km 1 and Seungjn Cho 1,2 1 Department of Computer Scence and Engneerng, POSTECH, Korea 2 Dvson of IT Convergence Engneerng, POSTECH, Korea {kshkawa, seungjn}@postech.ac.kr

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Multi-View Surveillance Video Summarization via Joint Embedding and Sparse Optimization

Multi-View Surveillance Video Summarization via Joint Embedding and Sparse Optimization IEEE TRANSACTIONS ON MULTIMEDIA, VOL. XX, NO. XX, DECEMBER 20XX 1 Mult-Vew Survellance Vdeo Summarzaton va Jont Embeddng and Sparse Optmzaton Rameswar Panda and Amt K. Roy-Chowdhury, Senor Member, IEEE

More information

Robust Dictionary Learning with Capped l 1 -Norm

Robust Dictionary Learning with Capped l 1 -Norm Proceedngs of the Twenty-Fourth Internatonal Jont Conference on Artfcal Intellgence (IJCAI 205) Robust Dctonary Learnng wth Capped l -Norm Wenhao Jang, Fepng Ne, Heng Huang Unversty of Texas at Arlngton

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Available online at Available online at Advanced in Control Engineering and Information Science

Available online at   Available online at   Advanced in Control Engineering and Information Science Avalable onlne at wwwscencedrectcom Avalable onlne at wwwscencedrectcom Proceda Proceda Engneerng Engneerng 00 (2011) 15000 000 (2011) 1642 1646 Proceda Engneerng wwwelsevercom/locate/proceda Advanced

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY SSDH: Sem-supervsed Deep Hashng for Large Scale Image Retreval Jan Zhang, and Yuxn Peng arxv:607.08477v2 [cs.cv] 8 Jun 207 Abstract Hashng

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input Real-tme Jont Tracng of a Hand Manpulatng an Object from RGB-D Input Srnath Srdhar 1 Franzsa Mueller 1 Mchael Zollhöfer 1 Dan Casas 1 Antt Oulasvrta 2 Chrstan Theobalt 1 1 Max Planc Insttute for Informatcs

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Fast Robust Non-Negative Matrix Factorization for Large-Scale Human Action Data Clustering

Fast Robust Non-Negative Matrix Factorization for Large-Scale Human Action Data Clustering roceedngs of the Twenty-Ffth Internatonal Jont Conference on Artfcal Intellgence IJCAI-6) Fast Robust Non-Negatve Matrx Factorzaton for Large-Scale Human Acton Data Clusterng De Wang, Fepng Ne, Heng Huang

More information

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures A Novel Adaptve Descrptor Algorthm for Ternary Pattern Textures Fahuan Hu 1,2, Guopng Lu 1 *, Zengwen Dong 1 1.School of Mechancal & Electrcal Engneerng, Nanchang Unversty, Nanchang, 330031, Chna; 2. School

More information

IN recent years, recommender systems, which help users discover

IN recent years, recommender systems, which help users discover Heterogeneous Informaton Network Embeddng for Recommendaton Chuan Sh, Member, IEEE, Bnbn Hu, Wayne Xn Zhao Member, IEEE and Phlp S. Yu, Fellow, IEEE 1 arxv:1711.10730v1 [cs.si] 29 Nov 2017 Abstract Due

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Object-Based Techniques for Image Retrieval

Object-Based Techniques for Image Retrieval 54 Zhang, Gao, & Luo Chapter VII Object-Based Technques for Image Retreval Y. J. Zhang, Tsnghua Unversty, Chna Y. Y. Gao, Tsnghua Unversty, Chna Y. Luo, Tsnghua Unversty, Chna ABSTRACT To overcome the

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

Graph-based Clustering

Graph-based Clustering Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component

More information

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK L-qng Qu, Yong-quan Lang 2, Jng-Chen 3, 2 College of Informaton Scence and Technology, Shandong Unversty of Scence and Technology,

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Querying by sketch geographical databases. Yu Han 1, a *

Querying by sketch geographical databases. Yu Han 1, a * 4th Internatonal Conference on Sensors, Measurement and Intellgent Materals (ICSMIM 2015) Queryng by sketch geographcal databases Yu Han 1, a * 1 Department of Basc Courses, Shenyang Insttute of Artllery,

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

A Background Subtraction for a Vision-based User Interface *

A Background Subtraction for a Vision-based User Interface * A Background Subtracton for a Vson-based User Interface * Dongpyo Hong and Woontack Woo KJIST U-VR Lab. {dhon wwoo}@kjst.ac.kr Abstract In ths paper, we propose a robust and effcent background subtracton

More information

arxiv: v1 [cs.cv] 29 Aug 2017

arxiv: v1 [cs.cv] 29 Aug 2017 Mult-vew Low-rank Sparse Subspace Clusterng Mara Brbć and Ivca Koprva arxv:708.087v [cs.cv] 9 Aug 07 Rudjer Boskovc Insttute, Zagreb, Croata August 07 Abstract Most exstng approaches address mult-vew subspace

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT 3. - 5. 5., Brno, Czech Republc, EU APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT Abstract Josef TOŠENOVSKÝ ) Lenka MONSPORTOVÁ ) Flp TOŠENOVSKÝ

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

AS THE major component of big data, multi-modal data

AS THE major component of big data, multi-modal data 1220 IEEE TRANSACTIONS ON MULTIMEDIA, VOL. 19, NO. 6, JUNE 2017 Cross-Modal Retreval Usng Multordered Dscrmnatve Structured Subspace Learnng Lang Zhang, Bngpeng Ma, Guorong L, Qngmng Huang, and Q Tan Abstract

More information

A new segmentation algorithm for medical volume image based on K-means clustering

A new segmentation algorithm for medical volume image based on K-means clustering Avalable onlne www.jocpr.com Journal of Chemcal and harmaceutcal Research, 2013, 5(12):113-117 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCRC5 A new segmentaton algorthm for medcal volume mage based

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM PERFORMACE EVALUAIO FOR SCEE MACHIG ALGORIHMS BY SVM Zhaohu Yang a, b, *, Yngyng Chen a, Shaomng Zhang a a he Research Center of Remote Sensng and Geomatc, ongj Unversty, Shangha 200092, Chna - yzhac@63.com

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Semi-Supervised Discriminant Analysis Based On Data Structure

Semi-Supervised Discriminant Analysis Based On Data Structure IOSR Journal of Computer Engneerng (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 17, Issue 3, Ver. VII (May Jun. 2015), PP 39-46 www.osrournals.org Sem-Supervsed Dscrmnant Analyss Based On Data

More information

Relevance Assignment and Fusion of Multiple Learning Methods Applied to Remote Sensing Image Analysis

Relevance Assignment and Fusion of Multiple Learning Methods Applied to Remote Sensing Image Analysis Assgnment and Fuson of Multple Learnng Methods Appled to Remote Sensng Image Analyss Peter Bajcsy, We-Wen Feng and Praveen Kumar Natonal Center for Supercomputng Applcaton (NCSA), Unversty of Illnos at

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Efficient Segmentation and Classification of Remote Sensing Image Using Local Self Similarity

Efficient Segmentation and Classification of Remote Sensing Image Using Local Self Similarity ISSN(Onlne): 2320-9801 ISSN (Prnt): 2320-9798 Internatonal Journal of Innovatve Research n Computer and Communcaton Engneerng (An ISO 3297: 2007 Certfed Organzaton) Vol.2, Specal Issue 1, March 2014 Proceedngs

More information

An efficient method to build panoramic image mosaics

An efficient method to build panoramic image mosaics An effcent method to buld panoramc mage mosacs Pattern Recognton Letters vol. 4 003 Dae-Hyun Km Yong-In Yoon Jong-Soo Cho School of Electrcal Engneerng and Computer Scence Kyungpook Natonal Unv. Abstract

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

Robust visual tracking based on Informative random fern

Robust visual tracking based on Informative random fern 5th Internatonal Conference on Computer Scences and Automaton Engneerng (ICCSAE 205) Robust vsual trackng based on Informatve random fern Hao Dong, a, Ru Wang, b School of Instrumentaton Scence and Opto-electroncs

More information

Scale Selective Extended Local Binary Pattern For Texture Classification

Scale Selective Extended Local Binary Pattern For Texture Classification Scale Selectve Extended Local Bnary Pattern For Texture Classfcaton Yutng Hu, Zhlng Long, and Ghassan AlRegb Multmeda & Sensors Lab (MSL) Georga Insttute of Technology 03/09/017 Outlne Texture Representaton

More information

Flatten a Curved Space by Kernel: From Einstein to Euclid

Flatten a Curved Space by Kernel: From Einstein to Euclid Flatten a Curved Space by Kernel: From Ensten to Eucld Quyuan Huang, Dapeng Olver Wu Ensten s general theory of relatvty fundamentally changed our vew about the physcal world. Dfferent from Newton s theory,

More information

Modular PCA Face Recognition Based on Weighted Average

Modular PCA Face Recognition Based on Weighted Average odern Appled Scence odular PCA Face Recognton Based on Weghted Average Chengmao Han (Correspondng author) Department of athematcs, Lny Normal Unversty Lny 76005, Chna E-mal: hanchengmao@163.com Abstract

More information