MULTI-VIEW ANCHOR GRAPH HASHING

Size: px
Start display at page:

Download "MULTI-VIEW ANCHOR GRAPH HASHING"

Transcription

1 MULTI-VIEW ANCHOR GRAPH HASHING Saehoon Km 1 and Seungjn Cho 1,2 1 Department of Computer Scence and Engneerng, POSTECH, Korea 2 Dvson of IT Convergence Engneerng, POSTECH, Korea {kshkawa, seungjn}@postech.ac.kr ABSTRACT Mult-vew hashng seeks compact ntegrated bnary codes whch preserve smlartes averaged over multple representatons of objects. Most of exstng mult-vew hashng methods resort to lnear hash functons where data manfold s not consdered. In ths paper we present mult-vew anchor graph hashng (MVAGH), where nonlnear ntegrated bnary codes are effcently determned by a subset of egenvectors of an averaged smlarty matrx. The effcency behnd MVAGH s due to a low-rank form of the averaged smlarty matrx nduced by mult-vew anchor graph, where the smlarty between two ponts s measured by two-step transton probablty through vew-specfc anchor (.e. landmark) ponts. In addton, we observe that MVAGH suffers from the performance degradaton when the hgh recall s requred. To overcome ths drawback, we propose a smple heurstc to combne MVAGH wth localty senstve hashng (LSH). Numercal experments on CIFAR-10 dataset confrms that MVAGH(+LSH) outperforms the exstng mult- and sngle-vew hashng methods. Index Terms Anchor graphs, hashng, mult-vew learnng 1. INTRODUCTION Hashng seeks a hash functon to embed hgh-dmensonal data nto a smlarty-preservng low-dmensonal Hammng space such that an approxmate nearest neghbor of a gven query can be found wth sub-lnear tme complexty [2,15]. Classc approach on hashng (localty senstve hashng [2]) s to compute a hash functon purely n randomzed manner, where random projectons followed by roundng are used to generate bnary codes such that the two smlar examples have the same bnary codes. Snce the performance s not satsfed when short bnary codes are used [12], multple hash tables or longer bnary codes should be employed n practce. Learnng to hash seeks to the compact smlarty-preservng bnary codes, whch can be categorzed nto unsupervsed and (sem- )supervsed paradgms. Spectral hashng (SH) [14, 15] s a representatve unsupervsed hashng method, where the Laplace-Beltram egenfunctons of manfolds are used to determne bnary codes. Iteratve quantzaton (ITQ) [3] tres to rotate PCA-embedded data by an orthogonal matrx n order to mnmze the quantzaton error caused by mappng the embedded data nto a bnary hypercube. Semantc hashng [11] s the earlest supervsed hashng, explotng deep networks to learn a non-lnear mappng between nput data and bnary codes. Practcal usefulness of semantc hashng s lmted due to tme-consumng tranng tme. Supervsed hashng wth a reasonable tranng tme has been proposed, where a hash functon s determned sequentally yeldng very short dscrmnatve codes [8]. Sem-supervsed hashng [13] mnmzes the emprcal error nduced by the volaton of parwse constrants (must-lnk and cannot-lnk), and prevents from over-ftted hash functons usng unlabeled data. Recently, learnng to hash for mult-vew (or mult-modal) data has been developed. [4, 6, 17] seeks an ntegrated bnary codes whch preserves an averaged smlarty, extendng spectral hashng for mult-vew data. [6] explots a pre-defned averaged smlarty (or dentty matrx) to compute vew-specfc bnary codes whch concatenates nto an ntegrated code. [17] uses a lnear sum of vew-specfc smlarty matrces as an averaged smlarty, where an ntegrated bnary code s drectly computed. [4] seeks an ntegrated bnary code n a sequental manner n order to de-correlate bnary codes, where an averaged smlarty s computed by α-average of vew-specfc dstance matrces. [18] proposes a generatve model, where ntra-vew and nter-vew smlartes are generated gven by vew-specfc bnary codes. Besdes [4, 6, 17], [18] can adopt non-vectoral nputs, but the algorthm s not scalable and the performance s degraded when code length s ncreased. In ths paper, we present mult-vew anchor graph hashng (MVAGH), where an non-lnear ntegrated bnary code s determned by a subset of egenvectors of an averaged smlarty nduced by mult-vew anchor graph. Mult-vew anchor graph keeps the averaged smlarty n a low-rank form, where the egenvectors can be computed effcently. Note that hashng wth an anchor graph [9] has been already popular for unsupervsed hashng, but the extenson for mult-vew data has never been studed. Snce MVAGH computes non-lnear bnary code n an effcent way, t has a clear advantage than the prevous mult-vew hashng, where [4, 6, 17] tres to compute a lnear hash functon and [18] s not scalable and needs label nformaton. More specfcally, [4, 6, 17] consders a lnear hash functon to preserve the par-wse smlarty n Eucldean space. However, f the data le on the embedded low-dmensonal manfold, the lnear hash functon cannot well capture the data smlarty, because the examples measured large dstance n Eucldean dstance can be smlar n the manfold. [18] consders a non-lnear hash functon, but t s not scalable because the smlarty matrx s kept explctly. We observe that MVAGH suffers from the performance degradaton when the hgh recall s requred. To overcome ths drawback, we propose a smple heurstc to combne MVAGH wth localty senstve hashng (LSH). 2. MULTI-VIEW ANCHOR GRAPH HASHING In ths secton, we descrbe mult-vew anchor graph hashng (MVAGH), explotng mult-vew anchor graph to approxmate the averaged smlarty. Before dscussng MVAGH, we clarfy our notaton. Suppose that {x } N =1 s a set of N objects, where each object x s represented by K dfferent vew-specfc examples by {x (1),..., x (K) }. We defne the ntegrated bnary code matrx as

2 Y = {y } N =1 { 1, +1} r N, where y s the bnary code assocated wth x and r s the code length. MVAGH borrows the spectral hashng formulaton n order to seek an ntegrated bnary code matrx Y { 1, +1} r N whch preserves the averaged smlarty S : 1 arg mn Y 2 N =1 j=1 N Sj y y j 2 2, subject to Y 1 N = 0, 1 N Y Y = I r, (1) where y s the Eucldean norm of y and 1 N R N s the vector of all ones. Snce y y s always r, the objectve functon s equvalent to arg max Y 1 2 tr(y S Y ). Ignorng the bnary constrant, the soluton of wth the constrants of (1) s the largest egenvectors of the smlarty matrx. As n anchor graph hashng [9], f the smlarty matrx s low-rank approxmated, the egenvectors are effcently computed and the generalzed egenfunctons are also easly computed for a novel nput. Therefore, the natural queston s how to construct the low-rank approxmaton of the averaged smlarty nduced by mult-vew data Mult-Vew Anchor Graph x 1 x 2 x 3 (a) s 1 s 2 s 3 Fg. 1. Random walk vew of smlarty graph approxmated by anchor graph (a) and mult-vew anchor graph (b) when the number of vews s two. If more than two vews, the mult-vew anchor graph can be developed lke (b). Anchor graph [7] s a low-rank approxmaton of neghborhood graph (such as k-nn and ɛ-graph), where the smlarty between data ponts are measured by a small number of anchor ponts. Fg. 1 (a) represents the bpartte graph, where the left vertces are data ponts and the rght ones anchor ponts. Data ponts and ther assocated two-nearest anchor ponts are connected. The smlarty of two ponts are measured by two-step transton probablty through anchor ponts. We observe that the smlarty of two ponts s greater than zero only when they share the same anchor pont, whch means that the smlarty matrx approxmated by anchor graph s emprcally sparse and preserves the data localty. The anchor ponts should be selected to suffcently cover the data dstrbuton. In practce, the cluster centers by a few teraton of k-means are enough to be anchor ponts. Though anchor graph s successfully appled to unsupervsed hashng [9], the extenson for mult-vew data has not studed. We frst defne mult-vew anchor graph, where the dfferent contrbuton x1 x2 x3 (b) s 1 s 2 s 3 of each vew s well combned to approxmate the averaged neghborhood graph. Snce the data dstrbuton mght be dfferent for each vew, we select anchor ponts for each vew 1, and k-th vew specfc anchor ponts denoted as {µ (k) j specfc smlarty between the example x (k) Z (k) j = k(x (k), µ (k) j ) K k=1 m [] k(x(k), µ (k) m ) } M j=1. Defne the k-th vew and µ (k) j as, j [], (3) where [] contans the ndces of the l-nearest anchors of x (k) (usually l = 3), and k(, ) s any kernel functon. The average smlarty between the objects x and x j s denoted as S j = p(x j x ). When the number of vews s two, we consder a tr-partte graph n Fg. 1 (b) to approxmate the average smlarty by mult-vew anchor graph, where the smlarty between the two objects, x and x j, are two-step transton probablty through vewspecfc anchor ponts: p(x j x ) = where p(x j µ (k) m ) = K k=1 m=1 M p(x j µ (k) m )p(µ(k) m x), (4) Z (k) jm Nj=1 Z (k) jm and p(µ (k) m x ) = Z (k). We defne that Λ (k) s a dagonal matrx whose m-th dagonal entry s N j=1 Z(k) jm and [Z(k) ] m = Z (k) m. Usng ths notaton, p(xj x) = [ K k=1 Z(k) Λ (k) 1 Z (k) ] j. Lettng Z = [Z (1) Z (K) ] and Λ s dag(λ (1)... Λ (K) ), the averaged smlarty can be low-rank approxmated by mult-vew anchor graph: S Z Λ 1 Z. Mult-vew anchor graph can be nterpreted as lnear sum of vewspecfc anchor graph, where the normalzaton term n (3) s the lnk between the vew-specfc ones. The egenvectors of S are obtaned by the egen-problem soluton of small matrx A = Λ 1/2 Z Z Λ 1/2. Let the r-largest egenvalues Σ = dag(σ 1,..., σ r) assocated wth the egenvectors V = [v 1,..., v r] of A. Now, the transpose of egenvectors of S denoted as Ỹ s computed by Ỹ = NΣ 1/2 V Λ 1/2 Z = NW Z, (5) where W = Λ 1/2 V Σ 1/2. The bnary codes can be computed by roundng at zero: Y = sgn(ỹ ), where sgn functon operates on each element. However, the relaxed soluton s not satsfed, and we adapt several heurstcs to fnd the better soluton (snce the orgnal soluton s NP-hard, we have to use some reasonable heurstcs for the better soluton). In spectral clusterng, spectral roundng [16] has been wdely used to refne the soluton, where a rotaton matrx s estmated to mnmze the quantzaton error nduced by dscretzng the contnuous relaxed soluton. Recently, the smlar dea has been successfully appled to learn a hash functon, whch s known as teratve quantzaton (ITQ) [3]. The objectve functon of ITQ s presented as arg mn R,Y Y R Ỹ 2 F, s.t. Y 1 N = 0 N, R R = RR = I r, (6) 1 For vew-specfc anchor ponts, we apply k-means (10 teratons) nto each vew m

3 where R and Y are teratvely estmated to mnmze the quantzaton error as n [3, 16]. ITQ orgnally consders the PCA embedded space, but spectral embedded space s also well suted to ITQ, because we emprcally observe that the bnary codes become more compact and reflect the better data smlarty (Fg. 2). Fg. 2 (a) represents tr(y S Y ) over teratons, where we observe that the smlarty between bnary codes more reflects the orgnal data smlarty by ITQ. Fg. 2 (b) means 1 n Y Y I r 2 F over teratons, where we observe that the bnary codes are more de-correlated by ITQ. the smlarty between data and bnary code 2.8 x teraton (a) De correlaton of bnary codes teraton Fg. 2. Durng ITQ teraton, the bnary codes more preserve the data smlarty (a) and are more de-correlated (b). CIFAR-10 dataset s used Out-of-Sample Extenson A subset of egenvectors of mult-vew anchor graph s used to determne the bnary codes of tranng data. We can compute analytcally the bnary code of a novel pont by usng Nyström method as n [9]. Lemma 1 represents the analytc hash functon of a novel pont. Lemma 1. Gven m vew-specfc anchor ponts {µ (k) j } M j=1 for k = 1,..., K and any example x, defne a mappng z : R D R M as follows z(x) = [z(x(1) ),..., z(x (K) )] K, k=1 z(x(k) ) where z(x (k) ) = [δ 1k(x (k), µ (k) 1 ),..., δmk(x(k), µ (k) m )] and δ {1, 0}. δ j = 1 ff anchor µ (k) j s one of s-nearest anchors of sample x (k) accordng to the kernel functon k(, ). The bnary code of a novel pont, y, s expressed as where W s defned n (5). y = sgn(r W z(x)), The proof of ths lemma s straghtforward from Theorem 1 n [9]. Algorthm 1 represents a pseudo-code for MVAGH Mult-Vew Anchor Graph Hashng + LSH Mult-vew anchor graph hashng conssts of the two steps: (1) project the data onto the largest egenvectors of mult-vew anchor graph, and roundng at zero to produce bnary codes. Snce the ntrnsc dmenson s usually low, the smlarty between bnary codes wth large code sze mght not well reflect the data smlarty. We observe that the precson s decreased wth large code sze when the hgh recall s requred. As n Fg. 3, when the hgh recall s requred (the number of returned example s large), the smlarty nduced by bnary codes turns (b) Algorthm 1 Mult-Vew Anchor Graph Hashng (MVAGH) Input: For each k-th vew, tranng data are X (k) = [x (k) 1 x (k) n ] R m n, anchor ponts are {s (k) j } M j=1, and test data s x (k) R m for k = 1,, K. Bnary code length s r. Output: Bnary code y assocated wth the test data {x (k) } K k=1. 1: Compute the smlarty between the example and anchor pont as [Z (k) ] m usng the equaton 3. 2: Let Z = [Z (1),, Z (K) ] and Σ = [Σ (1),, Σ (K) ]. 3: Apply egenvalue decomposton to A = V ΣV, where A = Λ 1/2 Z Z Λ 1/2. 4: Ŷ = NW Z, where W = Λ 1/2 V Σ 1/2. 5: R 0 s a random rotaton matrx. 6: for = 1,..., 50 do 7: Y = sgn(r 1Ŷ ). 8: R = MM, where Y Ŷ = MΩ M by SVD. 9: end for 10: Return k-bt bnary code: y = sgn(r 50W z(x)), where z(x) s defned n Lemma 1. precson 5 5 precson at 50 precson at 200 precson at 500 precson at 2000 Fg. 3. Precson of MVAGH over the code sze when the number of returned examples s changed. to be ncorrect. Ths observaton leads us to propose a smple heurstc to ncrease retreval performance when code sze s large, where the bnary codes from MVAGH and localty-senstve hashng (LSH) are concatenated. We compute the meanngful bnary code from MVAGH by choosng r a largest egenvectors, where r a should be small (n practce r a = 32 s suffcent). If we want to use r 1bts bnary code and r 1 > r a, we frst compute Y r a = sgn(r Ỹ ), where Ỹ s transpose of r a largest egenvectors from mult-vew anchor graph. The remanng (r 1 r a)bts are generated by LSH, where Y reman = sgn(m R Ỹ ) and M s generated from N(0, I). The fnal bnary code s obtaned by concatenatng nto Y r a and Y reman, yeldng Y = [Y r a ; Y reman]. Snce the r a largest egenvectors reveal the ntrnsc data structure, we expect that the bts generated by LSH are meanngful. We denote ths heurstc as MVAGH + LSH, and secton 3 shows that ths method outperforms state-of-the art methods over all bts. 3. NUMERICAL EXPERIMENTS In ths secton, we use CIFAR-10 [5] contans 60,000 mages wth 10 labels. We form a query set by randomly choosng 1,000 mages and construct a tranng set usng the rest of mages. All experments are repeated fve tmes to compute the mean and standard devaton for error bars. We use GIST [10] and HOG [1] descrptors to produce two vews of each mage. For GIST, we use Gabor flter wth 8 orentatons and 3 scales, leadng to 384-dmensonal vector. For HOG,

4 5 5 5 (GIST) (HOG) CHMIS MVH CS SU MVSH (a) (b) (c) AGH ITQ RMMH Fg. 4. Precson of MVAGH+LSH when sngle feature s used, or two features combned by mult-vew anchor graph (a). Comparson between MVAGH+LSH nto mult-vew (b) and sngle-vew (c) hashng methods. 6 9 precson precson at 50 precson at precson at 500 precson at Fg. 5. Precson of MVAGH-LSH over the number of returned mage (left), and the effects of the parameter r a (rght). we compute the mage gradents of non-overlappng wndows, where the orentaton of gradents s quantzed nto 8 bns and normalzed wth 4 dfferent metrcs, and 36 4-by-4 non-overlappng wndows yeld 1152-dmensonal vectors. We compare our proposed method (MVAGH + LSH) and the recent mult-vew hashng methods: mult-vew hashng [6], CHMIS [17], and SU-MVAGH [4] (Fg. 4 (b)). We also compare our method nto the state-of-the art sngle-vew hashng methods: anchor graph hashng (AGH), teratve quantzaton (ITQ), random maxmum margn hashng (RMMH) (Fg. 4 (c)). For sngle-vew hashng methods, mult-vew data are concatenated nto sngle representaton. As a performance measure, we use Hammng rankng [9, 13] where the rankngs between a query and data ponts are decded by Hammng dstance. If there exsts a te n the Hammng dstance, we break t randomly. We calculate the precson at the top 500 examples. For the parameter of MVAGH, we select 500 anchor ponts for each vew, s = 3, and r a = 32. We use Gaussan kernel: exp( 1 x y 2 σ 2), where σ s set the medan of parwse dstances of data 2 ponts. We choose the best parameters for the compared methods. Fg. 4 summarzes the comparson between MVAGH + LSH nto varous state-of-the-art hashng methods, where (a) means mult-vew hashng clearly mproves the precson than usng sngle descrptor, (b) and (c) suggest that our MVAGH + LSH outperforms the exstng mult and sngle-vew hashng methods. Fg. 5 shows that the performance of MVAGH-LSH s not decreased when code length s large, and the effects of the parameter r a, whch means that f we choose longer r a than 32, the performance s decreased. Fg. 6 represents the example of retreval result, where the leftmost mage s a query and the 20-nearest neghbors are dsplayed. The ncorrect retreved mages are marked by red rectangle. We can see that the Fg. 6. Retreval results on CIFAR-10. Leftmost mage s a query and the 20-nearest mages are dsplayed, and the ncorrect ones are marked by the red rectangle. precson s ncreased when the two features are used together, and MVAGH s superor to the sngle-vew hashng methods. 4. CONCLUSIONS We have presented mult-vew anchor graph hashng (MVAGH) where non-lnear ntegrated bnary codes are determned by a subset of egenvectors of an averaged smlarty matrx. The underlyng dea was based on mult-vew anchor graph to keep the averaged smlarty n a low-rank form, where the smlarty between two ponts s measured by two-step transton probablty through vew-specfc anchor ponts. We have also presented a heurstc to mprove the performance of MVAGH by combnng t wth LSH, demonstratng ts hgh performance over exstng methods. Acknowledgments: Ths work was supported by NIPA-MSRA Creatve IT/SW Research Project, ITRC Program (NIPA-2012-H ), POSTECH Rsng Star Program, and NRF WCU Program (R ).

5 5. REFERENCES [1] N. Dalal and B. Trggs, Hstograms of orented gradents for human detecton, n Proceedngs of the IEEE Internatonal Conference on Computer Vson and Pattern Recognton (CVPR), San Dego, CA, [2] A. Gons, P. Indyk, and R. Motawan, Smlarty search n hgh dmensons va hashng, n Proceedngs of the Internatonal Conference on Very Large Data Bases (VLDB), [3] Y. Gong and S. Lazebnk, Iteratve quantzaton: A procrustean approach to learnng bnary codes, n Proceedngs of the IEEE Internatonal Conference on Computer Vson and Pattern Recognton (CVPR), Colorado Sprngs, CO, [4] S. Km, Y. Kang, and S. Cho, Sequental spectral learnng to hash wth multple representaton, n Proceedngs of the European Conference on Computer Vson (ECCV), Frenze, Italy, [5] A. Krzhevsky and G. E. Hnton, Learnng multple layers of features from tny mages, Computer Scence Department, Unversty of Toronto, Tech. Rep., [6] S. Kumar and R. Udupa, Learnng hash functons for crossvew smlarty search, n Proceedngs of the Internatonal Jont Conference on Artfcal Intellgence (IJCAI), Barcelona, Span, [7] W. Lu, J. He, and S. F. Chang, Large graph constructon for scalable sem-supervsed learnng, n Proceedngs of the Internatonal Conference on Machne Learnng (ICML), Hafa, Israel, [8] W. Lu, J. Wang, R. J, Y. G. Jang, and S. F. Chang, Supervsed hashng wth kernels, n Proceedngs of the IEEE Internatonal Conference on Computer Vson and Pattern Recognton (CVPR), Provdence, Rhode Island, USA, [9] W. Lu, J. Wang, S. Kumar, and S. F. Chang, Hashng wth graphs, n Proceedngs of the Internatonal Conference on Machne Learnng (ICML), Bellevue, WA, [10] A. Olva and A. Torralba, Modelng the shape of the scene: A holstc representaton of the spatal envelope, Internatonal Journal of Computer Vson, vol. 42, no. 3, pp , [11] R. Salakhutdnov and G. Hnton, Semantc hashng, n Proceedng of the SIGIR Workshop on Informaton Retreval and Applcatons of Graphcal Models, [12] A. Torralba, R. Fergus, and Y. Wess, Small codes and large mage databases for recognton, n Proceedngs of the IEEE Internatonal Conference on Computer Vson and Pattern Recognton (CVPR), Anchorage, Alaska, [13] J. Wang, S. Kumar, and S. F. Chang, Sem-supervsed hashng for scalable mage retreval, n Proceedngs of the IEEE Internatonal Conference on Computer Vson and Pattern Recognton (CVPR), San Francsco, CA, [14] Y. Wess, R. Fergus, and A. Torralba, Multdmensonal spectral hashng, n Proceedngs of the European Conference on Computer Vson (ECCV), Frenze, Italy, [15] Y. Wess, A. Torralba, and R. Fergus, Spectral hashng, n Advances n Neural Informaton Processng Systems (NIPS), vol. 20. MIT Press, [16] S. X. Yu and J. Sh, Multclass spectral clusterng, n Proceedngs of the Internatonal Conference on Computer Vson (ICCV), [17] D. Zhang, F. Wang, and L. S, Composte hashng wth multple nformaton sources, n Proceedngs of the ACM SIGIR Conference on Research and Development n Informaton Retreval (SIGIR), Bejng, Chna, [18] Y. Zhen and D. Y. Yeung, A probablstc model for multmodal hash functon learnng, n Proceedngs of the ACM SIGKDD Conference on Knowledge Dscovery and Data Mnng (KDD), Bejng, Chna, 2012.

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY SSDH: Sem-supervsed Deep Hashng for Large Scale Image Retreval Jan Zhang, and Yuxn Peng arxv:607.08477v2 [cs.cv] 8 Jun 207 Abstract Hashng

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Data-dependent Hashing Based on p-stable Distribution

Data-dependent Hashing Based on p-stable Distribution Data-depent Hashng Based on p-stable Dstrbuton Author Ba, Xao, Yang, Hachuan, Zhou, Jun, Ren, Peng, Cheng, Jan Publshed 24 Journal Ttle IEEE Transactons on Image Processng DOI https://do.org/.9/tip.24.2352458

More information

Linear Cross-Modal Hashing for Efficient Multimedia Search

Linear Cross-Modal Hashing for Efficient Multimedia Search Lnear Cross-Modal Hashng for Effcent Multmeda Search Xaofeng Zhu Z Huang Heng Tao Shen Xn Zhao College of CSIT, Guangx Normal Unversty, Guangx, 544,P.R.Chna School of ITEE, The Unversty of Queensland,

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Laplacian Eigenmap for Image Retrieval

Laplacian Eigenmap for Image Retrieval Laplacan Egenmap for Image Retreval Xaofe He Partha Nyog Department of Computer Scence The Unversty of Chcago, 1100 E 58 th Street, Chcago, IL 60637 ABSTRACT Dmensonalty reducton has been receved much

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Discriminative classifiers for object classification. Last time

Discriminative classifiers for object classification. Last time Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Learning an Image Manifold for Retrieval

Learning an Image Manifold for Retrieval Learnng an Image Manfold for Retreval Xaofe He*, We-Yng Ma, and Hong-Jang Zhang Mcrosoft Research Asa Bejng, Chna, 100080 {wyma,hjzhang}@mcrosoft.com *Department of Computer Scence, The Unversty of Chcago

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Categorizing objects: of appearance

Categorizing objects: of appearance Categorzng objects: global and part-based models of appearance UT Austn Generc categorzaton problem 1 Challenges: robustness Realstc scenes are crowded, cluttered, have overlappng objects. Generc category

More information

Multimodal Learning of Geometry-Preserving Binary Codes for Semantic Image Retrieval

Multimodal Learning of Geometry-Preserving Binary Codes for Semantic Image Retrieval 600 IEICE TRANS. INF. & SYST., VOL.E100 D, NO.4 APRIL 2017 INVITED PAPER Specal Secton on Award-wnnng Papers Multmodal Learnng of Geometry-Preservng Bnary Codes for Semantc Image Retreval Go IRIE a), Hroyuk

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming CS 4/560 Desgn and Analyss of Algorthms Kent State Unversty Dept. of Math & Computer Scence LECT-6 Dynamc Programmng 2 Dynamc Programmng Dynamc Programmng, lke the dvde-and-conquer method, solves problems

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Semi-Supervised Kernel Mean Shift Clustering

Semi-Supervised Kernel Mean Shift Clustering IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. XX, NO. XX, JANUARY XXXX 1 Sem-Supervsed Kernel Mean Shft Clusterng Saket Anand, Student Member, IEEE, Sushl Mttal, Member, IEEE, Oncel

More information

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Shape Representaton Robust to the Sketchng Order Usng Dstance Map and Drecton Hstogram Department of Computer Scence Yonse Unversty Kwon Yun CONTENTS Revew Topc Proposed Method System Overvew Sketch Normalzaton

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering Out-of-Sample Extensons for LLE, Isomap, MDS, Egenmaps, and Spectral Clusterng Yoshua Bengo, Jean-Franços Paement, Pascal Vncent Olver Delalleau, Ncolas Le Roux and Mare Oumet Département d Informatque

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Structure from Motion

Structure from Motion Structure from Moton Structure from Moton For now, statc scene and movng camera Equvalentl, rgdl movng scene and statc camera Lmtng case of stereo wth man cameras Lmtng case of multvew camera calbraton

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

Simplification of 3D Meshes

Simplification of 3D Meshes Smplfcaton of 3D Meshes Addy Ngan /4/00 Outlne Motvaton Taxonomy of smplfcaton methods Hoppe et al, Mesh optmzaton Hoppe, Progressve meshes Smplfcaton of 3D Meshes 1 Motvaton Hgh detaled meshes becomng

More information

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016) Technsche Unverstät München WSe 6/7 Insttut für Informatk Prof. Dr. Thomas Huckle Dpl.-Math. Benjamn Uekermann Parallel Numercs Exercse : Prevous Exam Questons Precondtonng & Iteratve Solvers (From 6)

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Semi-Supervised Discriminant Analysis Based On Data Structure

Semi-Supervised Discriminant Analysis Based On Data Structure IOSR Journal of Computer Engneerng (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 17, Issue 3, Ver. VII (May Jun. 2015), PP 39-46 www.osrournals.org Sem-Supervsed Dscrmnant Analyss Based On Data

More information

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning Internatonal Journal of Intellgent Informaton Systems 2018; 7(2): 15-22 http://www.scencepublshnggroup.com/j/js do: 10.11648/j.js.20180702.11 ISSN: 2328-7675 (Prnt); ISSN: 2328-7683 (Onlne) Kernel Collaboratve

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide Lobachevsky State Unversty of Nzhn Novgorod Polyhedron Quck Start Gude Nzhn Novgorod 2016 Contents Specfcaton of Polyhedron software... 3 Theoretcal background... 4 1. Interface of Polyhedron... 6 1.1.

More information

EXTENDED BIC CRITERION FOR MODEL SELECTION

EXTENDED BIC CRITERION FOR MODEL SELECTION IDIAP RESEARCH REPORT EXTEDED BIC CRITERIO FOR ODEL SELECTIO Itshak Lapdot Andrew orrs IDIAP-RR-0-4 Dalle olle Insttute for Perceptual Artfcal Intellgence P.O.Box 59 artgny Valas Swtzerland phone +4 7

More information

Robust visual tracking based on Informative random fern

Robust visual tracking based on Informative random fern 5th Internatonal Conference on Computer Scences and Automaton Engneerng (ICCSAE 205) Robust vsual trackng based on Informatve random fern Hao Dong, a, Ru Wang, b School of Instrumentaton Scence and Opto-electroncs

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

What Is the Most Efficient Way to Select Nearest Neighbor Candidates for Fast Approximate Nearest Neighbor Search?

What Is the Most Efficient Way to Select Nearest Neighbor Candidates for Fast Approximate Nearest Neighbor Search? IEEE Internatonal Conference on Computer Vson What Is the Most Effcent Way to Select Nearest Neghbor Canddates for Fast Approxmate Nearest Neghbor Search? Masakazu Iwamura, Tomokazu Sato and Koch Kse Graduate

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017 U.C. Bereley CS294: Beyond Worst-Case Analyss Handout 5 Luca Trevsan September 7, 207 Scrbed by Haars Khan Last modfed 0/3/207 Lecture 5 In whch we study the SDP relaxaton of Max Cut n random graphs. Quc

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1 DOI: 10.98/CSIS110415064L An Improved Spectral Clusterng Algorthm Based on Local Neghbors n Kernel Space 1 Xnyue Lu 1,, Xng Yong and Hongfe Ln 1 1 School of Computer Scence and Technology, Dalan Unversty

More information

Heterogeneous Visual Features Fusion via Sparse Multimodal Machine

Heterogeneous Visual Features Fusion via Sparse Multimodal Machine 013 IEEE Conference on Computer Vson and Pattern Recognton Heterogeneous Vsual Features Fuson va Sparse Multmodal Machne Hua ang, Fepng Ne, Heng Huang, Chrs Dng Department of Electrcal Engneerng and Computer

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction Appl. Math. Inf. c. 6 No. pp. 8-85 (0) Appled Mathematcs & Informaton cences An Internatonal Journal @ 0 NP Natural cences Publshng Cor. wo-dmensonal upervsed Dscrmnant Proecton Method For Feature Extracton

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Multi-view Clustering with Adaptively Learned Graph

Multi-view Clustering with Adaptively Learned Graph Proceedngs of Machne Learnng Research 77:113 128, 217 ACML 217 Mult-vew Clusterng wth Adaptvely Learned Graph Hong Tao taohong.nudt@hotmal.com Chenpng Hou hcpnudt@hotmal.com Jubo Zhu ju bo zhu@alyun.com

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Hierarchical Image Retrieval by Multi-Feature Fusion

Hierarchical Image Retrieval by Multi-Feature Fusion Preprnts (www.preprnts.org) NOT PEER-REVIEWED Posted: 26 Aprl 207 do:0.20944/preprnts20704.074.v Artcle Herarchcal Image Retreval by Mult- Fuson Xaojun Lu, Jaojuan Wang,Yngq Hou, Me Yang, Q Wang* and Xangde

More information

Mercer Kernels for Object Recognition with Local Features

Mercer Kernels for Object Recognition with Local Features TR004-50, October 004, Department of Computer Scence, Dartmouth College Mercer Kernels for Object Recognton wth Local Features Swe Lyu Department of Computer Scence Dartmouth College Hanover NH 03755 A

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Graph-based Clustering

Graph-based Clustering Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Machine Learning. K-means Algorithm

Machine Learning. K-means Algorithm Macne Learnng CS 6375 --- Sprng 2015 Gaussan Mture Model GMM pectaton Mamzaton M Acknowledgement: some sldes adopted from Crstoper Bsop Vncent Ng. 1 K-means Algortm Specal case of M Goal: represent a data

More information

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM PERFORMACE EVALUAIO FOR SCEE MACHIG ALGORIHMS BY SVM Zhaohu Yang a, b, *, Yngyng Chen a, Shaomng Zhang a a he Research Center of Remote Sensng and Geomatc, ongj Unversty, Shangha 200092, Chna - yzhac@63.com

More information

A Bilinear Model for Sparse Coding

A Bilinear Model for Sparse Coding A Blnear Model for Sparse Codng Davd B. Grmes and Rajesh P. N. Rao Department of Computer Scence and Engneerng Unversty of Washngton Seattle, WA 98195-2350, U.S.A. grmes,rao @cs.washngton.edu Abstract

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

An Ensemble Learning algorithm for Blind Signal Separation Problem

An Ensemble Learning algorithm for Blind Signal Separation Problem An Ensemble Learnng algorthm for Blnd Sgnal Separaton Problem Yan L 1 and Peng Wen 1 Department of Mathematcs and Computng, Faculty of Engneerng and Surveyng The Unversty of Southern Queensland, Queensland,

More information

An Evaluation of Divide-and-Combine Strategies for Image Categorization by Multi-Class Support Vector Machines

An Evaluation of Divide-and-Combine Strategies for Image Categorization by Multi-Class Support Vector Machines An Evaluaton of Dvde-and-Combne Strateges for Image Categorzaton by Mult-Class Support Vector Machnes C. Demrkesen¹ and H. Cherf¹, ² 1: Insttue of Scence and Engneerng 2: Faculté des Scences Mrande Galatasaray

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification Fast Sparse Gaussan Processes Learnng for Man-Made Structure Classfcaton Hang Zhou Insttute for Vson Systems Engneerng, Dept Elec. & Comp. Syst. Eng. PO Box 35, Monash Unversty, Clayton, VIC 3800, Australa

More information

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input Real-tme Jont Tracng of a Hand Manpulatng an Object from RGB-D Input Srnath Srdhar 1 Franzsa Mueller 1 Mchael Zollhöfer 1 Dan Casas 1 Antt Oulasvrta 2 Chrstan Theobalt 1 1 Max Planc Insttute for Informatcs

More information