Mercer Kernels for Object Recognition with Local Features

Size: px
Start display at page:

Download "Mercer Kernels for Object Recognition with Local Features"

Transcription

1 TR004-50, October 004, Department of Computer Scence, Dartmouth College Mercer Kernels for Object Recognton wth Local Features Swe Lyu Department of Computer Scence Dartmouth College Hanover NH A new class of kernels for object recognton based on local mage feature representatons are ntroduced n ths paper. Formal proofs are gven to show that these kernels satsfy the Mercer condton. In addton, multple types of local features and semlocal constrants are ncorporated. Expermental results of SVM classfers coupled wth the proposed kernels are reported on recognton tasks wth the COIL-00 database and compared wth exstng methods. The proposed kernels acheved compettve performance and were robust to changes n object confguratons and mage degradatons. Correspondence should be addressed to S. Lyu. 6 Sudkoff Lab, Department of Computer Scence, Dartmouth College, Hanover NH tel: ; emal: lsw@cs.dartmouth.edu.

2 . Introducton Kernel methods receved attenton orgnally as a trck to ntroduce non-lnearty nto support vector machnes (SVM) ]. Evaluatng a kernel functon between two data s equvalent to computng the scalar product of ther mages n a non-lnearly mapped space (usually termed as feature space). It s realzed later that kernel methods are more general. Smlar to SVM, many lnear algorthms (e.g., PCA and Fsher lnear dscrmnant) depend on data through ther scalar products. By substtutng the scalar products wth kernel evaluatons, these algorthms can dscover non-lnear patterns n data. At the same tme, they are stll computatonally effcent, as the kernel functon s evaluated n the nput space 0]. Instead of usng general-purpose kernels (e.g., Gaussans), recent effort has been put on desgnng kernels talored to the requrements of a specfc applcaton. Such kernels better reflect the smlartes between data and thus ncorporate more doman knowledge nto the algorthm. One mportant applcaton of kernel method s appearance based object recognton. Object recognton remans one of the most challengng problems n computer vson. Changes n llumnaton, pose, vewng angle, occluson, clutters and non-rgd deformatons are just a few of the complcated problems a recognton system has to face. Many applcatons of kernel methods to object recognton are based on global mage features (e.g., global grayvalue hstograms) 4, 4, 5]. Though promsng performance has been reported, these methods are plagued by the defcences of the global features, such as beng senstve to mage degradatons (e.g., nose, occluson and background clutters) and not robust under changes n object confguratons (e.g., translaton and scalng). Recent years have seen mpressve developments n usng local features computed at nterest ponts for matchng and recognton 9, 7, 6, 0, ]. Such approaches lead to robust and compact mage representatons that lend themselves to powerful pattern analyss algorthms. However, the local feature representatons pose several challenges to kernel desgn. Frst, t requres the kernel to work effcently on nputs of varable lengths, as mages may have a dfferent number of local features. Secondly, the kernel should measure smlarty of two unordered sets of local features, where no explct correspondence s avalable. Furthermore, several dfferent types of local features are usually collected and they need to be fused nto the kernel. For better performance, semlocal spatal and geometrcal constrants between nterest ponts should also be ncorporated. Fnally, to guarantee unque global optmal solutons for the SVM algorthm, the kernel must also satsfy the Mercer condton. Unfortunately, exstng methods (e.g.,,, 3,, 8]) are not satsfactory n that they do not meet all of these requrements. The major contrbuton of ths paper s the defnton of a new class of kernels for object recognton, based on local feature representatons. Formal proofs are gven to show that ths class of kernels satsfy the Mercer condton and reflect smlartes between sets of local features. In addton, multple local feature types and semlocal constrants are ncorporated to reduce msmatches between local features, thus further mprove the classfcaton performance. Results are shown on testng the proposed kernels, coupled wth SVM classfcaton, on recognton tasks wth the COIL-00 database.. Methods In ths secton, after a bref revew of Mercer kernels and local features, the proposed kernel s descrbed and compared wth prevous approaches. Then kernels usng multple types of local features and semlocal constrants are ntroduced, followed by an algorthm summarzng the overall process... Mercer kernel Admssble kernel functons satsfy the Mercer condton (hence usually termed as Mercer kernels). For an nput space X, f there s a mappng φ : X H that maps any x, z X nto a Hlbert space H, then a Mercer kernel, K : X X R, s constructed as K(x, z) = φ(x), φ(z) H, where, H s the scalar product operator n H. Such a functon K satsfy the Mercer condton, an equvalent descrpton of whch s stated formally n the followng proposton: Proposton (Theorem 3., 0]) Let X be any nput space and K : X X R a symmetrc functon, K s a Mercer kernel f and only f the kernel matrx formed by restrctng K to any fnte subset of X s postve sem-defnte (havng no negatve egenvalues). The Mercer condton s essental to kernel desgn, as t s the key requrement for a unque global optmal soluton to the kernel-extended pattern analyss algorthms based on convex optmzaton (e.g., SVM) 0]. Instead of usng ts defnton, a Mercer kernel s usually constructed n other more convenent ways. For data n a vector space, one can choose from standard off-the-shelf kernels, a common choce beng a Gaussan: K R (x, z) = exp( x z σ ). () There are also Mercer kernels desgned specfcally for structured data types, such as strngs, trees or graphs 0]. In addton, new Mercer kernels can be bult on exstng ones. Several propertes of Mercer kernels relevant to ths aspect are summarzed n the followng proposton:

3 Proposton For Mercer kernels, the followng facts hold () (Proposton., 0]) The product of two Mercer kernels s a Mercer kernel. Thus, a monomal of any degree of a Mercer kernel s a Mercer kernel. () (Lemma, 7]) Let K be a Mercer kernel defned on X X, for any fnte A, B X, defne K(A, B) = x A y B K(x, y). Then K s a Mercer kernel on X X \ { }. () (Proposton.75, 0]) For a data space X that can be decomposed as X = X X N and x = (x,, x N ) X, for whch x X, denote Mercer kernels K on X X, =,, N. For x, z X, K(x, z) = N = K (x, z ), s a Mercer kernel on X X. Such a kernel s a specal case of the R-convoluton kernel 7]. Besdes satsfyng the Mercer condton, many applcatons also requre the desgned kernel to reflect smlartes between the data beng studed. As kernels are elcted from scalar products, they are expected to have larger values for data that are more smlar to each other. Admssble kernel functons satsfy the Mercer condton (hence usually termed as Mercer kernels). For an nput space X, f there s a mappng φ : X H that maps any x, z X nto a Hlbert space H, then a Mercer kernel, K : X X R, s constructed as K(x, z) = φ(x), φ(z) H, where, H s the scalar product operator n H. Such a functon K satsfy the Mercer condton, an equvalent descrpton of whch s stated formally n the followng proposton: Proposton (Theorem 3., 0]) Let X be any nput space and K : X X R a symmetrc functon, K s a Mercer kernel f and only f the kernel matrx formed by restrctng K to any fnte subset of X s postve sem-defnte (havng no negatve egenvalues). The Mercer condton s essental to kernel desgn, as t s the key requrement for a unque global optmal soluton to the kernel-extended pattern analyss algorthms based on convex optmzaton (e.g., SVM) 0]. Instead of usng ts defnton, a Mercer kernel s usually constructed n other more convenent ways. For data n a vector space, one can choose from standard off-the-shelf kernels, a common choce beng a Gaussan: K R (x, z) = exp( x z σ ). () There are also Mercer kernels desgned specfcally for structured data types, such as strngs, trees or graphs 0]. In addton, new Mercer kernels can be bult on exstng ones. Several propertes of Mercer kernels relevant to ths aspect are summarzed n the followng proposton: Proposton For Mercer kernels, the followng facts hold Strctly speakng, normalzed kernel evaluates the cosne smlarty of two mapped data n the feature space. () (Proposton., 0]) The product of two Mercer kernels s a Mercer kernel. Thus, a monomal of any degree of a Mercer kernel s a Mercer kernel. () (Lemma, 7]) Let K be a Mercer kernel defned on X X, for any fnte A, B X, defne K(A, B) = x A y B K(x, y). Then K s a Mercer kernel on X X \ { }. () (Proposton.75, 0]) For a data space X that can be decomposed as X = X X N and x = (x,, x N ) X, for whch x X, denote Mercer kernels K on X X, =,, N. For x, z X, K(x, z) = N = K (x, z ), s a Mercer kernel on X X. Such a kernel s a specal case of the R-convoluton kernel 7]. Besdes satsfyng the Mercer condton, many applcatons also requre the desgned kernel to reflect smlartes between the data beng studed. As kernels are elcted from scalar products, they are expected to have larger values for data that are more smlar to each other... Local feature representaton Local features are localzed descrptors that provde dstnct nformaton about a specfc locaton of an mage. Many local features (e.g., 9, 7, 6, 0, ]) are desgned to be nvarant under certan mage transformatons, such as rotaton and scalng, so that they are relatvely stable to changes n object confguratons. Local features have proved to be very successful n appearance based object matchng and recognton, as they are dstnctve, robust to mage degradaton and transformaton, and requre no segmentaton ]. Local features are usually collected at or n the neghborng regon around nterest ponts, whch are specfc postons n an mage that carry dstnctve features of the object beng studed. Interest ponts are found by an nterest pont detector, popular choces for whch are the Harrs detector 6] and mult-resoluton based detectors 9]. In ths paper, we denote p = (x, y ) as the coordnate (n the mage plane) of the -th nterest pont detected n the mage, and vector F as the local feature computed at or around p. An mage I a s represented by the set of local features correspondng to all nterest ponts detected, denoted as F a = {F (a),, F (a) F }. a.3. Related work Wth local feature representaton, an mage s concsely represented by ts set of local feature vectors. Accordngly, kernels that match mages could be defned between two sets of local feature vectors. We start by enumeratng some desrable propertes of such kernels: Strctly speakng, normalzed kernel evaluates the cosne smlarty of two mapped data n the feature space. 3

4 The kernel should satsfy the Mercer condton; The computaton of the kernel should be effcent n both tme and space; The kernel should be able to handle nputs wth varable lengths, as the number of nterest ponts may vary across dfferent mages; The kernel should reflect smlartes between two sets of local feature vectors. It should be noted that the local feature representaton does not provde correspondence between local features of two mages, whle only the correctly matched local features carry meanngful dscrmnant nformaton. However, fndng the optmal matchng of local features s not always feasble n practce and many algorthms are based on heurstcs. One mportant assumpton common to most matchng algorthms s that the correctly matched local features are more smlar to each other than otherwse. These propertes vod the use of off-the-shelf kernels, such as a Gaussan, as the underlyng data (sets of vectors) are not from a vector space. One can normalze the length of nputs by paddng zeros. Whereas the nputs can be assumed n a vector space now, the computed quantty s of lttle nterest to recognton. Notce, however, that t s relatvely easy to buld a kernel K F on the local features, as they are vectors wth dentcal dmensons. A natural dea s to construct composte kernels on the bass of such kernels, whch work wth sets of local features. One smple example of such an approach s the summaton kernel. On two local feature sets, F a = {F (a),, F (a) F } and F a b = {F (b), }, of two mages, I a and I b, the summaton kernel s defned as F a K S (F a, F b ) = F a = j= K F (F (a) j ). (3) Smple applcaton of Proposton, part (), shows that the summaton kernel satsfes the Mercer condton. However, ts dscrmnatve ablty s compromsed by the fact that all possble matchngs between local features are combned wth equal bas. The good matchngs, hghly out-numbered, could be easly swamped by the bad ones. In ], a kernel functon based on matchng local features was proposed K M (F a, F b ) = F a + F a max j=,, = max =,, F a j= K F (F (a) K F (F (b) j j ), F (a) ). (4) Functon K M has the desred property of reflectng smlartes of two sets of local feature vectors, as t only consders the smlartes of the best matched local features. Unfortunately, despte the clam n ], K M s not a Mercer kernel, for whch a detaled proof s gven n Appendx A. In ], a smlar non-mercer kernel based on a sub-optmal matchng between local features s used but measures are provded so that the probablty of the kernel not beng postve semdefnte s bounded. However, as ponted out earler, the Mercer condton s essental to relable recognton, Mercer kernels are stll preferable n practce. In 3], a Mercer kernel s proposed for sets of vectors based on the concept of prncpal angles between two lnear subspaces. However, ths kernel showed poor recognton performance as reported n 5]. In 8], the Bhattacharyya kernel s ntroduced where a set of vectors s represented as a multvarate Gaussan. Though provably satsfyng the Mercer condton, evaluatng ths kernel s cubc n the number of local features. Furthermore, good matchngs do not necessarly dstngush themselves n such a settng. In ], a kernel based on Kullback-Lebler dvergence s proposed. However, as the authors ponted out, t s not clear f such a kernel satsfes the Mercer condton..4. A Mercer kernel between local feature sets As dscussed earler, only the correctly matched local features wth large smlarty measures provde meanngful dscrmnant nformaton for recognton. Ths ndcates that such matched pars should domnate n the kernel evaluaton, f we expect the kernel to measure smlartes between two sets of local feature vectors. However, drectly summng the maxmum smlartes as n the case of K M results n nadmssble kernels that volate the Mercer condton. In ths paper, a new class of kernels are proposed that measure smlarty between local feature sets and that provably satsfy the Mercer condton. The proposed kernel functon s defned as F a K F (F a, F b ) = F a = j=, where nteger p s the kernel parameter. Wth p =, the proposed kernel ncludes the summaton kernel as a specal case. Smlar to the summaton kernel, all possble matchngs between local features n the two sets are consdered n K F, but wth dfferent bas. It s through the kernel parameter p that the correct matched local features are gven domnant bas n K F. Ths s made more clear f K F s rewrtten as K F (F a, F b ) = F a F a = j= F a j= F a = (5) j ) ] p + KF (F (b) j, F (a) ) ] p (6). 4

5 Now K F has a smlar form as K M : both are sums of some smlarty measures over each local feature vector. Only the max functon n K M s replaced here wth a summaton of monomals. Consder a local feature F (a) of F a (though the followng results are also true for members of F b ), and ts kernel evaluatons wth each member of F b, K F (F (a) ),, K F (F (a) ). The smlarty between F (a) and local feature set F b s measured n Equaton (6) as Fb j=. (7) Wthout loss of generalty, let us assume K F (F (a) K F (F (a) matched local feature n F b wth F (a) n the sum of Equaton (7) s: κ = K F (F (a) )] p Fb / j= ) ). The contrbuton of the best. The larger the value of p s, the more domnant s the best matched par. As p approaches nfnty, all but the maxmal values wll have a neglgble fracton n the sum. Furthermore, f we requre that the smlarty of the best matched par n the sum has a fracton above a gven threshold ρ, a lower bound of p can be computed as: ρ p log ( )ρ / log K ( F F (a) ( K F F (a) ) (8) ), (9) where F (b) s the second best matchng local feature n F b for F (a) (a detaled proof s gven n Appendx B). A proper p can be chosen as the maxmum of such lower bounds over all tranng data. The proposed kernel satsfes the Mercer condton, whch s formally stated n the followng proposton: Proposton 3 Functon K F defned n Equaton (5) s a Mercer kernel, f functon K F s a Mercer kernel defned on the local feature vectors. Proof Frst, note that K F s symmetrc by defnton. Also K F (, )] p s a monomal of a Mercer kernel. Wth Proposton, part (), t s also a Mercer kernel. Fnally, K F s constructed n way of Proposton, part (), therefore, t satsfes the Mercer condton..5. Multple local feature types So far, n constructng kernels on local feature sets, only one type of local feature s consdered. However, t s usually possble to compute multple types of local features at an nterest pont. As each ndvdual type of local feature may carry dstnctve nformaton about the underlyng object, t s desrable to have them fused nto the desgned kernel. Hereafter, we wll refer to each type of local feature as a base local feature. Assume L dfferent base local features are employed, and denote f (a) l R d l as the d l -dmensonal vector of the l-th base local feature computed at an nterest pont p a, for l =,, L. Also assume that the smlarty of the l-th base local feature s properly measured by a Mercer kernel, K (l) f.3 The local feature of p a s a vector of dmenson L l= d l, formed by stackng all f (a) l s as F a = (f (a) T,, f (a) T L ) T. A kernel between two such local features, F a and F b, s defne as K F (F a, F b ) = L l= K(l) (a) f (f l, f (b) l ). (0) The functon K F satsfes the Mercer condton, Proposton, part (). It can then be substtuted nto the defnton of K F, Equaton 5, whch now ncorporates multple types of local features..6. Semlocal constrants One problem of representng an mage as an unordered set of local feature vectors s that such a representaton s ndependent to the spatal locatons of the nterest ponts. Dfferent objects, therefore, wth smlar local feature vectors lad out dfferently n the mage plane are ndstngushable. On the other hand, as supported by the expermental results n 7], there are strong spatal correlatons between the nterest ponts and ther correspondng local features n an mage. Such correlatons are termed semlocal constrants n 7]. For better recognton performance, t may be desrable to enforce such semlocal constrants n kernel desgn. Followng the method n 7], we use the local shape confguraton to enforce semlocal constrants. 4 Specfcally, an mage s represented as a set of semlocal groups, whch bundle together mage nformaton around spatally close nterest ponts. One semlocal group s formed around each nterest pont (ts central nterest pont) detected n an mage. Each semlocal group s defned as a two component tuple, denoted as g = {F, Θ}. The frst component, F = {F 0, F,, F k }, s a set of local features collected at the central nterest pont as well as ts k-nearest neghbors, p,, p k. The second component, Θ = (θ,, θ k ), s a vector contanng neghborng angles n the constellaton spanned by the central nterest pont and ts k-nearest neghbors, Fgure. These neghborng angles convey the local geometrcal constrants wthn the semlocal group. As ponted out n 7], f we suppose that the transformatons of objects can be locally approxmated by a smlarty transformaton, then these angles have to be locally consstent. 3 Such kernels are termed as mnor kernels n 5]. In ], several mnor kernels for some state-of-the-art local feature representatons are lsted. 4 We are reluctant to use postons of nterest ponts drectly n the kernel, as n ]. Such a settng makes the kernel vulnerable to changes n the spatal confguratons of the object (e.g., translaton). 5

6 p 4 p 3 θ 3 θ θ 4 θ 5 θ Fgure : An example of semlocal group formed by an nterest pont (central flled dot) and ts fve nearest neghbors, p,, p 5. Hypothetcal lnes are added to show the neghborng angles. An mage I a s now represented by a set of semlocal groups, G a = {g (a),, g(a) G a }. Correspondngly, the kernel matchng mages are now defned on two sets of semlocal groups. Smlar to the approach taken n constructng kernel K F, we defne a kernel between two sets of semlocal groups as K G (G a, G b ) = G a G b p 5 p G a G b = j= p Kg (g (a), g (b) j )] p, () where K g s a Mercer kernel between two semlocal groups to be specfed later, and nteger p s the kernel parameter. A smlar proof as that of Proposton 3 wll show that kernel K G satsfes the Mercer condton. Correct correspondence s stll an mportant ssue, as n the case of local features, only correctly matched semlocal groups are meanngful for recognton. The kernel parameter p n K G has a smlar role as ts counterpart n kernel K F, whch gves preference to good matchngs between semlocal groups..7. A crcular-shft nvarant kernel In constructng K G, Equaton (), kernel K g between two semlocal groups s left unspecfed. As a semlocal group conssts of two parts, a natural way to desgn K g s to use the product of two kernels ndvdually defned on the two composng parts of g as: K g (g a, g b ) = K F (F a, F b )K G (Θ a, Θ b ), () where K F s defned as n Equaton (5). Kernel K G s defned between two vectors of neghborng angles n the semlocal constellaton. Specal care s requred to desgn such a kernel, as Θ s nvarant under crcular-shfts. For nstance, consder agan the example shown n Fgure. A vector of neghborng angles as (θ 3, θ 4, θ 5, θ, θ ) represents the same geometrcal confguraton as (θ, θ, θ 3, θ 4, θ 5 ). For ths reason, kernel K G, whch measures the smlarty between two vectors of neghborng angles, should not treat such two vectors as dfferent (.e., t should also be nvarant under crcular-shfts). In the most general settng, for two n-dmensonal vectors x = (x 0,, x n ) T R n and y = (y 0,, y n ) T R n, formally we defne functon c : R n {0,, n } R n to be the crcular-shft operator as (c(y, l)) = (y) (+l)mod n, where (y) s the -th component of y and 0 l, n. Now consder functon K G (x, y) = n l=0 K(x, c(y, l))]p, (3) where K : R n R n R s a Mercer kernel and satsfes that K(x, y) = K(c(x, d), c(y, d)) for 0 d n. Many commonly used kernel functons (e.g., Gaussan) are vald canddates for K and n our case, we smply choose t to be the vector scalar product n R n as K(x, y) = x T y. Proposton 4 Functon K G as defned n Equaton (3) has the followng propertes: () t s a Mercer kernel on R n R n ; () t s nvarant under crcular-shfts, as for x, y R n, K G (x, y) = K G (c(x, d ), c(y, d )), for 0 d, d n. A full proof of these results s gven n Appendx C. Notce that n constructng kernel K G, we agan employ a kernel parameter p to gve domnant bas to good matchngs. Fnally, as both K F and K G satsfy the Mercer condton, accordng to Proposton, part (), ther product K g, Equaton (), s also a Mercer kernel..8. Summary The process of constructng a kernel for object recognton, as proposed n ths paper, bult wth multple types of local features and semlocal constrants, s summarzed n the followng algorthm:. Wth mnor kernels K f defned on base local features, construct kernel K F on local features wth Equaton (0);. Construct kernel K F on local feature sets wth Equaton (5); 3. Obtan a vector of neghborng angles n a semlocal group, and construct kernel K G wth Equaton (3); 4. Combne kernels K F and K G nto kernel K g wth Equaton (); 5. Compute kernel K G between two sets of semlocal groups wth Equaton (); 3. Experments In ths secton, we present expermental results on recognton tasks usng local features and SVM classfcaton, 6

7 brdged together by the proposed kernels. In prncple, the proposed kernels can work wth any pattern analyss algorthm that s able to be kernelzed,.e., dependng on data through ther scalar products. SVM was chosen for ts performance and generalzaton ablty. 3.. Expermental setup We performed our experments on the COIL-00 database 3], a standard test benchmark for object recognton. The COIL-00 database contans 700 color mages of 00 dfferent objects. All mages are 8 8 pxels n sze. They were obtaned by placng the objects on a turntable and takng a pcture every 5 of vewng angle of a 360 rotaton. In our experments, the tranng set of all SVM classfers conssted of 3600 mages, 36 for each of the 00 objects that correspond to a 0 dfference n the vewng angles. Shown n the top row of Fgure are fve mages from the tranng set. From the remanng mages, fve dfferent testng sets were formed: Set : 3600 mages wth vewng angles other than those used n the tranng set. Set : 3600 mages generated by randomly scalng, rotatng and translatng mages n set. Set 3: 3600 mages generated by addng Gaussan nose of average db to the mages n set. Set 4: 3600 mages generated by embeddng the mages n set nto randomly chosen backgrounds. 5 Set 5: 3600 mages generated by artfcally addng partal occlusons (strpes from randomly chosen mages) to the mages n set. Set and test the generalzaton ablty of the kernels and classfers to changes n vewng angles and object postons. Set 3-5 are devsed to test ther reslence to common mage degradatons, namely addtve nose, background clutters and partal occlusons. On each mage, three types of local features along wth ther correspondng mnor kernels were computed:. Local jets 7] are dfferental grayvalue nvarants computed around an nterest pont. Each local jet s a vector of dmenson 9 contanng up to the thrd order dervatves. A Mercer kernel between ( two local jet features, x and z, s K(x, z) = exp (x z)t Λ (x z) σ ), where Λ s the covarance matrx and (x z) T Λ (x z) s the Mahalanobs dstance between x and z. 5 Images used for backgrounds and partal occlusons n set 4 and set 5 are downloaded from tran set set set 3 set 4 set 5 Fgure : Examples of mages used n our experments. The frst row are mages from the tranng set. The remanng rows are examples from each of the fve testng sets.. Local hstograms 6] are local features consstng of hstogram at dfferent scales around nterest ponts. Usng 3 bns n computng the hstogram and consderng up to 3 scales, each feature s a 96 dmensonal vector. A kernel based on the χ -smlarty between ( two feature vectors, x and z, K(x, z) = exp χ (x,z) σ ), s ntroduced n ] and proved to satsfy the Mercer condton 4]. 3. Local phase-based features ] are comprsed by local phases of a complex pyramd decomposton of the mage. The features are 36-dmensonal complexvalued vectors, and ther smlarty s measured by C(x, z) = xz + x z, from whch a Mercer kernel s constructed as K(x, z) = (C(x, z) + ) q. For each of these local features, nterest ponts were found by a Harrs corner detector, showed to have hgh repeatablty and robust performance 8]. Interest ponts too close to the boundary were gnored to avod mage border effects. The parameters of the nterest pont detector were set so that, on average, approxmately 00 nterest ponts were found n an mage. Semlocal groups, as descrbed n secton.6, were formed on each nterest pont usng ts fve nearest neghbors. To have a bass of comparson, we also collected a global 7

8 feature from each mage. The global feature we used s the raw pxel representaton 5], whch was obtaned by frst convertng a 8 8 color mage nto grayscale and reszng t to 3 3 pxels. A 04-dmensonal feature vector was formed by stackng the grayvalues of the reszed mage. For the local feature representatons, composte kernels as descrbed n Secton.8 were formed from the local features and ther kernels. The kernel parameter, p, was set to 9 n all cases. For the global features, a Gaussan kernel, Equaton () was employed. The SVM classfers were mplemented wth package LIBSVM 3], whch was enhanced to work wth kernels on local feature representatons. As a standard preprocessng step n the lterature, we used the normalzed kernel evaluaton n buldng the SVM classfer, K(x,y) as K(x, y). We employed a smple K(x,x) K(y,y) mult-class protocol for classfcaton, namely a one-versusthe-rest scheme n tranng and a wnner-takes-all strategy n testng. The regularzaton parameter of SVM was set to 0 3 n all classfers. 3.. Results Shown n Table s the performance of dfferent types of kernels wth the local jets on all testng sets. For comparson, the performance of the global feature (raw pxel representaton) wth a Gaussan kernel s also ncluded. Performance s evaluated n error rates, whch s the percentage of all msclassfcaton cases n all testng examples. Several ponts are worth notng about ths set of results. Frst, the local jet features out-performed the global features on all testng sets. The dfference s more sgnfcant wth the presence of mage transformatons and degradatons (set -5). Furthermore, notce that the proposed kernel, K F, acheved compettve performance to that of the matchng kernel K M. As K M s less senstve to msmatches n local features, t had lowest error rate n some cases (set 3,4). However, ts drawback s that there s no guarantee of a unque global optmal soluton to the SVM tranng. Shown n Fgure 3 s a plot of the contrbuton of the best matched local feature pars n the evaluaton of kernel K F, Equaton (8), wth regards to the kernel parameter p. For stablty, we reported here the average over kernel evaluatons of all tranng mage pars. Notce that after p 9, ths rato s plateaued to be more than 99%, ndcatng that the best matched pars of local features has domnated n the kernel evaluaton. Ths fact s further supported by the correspondng classfcaton error rates of K F on test set, Fgure 4. Wth p chosen greater than 9, the performance does not mproved sgnfcantly. In the second seres of experments, we tested the proposed kernel combned wth dfferent types of local features. Shown n Table are the results of ths experment. Note that the local jets work well under nose, but suf- 0.9 κ kernel parameter p Fgure 3: Contrbuton of the best matched local feature pars, κ, Equaton (8), n kernel K F wth local jet features as a functon of the kernel parameter p. error rate 4% 3% % % 0% kernel parameter p Fgure 4: Classfcaton performance of kernel K F wth local jet features on set as a functon of the kernel parameter p. fer from background clutter and occluson. The local hstograms, on the other hand, are more robust n the face of partal occlusons. The local phase-based features perform worst n all the experments. We further combned all types of local features as n Equaton (0), and reported ts performance n the frst row of Table 3. It seems that fuson of local features does not necessarly mprove the performance (set ). However, n cases of mage degradatons, ths approach acheved better results, possbly because multple types of local features provde complementary nformaton that helps to reduce ambguty n classfcaton. Fnally, we constructed an SVM classfer usng the kernel defned n Equaton (), to further ncorporate the semlocal constrants. Such a kernel, equpped wth the most comprehensve doman knowledge, was expected to work best. Shown n the second row of Table 3 s the performance of ths kernel on all testng sets. Compared to other kernels, t ndeed acheved the lowest error rate, whch suggests the effcacy of semlocal constrants. 4. Dscusson In ths paper, we have ntroduced a new class of kernels for appearance based object recognton wth local feature rep- 8

9 set set set 3 set 4 set 5 K R K S K M K F Table : Error rates (n percentage) of the Gaussan kernel (Equaton ()) wth a global feature (raw pxels) and dfferent kernels, K R (Equaton (3)),K S (Equaton (4)) and K M (Equaton (5)), wth the local jet features. set set set 3 set 4 set 5 local jets local hstograms local phases Table : Error rates (n percentage) of dfferent local features wth kernel K F, Equaton (5). set set set 3 set 4 set 5 K F K G Table 3: Error rates (n percentage) of kernels usng multple types of local features, Equaton (0) and wth semlocal constrants, K G, Equaton (). resentatons. The proposed kernels work on sets of local features wth varable lengths, satsfy the Mercer condton and reflect smlartes between sets of local features. In addton, multple types of local features and semlocal constrants were combned nto the kernel desgn, whch help to further mprove the performance. We presented prelmnary expermental results where the proposed kernels, coupled wth SVM classfcaton, showed promsng performance n recognton tasks and s robust to mage transformatons and degradatons. Acknowledgment I thank Hany Fard for helpful and nsprng comments about the paper. Ths work was supported by Hany Fard under an Alfred P. Sloan Fellowshp, an NSF CAREER Award (IIS ), an NSF Infrastructure Grant (EIA ), and under Award No. 000-DT-CX-K00 from the Offce for Domestc Preparedness, U.S. Department of Homeland Securty (ponts of vew n ths document are those of the authors and do not necessarly represent the offcal poston of the U.S. Department of Homeland Securty). Appendx A: K M s not a Mercer kernel Proposton 5 Functon K M defned n Equaton (3) s not a Mercer kernel, gven that K f s a Mercer kernel defned on the local features. Proof To prove that K M s not a Mercer kernel, accordng to Proposton, t s suffcent to show that there s a subset of the nput space on whch the matrx evaluated wth K M s not postve sem-defnte. To ths end, consder F = {f, f }, F = {f 3, f 4 }, and F 3 = {f 5, f 6 }. Assume a kernel matrx on set {f,, f 6 } constructed by some kernel functon as G f = Matrx G f s postve sem-defnte, evdent from ts sngular value decomposton. From G f, the kernel matrx of K M can be constructed usng Equaton (3) as G M = For example, the element of G M at row and column 3 s computed as (G M ) 3 = 4 max((g f ) 5, (G f ) 6 ) + max((g f ) 5, (G f ) 6 ) + max((g f ) 5, (G f ) 5 ) + max((g f ) 6, (G f ) 6 ] = ( )/4 = Matrx G M s not postve sem-defnte, as ts egenvalues are.77, 9.6, and Therefore, usng K M can not always form postve sem-defnte matrx and ths means K M s not a Mercer kernel. Appendx B: Proof of Equaton (8) Wthout loss of generalty, let us assume K F (F (a) K F (F (a) (b) ) and F feature to K F (F (a) matched local feature n F b wth F (a) n F 6 b. The contrbuton of the best n the sum of Equaton (6) s: κ = K F (F (a) )] p Fb / j= ) ) s the unque best match. (b.) 6 Ths constrant s held for most of the cases, but when multple best matches exst, a smlar result can be obtaned n the same way. 9

10 We requre that the smlarty of the best matched par n the sum has a fracton above a gven threshold ρ as: )] p Fb / j= ρ. (b.) Note that )] p + ( Fb ) K F (F (a) Therefore, j= Fb j= )] p. (b.3) )] p )] p + ( Fb ) K F (F (a) )] p j ) ] p ρ, Rearrangng terms yelds ρ p log ( )ρ / log K ( F F (a) ( K F F (a) ) )] p ). (b.4) Appendx C: Proof of Proposton 4 Proof Consder two n-dmensonal vectors x = (x 0,, x n ) T and y = (y 0,, y n ) T, defne X and Y to be subsets of R n as X = {c(x, 0),, c(x, n )} and Y = {c(y, 0),, c(y, n )}, where c : R n {0,, n } R n s the crcular-shft operator n R n. Now defne K(X, Y ) = K(c(x, ), c(y, j))] p. n n =0 j=0 (c.) Wth a smlar proof as that of Proposton 3, we conclude that K(, ) s a Mercer kernel. Wth the defnton of Mercer kernels, ths suggests that there exsts a mappng φ : Rn H, where H s a Hlbert space, such that K( X, Ỹ ) = φ ( X), φ (Ỹ ). Expandng the evaluaton of kernel K as K(X, Y ) = n =0 n j=0 = n =0 K(x, y)]p + n =0 + n =0 n j=+ K(c(x, j ), y)]p (c.) K(c(x, ), c(y, j))]p n j=+ K(x, c(y, j ))]p n l= K(x, c(y, l))] p + = nk(x, y) p + n =0 n n =0 l=n K(x, c(y, l))]p = nk(x, y) p + n =0 K(x, c(y, n ))]p + n n =0 l= K(x, c(y, l))]p = nk(x, y) p + n l= K(x, c(y, l))]p + (n ) n l= K(x, c(y, l))]p = n n l=0 K(x, c(y, l))]p = nk G (x, y). Therefore, we have K G (x, y) = n K(X, Y ). Notce that n provng ths equalty, we used a property of the crcular shft operator as x = c(x, n) and our assumpton about the base kernel,.e., K(x, y) = K(c(x, d), c(y, d)) for 0 d n. We can now defne another mappng φ : R n Rn such that φ (x) = {c(x, 0),, c(x, n )}. Substtutng the defntons of mappng φ and φ nto the evaluaton of K G yelds K G (x, y) = n K(X, Y ) = n φ (X), φ (Y ) = n φ (φ (x)), φ (φ (y)) = n φ (φ (x)), φ (φ (y)). n If a new mappng φ : R n H s defned as φ(x) = n φ (φ (x)), we then have K G (x, y) = φ(x), φ(y), (c.3) whch shows that K G (, ) s a Mercer kernel and hence proves the frst part of Proposton 4. The second part of Proposton 4 can be proved by frst notcng that φ (x) = {c(x, 0),, c(x, n )} = φ (c(x, d)) for any 0 d n. Therefore, K G (x, y) = n K(φ (x), φ (y)) = n K(φ (c(x, d ))φ (c(y, d ))) = K G (c(x, d ), c(y, d )). for any gven 0 d, d n. References ] S. Boughorbel, J. Tarel, and F. Fleuret. Non-mercer kernel for SVM object recognton. In Brtsh Machne Vson Conference (BMVC), 004. ] G. Carnero and A. Jepson. Phase-based local features. In European Conference on Computer Vson (ECCV), 00. 3] C. Chang and C. Ln. LIBSVM: a lbrary for support vector machnes, 00. Software avalable at cjln/lbsvm. 4] O. Chapelle, P. Haffner, and V. Vapnk. SVMs for hstogram based mage classfcaton. IEEE Transactons on Neural Networks, 0(5), ] J. Echhorn and O. Chapelle. Object categorzaton wth SVM: Kernels for local features. In Advances n Neural Informaton Processng Systems (NIPS), page to appear, ] C. Harrs and M. Stephens. A combned corner and edge detector. In Alvey Vson Conference,

11 7] D. Haussler. Convoluton kernel for structure data. Techncal Report UCS-CRL-99-0, ] R. Kondor and T. Jebra. A kernel between sets of vectors. In Internatonal Conference on Machne Learnng (ICML), ] D. Lowe. Dstnctve mage features from scale-nvarant keyponts. Internaton Journal on Computer Vson, 60():9 0, ] K. Mkolajczyk and C. Schmd. Indexng based on scale nvarant nterest ponts. Internaton Journal on Computer Vson, pages 53 53, 00. ] K. Mkolajczyk and C. Schmd. A performance evaluaton of local descrptors. In IEEE Conference on Computer Vson and Pattern Recognton (CVPR), 003. ] P. Monreno, P. Ho, and N. Vasconcelos. A Kullback-Lebler dvergence based kernel for SVM classfcaton n multmeda applcatons. In Advances n Neural Informaton Processng Systems (NIPS), ] S. A. Nene, S. K. Nayar, and H. Murase. Columba object mage lbrary (col-00). Techncal Report CUCS , Columba Unversty, ] F. Odone, A. Barla, and A Verr. Buldng kernels from bnary strngs for mage matchng. IEEE Transacton on Image Processng, (to appear). 5] M. Pontl and A. Verr. Support vector machnes for 3D object recognton. IEEE Transactons on Pattern Analyss and Machne Intellgence, 0(6): , ] F. Schaffaltzky and A. Zssermann. Vewpont nvarant texture matchng and wde baselne stereo. In IEEE Internatonal Conference on Computer Vson (ICCV), 00. 7] C. Schmd and R. Mohr. Local grayvalue nvarants for mage retreval. IEEE Transacton on Pattern Analyss and Machne Intellgence, 9(5): , ] C. Schmd, R. Mohr, and C. Bauckhage. Evaluaton of nterest pont detectors. Internatonal Journal on Computer Vson, 7(): 3, ] N. Sebe, Q. Tan, E. Loupas, M. Lew, and T. Huang. Evaluaton of salent pont technques. In Internatonal Conference on Image and Vdeo Retreval, ] J. Shawe-Taylor and N. Crstann. Kernel Methods for Pattern Analyss. Cambrdge, 004. ] V. Vapnk. Statstcal Learnng Theory. Wley, 998. ] C. Wallraven, B. Caputo, and A. Graf. Recognton wth local features: the kernel recpe. In IEEE Internatonal Conference on Computer Vson (ICCV), pages 57 64, ] L. Wolf and A. Shashua. Kernel prncple angles for classfcaton machnes wth applcatons to mage sequence nterpretaton. In IEEE Conference on Computer Vson and Pattern Recognton (CVPR), 003.

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity Journal of Sgnal and Informaton Processng, 013, 4, 114-119 do:10.436/jsp.013.43b00 Publshed Onlne August 013 (http://www.scrp.org/journal/jsp) Corner-Based Image Algnment usng Pyramd Structure wth Gradent

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Human Action Recognition Using Dynamic Time Warping Algorithm and Reproducing Kernel Hilbert Space for Matrix Manifold

Human Action Recognition Using Dynamic Time Warping Algorithm and Reproducing Kernel Hilbert Space for Matrix Manifold IJCTA, 10(07), 2017, pp 79-85 Internatonal Scence Press Closed Loop Control of Soft Swtched Forward Converter Usng Intellgent Controller 79 Human Acton Recognton Usng Dynamc Tme Warpng Algorthm and Reproducng

More information

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Private Information Retrieval (PIR)

Private Information Retrieval (PIR) 2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Inverse-Polar Ray Projection for Recovering Projective Transformations

Inverse-Polar Ray Projection for Recovering Projective Transformations nverse-polar Ray Projecton for Recoverng Projectve Transformatons Yun Zhang The Center for Advanced Computer Studes Unversty of Lousana at Lafayette yxz646@lousana.edu Henry Chu The Center for Advanced

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Shape Representaton Robust to the Sketchng Order Usng Dstance Map and Drecton Hstogram Department of Computer Scence Yonse Unversty Kwon Yun CONTENTS Revew Topc Proposed Method System Overvew Sketch Normalzaton

More information

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following. Complex Numbers The last topc n ths secton s not really related to most of what we ve done n ths chapter, although t s somewhat related to the radcals secton as we wll see. We also won t need the materal

More information

Histogram of Template for Pedestrian Detection

Histogram of Template for Pedestrian Detection PAPER IEICE TRANS. FUNDAMENTALS/COMMUN./ELECTRON./INF. & SYST., VOL. E85-A/B/C/D, No. xx JANUARY 20xx Hstogram of Template for Pedestran Detecton Shaopeng Tang, Non Member, Satosh Goto Fellow Summary In

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

Improved SIFT-Features Matching for Object Recognition

Improved SIFT-Features Matching for Object Recognition Improved SIFT-Features Matchng for Obect Recognton Fara Alhwarn, Chao Wang, Danela Rstć-Durrant, Axel Gräser Insttute of Automaton, Unversty of Bremen, FB / NW Otto-Hahn-Allee D-8359 Bremen Emals: {alhwarn,wang,rstc,ag}@at.un-bremen.de

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Analysis of Continuous Beams in General

Analysis of Continuous Beams in General Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,

More information

MOTION BLUR ESTIMATION AT CORNERS

MOTION BLUR ESTIMATION AT CORNERS Gacomo Boracch and Vncenzo Caglot Dpartmento d Elettronca e Informazone, Poltecnco d Mlano, Va Ponzo, 34/5-20133 MILANO boracch@elet.polm.t, caglot@elet.polm.t Keywords: Abstract: Pont Spread Functon Parameter

More information

Discriminative classifiers for object classification. Last time

Discriminative classifiers for object classification. Last time Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng

More information

Scale Selective Extended Local Binary Pattern For Texture Classification

Scale Selective Extended Local Binary Pattern For Texture Classification Scale Selectve Extended Local Bnary Pattern For Texture Classfcaton Yutng Hu, Zhlng Long, and Ghassan AlRegb Multmeda & Sensors Lab (MSL) Georga Insttute of Technology 03/09/017 Outlne Texture Representaton

More information

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures A Novel Adaptve Descrptor Algorthm for Ternary Pattern Textures Fahuan Hu 1,2, Guopng Lu 1 *, Zengwen Dong 1 1.School of Mechancal & Electrcal Engneerng, Nanchang Unversty, Nanchang, 330031, Chna; 2. School

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

3D vector computer graphics

3D vector computer graphics 3D vector computer graphcs Paolo Varagnolo: freelance engneer Padova Aprl 2016 Prvate Practce ----------------------------------- 1. Introducton Vector 3D model representaton n computer graphcs requres

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Augmented Distinctive Features for Efficient Image Matching

Augmented Distinctive Features for Efficient Image Matching Augmented Dstnctve Features for Effcent Image Matchng Quan Wang, We Guan and Suya You CGIT/IMSC USC Los Angeles, CA 90089 quanwang@usc.edu, wguan@usc.edu and suyay@graphcs.usc.edu Abstract Fndng correspondng

More information

Palmprint Feature Extraction Using 2-D Gabor Filters

Palmprint Feature Extraction Using 2-D Gabor Filters Palmprnt Feature Extracton Usng 2-D Gabor Flters Wa Kn Kong Davd Zhang and Wenxn L Bometrcs Research Centre Department of Computng The Hong Kong Polytechnc Unversty Kowloon Hong Kong Correspondng author:

More information

Object-Based Techniques for Image Retrieval

Object-Based Techniques for Image Retrieval 54 Zhang, Gao, & Luo Chapter VII Object-Based Technques for Image Retreval Y. J. Zhang, Tsnghua Unversty, Chna Y. Y. Gao, Tsnghua Unversty, Chna Y. Luo, Tsnghua Unversty, Chna ABSTRACT To overcome the

More information

Laplacian Eigenmap for Image Retrieval

Laplacian Eigenmap for Image Retrieval Laplacan Egenmap for Image Retreval Xaofe He Partha Nyog Department of Computer Scence The Unversty of Chcago, 1100 E 58 th Street, Chcago, IL 60637 ABSTRACT Dmensonalty reducton has been receved much

More information

Large-scale Web Video Event Classification by use of Fisher Vectors

Large-scale Web Video Event Classification by use of Fisher Vectors Large-scale Web Vdeo Event Classfcaton by use of Fsher Vectors Chen Sun and Ram Nevata Unversty of Southern Calforna, Insttute for Robotcs and Intellgent Systems Los Angeles, CA 90089, USA {chensun nevata}@usc.org

More information

CMPS 10 Introduction to Computer Science Lecture Notes

CMPS 10 Introduction to Computer Science Lecture Notes CPS 0 Introducton to Computer Scence Lecture Notes Chapter : Algorthm Desgn How should we present algorthms? Natural languages lke Englsh, Spansh, or French whch are rch n nterpretaton and meanng are not

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE Dorna Purcaru Faculty of Automaton, Computers and Electroncs Unersty of Craoa 13 Al. I. Cuza Street, Craoa RO-1100 ROMANIA E-mal: dpurcaru@electroncs.uc.ro

More information

A Gradient Difference based Technique for Video Text Detection

A Gradient Difference based Technique for Video Text Detection A Gradent Dfference based Technque for Vdeo Text Detecton Palaahnakote Shvakumara, Trung Quy Phan and Chew Lm Tan School of Computng, Natonal Unversty of Sngapore {shva, phanquyt, tancl }@comp.nus.edu.sg

More information

Querying by sketch geographical databases. Yu Han 1, a *

Querying by sketch geographical databases. Yu Han 1, a * 4th Internatonal Conference on Sensors, Measurement and Intellgent Materals (ICSMIM 2015) Queryng by sketch geographcal databases Yu Han 1, a * 1 Department of Basc Courses, Shenyang Insttute of Artllery,

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Face Detection with Deep Learning

Face Detection with Deep Learning Face Detecton wth Deep Learnng Yu Shen Yus122@ucsd.edu A13227146 Kuan-We Chen kuc010@ucsd.edu A99045121 Yzhou Hao y3hao@ucsd.edu A98017773 Mn Hsuan Wu mhwu@ucsd.edu A92424998 Abstract The project here

More information

A Gradient Difference based Technique for Video Text Detection

A Gradient Difference based Technique for Video Text Detection 2009 10th Internatonal Conference on Document Analyss and Recognton A Gradent Dfference based Technque for Vdeo Text Detecton Palaahnakote Shvakumara, Trung Quy Phan and Chew Lm Tan School of Computng,

More information

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM PERFORMACE EVALUAIO FOR SCEE MACHIG ALGORIHMS BY SVM Zhaohu Yang a, b, *, Yngyng Chen a, Shaomng Zhang a a he Research Center of Remote Sensng and Geomatc, ongj Unversty, Shangha 200092, Chna - yzhac@63.com

More information

Multi-stable Perception. Necker Cube

Multi-stable Perception. Necker Cube Mult-stable Percepton Necker Cube Spnnng dancer lluson, Nobuuk Kaahara Fttng and Algnment Computer Vson Szelsk 6.1 James Has Acknowledgment: Man sldes from Derek Hoem, Lana Lazebnk, and Grauman&Lebe 2008

More information

Combination of Color and Local Patterns as a Feature Vector for CBIR

Combination of Color and Local Patterns as a Feature Vector for CBIR Internatonal Journal of Computer Applcatons (975 8887) Volume 99 No.1, August 214 Combnaton of Color and Local Patterns as a Feature Vector for CBIR L.Koteswara Rao Asst.Professor, Dept of ECE Faculty

More information

Semantic Scene Concept Learning by an Autonomous Agent

Semantic Scene Concept Learning by an Autonomous Agent Semantc Scene Concept Learnng by an Autonomous Agent Weyu Zhu Illnos Wesleyan Unversty PO Box 29, Bloomngton, IL 672 wzhu@wu.edu Abstract Scene understandng addresses the ssue of what a scene contans.

More information

Content-Based Bird Retrieval using Shape context, Color moments and Bag of Features

Content-Based Bird Retrieval using Shape context, Color moments and Bag of Features www.ijcsi.org 101 Content-Based Brd Retreval usng Shape context, Color moments and Features Bahr abdelkhalak 1 and hamd zouak 2 1 Faculty of Scences, Unversty Chouab Doukkal, Equpe: Modélsaton mathématque

More information

COMPLEX WAVELET TRANSFORM-BASED COLOR INDEXING FOR CONTENT-BASED IMAGE RETRIEVAL

COMPLEX WAVELET TRANSFORM-BASED COLOR INDEXING FOR CONTENT-BASED IMAGE RETRIEVAL COMPLEX WAVELET TRANSFORM-BASED COLOR INDEXING FOR CONTENT-BASED IMAGE RETRIEVAL Nader Safavan and Shohreh Kasae Department of Computer Engneerng Sharf Unversty of Technology Tehran, Iran skasae@sharf.edu

More information

Fast Feature Value Searching for Face Detection

Fast Feature Value Searching for Face Detection Vol., No. 2 Computer and Informaton Scence Fast Feature Value Searchng for Face Detecton Yunyang Yan Department of Computer Engneerng Huayn Insttute of Technology Hua an 22300, Chna E-mal: areyyyke@63.com

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

Efficient Segmentation and Classification of Remote Sensing Image Using Local Self Similarity

Efficient Segmentation and Classification of Remote Sensing Image Using Local Self Similarity ISSN(Onlne): 2320-9801 ISSN (Prnt): 2320-9798 Internatonal Journal of Innovatve Research n Computer and Communcaton Engneerng (An ISO 3297: 2007 Certfed Organzaton) Vol.2, Specal Issue 1, March 2014 Proceedngs

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

Orthogonal Complement Component Analysis for Positive Samples in SVM Based Relevance Feedback Image Retrieval

Orthogonal Complement Component Analysis for Positive Samples in SVM Based Relevance Feedback Image Retrieval Orthogonal Complement Component Analyss for ostve Samples n SVM Based Relevance Feedback Image Retreval Dacheng Tao and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong {dctao2,

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information