Local Minima Free Parameterized Appearance Models
|
|
- Chester Lucas
- 5 years ago
- Views:
Transcription
1 Local Mnma Free Parameterzed Appearance Models Mnh Hoa Nguyen Fernando De la Torre Robotcs Insttute, Carnege Mellon Unversty Pttsburgh, PA 1513, USA. Abstract Parameterzed Appearance Models (PAMs) (e.g. Egentrackng, Actve Appearance Models, Morphable Models) are commonly used to model the appearance and shape varaton of objects n mages. Whle PAMs have numerous advantages relatve to alternate approaches, they have at least two drawbacks. Frst, they are especally prone to local mnma n the fttng process. Second, often few f any of the local mnma of the cost functon correspond to acceptable solutons. To solve these problems, ths paper proposes a method to learn a cost functon by explctly optmzng that the local mnma occur at and only at the places correspondng to the correct fttng parameters. To the best of our knowledge, ths s the frst paper to address the problem of learnng a cost functon to explctly model local propertes of the error surface to ft PAMs. Synthetc and real examples show mprovement n algnment performance n comparson wth tradtonal approaches. 1. Introducton Snce the early work of Srovch and Krby [1] parameterzng the human face usng Prncpal Component Analyss (PCA) and the successful egenfaces of Turk and Pentland [3], many computer vson researchers have used PCA technques to construct lnear models of optcal flow, shape or graylevel [3, 10, 4, 6, 19, 14, 5, 11]. In partcular, Parameterzed Appearance Models (PAMs) (e.g. egentrackng [4], actve appearance models [6, 10, 17, 9, 11], morphable models [5, 14]) have proven to be an approprate statstcal tool for modelng shape and appearance varaton of objects n mages. In PAMs, the appearance/shape models of objects are bult by performng PCA on tranng data. Once the models have been constructed, fndng the locaton/confguraton of an object of nterest n a testng mage s acheved by mnmzng a cost functon w.r.t. some transformaton (moton) parameters; ths s referred to as the fttng process. Although wdely used, PAMs suffer from two problems Fgure 1. Learnng a better model for mage algnment. (d,f): surface and contour plot of the PCA model. It has many local mnma; (e, g): Local Mnma Free PAM (LMF-PAM) method learns a better error surface to ft PAMs. Ths fgure s best seen n color. n the fttng process. Frst, they are especally prone to local mnma. Second, often few, f any, of the local mnma of the cost functon correspond to acceptable solutons. Fgures 1a,d,f llustrate these problems. Fg. 1d plots the error surface constructed by translatng the testng mage (Fg. 1c) around the ground truth landmarks (Fg. 1c) and computng the values of the cost functon. The cost functon s based on a PCA model constructed from labeled tranng data (Fg. 1a). Fg. 1f shows the contour plot of ths error surface. As can be observed, any gradent-based optmzaton method s lkely to get stuck at local mnma, and wll 1
2 not converge to the global mnmum. Moreover, the global mnmum of ths cost functon s not at the desred poston, the black dot of Fg. 1d, whch corresponds to the correct landmarks locatons. These problems occur because the PCA model s constructed wthout consderng the neghborhoods of the correct moton parameters (parameters that correspond to ground truth landmarks of tranng data). The neghborhoods determne the local mnma propertes of the error surface, and should be taken nto account whle constructng the models. In ths paper, we propose to learn the cost functon (.e. appearance model) that has a local mnmum at the expected locaton and no other local mnma n ts neghborhood. Ths s done by enforcng constrants on the gradents of the cost functon at the desred locaton and ts neghborhood. Fg. 1e,g plot the error surface and contours of the learned cost functon. Ths cost functon has a local mnmum n the expected place (black dot of Fg. 1e), and no other local mnma near by.. Prevous work Over the last decade, appearance models have become ncreasngly mportant n computer vson and graphcs. In partcular, PAMs have been proven useful for algnment, detecton, trackng, and face synthess [5, 4, 10, 6, 17, 19, 14, 11, 4]. Ths secton revews PAMs and gradent-based methods for the effcent algnment of hgh dmensonal deformaton models..1. PAMs PAMs [4, 10, 6, 19, 14, 5, 4] buld the objects appearance/shape representaton from the prncpal components of tranng data. Let d R m 1 (see notaton 1 ) be the th sample of a tranng set D R m n and U R m k the frst k prncpal components [13]. Once the model has been constructed (.e. U s known), trackng/algnment s acheved by fndng the moton parameter p that best algns the data w.r.t. the subspace U,.e. mn c,p d(f(x,p)) Uc (1) Here x = [x 1,y 1,...x l,y l ] T s the vector contanng the coordnates of the pxels to track. f(x, p) s the functon for geometrc transformaton; denote f(x, p) by 1 Bold uppercase letters denote matrces (e.g. D), bold lowercase letters denote column vectors (e.g. d). d j represents the j th column of the matrx D. d j denotes the scalar n the row th and column j th of the matrx D. Non-bold letters represent scalar varables. 1 k R k 1 s a column vector of ones. 0 k R k 1 s a column vector of zeros. I k R k k s the dentty matrx. tr(d) = d s the trace of square matrx D. d = d T d desgnates Eucldean norm of d. D F = tr(d T D) s the Frobenous norm of D. dag( ) s the operator that extracts the dagonal of a square matrx or constructs a dagonal matrx from a vector. [u 1,v 1,...,u l,v l ] T. d s the mage frame n consderaton, and d(f(x,p)) s the appearance vector of whch the th entry s the ntensty of mage d at pxel (u,v ). For affne and non-rgd transformatons, (u,v ) relates to (x,y ) by: [ u v ] [ a1 a = a 4 a 5 ][ x s y s ] + [ a3 wth [x s 1,y s 1,...x s l,ys l ]T = x + U s c s, where U s s the nonrgd shape model learned by performng PCA on a set of regstered shapes [7]. a,c s are affne and non-rgd moton parameters respectvely, and p = [a;c s ]... Optmzaton for PAMs Gven an mage d, PAM trackng/algnment algorthms optmze (1). Due to the hgh dmensonalty of the moton space, a standard approach to effcently search over the parameter space s to use gradent-based methods [1, 7, 17, 4, 8, 5]. To compute the gradent of the cost functon gven n (1), t s common to use Taylor seres expanson to approxmate d(f(x,p + δp)) by d(f(x,p)) + J d (p)δp, where J d (p) = d(f(x,p)) s the Jacoban of the mage d w.r.t. to the moton parameter p [16]. Once lnearzed, a standard approach s to use the Gauss-Newton method for optmzaton [, 4]. Other approaches learn an approxmaton of the Jacoban matrx wth lnear [7] or non-lnear [0, 15] regresson. Over the last few years, several strateges for mprovng the fttng performance have been proposed. For examples, Black & Anandan [4] and Cootes & Taylor [7] proposed usng mult-resoluton schemes, Xao et al [6] proposed usng 3D models to constran D solutons, de la Torre et al proposed learnng flters to acheve robustness to local mnma, de la Torre & Black [8], and Baker & Matthews [1] learned a PCA model nvarant to rgd and non-rgd transformatons. Although these methods show sgnfcant performance mprovement, they do not drectly address the problem of learnng a cost functon wth no local mnma. In ths paper, we delberately learn a cost functon whch has local mnma at and only at the desred places. 3. Learnng parameters of the cost functons Gradent-based algorthms, such as the ones dscussed n the prevous secton, mght not converge to the correct locaton (.e. correct moton parameters) for several reasons. Frst, gradent-based methods are susceptble to beng stuck at local mnma. Second, even when the optmzer converges to a global mnmum, the global mnmum mght not correspond to the correct moton parameters. These two problems occur prmarly because PCA has lmted generalzaton capabltes to model appearance varaton. Ths secton proposes a method to learn cost functons that do not exhbt these two problems n tranng data. a 6 ] ()
3 3.1. A generc cost functon for algnment Ths secton proposes a generc quadratc error functon where many PAMs can be cast. The quadratc error functon has the form: E(d,p) = d(f(x,p)) T Ad(f(x,p)) + b T d(f(x,p)) (3) Here A R m m and b R m 1 are the fxed parameters of the functon, and A s symmetrc. Ths functon s the general form of many cost functons used n the lterature ncludng Actve Appearance Models [6], Egentrackng [4], and template trackng [16, 18]. For nstance, consder the cost functon gven n (1). If p s fxed, the optmal c that mnmzes (1) can be obtaned usng c = U T d(f(x,p)). Substtutng ths back nto (1) and performng some basc algebra, (1) s equvalent to: mn p d(f(x,p)) T (I m UU T )d(f(x,p)). Thus (1) s a specal case of (3), wth A = I m UU T, and b = 0 m. For template trackng, the cost functon s typcally the sum of squared dfferences: d(f(x,p)) d ref, where d ref s the reference template. Ths cost functon s equvalent to: d(f(x,p)) T d(f(x,p)) d T refd(f(x,p)). Thus the cost functon used n template trackng s also a specal case of (3) wth A = I m and b = d ref. 3.. Desred propertes of cost functons As dscussed prevously, t s desrable that the cost functon have mnma at and only at the rght places. In ths secton, we delberately address ths need as an optmzaton problem over A and b. Let {d } n 1 be a set of tranng mages contanng the objects of nterest (e.g. faces), and assume the landmarks for the object shapes are avalable (e.g. manually labeled facal landmarks as n Fg. 5a). Let s be the vector contanng the landmark coordnates of mage d. Gven {s } n 1, we perform Procrustes analyss [7] and buld the shape model as follows. Frst, the mean shape s = 1 n s s calculated. Second, we compute a the affne parameter that best transforms s to s, and let a 1 be the nverse affne transformaton of a. Thrd, ŝ s obtaned by applyng the nverse affne transformaton a 1 on s (warpng toward the mean shape). Next, we perform PCA on {ŝ s} n to construct U s, a bass for non-rgd shape varaton. We then compute c s, the coeffcents of ŝ s w.r.t. the the bass U s. Fnally, let p = [a ;c s ], p s the parameter of mage d w.r.t. to our shape model. Notably, the shape model and {p } n 1 are derved ndependently of the appearance model. The appearance model (.e. the cost functon E(d, p) ) s what needs to be learned. For E(d,p) to have a local mnmum at the rght place, p must be a local mnmum of E(d,p). Theoretcally, ths Fgure. Neghborhoods around the ground truth moton parameter p (ret dot). N : regon nsde the orange crcle; t s satsfactory for fttng algorthms to converge to ths regon. N + : regon outsde the blue crcle; algnment algorthm wll not be ntalzed n ths regon. N : shaded regon, regon to enforce constrants on gradents. requres E(d,p) to vansh,.e. p E(d,p) = 0 (4) p To learn a cost functon that has few local mnma, t s necessary to consder p s neghborhoods. Let N = {p : lb p p ub}, N = {p : p p < lb}, N + = {p : p p > ub}. Here lb s chosen such that N s a set of neghbor parameters that are very close to p ; t s satsfactory for a fttng algorthm to converge to a pont n N. ub s chosen so that the fttng algorthm s guaranteed to be ntalzed at a pont n N or N. In most applcatons, such ub exsts. For example, for trackng problems, ub can be set to the maxmum movement of the object beng tracked between two consecutve frames. Fg. depcts the relatonshp between N, N,andN +. Fgure 3. p : desred convergence locaton. Blue arrows: gradent vectors, red arrows: walkng drectons of gradent descent algorthm, orange arrows: optmal drectons to the desred locaton. Performng gradent descent at p advances closer to p whle performng gradent descent at p moves away from p. For a gradent descent algorthm to converge to p or a
4 pont close enough to p, t s necessary that E(d,.) have no local mnma n N. Ths mples that E(d,p) does not vansh for p N. Notably, t s not necessary to enforce smlar constrants for p N N + because of the way lb, ub are chosen. Another desrable property s that each teraton of gradent descent advances closer to the correct poston. Because gradent descent walks aganst the gradent drecton at every teraton, we would lke the opposte drecton of the gradent at pont p N to be smlar to the optmal walkng drecton p p. Ths quantty can be measured as the projecton of the walkng drecton onto the optmal drecton. Fg. 3 llustrates the ratonale of ths requrement. Ths requrement leads to the constrants: ( ) T E(d,p) p p, > 0 p N (5) p p Equatons (4) and (5) specfy the constrants for the deal cost functon. However, these constrants mght be too strngent. Therefore, we propose to relax the constrants to get the optmzaton problem: 1 E(d,p) + C ξ p (6) ( ) T E(d,p) p p s.t., > ξ,p N p p mn A,b,ξ ξ 0 Here E(d,p) s requred to be small nstead of p strctly zero. ξ s are slack varables for constrants n (5) whch allows for penalzed constrant volaton. C s the parameter controllng the trade-off between havng few local mnma and havng local mnma at the rght places. The gradent of the functon E(d,p) plays a fundamental role n the above optmzaton problem. To compute the gradent E(d,p), t s common to use frst order Taylor seres expanson to approxmate d(f(x,p + δp)) by d(f(x,p)) + J d (p)δp, where J d (p) = d(f(x,p)) s the spatal ntensty gradent of the mage d w.r.t. to the moton parameter p [16]. Ths yelds: ( ) T E(d,p) (J d (p)) T (Ad(f(x,p)) + b) (7) Substtutng (7) nto (6), we obtan a quadratc optmzaton problem wth lnear constrants over A and b Practcal ssues and alternatve fttng methods In practce, there s an ssue regardng the optmzaton of (6): the small components of E(d,p) tend to be neglected when optmzng (6). Ths occurs due to the magntude dfference between some columns of J d (p). For example, n (), the magntudes of the Jacobans of d(f(x,p)) w.r.t. to a 1,a,a 4,a 5 can be much larger than the magntudes of the Jacobans of d(f(x,p)) w.r.t. to a 3,a 6. To address ths concern, we consder an alternatve optmzaton strategy where the update rule at teraton k th s: p k+1 = p k + d (p k ) (8) ( wth d (p k ) = 1 ) T Hd (p k ) 1 E(d, p) H d (p k ) = J d (p k ) T J d (p k ) p k The update rule of the above algorthm s a varant of Newton teraton. Intutvely, H d (p k ) s smlar to the Hessan of E(d,p) at p k, and t acts as a normalzaton matrx for the gradent. Ths algorthm s ndeed a reasonable optmzaton scheme for cost functons n whch A s symmetrc postve semdefnte wth all egenvalues less than or equal to 1. See Theorem 1 n the Appendx for the proof. Smlar to the case of gradent descent, requrng the ncremental updates to vansh at only at the places correspondng to acceptable solutons yelds the followng optmzaton problem: 1 d (p ) + C ξ (9) p s.t. d p (p), > ξ, p N p p mn A,b,ξ ξ 0. A s also constraned to be a symmetrc postve semdefnte matrx where egenvalues are less than or equal one. By ncorporatng the deas of maxmal margn and regularzaton, we obtan: 1 d (p ) + C ξ + C Ω(A,b) (10) p s.t. d p (p), C 3 ξ, p N p p mn A,b,ξ ξ 0 & A H m, where H m denotes the set of all m m symmetrc matrces of whch all egenvalues are non-negatve and less than or equal to one. Ω(A,b) s the regularzaton term for A and b, C s the weght for the regularzaton term, and C 3 s the user-defned margn sze. Snce d (p ) s lnear n terms of A and b, ths s a quadratc programmng problem wth lnear constrants, provded the requrement A H m can be descrbed by lnear constrants.
5 Of course, one can derve a smlar learnng problem for A and b where the Newton method s the optmzer of choce. The ncremental update n Newton teraton s: 1 [ J d (p k ) T AJ d (p k ) ] ( 1 E(d, p) T (11) p k) a b d However, each Newton teraton has to nvert J d (p k ) T AJ d (p k ). As a result, learnng A and b becomes much harder because the optmzaton problem s no longer quadratc wth lnear constrants. e 4. Specal cases and experments Sec. 3.3 proposes a method for learnng generc A and b. However, n specfc stuatons, A and b can be further parameterzed. The benefts of further parameterzaton are threefold. Frst, the number of parameters to learn can be reduced. Second, the relatonshp between A and b can be establshed. Thrd, the constrant that A H m can be replaced by a set of lnear constrants. Ths secton provdes the formulaton for two specal cases, namely weghted template algnment and weghted-bass AAM algnment. Expermental results on synthetc and real data are ncluded Weghted template algnment As shown n Sec. 3.1, template algnment s a specal case of (3) n whch A = I m, and b = d ref. In template algnment, pxels of the template are weghed equally; however, there s no reason why ths s optmal. Here, we propose learnng the weghts of template pxels to avod local mnma n template matchng. Consder the weghted sum of squared dfferences: (d(f(x,p)) d ref ) T dag(w)(d(f(x,p)) d ref ), where, w s the weght vector for the template s pxels. Ths cost functon s equvalent to (3) wth A = dag(w) and b = dag(w)d ref. The constrant A H m can be mposed by requrng 0 w 1. Furthermore, n ths settng, d (p ) = 0. Thus (10) becomes a lnear programmng problem wth lnear constrants over w. To demonstrate ths dea, we create a synthetc template of an sotropc Gaussan (Fg. 4a). Suppose the task s to locate the template nsde an mage contanng the template (Fg. 4c), startng at an arbtrary locaton. Fg. 4d plots the error surface of the nave cost functon (sum of squared dfferences). The value of ths error surface at a partcular pxel (x,y) s calculated by computng the sum of squared dfferences between the template and the crcular patch centered at (x, y). Smlarly, the error surface of the learned cost functon (weghted sum of squared dfferences) s calculated and dsplayed n Fg. 4e. The learned template weghts are shown n Fg. 4b; brghter pxels mean hgher weghts. As can be seen, the nave cost functon has a fence of local c Fgure 4. Learnng to weght template s pxels. (a) synthetc template of an sotropc Gaussan. (b) the learned weghts, brghter pxels mean hgher weghts. (c) an mage contanng the template. (d) error surface of the sum of squared dfferences. (e) error surface of the weghted sum of squared dfferences wth the learned weghts gven n (b). maxma surroundng the template locaton. Ths prevents algnment algorthms from convergng to the desred locaton. The learned cost functon s convex, and therefore, s more sutable for ths partcular template. The template s weghts gven n Fg. 4b are learned by optmzng (10) wth the followng parameter settngs: Ω(A,b) = 0,C = 0,C 3 = 10,C = 1. The lnear constrants are reduced to a set of 5000 constrants obtaned by random samplng. How to deal wth nfntely many constrants s dscussed n more detal n Sec Weghted-bass for AAM algnment As shown n Sec. 3.1, AAM algnment s a specal case of (3) n whch A = I m UU T = I m k 1 u u T, and b = 0. U s the set of k frst egenvectors from the total of K PCA bass of the tranng data subspace. k ( K) s usually chosen expermentally. In ths secton, we propose to use all K egenvectors, but wegh them dfferently. Specfcally, we learn A whch has the form: A = I m K 1 λ u u T. To ensure that A H m, we requre 0 λ 1. Let w = [λ T b T ] T. Substtutng ths nto (10) we get a quadratc programmng problem wth lnear constrants on w. To demonstrate ths dea, we perform experments on the Mult-PIE database [1]. Ths database conssts of facal mages of 337 subjects taken under dfferent llumnatons, expressons and poses. We only make use of the drectly-llumnated frontal face mages under fve expressons (smle, dsgust, squnt, surprse and scream). Our dataset contans 1100 mages, 400 are selected for tranng, 00 are used for valdaton (parameter tunng), and the rest
6 a b c Fgure 5. (a) example of landmarks assocated wth each face (red dots), (b) example of shape dstorton (yellow pluses), (c) example of patches for appearance modelng. are reserved for testng. Each face s manually labeled wth 68 landmarks, as shown n Fg. 5a. Images are down sampled to pxels. The shape model s bult as descrbed n Sec. 3.. The fnal shape model requres 10 coeffcents (6 affne + 4 nonrgd) to descrbe a shape. For object appearance, we extract ntensty values of pxels nsde the patches located at the landmarks (Fg. 5c). The tranng data s further dvded nto two subsets, one contans 300 mages and the other contans 100 mages. U s obtaned by performng PCA on the subset of 300 mages. The second subset s used to set up the optmzaton problem (10). For better generalzaton, (10) s constructed wthout usng mages n the frst tranng subset. To avod N beng of nfnte sze, we restrct our attenton to a set of 00 random samples from N. The random samples are drawn by ntroducng random Gaussan perturbaton to the correct shape parameter p. Followng the approach by Tsochantarsds et al [] for mnmzng a quadratc functon wth an exponentally large number of lnear constrants, we mantan a smaller subset of actve constrants S and optmze (10) teratvely. We repeat the followng steps for 10 teratons: () empty S; () randomly choose 0 tranng mages; () for each chosen tranng mage d, fnd the 100 most volated constrants from N and nclude them n S; (v) run quadratc programmng wth the reduced set of constrants. Testng data are generated by randomly perturbng the components of p, the correct shape parameters of test mage d. Perturbaton amounts are generated from a zero mean Gaussan dstrbuton wth standard devaton PerMag [ ] T. P erm ag controls the overall dffculty of the testng data. The relatve perturbaton amounts of shape coeffcents are determned to smulate possble moton n trackng, and ths s estmated vsually. Fg. 5b shows an example of shape perturbaton, the ground truth landmarks are marked n red (crcles), whle the perturbed shape s shown n yellow (pluses). Table 1 descrbes the expermental results wth four df- Table 1. Algnment results of dfferent methods for four dfferent dffculty levels of testng data (PerMag). Intal s the ntal amount of perturbaton before runnng any algnment algorthm. PCA e% s the cost functon constructed usng PCA preservng e% of energy. The table shows the means and standard devatons of ms-algnment (average over 68 landmarks and over testng data). The unt for measurement s pxel. PerMag Intal 0.75± ± ± ±.54 PCA 100% 0.37± ± ± ±.51 PCA 90% 0.36± ± ± ±.65 PCA 80% 0.40± ± ± ±.50 PCA 70% 0.41± ± ± ±.46 Ours 0.37± ± ± ±.39 fculty levels of testng data (controlled by PerMag). The performance of the learned cost functon s compared wth four other cost functons constructed usng PCA wth popular energy settngs (70%, 80%, 90%, and 100%). As can be observed, when the amount of perturbaton s small, PCA models wth hgher energy levels perform better. However, as the amount of pertubaton ncreases, PCA models wth lower energy levels perform better. Ths suggests that cost functons usng fewer bass vectors have less local mnma whle cost functons usng more bass vectors are more lkely to have local mnma at the rght places. Thus t s unclear what the energy for the PCA model should be. On the other hand, the learned cost functon performs sgnfcantly better than the PCA models for most dffculty levels. In ths experment, we use Ω(A,b) = b,c =,C = 0.1, and C 3 = The parameters are tuned usng the valdaton set. 5. Concluson In ths paper, we have proposed a method for learnng the cost functons for PAMs. We drectly address the problem of learnng cost functons that have local mnma at and only at the desred places. The task of learnng a cost functon s formulated as optmzng a quadratc functon under some lnear constrants. To the best of our knowledge, ths s the frst paper that addresses ths problem. Encouragng results have been acheved n the context of template matchng and AAM fttng. Further work needs to address how to select the most nterestng ponts n the error surface to reduce the number of constrants n the optmzaton. Acknowledgments: Ths materal s based upon work supported by the U.S. Naval Research Laboratory under Contract No. N C-040 and Natonal Insttute of Health Grant R01 MH Any opnons, fndngs and conclusons or recommendatons expressed n ths materal are those of the authors and do not necessarly reflect the vews of the U.S. Naval Research Laboratory.
7 Appendx Ths secton states and proves a theorem used to justfy the optmzaton algorthm gven n (8). Theorem 1: Consder an m-dmensonal functon f(x) of p- dmensonal varable x, and suppose we have to mnmze the functon: E(x) = f(x) T Af(x) + b T f(x), where A H m. Consder an teratve optmzaton method whch has the followng update rule: x new = x old + δx wth δx = H 1 J T (Af(x) + b) (1) and J = f,h = J T J x x old The above optmzaton method, when started suffcently close to a regular local mnmum, wll converge to that local mnmum. Here, a pont x 0 s sad to be regular f H s not sngular and the Taylor seres of f( ) converges for every pont n the neghborhood of x 0. Provng Theorem 1 requres two lemmas. We now state and prove those two lemmas. Lemma 1: A H m f and only f I m A H m. Proof: Ths lemma can be proven easly, based on: 0 ut Au u T u 1 0 ut (I m A)u 1 u (13) u T u Lemma : A H m f and only f there exsts a postve nteger k, scalars α s, and matrces B s such that:. B T B s nvertble = 1, k.. α 0 = 1, k, and k α 1. A = k αb(bt B ) 1 B T Proof for suffcency condtons: Suppose there exst k, α s, and B s that satsfy all all three condtons above. Because A s a lnear combnaton of symmetrc matrces, A s also symmetrc. We only need to prove that A s postve semdefnte of whch all egenvalues are less than or equal to 1. Consder v T Av for an arbtrarly vector v R m : v T Av = = = α v T B (B T B ) 1 B T v (14) α v T B (B T B ) 1 B T B (B T B ) 1 B T v α B (B T B ) 1 B T v ths wth the nequalty n (15), we have: 0 v T Av v T v Snce these nequaltes hold for arbtrary vector v R m, A must be an element of H m. Proof for necessary condtons: Suppose A H m. Consder the sngular value decomposton of A,A = UΛU T. Here, the columns of U are orthonormal vectors. Λ s a dagonal matrx, Λ = dag([λ 1,..., λ m]) wth 0 λ 1. Wthout loss of generalty, suppose λ 1 λ... λ m. We have: A = UΛU T = = m λ u u T (16) m 1 m (λ λ +1)( u ju T j ) + λ m( u ju T j ) j=1 j=1 Let α = λ λ +1 for = 1,..., m 1, and α m = λ m. Let B = [u 1...u ] for = 1, m. Snce {u } m 1 s a set of orthonormal vectors, B T B = I an dentty matrx. Therefore, B (B T B ) 1 B T = B B T = j=1 ujut j. Hence: A = m α B (B T B ) 1 B T (17) Fnally, we have α 0 and m α = λ1 1. Ths completes our proof for Lemma 1. Proof of Theorem 1: From Lemmas 1 and we know that α 0, B : I m A = k αb(bt B ) 1 B T and k 1 α 1. To prove Theorem 1, let us frst consder the optmzaton of the followng functon: E (x, {c }) = α f(x) B c (18) +α 0 f(x) + b T f(x) wth α 0 = 1 k α. One way to optmze ths functon s usng coordnate descent, alternatng between:. mnmzng E w.r.t. x whle fxng {c }.. mnmzng E w.r.t. {c } whle fxng x. To mnmze E w.r.t. x whle fxng {c }, we can use the Newton method: x new = x old ( E x ) 1 ( ) T E x Usng the frst order Taylor approxmaton, we have f(x + δx) f(x) + Jδx wth J = f x We know that B (B T B ) 1 B T s a projecton matrx and B (B T B ) 1 B T v s the projecton of v n the subspace B. Thus we have B (B T B ) 1 B T v v. Therefore: v T Av ( α ) v v (15) Furthermore, we have v T Av 0 because B (B T B ) 1 B T v 0, and α 0. Combnng Thus Hence E (x + δx, {c }) E (x, {c }) + δx T J T Jδx + δx T J T (f(x) α B c + b) (19) E x (f(x) α B c + b) T J (0) E x JT J (1)
8 Therefore, we have the Newton update rule: x new = x old (J T J) 1 J T (f(x) α B c + b) () When x s fxed, {c (x)} that globally mnmze E are: c (x) = (B T B ) 1 B T f(x) (3) Combnng () and (3), we have the update rule for mnmzng E : x new = x old (J T J) 1 J T [Af(x) +b] Ths update rule s exactly the same as the update rule gven n (1). As a result, (1) wll always lead us to a local mnmum of E. We now prove that a local mnmum of E obtaned by (1) wll be a local mnmum of E. Suppose (x 0, {c (x 0)}) s a local mnmum of E, we have ǫ 1 > 0 such that E (x 0, {c (x 0)}) E (x 0+δx, {c (x 0)+δc )}) δx, δc : δx + δc < ǫ 1. Because c (x) s a contnuous functon n terms of x, we can always fnd ǫ > 0 small enough such that δx f δx < ǫ then δx + c (x 0 + δx) c (x 0) < ǫ 1. Thus ǫ such that E (x 0, {c (x 0)}) E (x 0 + δx, {c (x 0 + δx)}) δx : δx < ǫ. On the other hand, one can easly verfy that E (x, {c }) E (x, {c (x)}) = E(x) x Therefore, we have ǫ > 0 such that E(x 0) E(x 0 + δx) δx : δx < ǫ. Hence, x 0 must be a local mnmum of E. To sum up, we have shown that (1) wll converge to a local mnmum of E. Furthermore, a local mnmum of E found by (1) s also a local mnmum of E. Thus the update rule gven n (1) s guaranteed to converge to a local mnmum of E. Ths concludes our proof for Theorem 1. References [1] S. Baker and I. Matthews. Lucas-Kanade 0 years on: a unfyng framework. Internatonal Journal of Computer Vson, 56(3):1 55, March 004. [] J. R. Bergen, P. Anandan, K. J. Hanna, and R. Hngoran. Herarchcal model-based moton estmaton. European Conference on Computer Vson, pages 37 5, 199. [3] M. J. Black, D. J. Fleet, and Y. Yacoob. Robustly estmatng changes n mage appearance. Computer Vson and Image Understandng, 78(1):8 31, 000. [4] M. J. Black and A. D. Jepson. Egentrackng: Robust matchng and trackng of objects usng vew-based representaton. Internatonal Journal of Computer Vson, 6(1):63 84, [5] V. Blanz and T. Vetter. A morphable model for the synthess of 3D faces. In ACM SIGGRAPH, [6] T. Cootes, G. Edwards, and C. Taylor. Actve appearance models. PAMI, 3(6): , 001. [7] T. F. Cootes and C. Taylor. Statstcal models of appearance for computer vson. Techncal report, Unversty of Manchester., 001. [8] F. de la Torre and M. J. Black. Robust parameterzed component analyss: theory and applcatons to D facal appearance models. Computer Vson and Image Understandng, 91:53 71, 003. [9] F. de la Torre, A. Collet, J. Cohn, and T. Kanade. Fltered component analyss to ncrease robustness to local mnma n appearance models. In IEEE Conference on Computer Vson and Pattern Recognton, 007. [10] F. de la Torre, J. Vtrà, P. Radeva, and J. Melenchón. Egenflterng for flexble egentrackng. In Internatonal Conference on Pattern Recognton, pages , 000. [11] S. Gong, S. Mckenna, and A. Psarrou. Dynamc Vson: From Images to Face Recognton. Imperal College Press, 000. [1] R. Gross, I. Matthews, J. Cohn, T. Kanade, and S. Baker. The CMU mult-pose, llumnaton, and expresson (Mult-PIE) face database. Techncal report, Robotcs Insttute, Carnege Mellon Unversty, 007. TR [13] I. Jollffe. Prncpal Component Analyss. Sprnger-Verlag, New York, [14] M. J. Jones and T. Poggo. Multdmensonal morphable models. In Internatonal Conference on Computer Vson, pages , [15] X. Lu. Generc face algnment usng boosted appearance model. In IEEE Conference on Computer Vson and Pattern Recognton, 007. [16] B. Lucas and T. Kanade. An teratve mage regstraton technque wth an applcaton to stereo vson. In Proceedngs of Imagng Understandng Workshop, [17] I. Matthews and S. Baker. Actve appearance models revsted. Internatonal Journal of Computer Vson, 60(): , Nov [18] I. Matthews, T. Ishkawa, and S. Baker. The template update problem. IEEE Transactons on Pattern Analyss and Machne Intellgence, 6: , 004. [19] S. K. Nayar and T. Poggo. Early Vsual Learnng. Oxford Unversty Press, [0] J. Saragh and R. Goecke. A nonlnear dscrmnatve approach to AAM fttng. In Internatonal Conference on Computer Vson, 007. [1] L. Srovch and M. Krby. Low-dmensonal procedure for the characterzaton of human faces. Journal of the Optcal Socety of Amerca A: Optcs, Image Scence, and Vson, 4(3):519 54, March [] I. Tsochantards, T. Joachms, T. Hofmann, and Y. Altun. Large margn methods for structured and nterdependent output varables. Journal of Machne Learnng Research, 6: , 005. [3] M. Turk and A. Pentland. Egenfaces for recognton. Journal Cogntve Neuroscence, 3(1):71 86, [4] T.Vetter. Learnng novel vews to a sngle face mage. In Internatonal Conference on Automatc Face and Gesture Recognton, pages 7, [5] M. Wmmer, F. Stulp, S. J. Tschechne, and B. Radg. Learnng robust objectve functons for model fttng n mage understandng applcatons. In Proceedngs of Brtsh Machne Vson Conference, 006. [6] J. Xao, S. Baker, I. Matthews, and T. Kanade. Real-tme combned D+3D actve appearance models. In Conference on Computer Vson and Pattern Recognton, volume II, pages , 004.
Learning Image Alignment without Local Minima for Face Detection and Tracking
Learnng Image Algnment wthout Local Mnma for Face Detecton and Trackng Mnh Hoa Nguyen Fernando De la Torre Robotcs Insttute, Carnege Mellon Unversty Pttsburgh, PA 15213, USA. mnhhoa@cmu.edu ftorre@cs.cmu.edu
More informationSupport Vector Machines
/9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.
More informationFace Recognition University at Buffalo CSE666 Lecture Slides Resources:
Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural
More informationMULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION
MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and
More informationLECTURE : MANIFOLD LEARNING
LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors
More informationSLAM Summer School 2006 Practical 2: SLAM using Monocular Vision
SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,
More informationParallelism for Nested Loops with Non-uniform and Flow Dependences
Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr
More informationHermite Splines in Lie Groups as Products of Geodesics
Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the
More informationLearning the Kernel Parameters in Kernel Minimum Distance Classifier
Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department
More informationRange images. Range image registration. Examples of sampling patterns. Range images and range surfaces
Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples
More informationS1 Note. Basis functions.
S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type
More informationImprovement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration
Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,
More informationFeature Reduction and Selection
Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components
More informationDiscriminative Dictionary Learning with Pairwise Constraints
Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse
More informationA Binarization Algorithm specialized on Document Images and Photos
A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a
More informationLecture 5: Multilayer Perceptrons
Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented
More informationSmoothing Spline ANOVA for variable screening
Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory
More informationSupport Vector Machines
Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned
More informationClassification / Regression Support Vector Machines
Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM
More informationR s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes
SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges
More informationy and the total sum of
Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton
More information12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification
Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero
More informationMachine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)
Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes
More informationRecognizing Faces. Outline
Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &
More informationContent Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers
IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth
More informationFitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.
Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both
More informationSolving two-person zero-sum game by Matlab
Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by
More informationAn Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices
Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal
More informationVideo Object Tracking Based On Extended Active Shape Models With Color Information
CGIV'2002: he Frst Frst European Conference Colour on Colour n Graphcs, Imagng, and Vson Vdeo Object rackng Based On Extended Actve Shape Models Wth Color Informaton A. Koschan, S.K. Kang, J.K. Pak, B.
More informationIntegrated Expression-Invariant Face Recognition with Constrained Optical Flow
Integrated Expresson-Invarant Face Recognton wth Constraned Optcal Flow Chao-Kue Hseh, Shang-Hong La 2, and Yung-Chang Chen Department of Electrcal Engneerng, Natonal Tsng Hua Unversty, Tawan 2 Department
More informationCS 534: Computer Vision Model Fitting
CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust
More informationOutline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1
4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:
More informationEdge Detection in Noisy Images Using the Support Vector Machines
Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona
More informationTakahiro ISHIKAWA Takahiro Ishikawa Takahiro Ishikawa Takeo KANADE
Takahro ISHIKAWA Takahro Ishkawa Takahro Ishkawa Takeo KANADE Monocular gaze estmaton s usually performed by locatng the pupls, and the nner and outer eye corners n the mage of the drver s head. Of these
More informationA Fast Visual Tracking Algorithm Based on Circle Pixels Matching
A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng
More informationFace Recognition Based on SVM and 2DPCA
Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty
More informationThe Codesign Challenge
ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.
More informationNew Extensions of the 3-Simplex for Exterior Orientation
New Extensons of the 3-Smplex for Exteror Orentaton John M. Stenbs Tyrone L. Vncent Wllam A. Hoff Colorado School of Mnes jstenbs@gmal.com tvncent@mnes.edu whoff@mnes.edu Abstract Object pose may be determned
More information2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements
Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.
More informationFor instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)
Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A
More informationOn Modeling Variations For Face Authentication
On Modelng Varatons For Face Authentcaton Xaomng Lu Tsuhan Chen B.V.K. Vjaya Kumar Department of Electrcal and Computer Engneerng, Carnege Mellon Unversty Abstract In ths paper, we present a scheme for
More informationCorner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity
Journal of Sgnal and Informaton Processng, 013, 4, 114-119 do:10.436/jsp.013.43b00 Publshed Onlne August 013 (http://www.scrp.org/journal/jsp) Corner-Based Image Algnment usng Pyramd Structure wth Gradent
More informationLocal Quaternary Patterns and Feature Local Quaternary Patterns
Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents
More informationImage Alignment CSC 767
Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances
More informationDetermining the Optimal Bandwidth Based on Multi-criterion Fusion
Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn
More informationActive Contours/Snakes
Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng
More informationThe Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique
//00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy
More informationAn Image Fusion Approach Based on Segmentation Region
Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua
More informationAn AAM-based Face Shape Classification Method Used for Facial Expression Recognition
Internatonal Journal of Research n Engneerng and Technology (IJRET) Vol. 2, No. 4, 23 ISSN 2277 4378 An AAM-based Face Shape Classfcaton Method Used for Facal Expresson Recognton Lunng. L, Jaehyun So,
More informationComputer Animation and Visualisation. Lecture 4. Rigging / Skinning
Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume
More informationSubspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;
Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features
More informationFitting: Deformable contours April 26 th, 2018
4/6/08 Fttng: Deformable contours Aprl 6 th, 08 Yong Jae Lee UC Davs Recap so far: Groupng and Fttng Goal: move from array of pxel values (or flter outputs) to a collecton of regons, objects, and shapes.
More informationA B-Snake Model Using Statistical and Geometric Information - Applications to Medical Images
A B-Snake Model Usng Statstcal and Geometrc Informaton - Applcatons to Medcal Images Yue Wang, Eam Khwang Teoh and Dnggang Shen 2 School of Electrcal and Electronc Engneerng, Nanyang Technologcal Unversty
More informationOn Incremental and Robust Subspace Learning
On Incremental and Robust Subspace Learnng Yongmn L, L-Qun Xu, Jason Morphett and Rchard Jacobs Content and Codng Lab, BT Exact pp1 MLB3/7, Oron Buldng, Adastral Park, Ipswch, IP5 3RE, UK Emal: Yongmn.L@bt.com
More informationMulti-View Face Alignment Using 3D Shape Model for View Estimation
Mult-Vew Face Algnment Usng 3D Shape Model for Vew Estmaton Yanchao Su 1, Hazhou A 1, Shhong Lao 1 Computer Scence and Technology Department, Tsnghua Unversty Core Technology Center, Omron Corporaton ahz@mal.tsnghua.edu.cn
More informationNAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics
Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson
More informationEcient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem
Ecent Computaton of the Most Probable Moton from Fuzzy Correspondences Moshe Ben-Ezra Shmuel Peleg Mchael Werman Insttute of Computer Scence The Hebrew Unversty of Jerusalem 91904 Jerusalem, Israel Emal:
More informationFEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur
FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents
More informationOptimizing Document Scoring for Query Retrieval
Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng
More informationPositive Semi-definite Programming Localization in Wireless Sensor Networks
Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer
More informationProblem Definitions and Evaluation Criteria for Computational Expensive Optimization
Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty
More informationLearning a Class-Specific Dictionary for Facial Expression Recognition
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for
More informationSum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints
Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan
More informationAP PHYSICS B 2008 SCORING GUIDELINES
AP PHYSICS B 2008 SCORING GUIDELINES General Notes About 2008 AP Physcs Scorng Gudelnes 1. The solutons contan the most common method of solvng the free-response questons and the allocaton of ponts for
More informationGraph-based Clustering
Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component
More informationRegistration of Expressions Data using a 3D Morphable Model
Regstraton of Expressons Data usng a 3D Morphable Model Curzo Basso DISI, Unverstà d Genova, Italy Emal: curzo.basso@ds.unge.t Thomas Vetter Departement Informatk, Unverstät Basel, Swtzerland Emal: thomas.vetter@unbas.ch
More informationCollaboratively Regularized Nearest Points for Set Based Recognition
Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,
More informationAn efficient method to build panoramic image mosaics
An effcent method to buld panoramc mage mosacs Pattern Recognton Letters vol. 4 003 Dae-Hyun Km Yong-In Yoon Jong-Soo Cho School of Electrcal Engneerng and Computer Scence Kyungpook Natonal Unv. Abstract
More informationClassifier Selection Based on Data Complexity Measures *
Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.
More informationDetection of an Object by using Principal Component Analysis
Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,
More informationTN348: Openlab Module - Colocalization
TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages
More informationCluster Analysis of Electrical Behavior
Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School
More informationA Bilinear Model for Sparse Coding
A Blnear Model for Sparse Codng Davd B. Grmes and Rajesh P. N. Rao Department of Computer Scence and Engneerng Unversty of Washngton Seattle, WA 98195-2350, U.S.A. grmes,rao @cs.washngton.edu Abstract
More informationRelated-Mode Attacks on CTR Encryption Mode
Internatonal Journal of Network Securty, Vol.4, No.3, PP.282 287, May 2007 282 Related-Mode Attacks on CTR Encrypton Mode Dayn Wang, Dongda Ln, and Wenlng Wu (Correspondng author: Dayn Wang) Key Laboratory
More informationUser Authentication Based On Behavioral Mouse Dynamics Biometrics
User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA
More informationAn Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method
Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and
More informationImage Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline
mage Vsualzaton mage Vsualzaton mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and Analyss outlne mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and
More informationReducing Frame Rate for Object Tracking
Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg
More informationCategories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms
3. Fndng Determnstc Soluton from Underdetermned Equaton: Large-Scale Performance Modelng by Least Angle Regresson Xn L ECE Department, Carnege Mellon Unversty Forbs Avenue, Pttsburgh, PA 3 xnl@ece.cmu.edu
More informationA Robust Method for Estimating the Fundamental Matrix
Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.
More informationGSLM Operations Research II Fall 13/14
GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017
U.C. Bereley CS294: Beyond Worst-Case Analyss Handout 5 Luca Trevsan September 7, 207 Scrbed by Haars Khan Last modfed 0/3/207 Lecture 5 In whch we study the SDP relaxaton of Max Cut n random graphs. Quc
More informationOutline. Type of Machine Learning. Examples of Application. Unsupervised Learning
Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton
More informationStructure from Motion
Structure from Moton Structure from Moton For now, statc scene and movng camera Equvalentl, rgdl movng scene and statc camera Lmtng case of stereo wth man cameras Lmtng case of multvew camera calbraton
More informationOnline Detection and Classification of Moving Objects Using Progressively Improving Detectors
Onlne Detecton and Classfcaton of Movng Objects Usng Progressvely Improvng Detectors Omar Javed Saad Al Mubarak Shah Computer Vson Lab School of Computer Scence Unversty of Central Florda Orlando, FL 32816
More informationDevelopment of an Active Shape Model. Using the Discrete Cosine Transform
Development of an Actve Shape Model Usng the Dscrete Cosne Transform Kotaro Yasuda A Thess n The Department of Electrcal and Computer Engneerng Presented n Partal Fulfllment of the Requrements for the
More informationClassification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM
Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based
More informationEYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS
P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye
More informationObject Recognition Based on Photometric Alignment Using Random Sample Consensus
Vol. 44 No. SIG 9(CVIM 7) July 2003 3 attached shadow photometrc algnment RANSAC RANdom SAmple Consensus Yale Face Database B RANSAC Object Recognton Based on Photometrc Algnment Usng Random Sample Consensus
More informationMOTION PANORAMA CONSTRUCTION FROM STREAMING VIDEO FOR POWER- CONSTRAINED MOBILE MULTIMEDIA ENVIRONMENTS XUNYU PAN
MOTION PANORAMA CONSTRUCTION FROM STREAMING VIDEO FOR POWER- CONSTRAINED MOBILE MULTIMEDIA ENVIRONMENTS by XUNYU PAN (Under the Drecton of Suchendra M. Bhandarkar) ABSTRACT In modern tmes, more and more
More informationX- Chart Using ANOM Approach
ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are
More informationMETRIC ALIGNMENT OF LASER RANGE SCANS AND CALIBRATED IMAGES USING LINEAR STRUCTURES
METRIC ALIGNMENT OF LASER RANGE SCANS AND CALIBRATED IMAGES USING LINEAR STRUCTURES Lorenzo Sorg CIRA the Italan Aerospace Research Centre Computer Vson and Vrtual Realty Lab. Outlne Work goal Work motvaton
More information6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour
6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the
More informationBOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET
1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School
More informationThe Research of Support Vector Machine in Agricultural Data Classification
The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou
More informationLEARNING A WARPED SUBSPACE MODEL OF FACES WITH IMAGES OF UNKNOWN POSE AND ILLUMINATION
LEARNING A WARPED SUBSPACE MODEL OF FACES WITH IMAGES OF UNKNOWN POSE AND ILLUMINATION Jhun Hamm, and Danel D. Lee GRASP Laboratory, Unversty of Pennsylvana, 3330 Walnut Street, Phladelpha, PA, USA jhham@seas.upenn.edu,
More informationAnalysis of Continuous Beams in General
Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,
More informationModular PCA Face Recognition Based on Weighted Average
odern Appled Scence odular PCA Face Recognton Based on Weghted Average Chengmao Han (Correspondng author) Department of athematcs, Lny Normal Unversty Lny 76005, Chna E-mal: hanchengmao@163.com Abstract
More informationAn Optimal Algorithm for Prufer Codes *
J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,
More informationProf. Feng Liu. Spring /24/2017
Prof. Feng Lu Sprng 2017 ttp://www.cs.pd.edu/~flu/courses/cs510/ 05/24/2017 Last me Compostng and Mattng 2 oday Vdeo Stablzaton Vdeo stablzaton ppelne 3 Orson Welles, ouc of Evl, 1958 4 Images courtesy
More information