Transductive Regression Piloted by Inter-Manifold Relations

Size: px
Start display at page:

Download "Transductive Regression Piloted by Inter-Manifold Relations"

Transcription

1 Huan Wang IE, The Chnese Unversty of Hong Kong, Hong Kong Shucheng Yan Thomas Huang ECE, Unversty of Illnos at Urbana Champagn, USA Janzhuang Lu Xaoou Tang IE, The Chnese Unversty of Hong Kong, Hong Kong Abstract In ths paper, we present a novel semsupervsed regresson algorthm workng on multclass data that may le on multple manfolds. Unlke conventonal manfold regresson algorthms that do not consder the class dstncton of samples, our method ntroduces the class nformaton to the regresson process and tres to explot the smlar confguratons shared by the label dstrbuton of mult-class data. To utlze the correlatons among data from dfferent classes, we develop a cross-manfold label propagaton process and employ labels from dfferent classes to enhance the regresson performance. The nterclass relatons are coded by a set of ntermanfold graphs and a regularzaton tem s ntroduced to mpose nter-class smoothness on the possble solutons. In addton, the algorthm s further extended wth the kernel trck for predctng labels of the out-ofsample data even wthout class nformaton. Experments on both syntheszed data and real world problems valdate the effectveness of the proposed framework for semsupervsed regresson.. Introducton Large scale and hgh dmensonal data are ubqutous n real-world applcatons, yet the processng and Appearng n Proceedngs of the 24 th Internatonal Conference on Machne Learnng, Corvalls, OR, Copyrght 2007 by the author(s)/owner(s). analyss of these data are often dffcult due to the curse of dmensonalty as well as the hgh computatonal cost nvolved. Usually, these hgh dmensonal data le approxmately on an underlyng compact low dmensonal manfold, whch may turn the problem tractable. Substantve works have been devoted to unvelng the ntrnsc structure of the manfold data, among whch the popular ones nclude ISOMAP (Tenenbaum et al., 2000), LLE (Rowes & Saul, 2000), Laplacan Egenmap (Belkn & Nyog, 2003), and MVU (Wenberger et al., 2004). Besdes these unsupervsed algorthms that purely explore the manfold structure of the data, researchers have also well utlzed the latent manfold structure nformaton from both labeled and unlabeled samples to enhance learnng algorthms wth lmted number of labeled samples. Great success has been acheved n varous areas, such as classfcaton (Krshnapuram et al., 2005) (Belkn et al., 2005), manfold algnment (Ham et al., 2005) and regresson (Belkn et al., 2004) (Zhu & Goldberg, 2005). Recently ncreasng attenton has been drawn to the semsupervsed regresson problem by consderng the manfold structure. (Chapelle et al., 999) proposed a transductve algorthm mnmzng the leave-one-out error of the rdge regresson on the jont set composed from both labeled and unlabeled data. To explot the manfold structure, (Belkn et al., 2004) adds a graph Laplacan regularzaton tem to the regresson objectve, whch mposes extra condton on the smoothness along the data manfold and t has proved to be qute useful n applcatons. (Cortes & Mohr, 2007) frst roughly transduces the functon values from the labeled data to the unlabeled ones utlzng local neghborhood relatons, and then optmzes the global objectve that best fts the

2 Transductve Regresson Ploted by Inter-Manfold Relatons labels of the tranng ponts as well as the estmated labels provded by the frst step None of these state-of-the-art regresson algorthms make use of the class nformaton to gude the regresson. In real-world applcatons, however, multclass samples are ubqutous and samples from dfferent classes can be regarded as lyng on multple manfolds. For example, n age estmaton, the personal agng process vares dfferently for dfferent genders, but there stll exsts consstency between the agng processes of male and female. In the pose estmaton from mages of multple persons, each person can be regarded as one class and the mages vary smlarly wth poses from dfferent persons. Moreover, we may also encounter the mult-modalty samples whose representatons may not be consstent due to the lack of correspondence, whle the nner manfold confguraton for the label dstrbuton across dfferent modaltes may stll be smlar. Besdes, n the functon learnng stage of the semsupervsed regresson framework, the class or modalty nformaton s usually easy to obtan, thus t s desrable to utlze ths nformaton to further boost the regresson accuracy. To fully explot the relatons among data manfolds, we develop n ths paper a semsupervsed regresson algorthm called TRIM (Transductve Regresson ploted by Inter-Manfold Relatons). TRIM s based on the assumpton that the data of dfferent classes share smlar confguraton for label dstrbutons. In our proposed algorthm, manfolds of dfferent classes are algned accordng to the landmark connectons constructed from both the sample label dstance and manfold structural smlarty, and then the labels are transduced across dfferent manfolds based on the algnment output. Besdes the ntra-manfold smoothness condton wthn each class, our method ntroduces an nter-manfold regularzaton tem and employs the transduced labels from varous data manfolds to plot the trend of the functon values over all the samples. In addton, the functon to be learnt can be approxmated by the functonals lyng on the Reproducng Kernel Hlbert Space (RKHS), and the regresson on the nduced RKHS drectly leads to an effcent soluton for the label predcton of the out-of-sample data even wthout correspondng class nformaton. 2. Background 2.. Intutve Motvatons Consder the problem of adult heght estmaton based on the parents heghts. Dramatc dfference exsts between male and female adults, whle the general confguraton of the heght dstrbuton wthn each gender Fgure. Demonstraton of the nter-class relatons. Labeled samples are marked by + and the nter-manfold relatons are ndcated by the dashed lnes. The manfolds are algned and the sample labels are propagated across manfolds to plot the regresson. may stll be smlar: the taller the parents are, the hgher the chldren are supposed to be. Whle once we mx up the data from both genders, the heght dstrbuton may become complcated. Fgure dsplays a toy example for the obstacles encountered n the multclass regresson problem. We have two classes of samples and the functon values wthn each class manfold share smlar confguratons, whle the number of samples and labels may vary across classes. As s shown n the upper part of Fgure, t s dffcult to get a satsfactory regresson f we put all the samples together due to the complex structures produced by the manfold ntersectons. On the other hand, f the regresson s carred out separately on dfferent classes, the accuracy could stll be low due to the lack of suffcent labeled samples for each class. Also, for the ncomng out-of-sample data, the predcton cannot be done wthout class nformaton. Moreover, we may face some regresson problems wth mult-modalty samples, e.g., the estmaton of human ages from both photos and sketch mages. It s meanngless to compose data of dfferent modaltes nto one manfold snce the semantcs of the features from dfferent modaltes are essentally dfferent, whle t s stll possble that the ntra-modalty label confguratons are smlar. In ths paper we focus on the mult-class regresson problem, but our framework can also be easly extended to gve predctons for the mult-modalty data Related Work Mult-vew learnng (Blum & Mtchell, 998) technque can also be appled to the regresson problem wth mult-class data. Denote the labeled and unlabeled nstances as xl X l X and xu X u X respectvely, where X l, X u, X are the labeled, unlabeled and whole sample sets. State-of-the-art semsupervsed mult-vew learnng frameworks employ multple regresson functons from the Hlber space Hv so that the estmaton error on the tranng set and dsagreement among the functons on the unlabeled data

3 are mnmzed (Brefeld et al., 2006),.e., f v M l v= = mn f v H v= M l ( +λ x X l c(f v (x l ),y(x l )) + γ f v 2 ) M l v,v 2= x X u c(f v (x u ) f v2 (x u )), () where y(x) s the sample label, f v M l v= are M l multple learners and c( ) s a cost functon. We would lke to hghlght beforehand some propertes of our framework compared to the mult-vew learnng algorthms:. There exsts a clear correspondence between the mult-vew data from the same nstance n the mult-vew learnng framework, whle our algorthm does not requre the correspondence among dfferent manfolds. The data of dfferent modaltes or classes may be obtaned from dfferent nstances n our confguraton, thus t s much more challengng to gve an accurate regresson. 2. The class nformaton s utlzed n two ways: a) Sample relatons wthn each class are coded by ntra-manfold graphs and a correspondng regularzaton tem s ntroduced to ensure the wthnclass smoothness separately. b) A set of nter-manfold graphs are constructed from the cross manfold label propagaton on the algned manfolds and an nter-manfold regularzaton tem s proposed to fully explot the nformaton conveyed among dfferent classes. 3. The class nformaton s used n the functon learnng phase but no class attrbutes are requred for the out-of-sample data n the predcton stage. 3. Problem Formulaton 3.. Notatons Assume that the whole sample set X conssts of N samples from M classes, denoted as X,X 2,...,X M. For each class X k, N k = l k + u k samples are gven,.e., X k = {(x k,y k ),...,(x k l k,yk l k),(xk l k +,y k l k + ),...,(x k l k +u k,y k l k +u k)}, where Yl k = [y,y k 2,...,y k k l ] T represents the functon k values wth respect to the gven labeled samples and Yu k = [y k l k +,...,y k l k +u ] T corresponds to the functon k values of the remanng unlabeled data to be estmated n the k th class. In ths paper, the label refers to a real value to be regressed. Let G k = (V k,e k ) denote the ntra-class graph wth vertex set V k and edge set E k constructed wthn the data of the k th class. Here we focus on the undrected graph and t s easy to generalze our algorthm for drected graphs. The edges n E k reflect the neghborhood relatons along the manfold data, whch can be defned n terms of k-nearest neghbors or an ǫ-ball dstance crteron n the sample feature space F. One choce for those non-negatve weghts on the correspondng edges s to use the heat kernel (Belkn & Nyog, 2003) or the nverse of feature dstances (Cortes & Mohr, 2007),.e., w j =e Φ k (x ) Φ k (x j ) 2 t or w j = Φ k (x ) Φ k (x j ), where t R s the parameter for the heat kernel and Φ k ( ) s a feature mappng from X to the normed feature vector space F for the sample of the k th class. For samples from dfferent modaltes/manfolds, the feature mappngs may be dfferent. The other one s to solve a least-square problem to mnmze the reconstructon error and get the weghts w j : w j = arg mn w j x j s.t. j (Rowes & Saul, 2000). w j x j 2, w j =,w j 0 To encode the mutual relatons among dfferent sample classes, we also ntroduce the nter-manfold graph, denoted as a trplet G kkj = (V k,v kj,e kkj ). The nter-manfold graph G kkj s a bpartte graph wth one vertex set from the k th class and the other from the kj th. The constructon of wll be dscussed n Gkkj the followng subsectons Regularzaton along Data Manfold Now we are gven the sample data of multple classes, denoted as X, ncludng both labeled and unlabeled samples X l and X u. The manfold structures wthn each class data are encoded by the ntra-class graph. (Belkn et al., 2004) ntroduced a manfold regularzaton tem for the semsupervsed regresson framework,.e., the graph Laplacan, whch s expected to mpose smoothness condtons along the manfold on the possble solutons. The fnal formulaton seeks a balance between the fttng tem and the smoothness regular-

4 zaton,.e., Transductve Regresson Ploted by Inter-Manfold Relatons f = arg mn (f y ) 2 + γf t L p f, (2) l where p N. When p =, the regularzaton tem s the graph Laplacan and for p = 2, the regularzaton turns out to be the 2-norm of the reconstructon error when the weghts w j are normalzed,.e., f T L T Lf = w.r.t. j f j w j f j 2 w j =,w j Cross Manfold Label Propagaton In our confguraton, there does not exst a clear correspondence among the manfolds of dfferent classes or modaltes, and even the representatons for dfferent modaltes can be dstnct. Thus t s rather dffcult to construct sample relatons drectly from the smlarty n the sample space X. Alternatvely, the functon labels stll convey some correspondence nformaton, whch may be utlzed to gude the nter-manfold relatons. Moreover, the manfold structure also contans some ndcatons about the correspondence. Thus we frst seek a pont-to-pont correspondence for the labeled data combnng the ndcatons provded by both the sample labels and the manfold structures. Ths s done under two assumptons:. Samples wth smlar labels le generally n smlar relatve postons on the correspondng manfold. 2. Correspondng sample sets tend to share smlar graph structures on the respectve manfold Renforced Landmark Correspondence Frst we search for a set of stable landmarks to gude the manfold algnment. Specfcally, we use the ǫ- ball dstance crteron on the sample labels to ntalze the nter-manfold graphs. To gve a robust correspondence, we renforce the nter-manfold connectons by teratvely mplementng W kkj W k W kkj W kj. (3) Smlar to the smlarty propagaton process on the drected graph (Blondel et al., 2004), (3) renforces the smlarty score of sample pars by the smlarty of ts neghbor pars,.e., W = W W W AB A AB B ab 2 aa 2 ab 2 bb 2 + W W W A AB B aa 3 ab 3 4 bb W W W A AB B aa 4 ab 4 3 bb 2 3 a 6 a 2 a 3 Fgure 2. Demonstraton of renforced landmark correspondence. The smlarty score of a and b 2 s renforced by the scores of neghbor pars. The assumpton here s that two nodes tend to be smlar f they have smlar neghbors. (4) utlzes the ntra-manfold structure nformaton to renforce the nter-manfold smlarty and thus can generate a more robust correspondence. The accuracy of the landmark correspondence s crtcal for our algorthm. To ensure a robust performance, only the correspondences wth the top 20% largest smlarty scores are selected as the landmarks and t s common that some classes may mss certan sample labels, so plenty of labeled samples reman unmatched. a Manfold Algnment To propagate the sample labels to the unlabeled ponts across manfolds, we stretch all the manfolds wth respect to the landmark ponts obtaned from the prevous step and ths can be realzed by the semsupervsed manfold algnment (Ham et al., 2005). In the manfold algnment process, we seek an embeddng that mnmzes the correspondence error on the landmark ponts and at the same tme keeps the ntramanfold structures,.e., and f k M C(f k M k k = = arg mn( ) = k f kt ), (5) D k k f C(f k M k =) = k k j w kkj j +γ a 4 a 5 f k x k b b 2 f kj x k j j 2 b 3 b 4 M (f kt L p k f k ) + βf T L a f, (6) k = where D k s the dagonal degree matrx, L k s the graph Laplacan matrx for the k th class and p N. To ensure the nter-class adjacency, we add a global compactness regularzaton tem βf T L a f to the cost functon C, where L a s the Laplacan Matrx of W a wth { f x and x j are of dfferent classes w a j = 0 o.w. w kkj j m,n w k m wkkj mn w kj nj. (4) The labels are propagated across manfolds on the derved algned manfolds usng the nearest neghbor ap-

5 Algorthm Procedure to construct nter-manfold connectons : Inter-manfold graph ntalzaton { f y k w kkj j = 0 o.w. 2: Correspondence Renforcement for Iter = : N Iter y kj j 2 < ǫ W kkj = W k W kkj W kj end 3: Landmark Selecton: Select the sample pars wth the top 20% largest smlarty scores as correspondence. Set the correspondng elements n W kkj as and others 0. 4: Manfold Algnment usng the Inter-manfold Graphs W kkj. 5: Fnd the correspondng ponts from dfferent classes for the unmatched labeled samples usng the nearest neghbor approach n the algned space and update the nter-manfold graphs W kkj. proach,.e., we connect the labeled samples wth the nearest ponts from other classes on the algned space. The derved nter-manfold graphs are concatenated to form W r = O W 2... W M W 2 O... W 2M W M W M2... O, (7) whch s then symmetrzed and employed n the followng Inter-Manfold regularzaton tem Inter-Manfold Regularzaton We rearrange the samples to place the data from the same class together and put the labeled ponts frst for each class, that s, X = {x,x 2,...,x l,x l +,...,x l +u,x2,x 2 2,...,x 2 l 2,x2 l 2 +,...,x 2 l 2 +u 2,...,xM,x M 2,...,x M l M,xM l M +,...,xm l M +u M }. Denote the correspondng functon values as f, whch s a concatenaton of f k = [f k,...,f k ] T from all x k x k l k +u k the classes. Our regresson objectve s defned as: f = arg mn f l k k +β k f k x y k k 2 x k X l (N k ) 2 (fk ) T L p k fk + λ N 2 ft L r f, (8) where L r s the Laplacan matrx of the symmetrzed W r. The mnmzaton of the objectve s acheved when where f = R k l k (Sk l ksk ) T Y k l (9) S k l =( I k lk l O ) k l k u k, ( O S k k = Nk P I Nk N O k P N k N k M k = N k k =k+ are label and class selecton matrces respectvely and R = k + β k l k (Sk l ksk ) T (S k l ksk ) (N k ) 2 SkT L p k Sk + λ N 2 Lr. 4. Regresson on Reproducng Kernel Hlbert Space (RKHS) A seres of algorthms such as SVM, Rdge regresson and LapRLS (Belkn et al., 2005) employ dfferent regularzaton tems and emprcal cost measures to the objectve and solve the optmzaton problem n an approprately chosen Reproducng Kernel Hlbert Space (RKHS). One mert for the regresson on RKHS s the ablty to predct out-of-sample labels. Let K denote a Mercer kernel: X X R and H K denote the nduced RKHS of functons X R wth norm K. The regresson wth nter-manfold regularzaton on the RKHS s defned as f = arg mn f H K l k f k x y k k 2 + γ f 2 K k x k X l +β k (N k ) 2 (fk ) T L p k fk + λ N 2 ft L r f. (0) Smlar to the Tkhonov regularzaton, we add an RKHS norm penalzaton tem to the TRIM algorthm as a smoothness condton. )

6 MAE: MAE:23.87 MAE: (a) (b) (c) (d) Fgure 3. Regresson on the nonlnear Two Moons Dataset. (a) Orgnal Functon Value Dstrbuton. (b) Tradtonal Graph Laplacan Regularzed Regresson (separate regressors for dfferent classes). (c) Two Class TRIM. (d) Two Class TRIM on RKHS. Note the dfference n the area ndcated by the rectangle. For the mult-class data under the same modalty, we have the followng theorem: Theorem-. The soluton to the mnmzaton problem (0) admts an expanson f(x) = N= P k (lk +u k ) = α K(x,x) () Theorem- s a specal verson of the Generalzed Representer Theorem (Schlkopf et al., 200) and the proof s omtted here. It says that the mnmzer of (0) can be expressed n terms of the lnear expanson of K(x,x) on both labeled and unlabeled data over all the sample classes. Thus the mnmzaton over Hlbert space bols down to mnmzng the coeffcent vector α = [α,...,α l,...,α l +u,...,α M,...,α M l,...,α M M l M +u ] T M over R N and the mnmzer s gven by: where J = k α = J k + β k l k (Sk l ksk ) T Y k l, (2) l k (Sk l ksk ) T (S k l ksk )K + γi (N k ) 2 SkT L p k Sk K + λ N 2 Lr K. and K s the N N Gram matrx of labeled and unlabeled ponts over all the sample classes. For the out-of-sample data, the estmated labels can be obtaned usng: y new = N= P k (lk +u k ) = α K(x,x new ). (3) Note here n ths framework the class nformaton for the ncomng sample s not requred n the predcton stage. 5. Experments We performed experments on two synthetc datasets and one real-world regresson problem of human age estmaton. Comparsons are made wth the tradtonal graph Laplacan regularzed semsupervsed regresson (Belkn & Nyog, 2003). We also evaluate the generalzaton performance of the mult-class regresson on RKHS. In the experments, the ntra-manfold graphs are constructed usng 0 nearest neghbors and the nter-manfold graphs are constructed by followng the procedure descrbed n Secton 3. For all the confguratons, the parameter β for the ntra-manfold graph s emprcally set as 0.00 and the λ for the nter-manfold regularzaton s set as 0.. For the kernelzed algorthms, the coeffcent for the RKHS norm γ = 0.00 and the gaussan kernel K(x,y) = exp{ x y 2 /δo} 2 wth parameters δ o = 2 /2.5 δ s appled, where δ s the standard devaton of the sample data. We use the Mean Absolute Error (MAE) crteron to measure the regresson accuracy and t s defned as an average of the absolute errors between the estmated labels and the ground truth labels. 5.. Synthetc Data: Nonlnear Two Moons The nonlnear two moons dataset s shown n Fgure 3. The colors n the fgure are assocated wth the functon values and the varaton of those sample labels along the manfold s not unform. The labeled samples are marked by + and ther dstrbutons are qute dfferent across classes. As we can see, the sample labels for the class lyng on the upper part of the fgure are not enough to gve an accurate gudance for the regresson on the nonlnear label dstrbuton. The tradtonal graph Laplacan regularzed regresson algorthm does not make use of the nformaton conveyed by the nter-class smlarty and the predcton result s not satsfactory, whle n our algorthm the sample labels from dfferent classes can be utlzed to gude the regresson and thus estmaton accuracy s much hgher.

7 MAE: MAE:44. MAE:07.64 (a) (b) (c) (d) Fgure 4. Regresson on Cyclone Dataset: (a) Orgnal Functon Values. (b) Tradtonal Graph Laplacan Regularzed Regresson (separate regressors for dfferent classes). (c) Three Class TRIM. (d) Three Class TRIM on RKHS Synthetc Data: Three-class Cyclones One mert of our algorthm s that the regresson accuracy may be boosted as the class number ncreases. Ths s because the cross manfold nformaton that could be utlzed grows rapdly as the class number ncreases. The Cyclone dataset conssts of three classes of samples and the class dstrbuton s demonstrated n Fgure 5. The label dstrbutons among dfferent classes are qute smlar, whle the labeled samples scatter dfferently from class to class. As we may observe from Fgure 4, wthout the nter-class regularzaton, the regresson for certan class may fal due to the lack of suffcent labeled samples whle our algorthm stll gves a satsfyng performance Human Age Estmaton In ths experment we consder as a regresson problem the age estmaton from facal mages. In real applcatons, we often cannot obtan enough age labels though we may easly get plenty of facal mages. Those unlabeled data can be used to gude our regresson n the semsupervsed framework. One challenge for ths estmaton s caused by the gender dfference. Although the male and female may share some smlar confguratons n the age dstrbuton, just as demonstrated n Fgure, mxng the two may complcate the regresson problem. On the other hand, the gender nformaton s usually easy to obtan and t s thus desrable to use both the age labels and gender nformaton to gude the regresson. The agng data set used n ths experment s the Yamaha database, whch contans 8000 Japanese facal mages of 600 persons wth ages rangng from 0 to 93. Each person has 5 mages and the Yamaha database s dvded nto two parts wth 4000 mages from 800 males and another 4000 mages for 800 females. The ages dstrbute evenly from 0 to 69 wth 3500 male mages and 3500 female mages, and the rest are dstrbuted n the age gap from 7 to 93. We randomly sampled 000 photos from the male and female Fgure 5. Class Dstrbuton of the Cyclone Dataset subset respectvely and thus altogether 2000 mages are used n our experments. Before the regresson, the nput mage data s preprocessed wth Prncpal Component Analyss and the frst 20 dmensons are used for data projecton. To fully evaluate the regresson performance for both close set samples and those out-of-sample data, we desgn two expermental confguratons for ths dataset. Confguraton : Close Set Evaluaton. In ths confguraton, we do not have the out-of-sample data and the close set performance of TRIM s evaluated n comparson wth the tradtonal graph Laplacan regularzed regresson (Belkn et al., 2004). The close set contans altogether 2000 samples wth 000 mages from males and 000 mages from females respectvely. We vary the number of randomly selected labeled samples and examne the performance of dfferent regresson algorthms. The comparson results between TRIM and the tradtonal sngle class Laplacan Regularzed regresson are shown n Fgure 6. We can observe that our algorthm generally gves a hgher regresson accuracy and the performance mprovement s remarkable especally when the number of labeled samples s small. As the number of sample labels ncreases, the dfference between the two algorthms becomes smaller. Ths may be caused by the fact that as the labels are sparse, the label gudance s far less enough and thus the class nformaton and nter-manfold relatons wll have a greater nfluence on the regresson accuracy whle when the labels are abundant enough to gude the regresson, the class nformaton as well as those nter-manfold connectons becomes less mportant. Confguraton 2: Open Set Evaluaton. Now we examne the out-of-sample predcton performance for

8 Fgure 6. TRIM vs tradtonal graph Laplacan regularzed regresson for the close set evaluaton on YAMAHA database. (a) (b) Fgure 7. Open set evaluaton for the kernelzed regresson on the YAMAHA database. (a) Regresson on the tranng set. (b) Regresson on out-of-sample data. the kernelzed TRIM compared wth kernelzed verson of the tradtonal graph Laplacan regularzed regresson (Belkn et al., 2005). In ths confguraton, the sample set s dvded nto two subsets: one for the regresson functon tranng and the other for the evaluaton of agng estmaton performance on those outof-sample data. The tranng set contans randomly selected 800 male and 800 female mages and the remanng ones are used for out-of-sample evaluaton. In the testng phase we do not nput any gender nformaton. As s demonstrated n Fgure 7, our algorthm acheves a lower MAE n both tranng and testng sets. Another observaton smlar as n the case for close set confguraton s that the regresson accuracy mprovement grows as the sample labels become sparser. 6. Conclusons In ths paper, we have presented a novel algorthm dedcated for the regresson problem on mult-class data. Manfolds constructed from dfferent classes are regularzed separately and to utlze the nter-manfold relatons, we developed an effcent cross manfold label propagaton method and the labels from dfferent classes can thus be employed to plot the regresson. Moreover, the regresson functon s further extended by the kernel trck to predct the labels of the out-of-sample data wthout class nformaton. Both syntheszed experments and real world applcatons demonstrated the superorty of our proposed framework over the state-of-the-art semsupervsed regresson algorthms. To the best of our knowledge, ths s the frst work to dscuss the semsupervsed regresson problem on mult-class data. 7. Acknowledgement The work descrbed n ths paper was supported by the Research Grants Councl of the Hong Kong SAR (Project No. CUHK 44306) and DTO Contract NBCHC06060 of USA. References Belkn, M., Matveeva, I., & Nyog, P. (2004). Regularzaton and sem-supervsed learnng on large graphs. COLT. Belkn, M., & Nyog, P. (2003). Laplacan egenmaps for dmensonalty reducton and data representaton. Neural Comput. Belkn, M., Nyog, P., & Sndhwan, V. (2005). On manfold regularzaton. AISTATS. Blondel, V. D., Gajardo, A., Heymans, M., Senellart, P., & Dooren, P. V. (2004). A measure of smlarty between graph vertces: Applcatons to synonym extracton and web searchng. SIAM Rev. Blum, A., & Mtchell, T. (998). Combnng labeled and unlabeled data wth co-tranng. COLT. Brefeld, U., Gartner, T., Scheffer, T., & Wrobel, S. (2006). Effcent co-regularsed least squares regresson. ICML. Chapelle, O., Vapnk, V., & Weston, J. (999). Transductve nference for estmatng values of functons. NIPS. Cortes, C., & Mohr, M. (2007). On transductve regresson. In Nps. Ham, Lee, & Saul (2005). Semsupervsed algnment of manfolds. AISTATS. Krshnapuram, B., DavdWllams, Xue, Y., Hartemnk, A., & Carn, L. (2005). On sem-supervsed classfcaton. NIPS. Rowes, S. T., & Saul, L. K. (2000). Nonlnear dmensonalty reducton by locally lnear embeddng. Scence. Schlkopf, B., Herbrch, R., & Smola, A. J. (200). A generalzed representer theorem. COLT/EuroCOLT. Tenenbaum, J. B., de Slva, V., & Langford, J. C. (2000). A global geometrc framework for nonlnear dmensonalty reducton. Scence. Wenberger, K. Q., Sha, F., & Saul, L. K. (2004). Learnng a kernel matrx for nonlnear dmensonalty reducton. ICML. Zhu, X., & Goldberg, A. B. (2005). Sem-supervsed regresson wth order preferences. Computer Scences TR.

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Laplacian Eigenmap for Image Retrieval

Laplacian Eigenmap for Image Retrieval Laplacan Egenmap for Image Retreval Xaofe He Partha Nyog Department of Computer Scence The Unversty of Chcago, 1100 E 58 th Street, Chcago, IL 60637 ABSTRACT Dmensonalty reducton has been receved much

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Semi-supervised Classification Using Local and Global Regularization

Semi-supervised Classification Using Local and Global Regularization Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (2008) Sem-supervsed Classfcaton Usng Local and Global Regularzaton Fe Wang 1, Tao L 2, Gang Wang 3, Changshu Zhang 1 1 Department of

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

Learning an Image Manifold for Retrieval

Learning an Image Manifold for Retrieval Learnng an Image Manfold for Retreval Xaofe He*, We-Yng Ma, and Hong-Jang Zhang Mcrosoft Research Asa Bejng, Chna, 100080 {wyma,hjzhang}@mcrosoft.com *Department of Computer Scence, The Unversty of Chcago

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input Real-tme Jont Tracng of a Hand Manpulatng an Object from RGB-D Input Srnath Srdhar 1 Franzsa Mueller 1 Mchael Zollhöfer 1 Dan Casas 1 Antt Oulasvrta 2 Chrstan Theobalt 1 1 Max Planc Insttute for Informatcs

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering Out-of-Sample Extensons for LLE, Isomap, MDS, Egenmaps, and Spectral Clusterng Yoshua Bengo, Jean-Franços Paement, Pascal Vncent Olver Delalleau, Ncolas Le Roux and Mare Oumet Département d Informatque

More information

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY SSDH: Sem-supervsed Deep Hashng for Large Scale Image Retreval Jan Zhang, and Yuxn Peng arxv:607.08477v2 [cs.cv] 8 Jun 207 Abstract Hashng

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Supervised Nonlinear Dimensionality Reduction for Visualization and Classification

Supervised Nonlinear Dimensionality Reduction for Visualization and Classification IEEE Transactons on Systems, Man, and Cybernetcs Part B: Cybernetcs 1 Supervsed Nonlnear Dmensonalty Reducton for Vsualzaton and Classfcaton Xn Geng, De-Chuan Zhan, and Zh-Hua Zhou, Member, IEEE Abstract

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

Geometry-aware Metric Learning

Geometry-aware Metric Learning Zhengdong Lu LUZ@CS.UTEXAS.EDU Prateek Jan PJAIN@CS.UTEXAS.EDU Inderjt S. Dhllon INDERJIT@CS.UTEXAS.EDU Dept. of Computer Scence, Unversty of Texas at Austn, Unversty Staton C5, Austn, TX 78712 Abstract

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Semi-Supervised Discriminant Analysis Based On Data Structure

Semi-Supervised Discriminant Analysis Based On Data Structure IOSR Journal of Computer Engneerng (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 17, Issue 3, Ver. VII (May Jun. 2015), PP 39-46 www.osrournals.org Sem-Supervsed Dscrmnant Analyss Based On Data

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

The Discriminate Analysis and Dimension Reduction Methods of High Dimension

The Discriminate Analysis and Dimension Reduction Methods of High Dimension Open Journal of Socal Scences, 015, 3, 7-13 Publshed Onlne March 015 n ScRes. http://www.scrp.org/journal/jss http://dx.do.org/10.436/jss.015.3300 The Dscrmnate Analyss and Dmenson Reducton Methods of

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of

More information

Angle-Independent 3D Reconstruction. Ji Zhang Mireille Boutin Daniel Aliaga

Angle-Independent 3D Reconstruction. Ji Zhang Mireille Boutin Daniel Aliaga Angle-Independent 3D Reconstructon J Zhang Mrelle Boutn Danel Alaga Goal: Structure from Moton To reconstruct the 3D geometry of a scene from a set of pctures (e.g. a move of the scene pont reconstructon

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem Ecent Computaton of the Most Probable Moton from Fuzzy Correspondences Moshe Ben-Ezra Shmuel Peleg Mchael Werman Insttute of Computer Scence The Hebrew Unversty of Jerusalem 91904 Jerusalem, Israel Emal:

More information

Data-dependent Hashing Based on p-stable Distribution

Data-dependent Hashing Based on p-stable Distribution Data-depent Hashng Based on p-stable Dstrbuton Author Ba, Xao, Yang, Hachuan, Zhou, Jun, Ren, Peng, Cheng, Jan Publshed 24 Journal Ttle IEEE Transactons on Image Processng DOI https://do.org/.9/tip.24.2352458

More information

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT 3. - 5. 5., Brno, Czech Republc, EU APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT Abstract Josef TOŠENOVSKÝ ) Lenka MONSPORTOVÁ ) Flp TOŠENOVSKÝ

More information

ManifoldBoost: Stagewise Function Approximation for Fully-, Semiand Un-supervised Learning

ManifoldBoost: Stagewise Function Approximation for Fully-, Semiand Un-supervised Learning ManfoldBoost: Stagewse Functon Approxmaton for Fully-, Semand Un-supervsed Learnng Ncolas Loeff loeff@uuc.edu Department of Electrcal and Computer Engneerng, Unversty of Illnos, Urbana, IL 61801 Davd Forsyth

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

A Robust LS-SVM Regression

A Robust LS-SVM Regression PROCEEDIGS OF WORLD ACADEMY OF SCIECE, EGIEERIG AD ECHOLOGY VOLUME 7 AUGUS 5 ISS 37- A Robust LS-SVM Regresson József Valyon, and Gábor Horváth Abstract In comparson to the orgnal SVM, whch nvolves a quadratc

More information

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Optimal Workload-based Weighted Wavelet Synopses

Optimal Workload-based Weighted Wavelet Synopses Optmal Workload-based Weghted Wavelet Synopses Yoss Matas School of Computer Scence Tel Avv Unversty Tel Avv 69978, Israel matas@tau.ac.l Danel Urel School of Computer Scence Tel Avv Unversty Tel Avv 69978,

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

Multi-View Face Alignment Using 3D Shape Model for View Estimation

Multi-View Face Alignment Using 3D Shape Model for View Estimation Mult-Vew Face Algnment Usng 3D Shape Model for Vew Estmaton Yanchao Su 1, Hazhou A 1, Shhong Lao 1 Computer Scence and Technology Department, Tsnghua Unversty Core Technology Center, Omron Corporaton ahz@mal.tsnghua.edu.cn

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Joint Example-based Depth Map Super-Resolution

Joint Example-based Depth Map Super-Resolution Jont Example-based Depth Map Super-Resoluton Yanje L 1, Tanfan Xue,3, Lfeng Sun 1, Janzhuang Lu,3,4 1 Informaton Scence and Technology Department, Tsnghua Unversty, Bejng, Chna Department of Informaton

More information

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1 DOI: 10.98/CSIS110415064L An Improved Spectral Clusterng Algorthm Based on Local Neghbors n Kernel Space 1 Xnyue Lu 1,, Xng Yong and Hongfe Ln 1 1 School of Computer Scence and Technology, Dalan Unversty

More information

General Vector Machine. Hong Zhao Department of Physics, Xiamen University

General Vector Machine. Hong Zhao Department of Physics, Xiamen University General Vector Machne Hong Zhao (zhaoh@xmu.edu.cn) Department of Physcs, Xamen Unversty The support vector machne (SVM) s an mportant class of learnng machnes for functon approach, pattern recognton, and

More information

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction Appl. Math. Inf. c. 6 No. pp. 8-85 (0) Appled Mathematcs & Informaton cences An Internatonal Journal @ 0 NP Natural cences Publshng Cor. wo-dmensonal upervsed Dscrmnant Proecton Method For Feature Extracton

More information

Wavelets and Support Vector Machines for Texture Classification

Wavelets and Support Vector Machines for Texture Classification Wavelets and Support Vector Machnes for Texture Classfcaton Kashf Mahmood Rapoot Faculty of Computer Scence & Engneerng, Ghulam Ishaq Khan Insttute, Top, PAKISTAN. kmr@gk.edu.pk Nasr Mahmood Rapoot Department

More information

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity Journal of Sgnal and Informaton Processng, 013, 4, 114-119 do:10.436/jsp.013.43b00 Publshed Onlne August 013 (http://www.scrp.org/journal/jsp) Corner-Based Image Algnment usng Pyramd Structure wth Gradent

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Integrated Expression-Invariant Face Recognition with Constrained Optical Flow

Integrated Expression-Invariant Face Recognition with Constrained Optical Flow Integrated Expresson-Invarant Face Recognton wth Constraned Optcal Flow Chao-Kue Hseh, Shang-Hong La 2, and Yung-Chang Chen Department of Electrcal Engneerng, Natonal Tsng Hua Unversty, Tawan 2 Department

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

IMAGE matting is an important but still challenging problem

IMAGE matting is an important but still challenging problem IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 1 Patch Algnment Manfold Mattng Xuelong L, Fellow, IEEE, Kang Lu, Member, IEEE, Yongsheng Dong, Member, IEEE, and Dacheng Tao, Fellow, IEEE arxv:1904.07588v1

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z. TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Cordial and 3-Equitable Labeling for Some Star Related Graphs

Cordial and 3-Equitable Labeling for Some Star Related Graphs Internatonal Mathematcal Forum, 4, 009, no. 31, 1543-1553 Cordal and 3-Equtable Labelng for Some Star Related Graphs S. K. Vadya Department of Mathematcs, Saurashtra Unversty Rajkot - 360005, Gujarat,

More information