BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

Size: px
Start display at page:

Download "BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION"

Transcription

1 BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha , P. R. Chna Abstract: Ths paper presents a new mult-source doman adaptaton framework based on the Bayesan learnng prncple (BayesMSDA), n whch one target doman and more than one source domans are used. In ths framework, the label of a target data pont s determned accordng to ts posteror, whch s calculated usng the Bayesan formula. To fulfll ths framework, a novel pror of the target doman based on Laplacan matrx and a new lkelhood dynamcally obtaned usng the k- nearest neghbors of a data pont are defned. We focus on the stuaton that there are no labeled data obtaned from the target doman whle there are large numbers of labeled data from source domans. Experments on synthetc data and real-world data llustrate that our framework has a good performance. Keywords: Bayesan framework; mult-source doman adaptaton; Laplacan matrx 1. Introducton Most theoretcal models n machne learnng, such as probably approxmately correct models (PAC models), assume that models are traned and tested usng data drawn from certan fxed dstrbutons. Unform convergence theory guarantees that a model s emprcal tranng error s close to ts true error under such assumptons. However, n practce assumptons that data for tranng and testng come from the same dstrbuton do not hold, because these two types of data usually come from dfferent dstrbutons, or domans. In these cases there s no hope for good generalzaton. We wsh to learn a model n one or more source domans (.e. domans from whch the tranng data come), and then apply t to a dfferent target doman (.e. doman from whch the test data come). Ths knd of learnng model s called doman adaptaton models [1], [2]. We confront ths problem n many felds, such as sentmental analyss [3], [4], [5], natural language processng [6], computer vson, etc. Often n these cases, source domans offer large numbers of labeled data for learnng, whle target domans may have no labeled data avalable. In ths paper, we concentrate on the stuaton that there are no labeled data n the target doman, and there s only one target doman. The task s to combne the labeled source data and unlabeled target data to classfy the target data as correctly as possble. The problem of mult-source doman adaptaton consdered n ths paper has been researched by many researchers [7], [8], [9], [10]. Crammer et al. [7] consdered a problem about learnng an accuracy model va the nearby data ponts from more than one source domans. They gave a general algorthm as follows: by usng samples from dfferent source domans to estmate the dvergence among these sources, the algorthm determnes whch samples from each source should be selected to tran the model. Thus a correspondng subset that best suts the target task was chosen from each source. On a bnary classfcaton task the algorthm was demonstrated to be effectve. Tu and Sun [8] gave an emsemble learnng framework for doman adaptaton. They presented a novel ensemble-based method whch dynamcally assgns weghts to dfferent test examples by usng the so-called frendly classfers. The model gave the most favorable weghts to dfferent examples. Mansour et al. [9] presented a theoretcal analyss of doman adaptaton learnng wth multple sources. They gave a combnaton of the source hypotheses weghted accordng to the source dstrbutons. In practce they showed that for any fxed target functon, there exsted a dstrbuton weghted combnng rule that has a loss at most ɛ. An nterestng ssue to consder n mult-source doman adaptaton s that what we should do f we do not know n advance whch doman performs best. On one hand, we want to use the most sutable source to solve the target task. On the other hand, we do not know whch one to choose. Ths paper gves a tradeoff for ths problem. In ths pa-

2 per, we present a mult-source doman adaptaton framework based on the Bayesan learnng prncple (BayesMSDA). Under the Bayesan framework, the determnaton of classfcaton s based on the posteror probabltes. These posterors are proportonal to the product of the prors and the lkelhoods. We defne n BayesMSDA framework a novel pror usng the Laplacan matrx [11], [12], and a novel lkelhood based on the mean Eucldean dstance of k-nearset ponts. The remander of ths paper s organzed as follows. The new framework and ts mplementaton for mult-source doman adaptaton are ntroduced n detal n Secton 2. In Secton 3, two experments on synthetc and real-world data are accomplshed to llustrate the effectveness of our framework. In Secton 4, conclusons and the future work are gven. 2. The proposed framework The proposed framework for mult-source doman adaptaton (BayesMSDA) s based on the Bayesan learnng prncple: the probablty of whch class a target example belongs to s proportonal to the product of pror and lkelhood assgned to ths example. For mult-source ssues, the core problem s how to combne these sources effectvely to solve the target task. The novel framework s descrbed as follows. Gven M source domans S, =1, 2,...M, and one target doman T. The task s to label the data n T, usng the unlabeled data n T and the large numbers of labeled data n S. Assume that we can get M classfers c, based on the M source domans S. Then we defne the pror, whch measures the ftness between the source doman and the target doman, and the lkelhood of each target data, whch represents the probablty of the target data occurng n the source. Applyng the Bayesan learnng prncple to get a posteror for classfcaton, we can use these posterors to weght the M classfers c to get a fnal label for the target data. The framework we present wth self-defned pror and lkehood s applcable to the stuaton that the data are unlabeled n the target doman, whch s compared wth the majorty votng algorthm Pror Consder a weghted undrected graph G =(V,E), wth the data set V = (x 1,x 2,...x n ),E = (e 1,e 2,...e l ). Assume that G s a connected graph (f not, the process followed can be used on each connected component). Let Y =(y 1,y 2...y n ) be the mage of the data set under certan mappng rules. The problem now s how to make the mages y and y j as close as possble when the data ponts x and x j are close. A reasonable crteron to guarantee ths s to mnmze the objectve functon G = (y y j ) 2 W j (1) under approprate constrans, where W j = e x x j T when x and x j are neghbors, and zero otherwse. Equaton (1) means that there s a heavy penalty when the mages of neghborhood ponts x and x j are far away from each other. Mnmzng t attempts to make sure that y and y j are close f x and x j are close. Ths property can be used n bnary classfcaton problems effectvely. The pror gves a measurement of ftness when a source classfer s appled on the target task: for a sample x n the target doman, t should be ndependent wth x. In ths paper, we construct the pror wth the Laplacan matrx [11], [12] of the target doman. We consder the ssue that data n the target doman whch are all unlabeled are used to quantfy the pror. For any Y, the objectve functon becomes G = = (y y j ) 2 W j ( 2 y + y 2 ) j 2y y j Wj = =2Y T LY y 2 D + j y j 2 D jj 2 y y j W j where L = D W s the Laplacan matrx. Notce that W j s symmetrc and D = W j s a dagonal matrx. D provdes a natural measure on the vertces of the graph G. The j bgger the value D (correspondng to the th vertex) s, the more mportant the vertex s. Gven n ponts x R d,=1, 2, 3,...n. We construct such an undrected graph G =(V,E), wth the neghbors of x are ts k-nearest neghbors. The steps of generatng a Laplacan matrx of the target data set are as follows. Step 1: (calculatng the adjacency matrx A) fx and x j are k-nearest neghbors, let A j =1 as well as A j =1, otherwse A j =0 and A j =0. Step 2: (calculatng the weght matrx W ) one of the varatons s the heat kernel: W j = e x x j T (2)

3 where T R. Step 3: (calculatng the Laplacan matrx L) let D = W j, then L = D W s the Laplacan matrx, whch s j a symmetrc, postve semdefnte matrx. Once we have a Laplacan matrx, the pror s defned as pror m = 1 = ( ) y m yj m 2Wj 1 2(Y m ) T LY m (3) where Y m s the output of the mth source classfer. In multsource doman cases, the dfferent prors of each source show the ftness between each source doman and the target doman. The bgger the pror s, the better the correspondng source classfer s Lkelhood The lkelhood we defne here represents the probablty of the nstance from the target doman occurng n the source doman, whch can also refer to the smlarty between the target doman and the source doman. The hgher the probablty descrbed above s, the better the source classfer s. In ths paper we use the mean Eucldean dstance of the K-nearest neghbors of nstance x (whch s from the target doman, and these K-nearest neghbors are from the source doman) to measure ths lkelhood. For each nstance x n the target doman, the lkelhoods n the dfferent source domans are dfferent, whch gve a dynamc classfyng rule. The lkelhood that the target data pont x occurs n the source doman S m s defned as Lke m K = x x m (4) j where x m j, whch comes form the mth source, s among the K-nearest neghbors of x. Accordng to the Bayesan learnng prncple, the posteror s proportonal to the product of the pror and the lkelhood: post m pror m Lke m (5) where post m s the posteror of x, based on the mth source doman, pror m s the pror of x based on the mth source doman, Lke m s the lkelhood of x based on the mth source doman. The posterors obtaned here are used to weght the source classfers. 3. Experments In ths secton, we evaluate the proposed framework by experments on both synthetc and real-world data sets. Each dataset has four domans. In the experments, every doman s treated as the target doman n turn whle the other three as source domans. We use support vector machnes (SVMs) [13], [14] for tranng and testng. For textual classfcaton SVMs have been found to perform better than other classfcaton methods [13], especally for the sentment analyss [3]. The kernel we use s RBF kernel. The parameters of SVMs are selected usng cross valdaton for each doman Synthetc data Fgure 1. Examples of synthetc data:,, +, stand for a, b, c, d, respectvely. The smaller ones (.e. the bottom-left porton of each doman n the fgure) are postve, and the larger ones (.e. the top-rght porton of each doman n the fgure) are negatve. Each doman s treated as the target n turn, whle the other three as the sources The synthetc dataset conssts of four dfferent domans, each of whch s sampled from Gaussan dstrbutons wth dfferent covarances and means. Fgure 1 shows 30 randomly selected data ponts from each doman, and the dfferent symbols stand for dfferent domans, says, for a, for b, + for c, and for d. The smaller ones (.e. the bottom-left porton of each doman n Fgure 1) are labeled as postve, whle bgger

4 TABLE 1. Accuraces of One-to-One Classfers. (T for Target Doman and S for Source Doman) S B(%) D(%) E(%) K(%) T B D E K Fgure 2. Classfcaton accuraces (%) on synthetc data ones (.e. the top-rght porton of each doman n the Fgure 1) negatve. The base classfers are traned usng SVMs wth RBF kernel. Fgure 2 llustrates the classfcaton accuraces on these domans usng BayesMSDA and the majorty votng algorthm. Each doman s treated as the target n turn, whle the other three as the sources. It s shown n Fgure 2 that on three domans (a, b, d), BayesMSDA method outperforms the majorty votng method, whle on the thrd one (c) both are equal. Sgnfcantly, accuracy of BayesMSDA s far hgher than that of the votng method on the fourth dataset (98.77% VS 69.69%). As four datasets are randomly obtaned, the results gve us a confdence on the effectveness of the proposed framework. In the next subsecton, we apply BayesMSDA on real-world data. But on the other hand, as we can see, on the thrd dataset, two results are equal. Ths phenomenon s acceptable snce no such an algorthm can ft the whole stuatons Real data Gven a pece of text, sentment classfcaton s a task to determne whether the sentment expressed by the text s postve or negatve. Ths problem has extended to many new domans, such as stock message boards, congressonal floor debates, and blog revews. Research results have been used to gauge market reacton and summarze opnon from web pages, dscusson boards, and blogs. We use the publcly avalable data sets 1 from Amazon web- 1 mdredze/ Fgure 3. Classfcaton accuraces (%) on realworld data ste n our experments [1], where there are many revews for several dfferent types of products. We select four domans: books, DVDs, ktchen & housewares, and electroncs (B, D, K, E for short, respectvelly). Each revew conssts of a ratng (1-5 stars), a ttle, revew text, and some other nformaton whch are gnored. We make t a bnary classfcaton task by bnnng revews wth 4-5 stars as postve and 1-2 stars as negatve, whle revews wth 3 stars are dscarded. As vocabulares of revews for dfferent products vary vastly, classfers traned on one doman may not ft a dfferent doman because some mportant lexcal nformaton may be mssed. Ths phenomenon motvates us to combne more than one source domans to avod the shortcomngs. Every doman contans 1000 postve revews (P) and 1000 negatve revews (N). In our experments we randomly choose 1000 out of these 2000 nstances n each doman for computatonal convenence. So we have nstances n all. Each nstance s represented as a sparse feature vector. The feature sets consst of the ungram that occur 5 to 1000 tmes n all the

5 revews. At the very begnnng, one-to-one lnear classfers are frstly traned wthout adaptaton. These classfers are regarded as baselnes. The baselnes are traned usng SVMs wth RBF kernel. Cross valdatons are employed once agan to select the parameters. Classfcaton accuraces are reported n TABLE 1 where the frst row represents the source domans. On the adaptaton stage, one of the four domans (B, D, K, E) s treated as target doman n turn, whle the others as source domans. The one-to-one baselnes are used here for adaptaton. Fgure 3 shows that BayesMSDA proposed n ths paper gves an encouragng result for bnary classfcaton. The BayesMSDA beats the majorty votng method on three domans (B, E, K). Comparng TABLE 1 and Fgure 3, we can conclude that BayesMSDA s a tradeoff between the best and the worst one-to-one lnear classfers. Because there are no labeled data at dsposal n the target doman, we do not know n advance whch one-to-one classfer s the best one and whch s the worst. Choosng classfers randomly s not accecptale. Our method s a better choce to get a resonable result, because even on doman D BayesMSDA outperforms the other two domans (74.1% VS 69.6% & 69.3%) except the best baselne. 4. Conclusons and future work In ths paper, a new Bayesan framework for mult-source doman adaptaton (BayesMSDA) s proposed. We focus on the case that there are lots of labeled data n source domans but no labeled data are at dsposal n the target doman. Our expermental results show that BayesMSDA s a better choce when no labeled data are avalable n the target doman. It s also an nterestng problem to consder the stuaton that there are some labeled nstances avalable n the target doman. In the future, we wll study ths problem. Acknowledgements Ths work s supported by the Natonal Natural Scence Foundaton of Chna under Project , and Shangha Knowledge Servce Platform Project (No. ZF1213). References [1] S.B. Davd, J. Bltzer, K. Crammer, A. Kulesza, F. Perera, and J.W. Vaughan, A Theory of Learnng from Dfferent Domans, Machne Learnng, Vol 79, pp , [2] W. Tu, and S. Sun, Transferable Dscrmnatve Dmensonalty Reducton, Proceedngs of the 23rd IEEE Internatonal Conference on Tools wth Artfcal Intellgence, pp , Nov [3] B. Pang, L. Lee, and S. Vathyanathan, Thumbs Up? Sentment Classfcaton usng Machne Learnng Technques, Proceedngs of Emprcal methods n Natural Language Processng, Vol 10, pp , [4] J. Bltzer, M. Dredze, and F. Perera, Bographes, Bollywood, Boom-boxes and Blenders: Doman Adaptaton for Sentment Classfcaton, Proceedngs of the 45th Annual Meetng of the Assocaton of Computatonal Lngustcs, pp , Jun [5] A. Aue, and M. Gamon, Customzng Sentment Classfers to New Domans: A Case Study, Proceedngs of Recent Advances n Natural Language Processng, [6] J. Jang, and C. Zha, Instance Weghtng for Doman Adaptaton n NLP, Proceedngs of the 45th Annual Meetng of the Assocaton of Computatonal Lngustcs, pp , Jun [7] K. Crammer, M. Kearns, and J. Wortman, Learnng from Multple Source, Journal of Machne Learnng Research, Vol 9, pp , Jun [8] W. Tu, and S. Sun, Dynamcal Ensemble Learnng wth Model-Frendly Classfers for Doman Adaptaton, Proceedngs of the 21st Internatonal Conference on Pattern Recognton 2012, pp , Nov [9] Y. Mansour, M. Mohr, and A. Rostamzadeh, Doman Adaptaton wth Multple Sources, Advances n Neural Informaton Processng Systems, Vol 21, pp , [10] Y. Mansour, M. Mohr, and A. Rostamzadeh, Doman Adaptaton: Learnng Bounds and Algorthms, Proceedngs of the Conference on Learnng Theory, Jun [11] M. Belkn, and P. Nyog, Laplacan Egenmaps for Dmensonalty Reducton and Data Representaton, Neural Computaton, Vol 15, pp , Jun [12] S. Sun, Mult-vew Laplacan Support Vector Machnes, Lecture Notes n Artfcal Intellgence, Vol 7121, pp , Dec [13] T. Joachms, Text Categorzaton wth Support Vector Machnes: Learnng wth Many Relevant Features, European Conference on Machne Learnng, pp , Apr [14] J. Shawe-Taylor, and S. Sun, A Revew of Optmzaton Methodologes n Support Vector Machnes, Neurocomputng, Vol 74, pp , Oct

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Laplacian Eigenmap for Image Retrieval

Laplacian Eigenmap for Image Retrieval Laplacan Egenmap for Image Retreval Xaofe He Partha Nyog Department of Computer Scence The Unversty of Chcago, 1100 E 58 th Street, Chcago, IL 60637 ABSTRACT Dmensonalty reducton has been receved much

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Learning an Image Manifold for Retrieval

Learning an Image Manifold for Retrieval Learnng an Image Manfold for Retreval Xaofe He*, We-Yng Ma, and Hong-Jang Zhang Mcrosoft Research Asa Bejng, Chna, 100080 {wyma,hjzhang}@mcrosoft.com *Department of Computer Scence, The Unversty of Chcago

More information

CSCI 5417 Information Retrieval Systems Jim Martin!

CSCI 5417 Information Retrieval Systems Jim Martin! CSCI 5417 Informaton Retreval Systems Jm Martn! Lecture 11 9/29/2011 Today 9/29 Classfcaton Naïve Bayes classfcaton Ungram LM 1 Where we are... Bascs of ad hoc retreval Indexng Term weghtng/scorng Cosne

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

An Evaluation of Divide-and-Combine Strategies for Image Categorization by Multi-Class Support Vector Machines

An Evaluation of Divide-and-Combine Strategies for Image Categorization by Multi-Class Support Vector Machines An Evaluaton of Dvde-and-Combne Strateges for Image Categorzaton by Mult-Class Support Vector Machnes C. Demrkesen¹ and H. Cherf¹, ² 1: Insttue of Scence and Engneerng 2: Faculté des Scences Mrande Galatasaray

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Sentiment Classification and Polarity Shifting

Sentiment Classification and Polarity Shifting Sentment Classfcaton and Polarty Shftng Shoushan L Sopha Yat Me Lee Yng Chen Chu-Ren Huang Guodong Zhou Department of CBS The Hong Kong Polytechnc Unversty {shoushan.l, sophaym, chenyng3176, churenhuang}

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

Semi-supervised Classification Using Local and Global Regularization

Semi-supervised Classification Using Local and Global Regularization Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (2008) Sem-supervsed Classfcaton Usng Local and Global Regularzaton Fe Wang 1, Tao L 2, Gang Wang 3, Changshu Zhang 1 1 Department of

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Bayesian Classifier Combination

Bayesian Classifier Combination Bayesan Classfer Combnaton Zoubn Ghahraman and Hyun-Chul Km Gatsby Computatonal Neuroscence Unt Unversty College London London WC1N 3AR, UK http://www.gatsby.ucl.ac.uk {zoubn,hckm}@gatsby.ucl.ac.uk September

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

Performance Assessment and Fault Diagnosis for Hydraulic Pump Based on WPT and SOM

Performance Assessment and Fault Diagnosis for Hydraulic Pump Based on WPT and SOM Performance Assessment and Fault Dagnoss for Hydraulc Pump Based on WPT and SOM Be Jkun, Lu Chen and Wang Zl PERFORMANCE ASSESSMENT AND FAULT DIAGNOSIS FOR HYDRAULIC PUMP BASED ON WPT AND SOM. Be Jkun,

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

IMAGE FUSION BASED ON EXTENSIONS OF INDEPENDENT COMPONENT ANALYSIS

IMAGE FUSION BASED ON EXTENSIONS OF INDEPENDENT COMPONENT ANALYSIS IMAGE FUSION BASED ON EXTENSIONS OF INDEPENDENT COMPONENT ANALYSIS M Chen a, *, Yngchun Fu b, Deren L c, Qanqng Qn c a College of Educaton Technology, Captal Normal Unversty, Bejng 00037,Chna - (merc@hotmal.com)

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

Online Detection and Classification of Moving Objects Using Progressively Improving Detectors

Online Detection and Classification of Moving Objects Using Progressively Improving Detectors Onlne Detecton and Classfcaton of Movng Objects Usng Progressvely Improvng Detectors Omar Javed Saad Al Mubarak Shah Computer Vson Lab School of Computer Scence Unversty of Central Florda Orlando, FL 32816

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 2 Sofa 2016 Prnt ISSN: 1311-9702; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-2016-0017 Hybrdzaton of Expectaton-Maxmzaton

More information

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,

More information

Semantic Scene Concept Learning by an Autonomous Agent

Semantic Scene Concept Learning by an Autonomous Agent Semantc Scene Concept Learnng by an Autonomous Agent Weyu Zhu Illnos Wesleyan Unversty PO Box 29, Bloomngton, IL 672 wzhu@wu.edu Abstract Scene understandng addresses the ssue of what a scene contans.

More information

Keywords - Wep page classification; bag of words model; topic model; hierarchical classification; Support Vector Machines

Keywords - Wep page classification; bag of words model; topic model; hierarchical classification; Support Vector Machines (IJCSIS) Internatonal Journal of Computer Scence and Informaton Securty, Herarchcal Web Page Classfcaton Based on a Topc Model and Neghborng Pages Integraton Wongkot Srura Phayung Meesad Choochart Haruechayasak

More information

Discriminative classifiers for object classification. Last time

Discriminative classifiers for object classification. Last time Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification Fast Sparse Gaussan Processes Learnng for Man-Made Structure Classfcaton Hang Zhou Insttute for Vson Systems Engneerng, Dept Elec. & Comp. Syst. Eng. PO Box 35, Monash Unversty, Clayton, VIC 3800, Australa

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status

More information

Learning from Multiple Related Data Streams with Asynchronous Flowing Speeds

Learning from Multiple Related Data Streams with Asynchronous Flowing Speeds Learnng from Multple Related Data Streams wth Asynchronous Flowng Speeds Zh Qao, Peng Zhang, Jng He, Jnghua Yan, L Guo Insttute of Computng Technology, Chnese Academy of Scences, Bejng, 100190, Chna. School

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Yan et al. / J Zhejiang Univ-Sci C (Comput & Electron) in press 1. Improving Naive Bayes classifier by dividing its decision regions *

Yan et al. / J Zhejiang Univ-Sci C (Comput & Electron) in press 1. Improving Naive Bayes classifier by dividing its decision regions * Yan et al. / J Zhejang Unv-Sc C (Comput & Electron) n press 1 Journal of Zhejang Unversty-SCIENCE C (Computers & Electroncs) ISSN 1869-1951 (Prnt); ISSN 1869-196X (Onlne) www.zju.edu.cn/jzus; www.sprngerlnk.com

More information

Selecting Shape Features Using Multi-class Relevance Vector Machine

Selecting Shape Features Using Multi-class Relevance Vector Machine Selectng Shape Features Usng Mult-class Relevance Vector Machne Hao Zhang Jtendra Malk Electrcal Engneerng and Computer Scences Unversty of Calforna at Berkeley Techncal Report No. UCB/EECS-5-6 http://www.eecs.berkeley.edu/pubs/techrpts/5/eecs-5-6.html

More information

A Novel Term_Class Relevance Measure for Text Categorization

A Novel Term_Class Relevance Measure for Text Categorization A Novel Term_Class Relevance Measure for Text Categorzaton D S Guru, Mahamad Suhl Department of Studes n Computer Scence, Unversty of Mysore, Mysore, Inda Abstract: In ths paper, we ntroduce a new measure

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton We-Chh Hsu, Tsan-Yng Yu E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton

More information

An Anti-Noise Text Categorization Method based on Support Vector Machines *

An Anti-Noise Text Categorization Method based on Support Vector Machines * An Ant-Nose Text ategorzaton Method based on Support Vector Machnes * hen Ln, Huang Je and Gong Zheng-Hu School of omputer Scence, Natonal Unversty of Defense Technology, hangsha, 410073, hna chenln@nudt.edu.cn,

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices Steps for Computng the Dssmlarty, Entropy, Herfndahl-Hrschman and Accessblty (Gravty wth Competton) Indces I. Dssmlarty Index Measurement: The followng formula can be used to measure the evenness between

More information

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering Out-of-Sample Extensons for LLE, Isomap, MDS, Egenmaps, and Spectral Clusterng Yoshua Bengo, Jean-Franços Paement, Pascal Vncent Olver Delalleau, Ncolas Le Roux and Mare Oumet Département d Informatque

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

A User Selection Method in Advertising System

A User Selection Method in Advertising System Int. J. Communcatons, etwork and System Scences, 2010, 3, 54-58 do:10.4236/jcns.2010.31007 Publshed Onlne January 2010 (http://www.scrp.org/journal/jcns/). A User Selecton Method n Advertsng System Shy

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Pruning Training Corpus to Speedup Text Classification 1

Pruning Training Corpus to Speedup Text Classification 1 Prunng Tranng Corpus to Speedup Text Classfcaton Jhong Guan and Shugeng Zhou School of Computer Scence, Wuhan Unversty, Wuhan, 430079, Chna hguan@wtusm.edu.cn State Key Lab of Software Engneerng, Wuhan

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented

More information

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures A Novel Adaptve Descrptor Algorthm for Ternary Pattern Textures Fahuan Hu 1,2, Guopng Lu 1 *, Zengwen Dong 1 1.School of Mechancal & Electrcal Engneerng, Nanchang Unversty, Nanchang, 330031, Chna; 2. School

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

A Multivariate Analysis of Static Code Attributes for Defect Prediction

A Multivariate Analysis of Static Code Attributes for Defect Prediction Research Paper) A Multvarate Analyss of Statc Code Attrbutes for Defect Predcton Burak Turhan, Ayşe Bener Department of Computer Engneerng, Bogazc Unversty 3434, Bebek, Istanbul, Turkey {turhanb, bener}@boun.edu.tr

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume

More information

From Comparing Clusterings to Combining Clusterings

From Comparing Clusterings to Combining Clusterings Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (008 From Comparng Clusterngs to Combnng Clusterngs Zhwu Lu and Yuxn Peng and Janguo Xao Insttute of Computer Scence and Technology,

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

A Post Randomization Framework for Privacy-Preserving Bayesian. Network Parameter Learning

A Post Randomization Framework for Privacy-Preserving Bayesian. Network Parameter Learning A Post Randomzaton Framework for Prvacy-Preservng Bayesan Network Parameter Learnng JIANJIE MA K.SIVAKUMAR School Electrcal Engneerng and Computer Scence, Washngton State Unversty Pullman, WA. 9964-75

More information

EXTENDED BIC CRITERION FOR MODEL SELECTION

EXTENDED BIC CRITERION FOR MODEL SELECTION IDIAP RESEARCH REPORT EXTEDED BIC CRITERIO FOR ODEL SELECTIO Itshak Lapdot Andrew orrs IDIAP-RR-0-4 Dalle olle Insttute for Perceptual Artfcal Intellgence P.O.Box 59 artgny Valas Swtzerland phone +4 7

More information

Data-dependent Hashing Based on p-stable Distribution

Data-dependent Hashing Based on p-stable Distribution Data-depent Hashng Based on p-stable Dstrbuton Author Ba, Xao, Yang, Hachuan, Zhou, Jun, Ren, Peng, Cheng, Jan Publshed 24 Journal Ttle IEEE Transactons on Image Processng DOI https://do.org/.9/tip.24.2352458

More information

Integrated Expression-Invariant Face Recognition with Constrained Optical Flow

Integrated Expression-Invariant Face Recognition with Constrained Optical Flow Integrated Expresson-Invarant Face Recognton wth Constraned Optcal Flow Chao-Kue Hseh, Shang-Hong La 2, and Yung-Chang Chen Department of Electrcal Engneerng, Natonal Tsng Hua Unversty, Tawan 2 Department

More information

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE Dorna Purcaru Faculty of Automaton, Computers and Electroncs Unersty of Craoa 13 Al. I. Cuza Street, Craoa RO-1100 ROMANIA E-mal: dpurcaru@electroncs.uc.ro

More information

Dynamic Integration of Regression Models

Dynamic Integration of Regression Models Dynamc Integraton of Regresson Models Nall Rooney 1, Davd Patterson 1, Sarab Anand 1, Alexey Tsymbal 2 1 NIKEL, Faculty of Engneerng,16J27 Unversty Of Ulster at Jordanstown Newtonabbey, BT37 OQB, Unted

More information

Impact of a New Attribute Extraction Algorithm on Web Page Classification

Impact of a New Attribute Extraction Algorithm on Web Page Classification Impact of a New Attrbute Extracton Algorthm on Web Page Classfcaton Gösel Brc, Banu Dr, Yldz Techncal Unversty, Computer Engneerng Department Abstract Ths paper ntroduces a new algorthm for dmensonalty

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information