Feature-Based Matrix Factorization

Size: px
Start display at page:

Download "Feature-Based Matrix Factorization"

Transcription

1 Feature-Based Matrx Factorzaton arxv: v3 [cs.ai] 29 Dec 2011 Tanq Chen, Zhao Zheng, Quxa Lu, Wenan Zhang, Yong Yu Apex Data & Knowledge Management Lab Shangha Jao Tong Unversty 800 Dongchuan Road, Shangha Chna Proect page: wk/svdfeature (verson 1.1) Abstract Recommender system has been more and more popular and wdely used n many applcatons recently. The ncreasng nformaton avalable, not only n quanttes but also n types, leads to a bg challenge for recommender system that how to leverage these rch nformaton to get a better performance. Most tradtonal approaches try to desgn a specfc model for each scenaro, whch demands great efforts n developng and modfyng models. In ths techncal report, we descrbe our mplementaton of feature-based matrx factorzaton. Ths model s an abstract of many varants of matrx factorzaton models, and new types of nformaton can be utlzed by smply defnng new features, wthout modfyng any lnes of code. Usng the toolkt, we bult the best sngle model reported on track 1 of KDDCup Introducton Recommender systems that recommends tems based on users nterest has become more and more popular among many web stes. Collaboratve Flterng(CF) technques that behnd the recommender system have been developed for many years and keep to be a hot area n both academc and ndustry aspects. Currently CF problems face two knds of maor challenges: how to handle large-scale dataset and how to leverage the rch nformaton of data collected. Tradtonal approaches to solve these problems s to desgn specfc models for each problem,.e wrtng code for each model, whch 1

2 demands great efforts n engneerng. Matrx factorzaton(mf) technque s one of the most popular method of CF model, and extensve study has been made n dfferent varants of matrx factorzaton model, such as [3][4] and [5]. However, we fnd that the maorty of matrx factorzaton models share common patterns, whch motvates us to put them together nto one. We call ths model feature-based matrx factorzaton. Moreover, we wrte a toolkt for solvng the general feature-based matrx factorzaton problem, savng the efforts of engneerng for detaled knds of model. Usng the toolkt, we get the best sngle model on track 1 of KDDCup 11[2]. Ths artcle serves as a techncal report for our toolkt of featurebased matrx factorzaton 1. We try to elaborate three problems n ths report,.e, what the model s, how can we use such knd of model, and addtonal dscusson of ssues n engneerng and effcent computaton. 2 What s feature based MF In ths secton, we wll descrbe the model of feature based matrx factorzaton, startng from the example of lnear regresson, and then gong to the full defnton of our model. 2.1 Start from lnear regresson Let s start from the basc collaboratve flterng models. The very baselne of collaboratve flterng model may be the baselne models ust consderng the mean effect of user and tem. See the followng two models. ˆr u = µ + b u (1) ˆr u = µ + b u + b (2) Here µ s a constant ndcatng the global mean value of ratng. Equaton 1 descrbe a model consderng users mean effect whle Equaton 2 denotes tems mean effect. A more complex model consderng the neghborhood nformaton[3] s as follows ˆr u = µ + b + b u + R(u) 1 2 s (r u b u ) (3) R(u) Here R(u) s the set of tems user u rate, b u s a user average ratng pre-calculated. s means the smlarty parameter from to. s s a parameter that we tran from data nstead of drect calculaton usng 1 wk/svdfeature 2

3 memory based methods. Note b u s dfferent from b u snce t s precalculated. Ths s a neghborhood model that takes the neghborhood effect of tems nto consderaton. Assumng we want to mplement all three models, t seems to be wastng to wrte code for each of the model. If we compare those models, t s obvous that all the three models are specal cases of lnear regresson problem descrbed by Equaton 4 y = w x (4) Suppose we have n users, m tems, and h total number of possble s n equaton 3. We can defne the feature vector x = [x 0, x 1,, x n+m+h ] for user tem par < u, > as follows Indcator(u == k) k < n Indcator( == k n) n k < n + m x k = 0 k m + n, / R(u), s means w k R(u) 1 2 (r u b u ) k m + n, R(u), s means w k (5) The correspondng layout for weght w shown n equaton 6. Note that choce of pars s can be flexble. We can choose only possble neghbors nstead of enumeratng all the pars. w = [b u (0), b u (1),, b u (n), b (1), b (m) s ] (6) In other words, equaton 3 can be reformed as the followng form ˆr u = µ + b 1 + b u 1 + s [ R(u) 1 2 (ru b ] u ) R(u) (7) where b, b u, s corresponds to weght of lnear regresson, and the coeffcents on the rght of the weght are the nput features. In summary, under ths framework, the only thng that we need to do s to layout the parameters nto a feature vector. In our case, we arrange frst n features to b u then b and s, then transform the nput data nto the format of lnear regresson nput. Fnally we use a lnear regresson solver to work the problem out. 2.2 Feature based matrx factorzaton The prevous secton shows that some baselne CF algorthms are lnear regresson problem. In ths secton, we wll dscuss feature-based generalzaton for matrx factorzaton. A basc matrx factorzaton model s stated n Equaton 8: ˆr u = µ + b u + b + p T u q (8) 3

4 Queston U User Factor Mergng Merged User Factor User Features I Item Features Item Factor Mergng Merged Item Factor Global Features Answer r U,I User Feature Bas Item Feature Bas Global Feature Bas Fgure 1: Feature-based matrx factorzaton The bas terms have the same meanng as prevous secton. We also get two factor term p u and q. p u models the latent peference of user u. q models the latent property of tem. Inspred by the dea of prevous secton, we can get a drect generalzaton for matrx factorzaton verson. ˆr u = µ + w x + b u + b + p T u q (9) Equaton 9 adds a lnear regresson term to the tradtonal matrx factorzaton model. Ths allows us to add more bas nformaton, such as neghborhood nformaton and tme bas nformaton, etc. However, we may also need a more flexble factor part. For example, we may want a tme dependent user factor p u (t) or herarchcal dependent tem factor q (h). As we can fnd from prevous secton, a drect way to nclude such flexblty s to use features n factor as well. So we adust our feature based matrx factorzaton as follows T y = µ+ b (g) γ + b (u) α + b () β + p α q β (10) The nput conssts of three knds of features < α, β, γ >, we call α user feature, β tem feature and γ global feature. The frst part of Equaton 10. The name of these features explans ther meanngs. α descrbes the user aspects, β descrbes the tem aspects, whle γ descrbes some global bas effect. Fgure 1 shows the dea of the procedure. We can fnd basc matrx factorzaton s a specal case of Equaton 10. For predctng user tem par < u, >, defne { { 1 k = u 1 k = γ =, α k = 0 k u, β k = (11) 0 k We are not lmted to the smple matrx factorzaton. It enables us to ncorporate the neghborhood nformaton to γ, and tme de- 4

5 pendent user factor by modfyng α. Secton 3 wll present a detaled descrpton of ths. 2.3 Actve functon and loss functon There, you need to choose an actve functon f( ) to the output of the feature based matrx factorzaton. Smlarly, you can also try varous of loss functons for loss estmaton. The fnal verson of the model s ˆr = f(y) (12) Loss = L(ˆr, r) + regularzaton (13) Common choce of actve functons and loss are lsted as follows: dentty functon, L2 loss, orgnal matrx factorzaton. ˆr = f(y) = y (14) Loss = (r ˆr) 2 + regularzaton (15) sgmod functon, log lkelhood, logstc regresson verson of matrx factorzaton. 1 ˆr = f(y) = 1 + e y (16) Loss = r ln ˆr + (1 r) ln(1 ˆr) + regularzaton (17) dentty functon, smoothed hnge loss[7], maxmum margn matrx factorzaton[8][7]. Bnary classfcaton problem, r {0, 1} Loss = h ((2r 1)y) + regularzaton (18) 1 2 z z 0 1 h(z) = 2 (1 z)2 0 < z < 1 0 z 1 (19) 5

6 2.4 Model Learnng To update the model, we use the followng update rule p = p + η êα q β λ 1 p (20) q = q + η êβ p α λ 2 q (21) b (g) b (u) b () = b (g) = b (u) = b () ( ) + η êγ λ 3 b (g) ( ) + η êα λ 4 b (u) ( ) + η êβ λ 5 b () (22) (23) (24) Here ê = r ˆr the dfference between true rate and predcted rate. Ths rule s vald for both logstc lkelhood loss and L2 loss. For other loss, we shall modfy ê to be correspondng gradent. η s the learnng rate and the λs are regularzaton parameters that defnes the strength of regularzaton. 3 What nformaton can be ncluded In ths secton, we wll present some examples to llustrate the usage of our feature-based matrx factorzaton model. 3.1 Basc matrx factorzaton Basc matrx factorzaton model s defned by followng equaton y = µ + b u + b + p T u q (25) And the correspondng feature representaton s γ =, α k = { 1 k = u 0 k u, β k = { 1 k = 0 k (26) 3.2 Parwse rank model For the rankng model, we are nterested n the order of two tems, gven a user u. A parwse rankng model s descrbed as follows P (r u > r u ) = sgmod ( µ + b b + p T u (q q ) ) (27) 6

7 The correspondng features representaton are lke ths { 1 k = 1 k = u γ =, α k = 0 k u, β k = 1 k = 0 k, k (28) by usng sgmod and log-lkelhood as loss functon. Note that the feature representaton gves one extra b u whch s not desrable. We can removed t by gve hgh regularzaton to b u that penalze t to Temporal Informaton A model that nclude temporal nformaton[4] can be descrbed as follows y = µ + b u (t) + b (t) + b u + b + (p u + p u (t)) T q (29) We can nclude b (t) usng global feature, and b u (t), p u (t) usng user feature. For example, we can defne a tme nterpolaton model as follows ( ) y = µ + b + b s e t u e s + t s be u e s + p s e t u e s + t s T pe u q (30) e s Here e and s mean start and end of the tme of all the ratngs. A ratng that s rated later wll be affected more by p e and b e and earler ratngs wll be more affected by p s and b s. For ths model, we can defne γ =, α k = e t e s t s e s k = u k = u + n 0 otherwse, β k = { 1 k = 0 k (31) Note we frst arrange the p s n the frst n features then p e n next n features. 3.4 Neghborhood nformaton A model that nclude neghborhood nformaton[3] can be descrbed as below: y = µ + s [ R(u) 1 2 (ru b ] u ) + b u + b + p T u q (32) R(u) We only need to mplement neghborhood nformaton to global features as descrbed by Secton

8 3.5 Herarchcal nformaton In Yahoo! Musc Dataset[2], some tracks belongs to same artst. We can nclude such herarchcal nformaton by addng t to tem feature. The model s descrbed as follows y = µ + b u + b t + b a + p T u (q t + q a ) (33) Here t means track and a denotes correspondng artst. Ths model can be formalzed as feature-based matrx factorzaton by redefnng tem feature. 4 Effcent tranng for SVD++ Feature-based matrx factorzaton can naturally ncorporate mplct and explct nformaton. We can smply add these nformaton to user feature α. The model confguraton s shown as follows: y = bas + ξ p + α d T β q (34) Here we omt the detal of bas term. The mplct and explct feedback nformaton s gven by α d, where α s the feature vector of feedback nformaton, α = for mplct feedback, and 1 R(u) α = r u, b u for explct feedback. d s the parameter of mplct R(u) and explct feedback factor. We explctly state out the mplct and explct nformaton n Equaton 34. Although Equaton 34 shows that we can easly ncorporate mplct and explct nformaton nto the model, t s actually very costly to run the stochastc gradent tranng, snce the update cost s lnear to the sze of nonzero entres of α, and α can be very large f a user has rated many tems. Ths wll greatly slow down the tranng speed. We need to use an optmzed method to do tranng. To show the dea of the optmzed method, let s frst defne a derved user mplct and explct factor p m as follows: p m = α d (35) The update of d after one step s gven by the followng equaton d = ηêα β q (36) 8

9 The resulted dfference n p m s gven by p m = ηê α 2 β q (37) Gven a group of samples wth the same user, we need to do gradent descent on each of the tranng sample. The smplest way s to do the followng steps for each sample: (1) calculate p m to get predcton (2) update all d assocates wth mplct and explct feedback. Every tme p m has to be recalculated usng updated d n ths way. However, we can fnd that to get new p m, we don t need to update each d. Instead, we only need to update p m usng Equaton 37. What s more, we can fnd there s a relaton between p m and d as follows: d = α p m (38) k α2 k We shall emphasze that Equaton 38 s true even for multple updates, gven the condton that the user s same n all the samples. We shall menton that the above analyss doesn t consder the regularzaton term. If L2 regularzaton of d s used durng the update as follows: d = η êα β q λd (39) The correspondng changes n p m also looks very smlar p m = η ê α 2 β q λp m (40) However, the relaton n Equaton 38 no longer holds strctly. But we can stll use the relaton snce t approxmately holds when regularzaton term s small. Usng the results we obtaned, we can develop a fast algorthm for feature-based matrx factorzaton wth mplct and explct feedback nformaton. The algorthm s shown n Algorthm 1. We fnd that the basc dea s to group the data of the same user together, for the same user shares the same mplct and explct feedback nformaton. Algorthm 1 allows us to calculate mplct feedback factor only once for a user, greatly savng the computaton tme. 5 How large-scale data s handled Recommender system confronts the problem of large-scale data n practce. Ths s a must when dealng wth real problems. For ex- 9

10 Algorthm 1 Effcent Tranng for Implct and Explct Feedback for all user u do p m α d {calculatng mplct feedback} p old p m for all tranng samples of user u do update other parameters, usng p m to replace α d update p m drectly, do not update d. end for for all, α 0 do d d + end for end for α k α2 k (p m p old ) {add all the changes back to d} ample Yahoo! Musc Dataset[2] conssts of more than 200M ratngs. A toolkt that s robust to nput data sze s desrable for real applcatons. 5.1 Input data bufferng The nput tranng data s extremely large n real applcaton, we don t try to load all the tranng data nto memory. Instead, we buffer all the tranng data through bnary format nto the hard-dsk. We use stochastc gradent descend to tran our model, that s we only need to lnearly terate over the data f we shuffle our data before bufferng. Therefore, our soluton requres the nput feature to be prevously shuffled, then a bufferng program wll create a bnary buffer from the nput feature. The tranng procedure reads the data from hard-dsk and uses stochastc gradent descend to tran the model. Ths bufferng approach makes the memory cost nvarant to the nput data sze, and allows us to tran models over large-scale of nput data so long as the parameters ft nto memory. 5.2 Executon ppelne Although nput data bufferng can solve the problem of large-scale data, t stll suffers from the cost of readng the data from hard-dsk. To mnmze the cost of I/O, we use a pre-fetchng strategy. We create a ndependent thread to fetch the buffer data nto a memory queue, then the tranng program reads the data from memory queue and do tranng. The procedure s shown n Fgure 2 Ths ppelne style of executon removes the burden of I/O from the tranng thread. So long as I/O speed s smlar or faster to tran- 10

11 Data n Dsk FETCH Buffer INPUT Thread 1 n Memory Thread 2 Matrx Factorzaton Stochastc Gradent Descent Fgure 2: Executon ppelne ng speed, the cost of I/O s neglgble, and our our experence on KDDCup 11 proves the success of ths strategy. Wth nput bufferng and ppelne executon, we can tran a model wth test RMSE=22.16 for track1 n KDDCup 11 2 usng less than 2G of memory, wthout sgnfcantly ncreasng of tranng tme. 6 Related work and dscusson The most related work of feature based matrx factorzaton s Factorzaton Machne [6]. The reader can refer to lbfm 3 for a toolkt for factorzaton machne. Strctly speakng, our toolkt mplement a restrcted case of factorzaton machne and s more useful n some aspects. We can support global feature that doesn t need to be take nto factorzaton part, whch s mportant for bas features such as user day bas, neghborhood based features, etc. The dvde of features also gves hnts for model desgn. For global features, we shall consder what aspect may nfluence the overall ratng. For user and tem features, we shall consder how to descrbe the user preference and tem property better. Our model s also related to [1] and [9], the dfference s that n feature-based matrx factorzaton, the user/tem feature can assocate wth temporal nformaton and other context nformaton to better descrbe the preference or property n current context. Our current model also has shortcomngs. The model doesn t support multple dstnct factorzatons at present. For example, sometmes we may want to ntroduce user vs tme tensor factorzaton together wth user vs tem factorzaton. We wll try our best to overcome these drawbacks n the future works. References [1] Deepak Agarwal and Bee-Chung Chen. Regresson-based latent factor models. In Proceedngs of the 15th ACM SIGKDD nterna- 2 kddcup.yahoo.com

12 tonal conference on Knowledge dscovery and data mnng, KDD 09, pages 19 28, New York, NY, USA, ACM. [2] Gdeon Dror, Noam Koengsten, Yehuda Koren, and Markus Wemer. The Yahoo! Musc dataset and KDD-Cup 11. In KDD- Cup Workshop, [3] Yehuda Koren. Factorzaton meets the neghborhood: a multfaceted collaboratve flterng model. In Proceedng of the 14th ACM SIGKDD nternatonal conference on Knowledge dscovery and data mnng, KDD 08, pages , New York, NY, USA, ACM. [4] Yehuda Koren. Collaboratve flterng wth temporal dynamcs. In Proceedngs of the 15th ACM SIGKDD nternatonal conference on Knowledge dscovery and data mnng, KDD 09, pages , New York, NY, USA, ACM. [5] A. Paterek. Improvng regularzed sngular value decomposton for collaboratve flterng. In Proceedngs of KDD Cup and Workshop, volume 2007, [6] Steffen Rendle. Factorzaton machnes. In Proceedngs of the 10th IEEE Internatonal Conference on Data Mnng. IEEE Computer Socety, [7] Jasson D. M. Renne and Nathan Srebro. Fast maxmum margn matrx factorzaton for collaboratve predcton. In Proceedngs of the 22nd nternatonal conference on Machne learnng, ICML 05, pages , New York, NY, USA, ACM. [8] Nathan Srebro, Jason D. M. Renne, and Tomm S. Jaakola. Maxmum-Margn Matrx Factorzaton. In Advances n Neural Informaton Processng Systems 17, volume 17, pages , [9] Davd H. Stern, Ralf Herbrch, and Thore Graepel. Matchbox: large scale onlne bayesan recommendatons. In Proceedngs of the 18th nternatonal conference on World wde web, WWW 09, pages , New York, NY, USA, ACM. 12

Collaborative Filtering Ensemble for Ranking

Collaborative Filtering Ensemble for Ranking Collaboratve Flterng Ensemble for Rankng ABSTRACT Mchael Jahrer commo research & consultng 8580 Köflach, Austra mchael.ahrer@commo.at Ths paper provdes the soluton of the team commo on the Track2 dataset

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following. Complex Numbers The last topc n ths secton s not really related to most of what we ve done n ths chapter, although t s somewhat related to the radcals secton as we wll see. We also won t need the materal

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

On Some Entertaining Applications of the Concept of Set in Computer Science Course

On Some Entertaining Applications of the Concept of Set in Computer Science Course On Some Entertanng Applcatons of the Concept of Set n Computer Scence Course Krasmr Yordzhev *, Hrstna Kostadnova ** * Assocate Professor Krasmr Yordzhev, Ph.D., Faculty of Mathematcs and Natural Scences,

More information

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009. Farrukh Jabeen Algorthms 51 Assgnment #2 Due Date: June 15, 29. Assgnment # 2 Chapter 3 Dscrete Fourer Transforms Implement the FFT for the DFT. Descrbed n sectons 3.1 and 3.2. Delverables: 1. Concse descrpton

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE

ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE Yordzhev K., Kostadnova H. Інформаційні технології в освіті ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE Yordzhev K., Kostadnova H. Some aspects of programmng educaton

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Related-Mode Attacks on CTR Encryption Mode

Related-Mode Attacks on CTR Encryption Mode Internatonal Journal of Network Securty, Vol.4, No.3, PP.282 287, May 2007 282 Related-Mode Attacks on CTR Encrypton Mode Dayn Wang, Dongda Ln, and Wenlng Wu (Correspondng author: Dayn Wang) Key Laboratory

More information

IN recent years, recommender systems, which help users discover

IN recent years, recommender systems, which help users discover Heterogeneous Informaton Network Embeddng for Recommendaton Chuan Sh, Member, IEEE, Bnbn Hu, Wayne Xn Zhao Member, IEEE and Phlp S. Yu, Fellow, IEEE 1 arxv:1711.10730v1 [cs.si] 29 Nov 2017 Abstract Due

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

CMPS 10 Introduction to Computer Science Lecture Notes

CMPS 10 Introduction to Computer Science Lecture Notes CPS 0 Introducton to Computer Scence Lecture Notes Chapter : Algorthm Desgn How should we present algorthms? Natural languages lke Englsh, Spansh, or French whch are rch n nterpretaton and meanng are not

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK L-qng Qu, Yong-quan Lang 2, Jng-Chen 3, 2 College of Informaton Scence and Technology, Shandong Unversty of Scence and Technology,

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Chapter 6 Programmng the fnte element method Inow turn to the man subject of ths book: The mplementaton of the fnte element algorthm n computer programs. In order to make my dscusson as straghtforward

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

Random Kernel Perceptron on ATTiny2313 Microcontroller

Random Kernel Perceptron on ATTiny2313 Microcontroller Random Kernel Perceptron on ATTny233 Mcrocontroller Nemanja Djurc Department of Computer and Informaton Scences, Temple Unversty Phladelpha, PA 922, USA nemanja.djurc@temple.edu Slobodan Vucetc Department

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Today s Outline. Sorting: The Big Picture. Why Sort? Selection Sort: Idea. Insertion Sort: Idea. Sorting Chapter 7 in Weiss.

Today s Outline. Sorting: The Big Picture. Why Sort? Selection Sort: Idea. Insertion Sort: Idea. Sorting Chapter 7 in Weiss. Today s Outlne Sortng Chapter 7 n Wess CSE 26 Data Structures Ruth Anderson Announcements Wrtten Homework #6 due Frday 2/26 at the begnnng of lecture Proect Code due Mon March 1 by 11pm Today s Topcs:

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm Recommended Items Ratng Predcton based on RBF Neural Network Optmzed by PSO Algorthm Chengfang Tan, Cayn Wang, Yuln L and Xx Q Abstract In order to mtgate the data sparsty and cold-start problems of recommendaton

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Exercises (Part 4) Introduction to R UCLA/CCPR. John Fox, February 2005

Exercises (Part 4) Introduction to R UCLA/CCPR. John Fox, February 2005 Exercses (Part 4) Introducton to R UCLA/CCPR John Fox, February 2005 1. A challengng problem: Iterated weghted least squares (IWLS) s a standard method of fttng generalzed lnear models to data. As descrbed

More information

Adapting Ratings in Memory-Based Collaborative Filtering using Linear Regression

Adapting Ratings in Memory-Based Collaborative Filtering using Linear Regression Adaptng Ratngs n Memory-Based Collaboratve Flterng usng Lnear Regresson Jérôme Kunegs, Şahn Albayrak Technsche Unverstät Berln DAI-Labor Franklnstraße 8/9 10587 Berln, Germany {kunegs,sahn.albayrak}@da-labor.de

More information

Your Neighbors Affect Your Ratings: On Geographical Neighborhood Influence to Rating Prediction

Your Neighbors Affect Your Ratings: On Geographical Neighborhood Influence to Rating Prediction Your Neghbors Affect Your Ratngs: On Geographcal Neghborhood Influence to Ratng Predcton Longke Hu lhu003@e.ntu.edu.sg Axn Sun axsun@ntu.edu.sg Yong Lu luy0054@e.ntu.edu.sg School of Computer Engneerng,

More information

Collaborative Topic Regression with Multiple Graphs Factorization for Recommendation in Social Media

Collaborative Topic Regression with Multiple Graphs Factorization for Recommendation in Social Media Collaboratve Topc Regresson wth Multple Graphs Factorzaton for Recommendaton n Socal Meda Qng Zhang Key Laboratory of Computatonal Lngustcs (Pekng Unversty) Mnstry of Educaton, Chna zqcl@pku.edu.cn Houfeng

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Efficient Distributed Linear Classification Algorithms via the Alternating Direction Method of Multipliers

Efficient Distributed Linear Classification Algorithms via the Alternating Direction Method of Multipliers Effcent Dstrbuted Lnear Classfcaton Algorthms va the Alternatng Drecton Method of Multplers Caoxe Zhang Honglak Lee Kang G. Shn Department of EECS Unversty of Mchgan Ann Arbor, MI 48109, USA caoxezh@umch.edu

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

An Item-Targeted User Similarity Method for Data Service Recommendation

An Item-Targeted User Similarity Method for Data Service Recommendation 01 IEEE 16th Internatonal Enterprse Dstrbuted Object Computng Conference Workshops An Item-Targeted User Smlarty Method for Data Serce Recommendaton Cheng Zhang, Xaofang Zhao Insttute of Computng Technology

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

A Hybrid Collaborative Filtering Model with Deep Structure for Recommender Systems

A Hybrid Collaborative Filtering Model with Deep Structure for Recommender Systems Proceedngs of the Thrty-Frst AAAI Conference on Artfcal Intellgence (AAAI-17) A Hybrd Collaboratve Flterng Model wth Deep Structure for Recommender Systems Xn Dong, Le Yu, Zhonghuo Wu, Yuxa Sun, Lngfeng

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010 Smulaton: Solvng Dynamc Models ABE 5646 Week Chapter 2, Sprng 200 Week Descrpton Readng Materal Mar 5- Mar 9 Evaluatng [Crop] Models Comparng a model wth data - Graphcal, errors - Measures of agreement

More information

Concurrent Apriori Data Mining Algorithms

Concurrent Apriori Data Mining Algorithms Concurrent Apror Data Mnng Algorthms Vassl Halatchev Department of Electrcal Engneerng and Computer Scence York Unversty, Toronto October 8, 2015 Outlne Why t s mportant Introducton to Assocaton Rule Mnng

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

A Webpage Similarity Measure for Web Sessions Clustering Using Sequence Alignment

A Webpage Similarity Measure for Web Sessions Clustering Using Sequence Alignment A Webpage Smlarty Measure for Web Sessons Clusterng Usng Sequence Algnment Mozhgan Azmpour-Kv School of Engneerng and Scence Sharf Unversty of Technology, Internatonal Campus Ksh Island, Iran mogan_az@ksh.sharf.edu

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

Signed Distance-based Deep Memory Recommender

Signed Distance-based Deep Memory Recommender Sgned Dstance-based Deep Memory Recommender ABSTRACT Personalzed recommendaton algorthms learn a user s preference for an tem, by measurng a dstance/smlarty between them. However, some of exstng recommendaton

More information

3D vector computer graphics

3D vector computer graphics 3D vector computer graphcs Paolo Varagnolo: freelance engneer Padova Aprl 2016 Prvate Practce ----------------------------------- 1. Introducton Vector 3D model representaton n computer graphcs requres

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Lecture #15 Lecture Notes

Lecture #15 Lecture Notes Lecture #15 Lecture Notes The ocean water column s very much a 3-D spatal entt and we need to represent that structure n an economcal way to deal wth t n calculatons. We wll dscuss one way to do so, emprcal

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

Available online at Available online at Advanced in Control Engineering and Information Science

Available online at   Available online at   Advanced in Control Engineering and Information Science Avalable onlne at wwwscencedrectcom Avalable onlne at wwwscencedrectcom Proceda Proceda Engneerng Engneerng 00 (2011) 15000 000 (2011) 1642 1646 Proceda Engneerng wwwelsevercom/locate/proceda Advanced

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers 62626262621 Journal of Uncertan Systems Vol.5, No.1, pp.62-71, 211 Onlne at: www.us.org.u A Smple and Effcent Goal Programmng Model for Computng of Fuzzy Lnear Regresson Parameters wth Consderng Outlers

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

A Robust LS-SVM Regression

A Robust LS-SVM Regression PROCEEDIGS OF WORLD ACADEMY OF SCIECE, EGIEERIG AD ECHOLOGY VOLUME 7 AUGUS 5 ISS 37- A Robust LS-SVM Regresson József Valyon, and Gábor Horváth Abstract In comparson to the orgnal SVM, whch nvolves a quadratc

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

Face Detection with Deep Learning

Face Detection with Deep Learning Face Detecton wth Deep Learnng Yu Shen Yus122@ucsd.edu A13227146 Kuan-We Chen kuc010@ucsd.edu A99045121 Yzhou Hao y3hao@ucsd.edu A98017773 Mn Hsuan Wu mhwu@ucsd.edu A92424998 Abstract The project here

More information

Speeding Up the Xbox Recommender System Using a Euclidean Transformation for Inner-Product Spaces

Speeding Up the Xbox Recommender System Using a Euclidean Transformation for Inner-Product Spaces Speedng Up the Xbox Recommender System Usng a Eucldean Transformaton for Inner-Product Spaces Ran Glad-Bachrach Mcrosoft Research Yoram Bachrach Mcrosoft Research Nr Nce Mcrosoft R&D Lran Katzr Computer

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Optimization Methods: Integer Programming Integer Linear Programming 1. Module 7 Lecture Notes 1. Integer Linear Programming

Optimization Methods: Integer Programming Integer Linear Programming 1. Module 7 Lecture Notes 1. Integer Linear Programming Optzaton Methods: Integer Prograng Integer Lnear Prograng Module Lecture Notes Integer Lnear Prograng Introducton In all the prevous lectures n lnear prograng dscussed so far, the desgn varables consdered

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

CAN COMPUTERS LEARN FASTER? Seyda Ertekin Computer Science & Engineering The Pennsylvania State University

CAN COMPUTERS LEARN FASTER? Seyda Ertekin Computer Science & Engineering The Pennsylvania State University CAN COMPUTERS LEARN FASTER? Seyda Ertekn Computer Scence & Engneerng The Pennsylvana State Unversty sertekn@cse.psu.edu ABSTRACT Ever snce computers were nvented, manknd wondered whether they mght be made

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Report on On-line Graph Coloring

Report on On-line Graph Coloring 2003 Fall Semester Comp 670K Onlne Algorthm Report on LO Yuet Me (00086365) cndylo@ust.hk Abstract Onlne algorthm deals wth data that has no future nformaton. Lots of examples demonstrate that onlne algorthm

More information

Virtual Machine Migration based on Trust Measurement of Computer Node

Virtual Machine Migration based on Trust Measurement of Computer Node Appled Mechancs and Materals Onlne: 2014-04-04 ISSN: 1662-7482, Vols. 536-537, pp 678-682 do:10.4028/www.scentfc.net/amm.536-537.678 2014 Trans Tech Publcatons, Swtzerland Vrtual Machne Mgraton based on

More information

Array transposition in CUDA shared memory

Array transposition in CUDA shared memory Array transposton n CUDA shared memory Mke Gles February 19, 2014 Abstract Ths short note s nspred by some code wrtten by Jeremy Appleyard for the transposton of data through shared memory. I had some

More information

Ontology Generator from Relational Database Based on Jena

Ontology Generator from Relational Database Based on Jena Computer and Informaton Scence Vol. 3, No. 2; May 2010 Ontology Generator from Relatonal Database Based on Jena Shufeng Zhou (Correspondng author) College of Mathematcs Scence, Laocheng Unversty No.34

More information

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data Malaysan Journal of Mathematcal Scences 11(S) Aprl : 35 46 (2017) Specal Issue: The 2nd Internatonal Conference and Workshop on Mathematcal Analyss (ICWOMA 2016) MALAYSIAN JOURNAL OF MATHEMATICAL SCIENCES

More information

Signature and Lexicon Pruning Techniques

Signature and Lexicon Pruning Techniques Sgnature and Lexcon Prunng Technques Srnvas Palla, Hansheng Le, Venu Govndaraju Centre for Unfed Bometrcs and Sensors Unversty at Buffalo {spalla2, hle, govnd}@cedar.buffalo.edu Abstract Handwrtten word

More information

A Parallel and Efficient Algorithm for Learning to Match

A Parallel and Efficient Algorithm for Learning to Match A Parallel and Effcent Algorthm for Learnng to Match Jngbo Shang 1,, Tanq Chen, Hang L 3, Zhengdong Lu 3, Yong Yu 1 Unversty of Illnos at Urbana Champagn, IL, USA shang7@llnos.edu Unversty of Washngton,

More information