LOCALIZING USERS AND ITEMS FROM PAIRED COMPARISONS. Matthew R. O Shaughnessy and Mark A. Davenport

Size: px
Start display at page:

Download "LOCALIZING USERS AND ITEMS FROM PAIRED COMPARISONS. Matthew R. O Shaughnessy and Mark A. Davenport"

Transcription

1 2016 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT , 2016, SALERNO, ITALY LOCALIZING USERS AND ITEMS FROM PAIRED COMPARISONS Matthew R. O Shaughnessy and Mark A. Davenport Georga Insttute of Technology School of Electrcal and Computer Engneerng {moshaughnessy6, mdav}@gatech.edu ABSTRACT Suppose that we wsh to determne an embeddng of ponts gven only pared comparsons of the form user x prefers tem q to tem q j. Such observatons arse n a varety of contexts, ncludng applcatons such as recommendaton systems, targeted advertsement, and psychologcal studes. In ths paper we frst present an optmzaton-based framework for localzng new users and tems when an exstng embeddng s known. We demonstrate that user localzaton can be formulated as a smple constraned quadratc program, and that although tem localzaton produces a set of non-convex constrants, we can make the problem convex by strategcally combnng comparsons to produce a set of convex lnear constrants. Fnally, we show that by teratvely applyng ths method to every user and tem, we can recover an accurate embeddng, allowng us to teratvely mprove a gven embeddng or even generate an embeddng from scratch. Index Terms localzaton, pared comparsons, nonmetrc multdmensonal scalng, deal pont models, recommendaton systems 1. INTRODUCTION In ths paper we consder several problems related to learnng an embeddng of ponts from pared comparsons of the form x s closer to q than q j, where x, q, q j R n correspond to ponts whose locatons we would lke to estmate. Observatons of ths form arse n a varety of contexts, but a partcularly mportant class of applcatons nvolve recommendaton systems, targeted advertsement, and psychologcal studes where x represents an deal pont that models a partcular user s preferences and q and q j represent tems that the user compares. In ths model, whch dates back at least to [1], tems whch are close to x are those most preferred by the user. Pared comparsons arse naturally n ths context snce precse numercal scores quantfyng a user s preference are generally much more dffcult to assgn than comparatve M. O Shaughnessy was supported by the Georga Tech Presdent s Undergraduate Research Award program. M. Davenport was supported by grants NRL N C001, AFOSR FA , NSF CCF , CCF , and CMMI Fg. 1: Set of lnear constrants nduced by the bnary pared comparsons. Shadng represents the feasble regon resultng from the set of comparsons. judgments [2]. Moreover, data consstng of pared comparsons s often generated mplctly n contexts where the user has the opton to act on two (or more) alternatves; for nstance, they may choose to watch a partcular move or clck a partcular advertsement out of those dsplayed to them [3]. In such contexts, the true dstances n the deal pont model are generally naccessble n any drect way, but t s nevertheless stll possble to learn useful nformaton from such data. Some of the smplest questons n ths model arse n the settng where we are gven an exstng embeddng of users/tems (as n a mature recommender system) and consder the on-lne problem of localzng a sngle new user or tem based on pared comparsons. (We wll return later to the problem of how ths ntal embeddng mght be generated.) In ths context, there are two man problems of nterest: localzng users and tems. In the case of localzng a user, [4] proposes a relatvely smple approach (whch we revew n greater detal n Secton 2) whch bulds on the observaton that each comparson provded by a user wth deal pont x dvdes the space R n n half and tells us whch sde of a hyperplane the pont x les n. As we wll see n Secton 3, however, localzng a new tem from the same knd of measurements s a bt more complcated n partcular, the comparsons now gve rse to a set of non-convex constrants. However, we wll demonstrate how to relax ths problem to a /16/$31.00 c 2016 IEEE

2 convex one that s extremely smlar to the approach we take for localzng users, and whch we demonstrate n Secton 4 to be hghly effectve n practce. In Secton 5 we return to the problem of generatng an embeddng of users/tems n the frst place. One approach would be to generate an tem embeddng ndependently usng a varety of methods, such as multdmensonal scalng appled to a set of tem features, and then localze each user usng the approach descrbed above. A more robust approach that uses only pared comparsons s descrbed n [5], whch proposes a general purpose algorthm for nonmetrc multdmensonal scalng based on semdefnte programmng. Unfortunately, ths algorthm scales poorly wth the number of total users/tems. In ths paper we consder an alternatve and hghly scalable scheme based on the teratve applcaton of the algorthms descrbed n Sectons 2 and 3, whch we demonstrate to be surprsngly effectve. 2. LOCALIZING A USER We frst revew the method of localzng a new user (gven an exstng embeddng of tems) presented n [4]. Each bnary pared comparson of the form user x prefers tem q (k) to tem q (k) j, whch we denote x q (k) 2 2 x q (k) j 2 2, s represented n the optmzaton problem as a constrant. Here, q refers to the tem that s preferred (over q j ), and the superscrpt (k) ndcates that the two tems are represented n the k th comparson. In the objectve functon of our optmzaton problem, we mnmze the total magntude of comparson volatons (expressed n the vector of slack varables ξ) plus a regularzaton term: mnmze x, ξ 1 2 x x k subject to x q (k) ξ k 0 c k ξ k 2 2 x q (k) j ξ k The values {c k } represent how heavly volatons of each comparson are penalzed relatve to each other and the regularzaton term; adjustng {c k } s a natural way to represent the relatve confdence we have n each the accuracy of each comparson. In the objectve functon, x 0 represents an ntal estmate of the deal pont for the user to be localzed (n the case where we do not have a prevous estmate of x, we ntalze x 0 to the orgn or the center of the embeddng of tems). The regularzaton term x x serves three purposes: () t constrans x when the set of comparsons does not result n a completely bounded feasble regon; () t regularzes the soluton to avod senstvty to outlers; and () t allows us to bas the soluton towards a pre-exstng estmate, whch can be very helpful when such an estmate exsts and our comparsons are nosy or small n number. Whle perhaps not ntally obvous, the optmzaton problem n (1) s a standard quadratc program wth lnear (1) constrants. To see ths, one needs to smplfy the constrant, elmnatng the x 2 2 term from both sdes and rearrangng to obtan the lnear constrant: 2 ( q (k) j q (k) ) T x + q (k) 2 2 q (k) j 2 2 ξ k 0 (2) From ths form, we can see that each lnear constrant resultng from a bnary pared comparson defnes a half-space n R n where the localzed user x lves. Fgure 1 shows a geometrc nterpretaton of the feasble regon created by several comparsons. See [4] for further detals. From here we can smplfy the optmzaton problem by assgnng the vector 2(q (k) j q (k) ) T from each comparson to the rows of matrx 2 2 q (k) j 2 2 as entres of the A and the scalar entres q (k) column vector b. Wth ths notaton, the optmzaton problem takes ts fnal form as: mnmze x, ξ 1 2 x x c T ξ subject to Ax + b ξ 0 ξ 0 where the nequalty sgn n both constrants s appled element-wse. Ths s a standard quadratc program, and s solvable usng a convex optmzaton software package such as CVX [6, 7]. 3. LOCALIZING AN ITEM To localze a new tem gven an exstng embeddng of users/tems, we begn by constructng an optmzaton problem smlar to (1). However, when localzng a new tem, t s useful to dvde the constrants nto two groups: () those dervng from comparsons n whch the tem to be localzed, q, s the tem preferred to some alternatve q (k) j, and () those for whch q represents the tem preferred less than some q (k). We wll let T + and T ndex these two sets of constrants n our optmzaton problem as follows: mnmze q, ξ 1 2 q q k (3) c k ξ k (4) subject to x (k) q 2 2 x (k) q (k) j ξ k k T + x (k) q (k) 2 2 x (k) q ξ k k T ξ k 0 As llustrated geometrcally n Fgure 2, ths set of constrants makes the problem nonconvex. On the one hand, each constrant n T + (where q s the preferred tem) defnes a hypersphere wthn whch the localzed tem must le. Constrants of ths type are convex (but quadratc, n contrast to the lnear constrants that arse when localzng a user). On the other hand, each constrant n T (where q s the less preferred tem) defnes a hypersphere wthn whch the localzed tem cannot le. The ncluson of the constrants n T + make the optmzaton problem n (4) nonconvex.

3 (a) Constrant resultng n convex feasble regon (shaded) (b) Constrant resultng n non-convex feasble regon (shaded) Fg. 2: Feasble regon n R 2 for (a) comparsons where the tem to be localzed s the preferred tem, and (b) comparsons where the tem to be localzed s the less preferred tem. To make the optmzaton problem convex, we consder one possble convex relaxaton; we combne pars of comparsons by smply addng them, creatng a sngle lnear constrant from two quadratc constrants. Whle other convex relaxatons are possble, we wll see that ths smple method results n lnear constrants, leads to a computatonally effcent algorthm, and produces good results n practce. To descrbe our procedure, we wll suppose that q denotes the tem that we wsh to update. We can combne constrants by addng them to produce useful lnear constrants when the par s of the form x (k) q 2 2 x (k) q (k) j 2 2 x (l) q (l) 2 2 x (l) q 2 2 and x (k) x (l). The requrement that q appear on opposte sdes of the two nequaltes ensures the quadratc terms nvolvng q cancel when addng, and x (k) x (l) s necessary to avod makng the lnear term n the resultng constrant zero and the constrant vacuous. For each nequalty, the sdes not contanng the tem that we wsh to localze (q ) are constants, whch we denote c and d. We then combne the nequaltes as x (k) q 2 2 c {}}{ x (k) q (k) j x (l) q (l) 2 2 x (l) q 2 }{{} 2. d Expandng the norm n each nequalty, we note that combnng the comparsons results n the cancellaton of the quadratc term q 2 2, smplfyng the par of more computatonally expensve quadratc constrants nto a sngle lnear one. Addng and smplfyng yelds ( 2 x (l) x (k)) T q c d + x (l) 2 2 x (k) 2 2. (7) (5) (6) Fg. 3: Lnear constrant generated by the combnaton of two quadratc constrants. Gray shadng represents the feasble regons resultng from each of the two pared comparsons; blue hatched shadng represents the feasble regon defned by the hyperplane resultng from the combnaton of two comparsons. Note that the lnear relaxaton results n a much larger feasble regon. We can substtute these constrants nto (4) to obtan a lnearly-constraned quadratc problem that has the same form as the user localzaton optmzaton problem n (3), wth each 2(x (k) x (l) ) T assgned to a row of A and each c d + x (l) 2 2 x (k) 2 2 assgned to an entry of b. Fgure 3 shows a geometrc nterpretaton of the lnear constrant nduced by a combnaton of the two quadratc constrants. It s mportant to note that whle the lnear relaxaton obtaned by ths procedure results n a much larger feasble regon (n partcular, one that s unbounded), the combnaton of many such pars of constrants may stll be an effectve smplfcaton of the orgnal set of nonconvex constrants. Note that f we have a large number of comparsons, the number of vald combnatons T + T could be extremely large. In ths case, we smply choose a subset of m combnatons at random, although more ntellgent choces are lkely possble (partcularly n real-world applcatons, where the dstrbuton of users and tems present n comparsons wll be hghly uneven). It s also mportant to note that only pars of comparsons that produce a hyperplane lke the one shown n Fgure 3, where the hyperplane splts x (k) and x (l), are combned n ths way. To only select combnatons wth ths behavor, we also requre that ether a T x (k) < b < a T x (l) or a T x (l) < b < a T x (k) for each combnaton. 4. SIMULATIONS In ths secton, we demonstrate the performance of these approaches to localzng a new user or tem. For now we assume a pror knowledge of an exstng embeddng of users/tems, and use our lst of pared comparsons to localze a new user or tem n the embeddng. In each smulaton, a ground truth embeddng of many users and tems s generated unformly on the unt square n R n. Comparsons are generated by randomly selectng a user x (k) and two tems q 1 and q 2, q 1 q 2. For each comparson, q 1 and q 2 are assgned to q (k) and q (k) j to make them

4 (a) Fg. 4: Localzaton error measured n terms of l 2 dstance from ground truth as a functon of the total number of comparsons avalable to the system, whch contans 100 users and 100 tems. (a) User localzaton error. (b) Item localzaton error. (b) consstent wth x (k) q (k) 2 x (k) q (k) j 2. In each of the followng smulatons, our ground truth embeddng conssts of 100 users and 100 tems. Comparsons are generated unformly at random from the complete set of users and tems, and performance s measured by computng the l 2 dstance between the estmated pont and the ground truth. We repeat each smulaton 30 tmes and report the mean and standard devaton. In the frst smulaton, we demonstrate the accuracy wth whch we can localze a new user based on the number of comparsons we have avalable. As shown n Fgure 4(a), a very small number of comparsons nvolvng the localzed user results n a large error, but ths error drops off rapdly wth an ncreasng number of comparsons. Note that n ths paper we consder comparsons chosen randomly; smlar performance could be obtaned wth a much smaller number of comparsons f the user and tems to compare were adaptvely selected (see, for example, the approaches n [8, 9, 10]). Fgure 4(a) also shows the large mpact of dmensonalty on the l 2 recovery error. Ths s expected; on the unt square n R n, the maxmum possble error grows as n. These results are consstent wth recent theoretcal results for ths problem establshed n [10, 11]. Next, we apply the tem localzaton procedure descrbed n Secton 3 to localze a new tem gven an exstng embeddng of users/tems. As noted n the prevous secton, we select a subset of comparson combnatons at random from the (possbly large) set of vald combnatons; here, we pck m = combnatons. In Fgure 4(b), we show that our convex relaxaton of addng comparsons allows us to localze a new tem wth accuracy comparable to that of the user localzaton problem. Ths may be somewhat surprsng snce our procedure for generatng a convex set of constrants seems to dscard a sgnfcant amount of nformaton. 5. GENERATING A COMPLETE EMBEDDING Usng the procedures descrbed n Sectons 2 and 3, we can localze a new user or tem nto an embeddng of users and tems gven only a lst of pared comparsons. However, our localzaton procedures can also be used to mprove an exstng embeddng usng a lst of pared comparsons. When presented wth an exstng embeddng, we can terate through each pont and use our localzaton procedure to mprove ts accuracy. Because the accuracy wth whch we can localze a new pont depends on the accuracy of every other pont n the embeddng, each update can mprove the current pont as well as the results of future teratons. The teratve localzaton procedure s as follows: () Create an ntal embeddng (f no ntal embeddng s avalable, ntalze each pont at random); () Iterate through each user and tem n the embeddng, updatng the user/tem s locaton at each step by solvng the optmzaton problem from Sectons 2 or 3 as approprate. Repeat ths teraton untl convergence. Usng ths procedure, we can use a lst of pared comparsons to mprove the accuracy of a nosy embeddng of users and tems. Further, even f we do not have an ntal estmate of the embeddng, we can use ths procedure to generate one from scratch wth surprsng accuracy. Improvng an exstng embeddng. To evaluate the potental of ths approach, we begn by generatng comparsons between users and tems as n the experments n Secton 4, but after generatng the comparsons we perturb every user and tem n the embeddng wth Gaussan nose. Fgure 5 shows the reducton of error our procedure can acheve for three dfferent perturbaton nose levels (σ emb ). We show performance both n terms of the l 2 dstance to the ground truth (across all ponts n the embeddng, after applyng an affne transformaton, also known as the Procrustes dstance) but also n

5 Fg. 5: Reducton n nose acheved for a nosy embeddng n R 2, measured n terms of percent of all comparsons volated and mean l 2 recovery error calculated usng the Procrustes dstance. terms of the percentage of observed comparsons whch are volated by the learned embeddng. Gven a sutable number of teratons, we observe that we can reduce the error n the embeddng to approxmately the same level, regardless of the perturbaton nose level. Generatng an embeddng from scratch. Our teratve procedure s surprsngly effectve at denosng an embeddng, even when t s corrupted wth enough nose to volate up to 35% of the gven comparsons. Ths suggest that t may be possble to obtan smlar results when we have no a pror knowledge of the embeddng at all, generatng an embeddng from scratch. In ths smulaton, we agan create a ground truth embeddng of users and tems dstrbuted unformly on the unt square n R n. We use ths ground truth embeddng to generate a lst of bnary pared comparsons, then ntalze the reconstructed embeddng to a random set of ponts and use only the lst of comparsons to recreate the embeddng. As n the prevous experments, we measure the accuracy of our reconstructon usng the Procrustes dstance between the reconstructon and orgnal embeddng. Fgure 6 shows that even wth a random ntal embeddng (approxmately 50% of comparsons volated), we can generate an embeddng wth only a modest level of error n only a few passes over the entre dataset. Whle the mean l 2 recovery error n Fgure 6 shows error ncreasng wth dmensonalty, the error does not grow as quckly as the scalng of dstances wth dmenson (whch ncreases as n). 6. DISCUSSION In ths work, we have extended the optmzaton-based method n [4] for localzng a new user nto an embeddng of users/tems usng a set of bnary pared comparsons. Ths paper has two man contrbutons: frst, we derve a convex relaxaton whch allows us to use a smlar procedure for localzng tems nto an embeddng. Next, we develop a method for teratvely mprovng a nosy embeddng or generatng an embeddng from scratch usng only a set of pared comparsons. Smulatons on synthetcally generated data show that gven a sutable number of comparsons, we can localze new users and tems extremely accurately and can generate an embeddng wth good accuracy, even n hgh dmensons. The algorthm s computatonally effcent and easly parallelzable: localzng a sngle pont only requres solvng a smple lnearly-constraned quadratc program, and generatng an approxmaton of the entre embeddng only requres a few teratons through each user/tem n the embeddng (whch can be performed n parallel). Ths effcency s key, because n real-world recommendaton systems the number of users and tems may be extremely large. Ths algorthm scales far better n terms of both computaton and memory requrements than exstng procedures based on semdefnte programmng such as generalzed non-metrc multdmensonal scalng [5]. However, the performance of our algorthm mght be sgnfcantly mproved by an accurate knowledge (or estmate) of several landmark ponts, whch could be generated by such a procedure. Fnally, we brefly note that the deal pont model descrbed n ths paper, whle closely related, s dstnct from the low-rank model used n matrx completon approaches whch have recently ganed much attenton, e.g., [12, 13]. Although both models suppose preferences are guded by a small number of factors, the deal pont model leads to preferences that are non-monotonc functons of those attrbutes. There s emprcal evdence that the deal pont model captures user behavor more accurately than factorzaton based approaches do [14]. Nevertheless, the results n [15, 16, 17, 18], whch consder bnary observatons, pared comparsons, and other more general ordnal measurements n the low-rank context, share a smlar nspraton as ths work.

6 Fg. 6: Recovery error for a generated embeddng measured n terms of percent of all comparsons volated and mean l 2 recovery error calculated usng the Procrustes dstance. 7. REFERENCES [1] C. Coombs, Psychologcal scalng wthout a unt of measurement, Psych. Rev., vol. 57, no. 3, pp , [2] H. Davd, The method of pared comparsons, London, UK: Charles Grffn & Company Lmted, [3] F. Radlnsk and T. Joachms, Actve exploraton for learnng rankngs from clckthrough data, n Proc. ACM SIGKDD Int. Conf. on Knowledge, Dscovery, and Data Mnng (KDD), San Jose, CA, August 2007, ACM. [4] M. Davenport, Lost wthout a compass: Nonmetrc trangulaton and landmark multdmensonal scalng, n Proc. IEEE Int. Work. Comput. Adv. Mult-Sensor Adaptve Processng (CAMSAP), Sant Martn, Dec [5] S. Agarwal, J. Wlls, L. Cayton, G. Lanckret, D. Kregman, and S. Belonge, Generalzed non-metrc multdmensonal scalng, n Proc. Int. Conf. Art. Intell. Stat. (AISTATS), San Juan, Puerto Rco, Mar [6] Mchael Grant and Stephen Boyd, CVX: Matlab software for dscplned convex programmng, verson 2.1, Mar [7] Mchael Grant and Stephen Boyd, Graph mplementatons for nonsmooth convex programs, n Recent Advances n Learnng and Control, V. Blondel, S. Boyd, and H. Kmura, Eds., Lecture Notes n Control and Informaton Scences, pp Sprnger-Verlag Lmted, 2008, boyd/ graph_dcp.html. [8] K. Jameson and R. Nowak, Actve rankng usng parwse comparsons, n Proc. Adv. n Neural Processng Systems (NIPS), Granada, Span, Dec [9] K. Jameson and R. Nowak, Low-dmensonal embeddng usng adaptvely selected ordnal data, n Proc. Allerton Conf. Communcaton, Control, and Computng, Montcello, IL, [10] A. Massmno and M. Davenport, As you lke t: Localzaton va pared comparsons, Preprnt, [11] A. Massmno and M. Davenport, Bnary stable embeddng va pared comparsons, n Proc. IEEE Work. Stat. Sgnal Processng, Palma de Mallorca, Span, June [12] J. Renne and N. Srebro, Fast maxmum margn matrx factorzaton for collaboratve predcton, n Proc. Int. Conf. Machne Learnng, Bonn, Germany, Aug [13] E. Candès and B. Recht, Exact matrx completon va convex optmzaton, Found. Comput. Math., vol. 9, no. 6, pp , [14] A. Maydeu-Olvares and U. Böckenholt, Modelng preference data, The SAGE handbook of quanttatve methods n psychology, pp , [15] M. Davenport, Y. Plan, E. van den Berg, and M. Wootters, 1-bt matrx completon, Inf. Inference, vol. 3, no. 3, pp , [16] Y. Lu and S. Negahban, Indvdualzed rank aggregraton usng nuclear norm regularzaton, arxv: , [17] D. Park, J. Neeman, J. Zhang, S. Sanghav, and I. Dhllon, Preference completon: Large-scale collaboratve rankng form parwse comparsons, n Proc. Int. Conf. Machne Learnng, Llle, France, July [18] S. Oh, K. Thekumparampl, and J. Xu, Collaboratvely learnng preferences from ordnal data, n Proc. Adv. n Neural Processng Systems (NIPS), Montréal, Québec, Dec

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016) Technsche Unverstät München WSe 6/7 Insttut für Informatk Prof. Dr. Thomas Huckle Dpl.-Math. Benjamn Uekermann Parallel Numercs Exercse : Prevous Exam Questons Precondtonng & Iteratve Solvers (From 6)

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Private Information Retrieval (PIR)

Private Information Retrieval (PIR) 2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market

More information

5 The Primal-Dual Method

5 The Primal-Dual Method 5 The Prmal-Dual Method Orgnally desgned as a method for solvng lnear programs, where t reduces weghted optmzaton problems to smpler combnatoral ones, the prmal-dual method (PDM) has receved much attenton

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Analysis of Continuous Beams in General

Analysis of Continuous Beams in General Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem Ecent Computaton of the Most Probable Moton from Fuzzy Correspondences Moshe Ben-Ezra Shmuel Peleg Mchael Werman Insttute of Computer Scence The Hebrew Unversty of Jerusalem 91904 Jerusalem, Israel Emal:

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

A Scalable Projective Bundle Adjustment Algorithm using the L Norm

A Scalable Projective Bundle Adjustment Algorithm using the L Norm Sxth Indan Conference on Computer Vson, Graphcs & Image Processng A Scalable Projectve Bundle Adjustment Algorthm usng the Norm Kaushk Mtra and Rama Chellappa Dept. of Electrcal and Computer Engneerng

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

UNIT 2 : INEQUALITIES AND CONVEX SETS

UNIT 2 : INEQUALITIES AND CONVEX SETS UNT 2 : NEQUALTES AND CONVEX SETS ' Structure 2. ntroducton Objectves, nequaltes and ther Graphs Convex Sets and ther Geometry Noton of Convex Sets Extreme Ponts of Convex Set Hyper Planes and Half Spaces

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Fitting: Deformable contours April 26 th, 2018

Fitting: Deformable contours April 26 th, 2018 4/6/08 Fttng: Deformable contours Aprl 6 th, 08 Yong Jae Lee UC Davs Recap so far: Groupng and Fttng Goal: move from array of pxel values (or flter outputs) to a collecton of regons, objects, and shapes.

More information

AP PHYSICS B 2008 SCORING GUIDELINES

AP PHYSICS B 2008 SCORING GUIDELINES AP PHYSICS B 2008 SCORING GUIDELINES General Notes About 2008 AP Physcs Scorng Gudelnes 1. The solutons contan the most common method of solvng the free-response questons and the allocaton of ponts for

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Feature-Based Matrix Factorization

Feature-Based Matrix Factorization Feature-Based Matrx Factorzaton arxv:1109.2271v3 [cs.ai] 29 Dec 2011 Tanq Chen, Zhao Zheng, Quxa Lu, Wenan Zhang, Yong Yu {tqchen,zhengzhao,luquxa,wnzhang,yyu}@apex.stu.edu.cn Apex Data & Knowledge Management

More information

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming CS 4/560 Desgn and Analyss of Algorthms Kent State Unversty Dept. of Math & Computer Scence LECT-6 Dynamc Programmng 2 Dynamc Programmng Dynamc Programmng, lke the dvde-and-conquer method, solves problems

More information

Intelligent Information Acquisition for Improved Clustering

Intelligent Information Acquisition for Improved Clustering Intellgent Informaton Acquston for Improved Clusterng Duy Vu Unversty of Texas at Austn duyvu@cs.utexas.edu Mkhal Blenko Mcrosoft Research mblenko@mcrosoft.com Prem Melvlle IBM T.J. Watson Research Center

More information

Polyhedral Compilation Foundations

Polyhedral Compilation Foundations Polyhedral Complaton Foundatons Lous-Noël Pouchet pouchet@cse.oho-state.edu Dept. of Computer Scence and Engneerng, the Oho State Unversty Feb 8, 200 888., Class # Introducton: Polyhedral Complaton Foundatons

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming CEE 60 Davd Rosenberg p. LECTURE NOTES Dualty Theory, Senstvty Analyss, and Parametrc Programmng Learnng Objectves. Revew the prmal LP model formulaton 2. Formulate the Dual Problem of an LP problem (TUES)

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Chapter 6 Programmng the fnte element method Inow turn to the man subject of ths book: The mplementaton of the fnte element algorthm n computer programs. In order to make my dscusson as straghtforward

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Parallel Branch and Bound Algorithm - A comparison between serial, OpenMP and MPI implementations

Parallel Branch and Bound Algorithm - A comparison between serial, OpenMP and MPI implementations Journal of Physcs: Conference Seres Parallel Branch and Bound Algorthm - A comparson between seral, OpenMP and MPI mplementatons To cte ths artcle: Luco Barreto and Mchael Bauer 2010 J. Phys.: Conf. Ser.

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

AMath 483/583 Lecture 21 May 13, Notes: Notes: Jacobi iteration. Notes: Jacobi with OpenMP coarse grain

AMath 483/583 Lecture 21 May 13, Notes: Notes: Jacobi iteration. Notes: Jacobi with OpenMP coarse grain AMath 483/583 Lecture 21 May 13, 2011 Today: OpenMP and MPI versons of Jacob teraton Gauss-Sedel and SOR teratve methods Next week: More MPI Debuggng and totalvew GPU computng Read: Class notes and references

More information

CMPS 10 Introduction to Computer Science Lecture Notes

CMPS 10 Introduction to Computer Science Lecture Notes CPS 0 Introducton to Computer Scence Lecture Notes Chapter : Algorthm Desgn How should we present algorthms? Natural languages lke Englsh, Spansh, or French whch are rch n nterpretaton and meanng are not

More information

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices Steps for Computng the Dssmlarty, Entropy, Herfndahl-Hrschman and Accessblty (Gravty wth Competton) Indces I. Dssmlarty Index Measurement: The followng formula can be used to measure the evenness between

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Array transposition in CUDA shared memory

Array transposition in CUDA shared memory Array transposton n CUDA shared memory Mke Gles February 19, 2014 Abstract Ths short note s nspred by some code wrtten by Jeremy Appleyard for the transposton of data through shared memory. I had some

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Support Vector Machines for Business Applications

Support Vector Machines for Business Applications Support Vector Machnes for Busness Applcatons Bran C. Lovell and Chrstan J Walder The Unversty of Queensland and Max Planck Insttute, Tübngen {lovell, walder}@tee.uq.edu.au Introducton Recent years have

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

On a Registration-Based Approach to Sensor Network Localization

On a Registration-Based Approach to Sensor Network Localization 1 On a Regstraton-Based Approach to Sensor Network Localzaton R. Sanyal, M. Jaswal, and K. N. Chaudhury arxv:177.2866v2 [math.oc] 9 Nov 217 Abstract We consder a regstraton-based approach for localzng

More information

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009. Farrukh Jabeen Algorthms 51 Assgnment #2 Due Date: June 15, 29. Assgnment # 2 Chapter 3 Dscrete Fourer Transforms Implement the FFT for the DFT. Descrbed n sectons 3.1 and 3.2. Delverables: 1. Concse descrpton

More information

Semi-Supervised Kernel Mean Shift Clustering

Semi-Supervised Kernel Mean Shift Clustering IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. XX, NO. XX, JANUARY XXXX 1 Sem-Supervsed Kernel Mean Shft Clusterng Saket Anand, Student Member, IEEE, Sushl Mttal, Member, IEEE, Oncel

More information

Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems

Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems Taxonomy of Large Margn Prncple Algorthms for Ordnal Regresson Problems Amnon Shashua Computer Scence Department Stanford Unversty Stanford, CA 94305 emal: shashua@cs.stanford.edu Anat Levn School of Computer

More information

Computer models of motion: Iterative calculations

Computer models of motion: Iterative calculations Computer models o moton: Iteratve calculatons OBJECTIVES In ths actvty you wll learn how to: Create 3D box objects Update the poston o an object teratvely (repeatedly) to anmate ts moton Update the momentum

More information

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION 24 CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION The present chapter proposes an IPSO approach for multprocessor task schedulng problem wth two classfcatons, namely, statc ndependent tasks and

More information

Review of approximation techniques

Review of approximation techniques CHAPTER 2 Revew of appromaton technques 2. Introducton Optmzaton problems n engneerng desgn are characterzed by the followng assocated features: the objectve functon and constrants are mplct functons evaluated

More information