Fast and Efficient Incremental Learning for High-dimensional Movement Systems

Size: px
Start display at page:

Download "Fast and Efficient Incremental Learning for High-dimensional Movement Systems"

Transcription

1 Vjayakumar, S, Schaal, S (2). Fast and effcent ncremental learnng for hgh-dmensonal movement systems, Internatonal Conference on Robotcs and Automaton (ICRA2). San Francsco, Aprl 2. Fast and Effcent Incremental Learnng for Hgh-dmensonal Movement Systems Sethu Vjayakumar * sethu@bran.rken.go.jp Stefan Schaal sschaal@usc.edu *Laboratory for Informaton Synthess, Rken Bran Scence Research Insttute, Wako, Satama, Japan Computer Scence and Neuroscence, HNB-3, Unv. of Southern Calforna, Los Angeles, CA Kawato Dynamc Bran Project (ERATO/JST), 2-2 Hkarda, Seka-cho, Soraku-gun, 69-2 Kyoto, Japan Abstract: We ntroduce a new algorthm, Locally Weghted Projecton Regresson (LWPR), for ncremental real-tme learnng of nonlnear functons, as partcularly useful for problems of autonomous real-tme robot control that requres nternal models of dynamcs, knematcs, or other functons. At ts core, LWPR uses locally lnear models, spanned by a small number of unvarate regressons n selected drectons n nput space, to acheve pecewse lnear functon approxmaton. The most outstandng propertes of LWPR are that t ) learns rapdly wth second order learnng methods based on ncremental tranng, ) uses statstcally sound stochastc cross valdaton to learn ) adjusts ts local weghtng kernels based on only local nformaton to avod nterference problems, v) has a computatonal complexty that s lnear n the number of nputs, and v) can deal wth a large number of possbly redundant and/or rrelevant nputs, as shown n evaluatons wth up to 5 dmensonal data sets for learnng the nverse dynamcs of an anthropomorphc robot arm. To our knowledge, ths s the frst ncremental neural network learnng method to combne all these propertes and that s well suted for complex on-lne learnng problems n robotcs. Introducton Motor control of complex movement systems requres knowledge of a varety of contnuous valued functons, for nstance coordnate transformatons of the manpulator knematcs and models of the forward or nverse dynamcs. Whenever analytcal methods are not avalable to derve these functons, e.g., as frequently the case n lghtweghted and complex (humanod) dexterous robots (e.g., Fgure ), learnng approaches need to be employed to fnd approxmate solutons. However, functon approxmaton for hgh dmensonal nonlnear motor systems remans a nontrval problem. An deal algorthm for such tasks needs to elmnate redundancy n the nput data, detect rrelevant nput dmensons, keep the computatonal complexty less than quadratc n the number of nput dmensons, and, of course, acheve accurate functon approxmaton and generalzaton. In ths paper, we suggest to accomplsh these goals wth technques of projecton regresson. The key dea of projecton regresson s to cope wth the complextes of hgh dmensonal functon approxmaton by decomposng the regresson nto a sequence of onedmensonal localzed regressons along partcular drectons n nput space. The major dffculty of projecton regresson becomes how to select effcent projectons,.e., to acheve the best fttng result wth as few as possble onedmensonal regresson. Fgure : Humanod robot n our laboratory Prevous work n the learnng lterature has focussed on fndng good global projectons for fttng nonlnear onedmensonal functons. Among the best known algorthms s projecton pursut regresson ([]), and ts generalzaton n form of Generalzed Addtve Models ([2]). Sgmodal neural networks can equally be conceved of as a method

2 of projecton regresson, n partcular when new projectons are added sequentally, e.g., as n Cascade Correlaton [3]). Here we suggest an alternatve method of projecton regresson, focussng on fndng effcent local projectons. Local projectons can be used to accomplsh local functon approxmaton n the neghborhood of a query pont. Such methods allow fttng locally smple functons, e.g., low order polynomals, along the projecton, whch greatly smplfes the functon approxmaton problem. Local projecton regresson can thus borrow most of ts statstcal propertes from the well-establshed methods of locally weghted learnng and nonparametrc regresson ([4], [5]). Counterntutve to the curse of dmensonalty ([6]), local regresson methods can work successfully n hgh dmensonal spaces ([7]), as we wll emprcally demonstrate below. In the next secton, we wll frst motvate why functon approxmaton n hgh dmensons s complcated, and why there s hope that the curse of dmensonalty s not really a problem for hgh-dmensonal movement systems. Second, we wll ntroduce our new learnng algorthm, Locally Weghted Projecton Regresson (LWPR) that can effcently deal wth hgh-dmensonal learnng problems. In the last secton, we wll show learnng results on synthetc data and data from a Sarcos Dexterous Robot, an anthropomorphc robot arm, whose nverse dynamcs model was learned by our algorthm, even when contamnated wth rrelevant and redundant nputs. 2 The Curse of Dmensonalty Both n research of bologcal and robotc motor control, the need for nternal models has been emphaszed n order to acheve accurate control of fast movements ([8]; [9]). For models wth only few nput dmensons, learnng approaches have been qute successful (e.g, []; []; [2]). However, for hgher dmensonal learnng tasks, t has been unclear whether learnng approaches can succeed. For nstance, we are nterested n learnng the nverse dynamcs model of a humanod robot (Fgure ). The robot has 3 degrees-of-freedom (DOFs), amountng to a 9-dmensonal nput space to the nverse dynamcs functon (3 poston, 3 velocty, and 3 acceleraton states). A rgd body dynamcs model performs poorly for ths hydraulcally actuated lght-weghted system. Dependng on the nonlnear actvaton functon employed n the network unts, two categores of neural networks are avalable for such a task. The tradtonal sgmodal actvaton functon or any other functon wth unbounded support tres to fnd global projectons n nput space that lead to a good approxmaton of the data. It s well known that these networks learn rather slowly n hghdmensonal spaces, and that the network structure,.e., the number of hdden unts, needs to be chosen carefully to acheve good learnng results. Moreover, neural networks wth unbounded actvaton functons are very vulnerable to nference problems,.e., the unlearnng of relevant knowledge when traned on new data ponts ([3]). Snce an autonomous robot n a dynamc envronment wll encounter new data all the tme, complex off-lne re-tranng procedures would need to be devsed to cope wth the nterference problem, and a substantal amount of data would have to be stored for the re-tranng. From a practcal pont, ths approach s hardly useful. Alternatvely, networks types can be selected that have bounded support n ther actvaton functons. Radal bass functon networks wth Gaussan kernels are among the best known n ths category ([4]; [5]). The selecton of the number of hdden unts s usually easer wth ths network type, and tranng proceeds rather fast and has reduced nterference problems. However, n a 9- dmensonal nput space, t s possble that a huge number of radal bass functons would be needed to cover the nput space: even f each nput dmenson s only covered by 2 Gaussans, an astronomcal number of 2 9 Gaussans would be requred for our humanod. Therefore, the only hope for learnng approaches s that the data generated by a movement system actually le on low dmensonal dstrbutons. We tested ths hypothess emprcally by collectng data from human and robot (Sarcos Dexterous Arm, Fgure 4) seven degree-of-freedom unconstraned arm movements by recordng about Mb of data of the jont angular trajectores durng varous movement tasks. For the human data, we assumed a bomechancally reasonable mass dstrbuton and computed the torque trajectores of every DOF; for the robot data, load sensors measured the torques drectly. Thus, we obtaned approprate data for tranng a supervsed neural network. The network we chose employed Gaussan kernels as nonlnear actvaton functon that ncluded a prncple component dmensonalty reducton (PCA) n each kernel ([7]). Importantly, the PCA automatcally determned the number of local dmensons needed n each kernel to accomplsh good functon approxmaton. Results of ncremental learnng wth ths system are shown n Fgure 2. Besdes that the neural network acheved very good approxmaton results, characterzed by a low normalzed mean squared error (c.f. [6]), t most noteworthy that the network only requred on average 4 to 6 dmensons locally for good functon fttng (cf. the dashed lnes n Fgure 2). Ths example s only to hghlght the problem of functon fttng n hgh-dmensonal spaces t wll never be possble to collect enough data wth a real robot to fll such bg spaces, even when runnng the robot for hundreds of years. 2

3 These analyses confrmed our assumpton that movement data n hgh-dmensonal spaces actually le on locally low dmensonal dstrbutons. Approprate learnng algorthms that can dscover the local dstrbutons should thus be able to approxmate really complex nternal models. a) b) nmse On Test Set nmse On Test Set.5 Local Learnng #Dmensons 9.4 Lnear Regresson #Tranng Iteratons Local Learnng Parametrc Model Lnear Regresson #Dmensons 5 #Tranng Iteratons Fgure 2: Learnng nverse dynamcs models wth a nonparametrc neural network that uses Gaussan actvaton functons wth prncpal component-based dmensonalty reducton n each Gaussan. The local learnng curves show the reducton of the mean squared error as a functon of the tranng teratons. The lnear regresson straght lne s the result of a global lnear regresson ft of the tranng data, and the Parametrc Model lne demonstrates the results from usng a rgd-body based parameter estmaton method to ft the nverse dynamcs model ([]) the latter method was only applcable for the robot data. a) Learnng from human behavoral data, b) learnng from robot data. A drawback of the PCA-based local dmensonalty reducton of ([7]), however, s that PCA only reduces the dmensonalty by lookng at the nput data. In ths way, PCA can mstake nose wth relevant nput sgnals, and rrelevant dmensons, as they may occur n some learnng problems, wll nfluence the results of learnng. These consderatons lead to a new learnng network, as descrbed n the next secton. 3 Locally Weghted Projecton Regresson In the followng, we assume that the data generatng model for our regresson problem has the standard form y = f( x)+ ε, where x R n s a n-dmensonal nput vector, the nose term has mean zero, E{ ε } =, and the output s one-dmensonal. The key concept of our regresson #Dmensons n Regresson #Dmensons n Regresson network s to approxmate nonlnear functons by means of pecewse lnear models. The regon of valdty, called a receptve feld, of each lnear model s computed from a Gaussan functon: 2 x c D x c () T w k = exp ( k ) k( k) where c k s the center of the k th lnear model, and D k corresponds to a dstance metrc that determnes the sze and shape of regon of valdty of the lnear model. Gven an nput vector x, each lnear model calculates a predcton y k. The total output of the network s the weghted mean of all lnear models: K wy k= k k ŷ = K (2) w k= k Prevous work ([3]) computed the outputs of each lnear model y k by tradtonal recursve least squares regresson over all the nput varables. Learnng n such a system, however, requred more than O(n 2 ) computatons whch became nfeasble for about more than dmensonal nput spaces. Here we suggest reducng the computatonal burden n each local lnear model by applyng a sequence of one-dmensonal regressons along selected projectons u r n nput space (note that we dropped the ndex k from now on unless t s necessary to dstngush explctly between dfferent lnear models): Intalze: y = β,z = x x For = :r T s = uz y = y+ βs z z p s The projectons u, the unvarate regresson parameters β, the mean x, and the number of projectons r are determned by the learnng algorthm. Addtonally, the learnng algorthm also fnds a projecton vector p that reduces the nput space for the next unvarate regresson. As wll be explaned below, ths step allows to fnd more effcent projectons u at subsequent unvarate regresson steps. In order to determne the open parameters n Equaton (3), the technque of partal least squares (PLS) regresson can be adapted from the statstcs lterature ([7]). The mportant ngredent of PLS s to choose projectons accordng to the correlaton of the nput data wth the output data. The followng algorthm, Locally Weghted Projecton Regresson (LWPR), uses an ncremental locally weghted verson of PLS to determne the lnear model parameters: (3) 3

4 Gven: A tranng pont ( x, y) Update the means of nputs and output: n n λw x + wx x = W n λw β + wy β = W n where W = λw + w Update the local model: Intalze: z = x, res = y β For = :r, n a) u = λ u + w zres T b) s = zu n 2 c) SS = λ SS + w s n d) SR = λ SR + w s res n e) SZ = λ SZ + w z s f) β = SR SS g) p = SZ SS h) z z sp ) res res sβ n j) MSE = λ MSE + w res 2 In the above equatons, λ [,] s a forgettng factor that determnes how much older data n the regresson parameters wll be forgotten, smlar as n recursve system dentfcaton technques ([8]). The varables SS, SR, and SZ are memory terms that enable us to do the unvarate regresson n step f) n a recursve least squares fashon,.e., a fast Newton-lke method. Step g) regresses the projecton p from the current projected data s and the current nput data z. Ths step guarantees that the next projecton of the nput data for the next unvarate regresson wll result n a u + that s orthogonal to u. Thus, for r=n, the entre nput space would be spanned by the projectons u and the regresson results would be dentcal to that of a tradtonal lnear regresson. Step j) wll be dscussed below. There are several mportant propertes n PLS. Frst, f all the nput varables are statstcally ndependent, PLS wll fnd the optmal projecton drecton u n a sngle teraton the optmal projecton drecton corresponds to the gradent of the assumed locally lnear functon to be approxmated. Second, choosng the projecton drecton from correlatng the nput and the output data n Step a) automatcally excludes rrelevant nput dmensons,.e., nputs that do not contrbute to the output. And thrd, there s no danger of numercal problems n PLS due to redundant nput dmensons as the unvarate regressons wll never be sngular. (4) The above update rule can be embedded n an ncremental learnng system that automatcally allocates new locally lnear models as needed ([3]): Intalze the LWPR wth no receptve feld (RF); For every new tranng sample (x,y): For k= to #RF: calculate the actvaton from () update accordng to (4) end; If no lnear model was actvated by more than w gen ; create a new RF wth r=2, c=x, D=D def end; end; In ths pseudo-code algorthm, w gen s a threshold that determnes when to create a new receptve feld, and D def s the ntal (usually dagonal) dstance metrc n (). The ntal number of projectons s set to r=2. The algorthm has a smple mechansm of determnng whether r should be ncreased by recursvely keepng track of the meansquared error (MSE) as a functon of the number of projectons ncluded n a local model,.e., Step j) n (4). If the MSE at the next projecton does not decrease more than a certan percentage of the prevous MSE,.e., MSE+ > φ, where φ [ ] MSE, (6) the algorthm wll stop addng new projectons to the local model. It s even possble to learn the correct parameters for the dstance metrc D n each local model. The algorthm for ths update was derved n ([3]) for normal locally lnear regresson based on an ncremental cross valdaton technque. Ths algorthm s drectly applcable to LWPR, and s strongly smplfed, as t only needs to be done n the context of unvarate regressons. Due to space lmtatons, we wll not provde the update rules n ths paper as they can be derved from ([3]). 4 Emprcal Evaluatons In order to provde a graphcal llustraton of the learnng algorthm, as a frst test, we ran LWPR on 5 nosy tranng data drawn from the synthetc two dmensonal functon: x y x y z = ( + ) max e, e, 25. e N + (,. ) (7) shown n Fgure 3a. A second test added 8 constant dmensons to the nputs and rotated ths new nput space by a random -dmensonal rotaton matrx. A thrd test added another nput dmensons to the nputs of the second 4

5 test, each havng N(,.5 2 ) Gaussan nose, thus obtanng a 2-dmensonal nput space. The learnng results wth these data sets are llustrated n Fgure 3. In all three cases, LWPR reduced the normalzed mean squared error (thck lnes) on a noseless test set rapdly n -2 epochs of tranng to less than nmse=.5, and t converged to the excellent functon approxmaton result of nmse=. after, data presentatons. Fgure 3b llustrates the reconstructon of the orgnal functon from the 2- dmensonal test an almost perfect approxmaton. The rsng thn lnes n Fgure 3c show the number of local models that LWPR allocated durng learnng. The very thn lnes at the bottom of the graph ndcate the average number of projectons that the local models allocated: the average remaned at the ntalzaton value of two projectons, as approprate for ths orgnally two dmensonal data set. In the second evaluaton, we approxmated the nverse dynamcs model of a 7-degree-of-freedom anthropomorphc robot arm (Fgure 4a) from a data set consstng of 45, data ponts, collected at Hz from the actual robot performng varous rhythmc and dscrete movement tasks (ths corresponds to 7.5 mnutes of data collecton). The nverse dynamcs model of the robot s strongly nonlnear due to a vast amount of superpostons of sne and cosne functons n the robot dynamcs. The data conssted of 2 nput dmensons: 7 jont postons, veloctes, and acceleratons. The goal of learnng was to approxmate the approprate torque command of the shoulder robot motor n response to the nput vector. To ncrease the dffculty of learnng, we added 29 rrelevant dmensons to the nputs wth N(,.5 2 ) Gaussan nose. 5, data ponts were excluded from the tranng data as a test set. Fgure 4b shows the learnng results n comparson to a global lnear regresson of the data. From the very begnnng, LWPR outperformed the global lnear regresson. Wthn about 5, tranng ponts, LWPR converged to the excellent result of nmse=.42. It employed an average of only 3.8 projectons per local model despte the fact that the nput dmensonalty was 5. Durng learnng, the number of local models ncreased by a factor of 6 from about 5 ntal models to about 325 models. Ths ncrease s due to the adjustment of the dstance metrc D n Equaton (), whch was ntalzed to form a rather large kernel. Snce ths large kernel oversmoothes the data, LWPR reduced the kernel sze, and n response more kernels needed to be allocated. In comparson to the robot learnng results n Fgure 2, t s noteworthy that LWPR requred much fewer local dmensons than the PCA-based algorthm n ([7]). Ths s due to the fact the LWPR chooses projectons much more effcently than PCA, e.g., for ndependent nput data dstrbutons, only one projecton would suffce, as mentoned before. c) nmse on Test Set a) b) z z y -.5 y #Tranng Data Ponts x x 2D-cross D-cross 2D-cross Fgure 3: a) Target and b) learned nonlnear cross functon. c) Learnng curves for 2-D, -D, and 2-D data. 5 Conclusons Ths paper presented a new learnng algorthm, Locally Weghted Projecton Regresson (LWPR), a nonlnear functon approxmaton network that s partcularly suted for problems of on-lne ncremental motor learnng. The essence of LWPR s to acheve functon approxmaton #Receptve Felds / Average #Projectons 5

6 wth pecewse lnear models by fndng effcent local projecton to reduce the dmensonalty of the nput space. Hgh-dmensonal learnng problems can thus be dealt wth effcently: updatng one projecton drecton has lnear computatonal cost n the number of nputs, and snce the algorthm accomplshes good approxmaton results wth only 3-4 projectons rrespectve of the number of nput dmensons, the overall computatonal complexty remans lnear n the nputs.. Moreover, the mechansms of LWPR to select low dmensonal projectons are capable of excludng rrelevant and redundant dmensons from the nput data. As an example, we demonstrated how LWPR leads to excellent functon approxmaton results n up to 5 dmensonal data sets, extracted from a 7 degree-of-freedom anthropomorphc robot arm. To our knowledge, ths the frst ncremental learnng system that can effcently work n hgh dmensonal spaces. b) a) nmse on Test Set Lnear Regresson LWPR #Tranng Data Ponts Fgure 4: a) Sarcos Dexterous Robot Arm; b) Learnng curve for learnng the nverse dynamcs model of the robot from a 5 dmensonal data set that ncluded 29 rrelevant dmensons. 6 Acknowledgments Ths work was made possble by Award #9732 of the Natonal Scence Foundaton, the ERATO Kawato Dynamc Bran Project funded by the Japanese Scence and Technology Cooperaton, and the ATR Human Informaton Processng Research Laboratores. 5 #Receptve Felds 7 References [] J. H. Fredman and W. Stützle, Projecton pursut regresson, Journal of the Amercan Statstcal Assocaton, Theory and Models, vol. 76, pp , 98. [2] T. J. Haste and R. J. Tbshran, Generalzed addtve models. London: Chapman and Hall, 99. [3] S. E. L. C. Fahlman, The cascade-correlaton learnng archtecture, n Advances n Neural Informaton Processng Systems II, D. S. Touretzky, Ed. San Mateo, CA: Morgan Kaufmann, 99, pp [4] T. Haste and C. Loader, Local regresson: Automatc kernel carpentry, Statstcal Scence, vol. 8, pp. 2-43, 993. [5] C. G. Atkeson, A. W. Moore, and S. Schaal, Locally weghted learnng, Artfcal Intellgence Revew, vol., pp. -73, 997. [6] D. W. Scott, Multvarate Densty Estmaton. New York: Wley, 992. [7] S. Vjayakumar and S. Schaal, Local adaptve subspace regresson, Neural Processng Letters, vol. 7, pp , 998. [8] M. Kawato, Computatonal schemes and neural network models for formaton and control of multjont arm trajectory, n Neural Networks for Control, W. T. Mller I, R. S. Sutton, and P. J. Werbos, Eds. Cambrdge, MA: MIT Press, 99, pp [9] M. I. Jordan, Computatonal aspects of motor control and motor learnng, n Handbook of percepton and acton, H. Heuer and S. W. Keele, Eds. New York: Academc Press, 996. [] C. H. An, C. G. Atkeson, and J. M. Hollerbach, Model-based control of a robot manpulator. Cambrdge, MA: MIT Press, 988. [] A. H. Fagg, N. Stkoff, A. G. Barto, and J. C. Houk, Cerebellar learnng for control of a two-lnk arm n muscle space, presented at IEEE Internatonal Conference on Robotcs and Automaton (ICRA'97), Albuquerque, NM, 997. [2] M. Cannon and J. E. Slotne, Space-frequency localzed bass functon networks for nonlnear system estmaton and control, Neurocomputng, vol. 9, pp , 995. [3] S. Schaal and C. G. Atkeson, Constructve ncremental learnng from only local nformaton, Neural Computaton, vol., pp , 998. [4] J. Moody and C. Darken, Learnng wth localzed receptve felds, n Proceedngs of the 988 Connectonst Summer School, D. Touretzky, G. Hnton, and T. Sejnowsk, Eds. San Mateo, CA: Morgan Kaufmann, 988, pp [5] R. Poggo and F. Gros, Regularzaton algorthms for learnng that are equvalent to multlayer networks, Scence, vol. 247, pp , 99. [6] C. M. Bshop, Neural networks for pattern recognton. New York: Oxford Unversty Press, 995. [7] H. Wold, Soft modelng by latent varables: the nonlnear teratve partal least squares approach, n Perspectves n Probablty and Statstcs, Papers n Honour of M. S. Bartlett, J. Gan, Ed. London: Academc Press, 975, pp [8] L. Ljung and T. Söderström, Theory and practce of recursve dentfcaton: Cambrdge MIT Press,

Learning physical Models of Robots

Learning physical Models of Robots Learnng physcal Models of Robots Jochen Mück Technsche Unverstät Darmstadt jochen.mueck@googlemal.com Abstract In robotcs good physcal models are needed to provde approprate moton control for dfferent

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Learning Inverse Kinematics

Learning Inverse Kinematics D Souza, A., Vjayakumar, S., Schaal, S. Learnng nverse knematcs. In Proceedngs of the IEEE/RSJ Internatonal Conference on Intellgence n Robotcs and Autonomous Systems (IROS 21). Mau, HI, USA, Oct 21. Learnng

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

A Robust LS-SVM Regression

A Robust LS-SVM Regression PROCEEDIGS OF WORLD ACADEMY OF SCIECE, EGIEERIG AD ECHOLOGY VOLUME 7 AUGUS 5 ISS 37- A Robust LS-SVM Regresson József Valyon, and Gábor Horváth Abstract In comparson to the orgnal SVM, whch nvolves a quadratc

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

An Ensemble Learning algorithm for Blind Signal Separation Problem

An Ensemble Learning algorithm for Blind Signal Separation Problem An Ensemble Learnng algorthm for Blnd Sgnal Separaton Problem Yan L 1 and Peng Wen 1 Department of Mathematcs and Computng, Faculty of Engneerng and Surveyng The Unversty of Southern Queensland, Queensland,

More information

Lecture #15 Lecture Notes

Lecture #15 Lecture Notes Lecture #15 Lecture Notes The ocean water column s very much a 3-D spatal entt and we need to represent that structure n an economcal way to deal wth t n calculatons. We wll dscuss one way to do so, emprcal

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Using Neural Networks and Support Vector Machines in Data Mining

Using Neural Networks and Support Vector Machines in Data Mining Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Correlative features for the classification of textural images

Correlative features for the classification of textural images Correlatve features for the classfcaton of textural mages M A Turkova 1 and A V Gadel 1, 1 Samara Natonal Research Unversty, Moskovskoe Shosse 34, Samara, Russa, 443086 Image Processng Systems Insttute

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition Optmal Desgn of onlnear Fuzzy Model by Means of Independent Fuzzy Scatter Partton Keon-Jun Park, Hyung-Kl Kang and Yong-Kab Km *, Department of Informaton and Communcaton Engneerng, Wonkwang Unversty,

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

Reducing Frame Rate for Object Tracking

Reducing Frame Rate for Object Tracking Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg

More information

Fitting: Deformable contours April 26 th, 2018

Fitting: Deformable contours April 26 th, 2018 4/6/08 Fttng: Deformable contours Aprl 6 th, 08 Yong Jae Lee UC Davs Recap so far: Groupng and Fttng Goal: move from array of pxel values (or flter outputs) to a collecton of regons, objects, and shapes.

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Training ANFIS Structure with Modified PSO Algorithm

Training ANFIS Structure with Modified PSO Algorithm Proceedngs of the 5th Medterranean Conference on Control & Automaton, July 7-9, 007, Athens - Greece T4-003 Tranng ANFIS Structure wth Modfed PSO Algorthm V.Seyd Ghomsheh *, M. Alyar Shoorehdel **, M.

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

3D vector computer graphics

3D vector computer graphics 3D vector computer graphcs Paolo Varagnolo: freelance engneer Padova Aprl 2016 Prvate Practce ----------------------------------- 1. Introducton Vector 3D model representaton n computer graphcs requres

More information

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z. TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms 3. Fndng Determnstc Soluton from Underdetermned Equaton: Large-Scale Performance Modelng by Least Angle Regresson Xn L ECE Department, Carnege Mellon Unversty Forbs Avenue, Pttsburgh, PA 3 xnl@ece.cmu.edu

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Kinematics of pantograph masts

Kinematics of pantograph masts Abstract Spacecraft Mechansms Group, ISRO Satellte Centre, Arport Road, Bangalore 560 07, Emal:bpn@sac.ernet.n Flght Dynamcs Dvson, ISRO Satellte Centre, Arport Road, Bangalore 560 07 Emal:pandyan@sac.ernet.n

More information

Comparison Study of Textural Descriptors for Training Neural Network Classifiers

Comparison Study of Textural Descriptors for Training Neural Network Classifiers Comparson Study of Textural Descrptors for Tranng Neural Network Classfers G.D. MAGOULAS (1) S.A. KARKANIS (1) D.A. KARRAS () and M.N. VRAHATIS (3) (1) Department of Informatcs Unversty of Athens GR-157.84

More information

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling Artfcal Intellgence Technques for Steam Generator Modellng Sarah Wrght and Tshldz Marwala Abstract Ths paper nvestgates the use of dfferent Artfcal Intellgence methods to predct the values of several contnuous

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Radial Basis Functions

Radial Basis Functions Radal Bass Functons Mesh Reconstructon Input: pont cloud Output: water-tght manfold mesh Explct Connectvty estmaton Implct Sgned dstance functon estmaton Image from: Reconstructon and Representaton of

More information

Why visualisation? IRDS: Visualization. Univariate data. Visualisations that we won t be interested in. Graphics provide little additional information

Why visualisation? IRDS: Visualization. Univariate data. Visualisations that we won t be interested in. Graphics provide little additional information Why vsualsaton? IRDS: Vsualzaton Charles Sutton Unversty of Ednburgh Goal : Have a data set that I want to understand. Ths s called exploratory data analyss. Today s lecture. Goal II: Want to dsplay data

More information

The Comparison of Calibration Method of Binocular Stereo Vision System Ke Zhang a *, Zhao Gao b

The Comparison of Calibration Method of Binocular Stereo Vision System Ke Zhang a *, Zhao Gao b 3rd Internatonal Conference on Materal, Mechancal and Manufacturng Engneerng (IC3ME 2015) The Comparson of Calbraton Method of Bnocular Stereo Vson System Ke Zhang a *, Zhao Gao b College of Engneerng,

More information

APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET

APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET Jae-young Lee, Shahram Payandeh, and Ljljana Trajovć School of Engneerng Scence Smon Fraser Unversty 8888 Unversty

More information

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009)

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009) Statstcal Technques n Robotcs (Fall 09) Keywords: classfer ensemblng, onlne learnng, expert combnaton, machne learnng Javer Hernandez Alberto Rodrguez Tomas Smon javerhe@andrew.cmu.edu albertor@andrew.cmu.edu

More information

A Bilinear Model for Sparse Coding

A Bilinear Model for Sparse Coding A Blnear Model for Sparse Codng Davd B. Grmes and Rajesh P. N. Rao Department of Computer Scence and Engneerng Unversty of Washngton Seattle, WA 98195-2350, U.S.A. grmes,rao @cs.washngton.edu Abstract

More information

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume

More information

Model Selection with Cross-Validations and Bootstraps Application to Time Series Prediction with RBFN Models

Model Selection with Cross-Validations and Bootstraps Application to Time Series Prediction with RBFN Models Model Selecton wth Cross-Valdatons and Bootstraps Applcaton to Tme Seres Predcton wth RBF Models Amaury Lendasse Vncent Wertz and Mchel Verleysen Unversté catholque de Louvan CESAME av. G. Lemaître 3 B-348

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

A Machine Learning Approach to Developing Rigid-body Dynamics Simulators for Quadruped Trot Gaits

A Machine Learning Approach to Developing Rigid-body Dynamics Simulators for Quadruped Trot Gaits A Machne Learnng Approach to Developng Rgd-body Dynamcs Smulators for Quadruped Trot Gats Jn-Wook Lee leepc@stanford.edu Abstract I present a machne learnng based rgd-body dynamcs smulator for trot gats

More information

APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET

APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET Jae-young Lee, Shahram Payandeh, and Ljljana Trajovć School of Engneerng Scence Smon Fraser Unversty 8888 Unversty

More information

Optimal Scheduling of Capture Times in a Multiple Capture Imaging System

Optimal Scheduling of Capture Times in a Multiple Capture Imaging System Optmal Schedulng of Capture Tmes n a Multple Capture Imagng System Tng Chen and Abbas El Gamal Informaton Systems Laboratory Department of Electrcal Engneerng Stanford Unversty Stanford, Calforna 9435,

More information

Learning Non-Linearly Separable Boolean Functions With Linear Threshold Unit Trees and Madaline-Style Networks

Learning Non-Linearly Separable Boolean Functions With Linear Threshold Unit Trees and Madaline-Style Networks In AAAI-93: Proceedngs of the 11th Natonal Conference on Artfcal Intellgence, 33-1. Menlo Park, CA: AAAI Press. Learnng Non-Lnearly Separable Boolean Functons Wth Lnear Threshold Unt Trees and Madalne-Style

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

Private Information Retrieval (PIR)

Private Information Retrieval (PIR) 2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

Classifier Swarms for Human Detection in Infrared Imagery

Classifier Swarms for Human Detection in Infrared Imagery Classfer Swarms for Human Detecton n Infrared Imagery Yur Owechko, Swarup Medasan, and Narayan Srnvasa HRL Laboratores, LLC 3011 Malbu Canyon Road, Malbu, CA 90265 {owechko, smedasan, nsrnvasa}@hrl.com

More information

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices Hgh resoluton 3D Tau-p transform by matchng pursut Wepng Cao* and Warren S. Ross, Shearwater GeoServces Summary The 3D Tau-p transform s of vtal sgnfcance for processng sesmc data acqured wth modern wde

More information

Optimal Workload-based Weighted Wavelet Synopses

Optimal Workload-based Weighted Wavelet Synopses Optmal Workload-based Weghted Wavelet Synopses Yoss Matas School of Computer Scence Tel Avv Unversty Tel Avv 69978, Israel matas@tau.ac.l Danel Urel School of Computer Scence Tel Avv Unversty Tel Avv 69978,

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information