Learning physical Models of Robots

Size: px
Start display at page:

Download "Learning physical Models of Robots"

Transcription

1 Learnng physcal Models of Robots Jochen Mück Technsche Unverstät Darmstadt Abstract In robotcs good physcal models are needed to provde approprate moton control for dfferent types of robots. Classcal robotcs falls short, when models are hand-crafted and unmodeled features as well as nose cause problems and naccuracy. Based on the paper Scalable Technques from Nonparametrc Statstcs for Real Tme Robot Learnng [4] by Stefan Schaal, Chrstopher G. Atkeson and Sethu Vjayakumar ths revew paper dscusses the model learnng problem, descrbes algorthms for locally weghted learnng and presents some real world applcatons. 1 Introducton Moton-Control s one of the most mportant basc tasks n robotcs. The classcal robotcs approach, where a physcal model of the robot s hand-crafted by an engneer before the robot s actually used, quckly comes to ts lmts because of the lack of accuracy n the model and nose caused by unmodeled features and change n characterstcs due to envronmental nfluence. In classcal robotcs smple control laws (e.g. PID-Controller) are used to compensate these errors. In contrast the robot learnng approach s to constantly learn and mprove a physcal model of the robot usng recorded data from the robots jonts. The concept of self-mprovement has the flexblty to handle nose and unmodeled features. Hence, usng learned models can help generatng better control-laws, thus moton-control tasks can be performed wth hgher accuracy. Challenges n model learnng for robots are the hgh dmensonalty of the problem (e.g up to 90 dmensons for a humanod robot) and the need to calculate and mprove the model n real-tme gven a contnuous stream of data. Furthermore usng on-lne robot learnng on autonomous systems requres fast algorthms snce hgh performance hardware usually s not avalable. There are many dfferent learnng methods to solve a learnng problem. But snce large amounts of data has to be handled and nexpensve algorthms are preferred for on-lne model learnng, not all of these can be appled to model learnng for robots. In ths paper locally weghted learnng methods (LWL) are descrbed, whch are sutable for robot learnng problems. Before the algorthms are presented n secton 3, some foundatons of model learnng are dscussed n the next secton whch wll also gve a short overvew about statstcs, dfferent types of models and learnng archtectures. Fnally real-world model learnng applcatons are descrbed n secton 4. 1

2 2 Foundatons of Model Learnng Ths sectons gves an overvew about model learnng. Frst the Model Learnng Problem and a soluton usng regresson methods and gradent descent search are shown. Afterwards dfferent types of models are explaned. At last learnng archtectures, whch make use of models, are descrbed. 2.1 The Model Learnng Problem In robot control, generally dfferent model learnng problems are nterestng. These are: Forward Knematcs Inverse Knematcs Forward Dynamcs Inverse Dynamcs x = f(q) ẋ = J(q) q (1) ẍ = J(q) q + J(q) q q = f 1 (x) (2) q = M 1 (q)(u c( q, q) g(q)) (3) u = M(q) q d + c( q, q) + g(q) (4) All of these model learnng problems except the nverse knematcs can be solved usng regresson methods. The forward dynamcs equaton 3 descrbes the jont acceleratons q gven the appled torque u usng the physcal parameters M 1, c( q, q) and g(q). In contrast the nverse dynamcs equaton 4 descrbes whch torques have to be appled to the actuators to acheve the desred jont acceleratons q d. Hence, the task to learn a physcal model of a robot s to determne the parameters M, c( q, q) and g(q). More generally the model can be wrtten as: y = f θ (x) + ɛ (5) The functon f θ (x) can also be wrtten as φ(x) T θ wth parameters θ and features φ. ɛ denotes Gaussan dstrbuted nose. The goal of model learnng s to fnd parameters θ, so that a cost-functon J s mnmal. J usually s defned as a least squares cost-functon: Whch can also be wrtten as: J = 1 2 N (y f θ (x )) 2 (6) =1 J = 1 2 (Y φθ)t (Y φθ) (7) To fnd parameters θ wth mnmum cost-functon J can be done wth gradent descent search (see secton 2.1.1). A closed form for the soluton can be derved from gradent descent search and can be wrtten as: θ = (φ T φ) 1 φ T Y (8) Equatons 5 to 8 show how a model learnng problem looks lke and how t can be solved usng basc lnear regresson methods. 2

3 2.1.1 Gradent Descent Search Fgure 1: Gradent Descent Search Gradent descent search s an teratve method to fnd a local mnma of a mult-varable functon F(x). The dea s to teratvely take small steps nto the drecton of the functon s gradent at a certan pont x k, gven a random startng pont. The step-sze can be modfed wth a parameter γ. The teraton-step can be wrtten as: x k+1 = x k γ F(x k ) (9) Fgure 1 shows how the gradent descent search methods works n a quadratc functon. 2.2 Types of Models The goal of model learnng s to learn the behavor of the system gven some observed quanttes. Therefore mssng nformaton has to be predcted. Dependng on what knd of nformaton s mssng, dfferent types of models can be defned (see [1] secton 2.1). Forward Models predct a future state of the system s k+1 gven the current state and acton s k and a k. Snce the forward model drectly descrbes the physcal propertes of the system, whch represents a causal relatonshp between states and actons, learnng a forward model s a well-defned problem. Inverse Models n contrast are used to compute an acton a k whch s needed to get from state s k to s k+1. Snce ths mappng n general s not unque, learnng an nverse model s not always a well-defned problem. But for a robot s nverse dynamcs t s well-defned, thus the robot s nverse dynamcs model can be learned usng regresson methods. Forward and nverse models are the most mportant types of models. Although a combnaton of both models can be useful. The forward model can help to create a unque mappng for an nverse model learnng problem. These models are called Mxed Models. Furthermore n some applcatons t s mportant to know not just the next state of the system, but several states n the future. Models that provde ths nformaton are called Mult-Step Predcton Models. 3

4 2.3 Learnng Archtectures Dependng on the type of model and the learnng problem dfferent learnng archtectures can be defned. The learnng archtecture descrbes what quanttes are observed to learn the model (see [1] secton 2.2). Usng Drect Modelng a model s learnt by observng the system s nputs and outputs. The dea s to obtan the systems behavor and therefore the model by observng an nput acton and the resultng output state of the system. Ths learnng archtecture can bascally be used to learn all types of models f the learnng problem s well-defned. An Indrect Modelng learnng approach s feedback error learnng, where a feedforward controller s beng learned from a feedback controllers error sgnals. If the learned forward model s perfect, the feedback controller wll not nfluence the control system any more, otherwse the error sgnal s used to update the model. Thus ths learnng archtecture always tres to mnmze the error. Although t can deal wth ll-posed problems such as nverse knematcs learnng. One drawback of ths archtecture s that t has to be appled on-lne n order to get the actual feedback controller s error sgnals. Indrect Modelng can be used to learn nverse models and mxed models. In the Dstal Teacher Approach a unque forward model acts as a teacher to obtan an learned nverse models error and therefore help to update the nverse model. The goal s to mnmze the error. The dea s that the nverse model results n a correct soluton for a desred trajectory when the error between the output of the forward model and the nput of the nverse model s mnmzed. The Dstal Teacher Approach can only be appled for learnng mxed models. 3 Locally Weghted Learnng Locally Weghted Learnng (LWL) s one approach to learn models from tranng data. As descrbed before the am s to fnd the functon f θ (x) n the model equaton 5. The dea of LWL methods s to approxmate ths non-lnear functon by means of pecewse lnear models. The man challenge s to fnd the regon where the local model s vald (receptve feld). Here, for each lnear model the followng receptve feld s used: w k = exp( 1 2 (x c k) T D k (x c k )) (10) The advantage of LWL methods s that we do not need a feature vector φ, whch s usually hand-crafted. The remander of ths secton descrbes four dfferent LWL methods. 3.1 Locally Weghted Regresson Locally Weghted Regresson (LWR) extends the standard regresson method as shown n equaton 8 by a weght matrx whch determnes how much nfluence the data next to the query pont has to the local lnear model. Ths only requres the learnng system to have suffcent tranng data n memory. Thus the model can easly be updated by addng new tranng data. An algorthm for a predcton ŷ q for a query pont x q s shown below (Algorthm 1). Ths algorthm seems quet complex on a frst vew, but snce the wegh-matrx W ensures that data-ponts whch are not close to the query-pont wll be equal zero, the matrx multplcaton and pseudo nverse smplfes a lot. Nevertheless the complexty rses wth the dmensonalty of the system. The not yet defned varable n the above algorthm s the parameter D whch s the dstance matrx of the receptve feld. Thus ths parameter descrbes how bg the regon of valdty around a query-pont s. D can be optmzed usng Leave-One-Out Cross Valdaton (Algorthm 2), when suffcent tranng data s recorded. In purpose to reduce the number of parameters, D s assumed to be a global dagonal matrx multpled by a scalng factor h. Ths scalng factor s now the only parameter to be optmzed. Leave-One-Out Cross Valdaton predcts a value for a query-pont whch s left out of the tranng data and compares 4

5 Algorthm 1 Locally Weghted Regresson Gven: x q (Query Pont), p (Tranng Ponts {x, y }) Compute weght-matrx W: w = exp( 1 2 (x x q ) T D(x x q )) Buld matrx X and vector y such that: X = ( x 1, x 2..., x p ) T where x = [(x x q ) T 1] T y = (y 1, y 2,..., y p ) T Compute locally lnear model: β = (X T WX) 1 X T Wy Compute predcton x q : ŷ q = β n+1 the predcton afterwards to the actual sample value, resultng n an error. Ths s repeated for all tranng ponts. The factor h s chosen to acheve a mnmal error. Algorthm 2 Leave-One-Out Cross Valdaton Gven: a set H of reasonable values h r for all h r H do sse r = 0 for = 1 : p do x q = x Temporarly exclude {x, y } from tranng data Compute LWR predcton ŷ q wth reduced data sse r = sse r + (y ŷ q ) 2 end for end for Choose optmal h r such that h r = mn{sse r } 3.2 Locally Weghted Partal Least Squares Snce the complexty of the LWR Algorthm (Algorthm 1) rses wth the nput dmensonalty LWR can be slow for hgher dmensons. Furthermore the matrx nverson step can become numercally unstable f there are redundant nput dmensons. Locally Weghted Partal Least Squares (LWPLS) takes care of these problems. In ths approach Partal Least Squares (PLS) s used to reduce the complexty of the problem. PLS s based on a lnear transton from the hgh dmensonalty of the nput to a new varable space based on lower dmensonal orthogonal factors. Ths means, that those orthogonal factors are ndependent lnear combnatons of the orgnal nput. These projectons are used to calculate a predcton for a query-pont (see Algorthm 3). The only undefned parameter s the number of projectons r. Snce the squared error for each new projecton should be reduced, addng new projectons can be stopped f the rate of error reducton learnng tasks. res2 res 2 1 < φ s not hgh enough any more. In [4] φ = 0.5 was used for all 5

6 Algorthm 3 Locally Weghted Partal Least Squares Gven: x q (Query Pont), p (Tranng Ponts {x, y }) Compute weght-matrx W: w = exp( 1 2 (x x q ) T D(x x q )) Buld matrx X and vector y such that: x = p =1 w x / p =1 w β 0 = p =1 w y / p =1 w X = ( x 1, x 2,..., x p ) T where x = (x x) y =(ỹ 1, ỹ 2,..., ỹ p ) T where ỹ = (y β 0 ) Compute locally lnear model: Intalze: Z 0 = X, res 0 = y for = 1 : r do u = Z T 1 Wres 1 s = Z 1 u β = st Wres 1 s T Ws p = st WZ 1 s T Ws res = res 1 s β Z = Z 1 s p end for Compute predcton x q : Intalze: z 0 = x q x, ỹ q = β 0 for = 1 : r do s = z T 1 u ỹ q ỹ q + s β z = z 1 s p T end for 3.3 Receptve Feld Weghted Regresson When tranng data s receved constantly by the learnng system lke n on-lne learnng scenaros, the data set becomes very large. In ths case LWR and LWPLS fall short because of hgh computatonal cost. Instead of computng a local model when a predcton has to be made, Receptve Feld Weghted Regresson (RFWR) teratvely bulds new local models when tranng data s added. Thus the predcton for a query-pont can be computed as the weghted average over the predctons of all local models: K k=1 ŷ q = w k ˆ K k=1 w k y q,k (11) Algorthm 4 shows how the local models are updated. Lke n LWL and LWPLS the only Algorthm 4 Receptve Feld Weghted Regresson Gven: a tranng pont (x, y) Update K local models: w k = exp( 1 2 (x c k) T D k (x c k )) β n+1 k = βk n + w kp n+1 k xe cv,k where x = [(x c k ) T 1] and P n+1 k Compute predcton x q : ŷ k = βk T x 1 = 1 λ (Pk n Pn k x xt P n k λ w k + x T P n k x) and e cv,k = (y βk nt x) open parameter s the dstance matrx D. Snce RFWR uses several local models, we can use a dfferent dstance matrx D for each of these models. In [3] the followng cost-functon 6

7 for a gradent descent update of D was used: J = 1 k =1 w k =1 w y ŷ 2 (1 w x T P x ) + γ n 2,j=1 D 2 j (12) Note that RWFR s knd of an ncremental verson of LWR, thus t also can not deal wth very hgh dmensonal problems. 3.4 Locally Weghted Projecton Regresson For hgher dmensonal problems LWPLS was used to reduce the dmensonalty and therefore the complexty of the problem. Usng LWPLS on-lne wth huge data-sets relates to the problem of LWR whch was solved there usng ncremental model updates leadng to RWFR. The dea of Locally Weghted Projecton Regresson s to formulate an ncremental verson of LWPLS whch can deal wth hgh dmensonalty and huge data-sets. Algorthm 5 shows how one local model s updated. Algorthm 5 Locally Weghted Projecton Regresson Gven: a tranng pont (x, y) Update the means of nputs and outputs: x n+1 0 = λw n x n 0 +wx W n+1 β0 n+1 = λw n β n 0 +wy W n+1 where W n+1 = λw n + w Update the local model: Intalze z 0 = x x n+1 0, res 0 = y β0 n+1 for = 1 : r do u n+1 = λu n + wz 1 res 1 s = z T 1 un+1 SS n+1 = λss n + ws2 = λsr n + ws2 res 1 SZ n+1 = λsz n + wz 1s SR n+1 β n+1 p n+1 = SRn+1 SS n+1 = SZn+1 SS n+1 z = z +1 s p n+1 res = res 1 s β n+1 SSE n+1 = λsse n + wres2 end for Lke n RFWR the dstance matrx D s updated usng gradent descent for each local model (see [4]). 7

8 4 Model Learnng Applcatons In ths secton some very dfferent model learnng applcatons are shown. For every applcaton nformaton about the task that has to be learned, the type of model learnng method used as well as the results are presented. The frst three example applcatons use dfferent LWL methods, whle the last example uses Gaussan process regresson (GPR). Ths example ntents to show that other model learnng methods can be appled to specfc problems but wll not gve an detaled look nto GPR. 4.1 Learnng Devl Stckng Fgure 2: Devl Stckng Task a) and Robot Layout b) (see [4]) Task Descrpton: Jugglng a center stck between two control stcks. The center stck s lfted alternately by the two control stcks to joggle t. The task s to learn a contnuous left-rght-left pattern. Ths task s modeled as a dscrete functon that maps mpact states from one hand to the other. A state s descrbed as a vector x = (p, θ, ẋ, ẏ, θ) T wth mpact poston, angle, veloctes of the center of the center stck and ts angular velocty. The task command u = (x h, y h, θ, v x, v y ) T wth a catch poston, and angular trgger velocty and the two dmensonal throw drecton. Type of Model Learnng: The robot learns a forward model gven the current state and acton and predctng the next state. Ths problem has a 10 dmensonal nput and a fve dmensonal output and therefore s deally suted for LWR methods. Furthermore tranng data s only generated wth 1-2Hz, every tme the center stck hts one of the control stcks. Results: More than 1000 consecutve hts where counted as a successful tral, whch was acheved n about trals (see Fgure 3). Ths s a remarkable result snce even for humans devl stckng s a dffcult task and a lot of trals are needed for an untraned human. Fgure 3: Devl Stckng Results (see [4]) 8

9 4.2 Learnng Pole Balancng Fgure 4: Pole Balancng a) and Results b) (see [4]) Task Descrpton: In ths applcaton balancng a pole up-rght on a robots fnger s the learnng task (see Fgure 4a). The robot arm has 7 degree-of-freedom. Gven the nverse dynamcs model of the robot the goal of the learnng problem was to learn task level commands lke Cartesan acceleratons of the robot s fnger. Type of Model Learnng: On-lne learnng usng RFWR was used wth nput data from a stereo camera system observng the pole. Thus nput data s 12 dmensonal: 3 postons of the lower pole end, 2 angular postons, the 5 correspondng veloctes and two horzontal acceleratons of the robot s fnger. The output that predcts the next state of the pole therefore s 10 dmensonal. So RFWR was used to learn a forward model of the pole to predct the next state. Results: Whle learnng the model from scratch took the system about trals to keep the pole up-rght for longer then a mnute, observng a human teacher who demonstrated pole balancng for 30 seconds helped to extract the forward model so that the system fulflled the task on a sngle tral (see Fgure 4b). Ths could also be shown usng dfferent poles and demonstratons from dfferent people. 9

10 4.3 Inverse Dynamcs Learnng Task Descrpton: Computng a nverse dynamcs model 4 by hand from rgd-body dynamcs does not lead to a satsfyng soluton snce a lot of the systems propertes can not be modeled accurately. Learnng the nverse dynamcs model of a 7-degree-of-freedom robot arm was accomplshed n ths example applcaton (=21 nput dmensons). In a second example the authors learned the nverse dynamcs of a humanod robots shoulder motor wth 30 degrees-of-freedom (=90 nput dmensons). Type of Model Learnng: In both nverse dynamcs learnng problems LWPR was appled. Ths fts the need of handlng hgh dmensonalty nput and copng wth bg data sets. Results: For the 7-degree-of-freedom robot arm Fgure 5 shows the results compared to a parametrc nverse dynamcs model. The normalzed Mean-Squared-Error (=nmse) converges after about tranng ponts to nm SE = Durng learnng LWPR created about 300 local models. Fgure 5: Inverse dynamcs learnng results wth 7DOF Robot (see [4]) The results of the second example are shown n Fgure 6. Here a lot more tranng ponts where needed, but LWPR outperformed the parametrc model quckly. On ths hgh dmensonal learnng problem LWPR created more than 2000 local models. Fgure 6: Inverse dynamcs learnng results wth 30DOF Robot (see [4]) 10

11 4.4 Bo-Inspred Moton Control Based on a Learned Inverse Dynamcs Model Task Descrpton: The prevous example shows that learnng an nverse dynamcs model leads to better results n comparson to manually computed models. Ths was shown on a rgd body robot wth stff jonts. The problem of accurately modelng all system propertes gets even worse when elastc (bo-nspred) jonts are used. In ths example a dfferent model learnng approach was used to learn an nverse dynamcs model of the bo-nspred robot BoBped1 (see Fgure 7). Type of Model Learnng: The authors decded to use Gaussan process regresson (GPR) to learn the nverse model off-lne usng recorded tranng data. As shown n [2] GPR s suted well for off-lne model learnng. Fgure 7: Bo-Inspred robot BoBped1 (see [5]) Results: Usng the nverse dynamcs model a model-based feed-forward controller was mplemented and compared to a standard PD-controller. Furthermore a combnaton of the feed-forward controller and the feed-back controller was mplemented. All three approaches were evaluated n an experment, where the robot s standng on the ground wth both feet and performs a perodc up and down swngng moton usng both legs. Fgure 8 shows the jont error of all three approaches. Fgure 8: Bo-Inspred Moton Control Results (see [5]) 11

12 5 Concluson Model learnng can help to fnd propertes of a system whch s n ths case related to a robot. In 1 the model learnng problem was dscussed, descrbng dfferent types of models and a regresson soluton was shown by mnmzng a cost functon. Dfferent types of models where categorzed n 2 n more detal together wth learnng archtectures whch can be utlzed to learn a specfc model. Based on the regresson soluton to the model learnng problem locally weghted learnng (LWL) methods where dscussed n secton 3. Begnnng wth a very straghtforward approach (see 1), whch than was modfed n 3 to reduce the systems dmensonalty and n 4 and 5 to deal wth bg data-sets. The frst three examples show that applyng LWL methods to dfferent learnng scenaros s successful for learnng forward models, as well as nverse dynamcs models. The last applcaton example (see 4.4) shows that also other learnng methods (here GPR) can be appled successful to a learnng problem. However, the challenge n model learnng remans to deal wth hgh dmensonalty, complex algorthms and bg data sets. On-lne learnng s preferred for self-mproved whle performng a certan task but needs very effcent algorthms (e.g. GPR can not appled on-lne). References [1] Nguyen-Tuong, D., and Peters, J. Model learnng n robotcs: a survey. Cogntve Processng, 12(4) (2011), Intellgent Autonomous Systems. [2] Nguyen-Tuong, D., Peters, J., Seeger, M., and Schölkopf, B. Learnng nverse dynamcs: a comparson. In Proceedngs of the European Symposum on Artfcal Neural Networks (ESANN) (2008). Intellgent Autonomous Systems. [3] Schaal, S., and Atkeson, C. G. Constructve ncremental learnng from only local nformaton. Neural Computaton 10 (1997), [4] Schaal, S., Atkeson, C. G., and Vjayakumar, S. Scalable technques from nonparametrc statstcs for real tme robot learnng. Appled Intellgence 17 (2002), /A: [5] Scholz, D., Kurowsk, S., Radkhah, K., and von Stryk, O. Bo-nspred moton control of the musculoskeletal bobped1 robot based on a learned nverse dynamcs model. In Proc. 11th IEEE-RAS Intl. Conf. on Humanod Robots (Bled, Slovena, Oct ). 12

Learning Inverse Kinematics

Learning Inverse Kinematics D Souza, A., Vjayakumar, S., Schaal, S. Learnng nverse knematcs. In Proceedngs of the IEEE/RSJ Internatonal Conference on Intellgence n Robotcs and Autonomous Systems (IROS 21). Mau, HI, USA, Oct 21. Learnng

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Fast and Efficient Incremental Learning for High-dimensional Movement Systems

Fast and Efficient Incremental Learning for High-dimensional Movement Systems Vjayakumar, S, Schaal, S (2). Fast and effcent ncremental learnng for hgh-dmensonal movement systems, Internatonal Conference on Robotcs and Automaton (ICRA2). San Francsco, Aprl 2. Fast and Effcent Incremental

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

A Machine Learning Approach to Developing Rigid-body Dynamics Simulators for Quadruped Trot Gaits

A Machine Learning Approach to Developing Rigid-body Dynamics Simulators for Quadruped Trot Gaits A Machne Learnng Approach to Developng Rgd-body Dynamcs Smulators for Quadruped Trot Gats Jn-Wook Lee leepc@stanford.edu Abstract I present a machne learnng based rgd-body dynamcs smulator for trot gats

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Chapter 6 Programmng the fnte element method Inow turn to the man subject of ths book: The mplementaton of the fnte element algorthm n computer programs. In order to make my dscusson as straghtforward

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Overview. Basic Setup [9] Motivation and Tasks. Modularization 2008/2/20 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION

Overview. Basic Setup [9] Motivation and Tasks. Modularization 2008/2/20 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION Overvew 2 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION Introducton Mult- Smulator MASIM Theoretcal Work and Smulaton Results Concluson Jay Wagenpfel, Adran Trachte Motvaton and Tasks Basc Setup

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016) Technsche Unverstät München WSe 6/7 Insttut für Informatk Prof. Dr. Thomas Huckle Dpl.-Math. Benjamn Uekermann Parallel Numercs Exercse : Prevous Exam Questons Precondtonng & Iteratve Solvers (From 6)

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Training ANFIS Structure with Modified PSO Algorithm

Training ANFIS Structure with Modified PSO Algorithm Proceedngs of the 5th Medterranean Conference on Control & Automaton, July 7-9, 007, Athens - Greece T4-003 Tranng ANFIS Structure wth Modfed PSO Algorthm V.Seyd Ghomsheh *, M. Alyar Shoorehdel **, M.

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

Simulation Based Analysis of FAST TCP using OMNET++

Simulation Based Analysis of FAST TCP using OMNET++ Smulaton Based Analyss of FAST TCP usng OMNET++ Umar ul Hassan 04030038@lums.edu.pk Md Term Report CS678 Topcs n Internet Research Sprng, 2006 Introducton Internet traffc s doublng roughly every 3 months

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling Artfcal Intellgence Technques for Steam Generator Modellng Sarah Wrght and Tshldz Marwala Abstract Ths paper nvestgates the use of dfferent Artfcal Intellgence methods to predct the values of several contnuous

More information

3. CR parameters and Multi-Objective Fitness Function

3. CR parameters and Multi-Objective Fitness Function 3 CR parameters and Mult-objectve Ftness Functon 41 3. CR parameters and Mult-Objectve Ftness Functon 3.1. Introducton Cogntve rados dynamcally confgure the wreless communcaton system, whch takes beneft

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Fusion Performance Model for Distributed Tracking and Classification

Fusion Performance Model for Distributed Tracking and Classification Fuson Performance Model for Dstrbuted rackng and Classfcaton K.C. Chang and Yng Song Dept. of SEOR, School of I&E George Mason Unversty FAIRFAX, VA kchang@gmu.edu Martn Lggns Verdan Systems Dvson, Inc.

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline mage Vsualzaton mage Vsualzaton mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and Analyss outlne mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Cost-efficient deployment of distributed software services

Cost-efficient deployment of distributed software services 1/30 Cost-effcent deployment of dstrbuted software servces csorba@tem.ntnu.no 2/30 Short ntroducton & contents Cost-effcent deployment of dstrbuted software servces Cost functons Bo-nspred decentralzed

More information

S.P.H. : A SOLUTION TO AVOID USING EROSION CRITERION?

S.P.H. : A SOLUTION TO AVOID USING EROSION CRITERION? S.P.H. : A SOLUTION TO AVOID USING EROSION CRITERION? Célne GALLET ENSICA 1 place Emle Bloun 31056 TOULOUSE CEDEX e-mal :cgallet@ensca.fr Jean Luc LACOME DYNALIS Immeuble AEROPOLE - Bat 1 5, Avenue Albert

More information

Inverse Kinematics (part 2) CSE169: Computer Animation Instructor: Steve Rotenberg UCSD, Spring 2016

Inverse Kinematics (part 2) CSE169: Computer Animation Instructor: Steve Rotenberg UCSD, Spring 2016 Inverse Knematcs (part 2) CSE169: Computer Anmaton Instructor: Steve Rotenberg UCSD, Sprng 2016 Forward Knematcs We wll use the vector: Φ... 1 2 M to represent the array of M jont DOF values We wll also

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Analysis on the Workspace of Six-degrees-of-freedom Industrial Robot Based on AutoCAD

Analysis on the Workspace of Six-degrees-of-freedom Industrial Robot Based on AutoCAD Analyss on the Workspace of Sx-degrees-of-freedom Industral Robot Based on AutoCAD Jn-quan L 1, Ru Zhang 1,a, Fang Cu 1, Q Guan 1 and Yang Zhang 1 1 School of Automaton, Bejng Unversty of Posts and Telecommuncatons,

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Kinematics of pantograph masts

Kinematics of pantograph masts Abstract Spacecraft Mechansms Group, ISRO Satellte Centre, Arport Road, Bangalore 560 07, Emal:bpn@sac.ernet.n Flght Dynamcs Dvson, ISRO Satellte Centre, Arport Road, Bangalore 560 07 Emal:pandyan@sac.ernet.n

More information

Classification Based Mode Decisions for Video over Networks

Classification Based Mode Decisions for Video over Networks Classfcaton Based Mode Decsons for Vdeo over Networks Deepak S. Turaga and Tsuhan Chen Advanced Multmeda Processng Lab Tranng data for Inter-Intra Decson Inter-Intra Decson Regons pdf 6 5 6 5 Energy 4

More information

A Robust LS-SVM Regression

A Robust LS-SVM Regression PROCEEDIGS OF WORLD ACADEMY OF SCIECE, EGIEERIG AD ECHOLOGY VOLUME 7 AUGUS 5 ISS 37- A Robust LS-SVM Regresson József Valyon, and Gábor Horváth Abstract In comparson to the orgnal SVM, whch nvolves a quadratc

More information

Random Kernel Perceptron on ATTiny2313 Microcontroller

Random Kernel Perceptron on ATTiny2313 Microcontroller Random Kernel Perceptron on ATTny233 Mcrocontroller Nemanja Djurc Department of Computer and Informaton Scences, Temple Unversty Phladelpha, PA 922, USA nemanja.djurc@temple.edu Slobodan Vucetc Department

More information

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009)

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009) Statstcal Technques n Robotcs (Fall 09) Keywords: classfer ensemblng, onlne learnng, expert combnaton, machne learnng Javer Hernandez Alberto Rodrguez Tomas Smon javerhe@andrew.cmu.edu albertor@andrew.cmu.edu

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

[33]. As we have seen there are different algorithms for compressing the speech. The

[33]. As we have seen there are different algorithms for compressing the speech. The 49 5. LD-CELP SPEECH CODER 5.1 INTRODUCTION Speech compresson s one of the mportant doman n dgtal communcaton [33]. As we have seen there are dfferent algorthms for compressng the speech. The mportant

More information

AVO Modeling of Monochromatic Spherical Waves: Comparison to Band-Limited Waves

AVO Modeling of Monochromatic Spherical Waves: Comparison to Band-Limited Waves AVO Modelng of Monochromatc Sphercal Waves: Comparson to Band-Lmted Waves Charles Ursenbach* Unversty of Calgary, Calgary, AB, Canada ursenbach@crewes.org and Arnm Haase Unversty of Calgary, Calgary, AB,

More information

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices Hgh resoluton 3D Tau-p transform by matchng pursut Wepng Cao* and Warren S. Ross, Shearwater GeoServces Summary The 3D Tau-p transform s of vtal sgnfcance for processng sesmc data acqured wth modern wde

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

Multi-stable Perception. Necker Cube

Multi-stable Perception. Necker Cube Mult-stable Percepton Necker Cube Spnnng dancer lluson, Nobuuk Kaahara Fttng and Algnment Computer Vson Szelsk 6.1 James Has Acknowledgment: Man sldes from Derek Hoem, Lana Lazebnk, and Grauman&Lebe 2008

More information

Review of approximation techniques

Review of approximation techniques CHAPTER 2 Revew of appromaton technques 2. Introducton Optmzaton problems n engneerng desgn are characterzed by the followng assocated features: the objectve functon and constrants are mplct functons evaluated

More information

ROBOT KINEMATICS. ME Robotics ME Robotics

ROBOT KINEMATICS. ME Robotics ME Robotics ROBOT KINEMATICS Purpose: The purpose of ths chapter s to ntroduce you to robot knematcs, and the concepts related to both open and closed knematcs chans. Forward knematcs s dstngushed from nverse knematcs.

More information

A SYSTOLIC APPROACH TO LOOP PARTITIONING AND MAPPING INTO FIXED SIZE DISTRIBUTED MEMORY ARCHITECTURES

A SYSTOLIC APPROACH TO LOOP PARTITIONING AND MAPPING INTO FIXED SIZE DISTRIBUTED MEMORY ARCHITECTURES A SYSOLIC APPROACH O LOOP PARIIONING AND MAPPING INO FIXED SIZE DISRIBUED MEMORY ARCHIECURES Ioanns Drosts, Nektaros Kozrs, George Papakonstantnou and Panayots sanakas Natonal echncal Unversty of Athens

More information

Inverse kinematic Modeling of 3RRR Parallel Robot

Inverse kinematic Modeling of 3RRR Parallel Robot ème Congrès Franças de Mécanque Lyon, 4 au 8 Août 5 Inverse knematc Modelng of RRR Parallel Robot Ouafae HAMDOUN, Fatma Zahra BAGHLI, Larb EL BAKKALI Modelng and Smulaton of Mechancal Systems Laboratory,

More information

ViSP: A Software Environment for Eye-in-Hand Visual Servoing

ViSP: A Software Environment for Eye-in-Hand Visual Servoing VSP: A Software Envronment for Eye-n-Hand Vsual Servong Érc Marchand IRISA - INRIA Rennes Campus de Beauleu, F-3542 Rennes Cedex Emal: Erc.Marchand@rsa.fr Abstract In ths paper, we descrbe a modular software

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET

APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET APPLICATION OF PREDICTION-BASED PARTICLE FILTERS FOR TELEOPERATIONS OVER THE INTERNET Jae-young Lee, Shahram Payandeh, and Ljljana Trajovć School of Engneerng Scence Smon Fraser Unversty 8888 Unversty

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input

Real-time Joint Tracking of a Hand Manipulating an Object from RGB-D Input Real-tme Jont Tracng of a Hand Manpulatng an Object from RGB-D Input Srnath Srdhar 1 Franzsa Mueller 1 Mchael Zollhöfer 1 Dan Casas 1 Antt Oulasvrta 2 Chrstan Theobalt 1 1 Max Planc Insttute for Informatcs

More information

Angle-Independent 3D Reconstruction. Ji Zhang Mireille Boutin Daniel Aliaga

Angle-Independent 3D Reconstruction. Ji Zhang Mireille Boutin Daniel Aliaga Angle-Independent 3D Reconstructon J Zhang Mrelle Boutn Danel Alaga Goal: Structure from Moton To reconstruct the 3D geometry of a scene from a set of pctures (e.g. a move of the scene pont reconstructon

More information

Exercises (Part 4) Introduction to R UCLA/CCPR. John Fox, February 2005

Exercises (Part 4) Introduction to R UCLA/CCPR. John Fox, February 2005 Exercses (Part 4) Introducton to R UCLA/CCPR John Fox, February 2005 1. A challengng problem: Iterated weghted least squares (IWLS) s a standard method of fttng generalzed lnear models to data. As descrbed

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

THE PULL-PUSH ALGORITHM REVISITED

THE PULL-PUSH ALGORITHM REVISITED THE PULL-PUSH ALGORITHM REVISITED Improvements, Computaton of Pont Denstes, and GPU Implementaton Martn Kraus Computer Graphcs & Vsualzaton Group, Technsche Unverstät München, Boltzmannstraße 3, 85748

More information

Today s Outline. Sorting: The Big Picture. Why Sort? Selection Sort: Idea. Insertion Sort: Idea. Sorting Chapter 7 in Weiss.

Today s Outline. Sorting: The Big Picture. Why Sort? Selection Sort: Idea. Insertion Sort: Idea. Sorting Chapter 7 in Weiss. Today s Outlne Sortng Chapter 7 n Wess CSE 26 Data Structures Ruth Anderson Announcements Wrtten Homework #6 due Frday 2/26 at the begnnng of lecture Proect Code due Mon March 1 by 11pm Today s Topcs:

More information