Unsupervised Learning

Size: px
Start display at page:

Download "Unsupervised Learning"

Transcription

1 Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and Engneerng South Chna Unversty of Technology Smlarty (Dssmlarty) Measure Crteron Functon Clusterng Algorthm Lec8: Unsupervsed Learnng Lec8: Unsupervsed Learnng Supervsed VS Unsupervsed Recall Supervsed Learnng Labels gven Someone (a supervsor) provdes the true answer Unsupervsed Learnng No Labels gven Much harder than supervsed learnng Also named Learnng wthout a teacher You never know the true, correct answer How to evaluate the result? Examples of unsupervsed learnng problems New medcal cases New stars New speces Chess game playng Performance evaluaton (may be satsfactory today, but unsatsfactory by next year s standard) Lec8: Unsupervsed Learnng 3 Lec8: Unsupervsed Learnng

2 Unsupervsed Learnng How to evaluate the result? External: Expert comments Expert may be wrong Internal: Objectve functons E.g. Dstance between samples and centers Very ntutve Dfferent from Supervsed Learnng, the evaluaton method s subjectve Why Unsupervsed Learnng? Label s expensve Especally for huge dataset E.g. Medcal applcaton No deaon the number of classes Data Mnng Gan some nsghtabout the data structure before desgnng classfers E.g. Feature selecton Lec8: Unsupervsed Learnng 5 Lec8: Unsupervsed Learnng 6 Unsupervsed Learnng Data Structure Parametrc Approach Assume structure of dstrbuton s known Only need to estmate parametersof the dstrbuton E.g. Maxmum-Lkelhood Estmate Non-Parametrc Approach No assumpton on the dstrbuton Group data nto clusters Samples n the same group share somethng n common E.g. Clusterng Method Lec8: Unsupervsed Learnng 7 Parametrc Approach Mxture of Denstes Assume Samples come from known classes ω,, ω c The pror probablty (P(ω j )) s known The forms of class-condtonal probablty densty ( P(x ω j, θ j ) ) are known, but ther parameters θ j s are unknown Category labels unknown Dfferent denstes for each class ω j wth parameters θ j Lec8: Unsupervsed Learnng 8

3 Parametrc Approach: Maxmum-Lkelhood Estmates A method to fnd the soluton of unknown parameter θ The maxmum-lkelhood estmate θs that value of θthat maxmzes p(d θ) (Refer to Lecture ) Lec8: Unsupervsed Learnng 9 [#] Parametrc Approach: Maxmum-Lkelhood Estmates lbe the logarthm of the lkelhood, Take lnof Equ#, we have Max of lwll yeld Max of p(d θ) Take gradentof lwth respect to θ : Maxmum lkelhood estmate θ must satsfythe condtons Lec8: Unsupervsed Learnng 0 Applcaton to Normal Mxtures Assume the component densty s multvarate normal ( x w, θ) ~ N( µ, p ) A few of the dfferent casesthat can arse dependng upon whch parameters are known ( ) and whch are unknown (?): Case s the smplest, and wll be consdered n detal Case s more realstc, though somewhat more nvolved Case 3represents the problem we face encounterng a completely unknown set of data Lec8: Unsupervsed Learnng Applcaton to Normal Mxtures: Unknown Mean (Varance known) The lkelhood functon Log of the lkelhood / d/ t ln p ( x w, µ ) ln(π) ( x µ ) ( x µ ) Its dervatve s µ ( x ln p ( x w, µ ) µ ) t ( / ) ( x µ ) ( x ) ) p ( x w, µ ) exp( / µ d/ (π) Lec8: Unsupervsed Learnng

4 Applcaton to Normal Mxtures Unknown Mean Vectors (Varance known) The maxmum-lkelhood estmate µmust satsfy, accordng to last equaton on slde #9 Applcaton to Normal Mxtures Unknown Mean Vectors (Varance known) Iteratve method s appled After the terms are rearranged: (Equ **) where: µ cannot be calculated explctly µ (0) s set to a startng pont n ntalzaton Lec8: Unsupervsed Learnng 3 Lec8: Unsupervsed Learnng Applcaton to Normal Mxtures Unknown Mean Vectors (Varance known) Examples: 5 samples µ - and µ P(ω ) 0.33 and P(ω ) 0.67 (don t know whch sample belongng to whch class, but do know /3 belong to ω and /3 belong to ω ) Varance 9 Ths nformaton wll not be gven Applcaton to Normal Mxtures Unknown Mean Vectors By usng the teratve scheme (gradent descent) Dfferent startng ponts yeld dfferent solutons Two local mnmum solutons obtaned by usng teratve (Equ **) on slde# µ a µ b : : µ.30 µ. 668 µ.085 µ. 57 µ a and µ b are local optmal estmates for µ -, µ Lec8: Unsupervsed Learnng 5 Lec8: Unsupervsed Learnng 6

5 Applcaton to Normal Mxtures Unknown Mean Vectors The soluton s not unque due to dfferent pror probabltes (P(ω )s not equal top(ω )) Thus, when the mxture densty s not dentfable, the maxmum-lkelhood soluton s not unque. (note: P(xIµ a ) s better estmate to the source densty than P(xIµ b ) ) Drawback of Parametrc Approach Very msleadngf the sample dstrbuton s dfferent from the assumpton E.g. Normal Dstrbuton s assumed These datasets consdered to be the same for parametrc approach snce they have same mean and varance Non-parametrc approach can solve ths problem µ - µ ω ω Lec8: Unsupervsed Learnng 7 Lec8: Unsupervsed Learnng 8 Non-parametrc --- Clusterng Objectve: seek the natural clusters n the data Internal External What s a good clusterng result? Internal(wthn a cluster): Dstance should be small External(ntra-cluster): Dstance should be large Internal Non-parametrc ---Clusterng Three Important Factors Smlarty (Dssmlarty) Measure How smlar between two samples? Crteron Functon What knd of clusterng result s expected? Clusterng Algorthm E.g. optmze the crteron functon Lec8: Unsupervsed Learnng 9 Lec8: Unsupervsed Learnng 0

6 Non-parametrc Clusterng Smlarty Measure No best measurefor all cases Applcaton dependent Examples: Face Recognton, should have rotaton nvarance Should be smlar Non-parametrc Clusterng Smlarty Measure The scale of features may be very dfferent Dfferent Ranges Weght: , wast wdth: 8 5 Dfferent Unts Km VS mle, cm VS meter Should features be normalzed? f the spread s due to the presence of clusters, normalzaton reduces cluster effect (rght dagram) For character recognton, NO rotaton nvarance Should be dfferent Lec8: Unsupervsed Learnng Lec8: Unsupervsed Learnng Non-parametrc Clusterng Smlarty Measure Eucldean Dstance d( x, x) ( x x k) k k Manhattan Dstance d( x Cosne Smlarty s n n, x) x x k k k t, x x x x ( x x ) Lec8: Unsupervsed Learnng 3 ( 5) + ( 3 ) [ 3] Non-parametrc Clusterng Smlarty Measure Other examples: Mahalanobs Dstance d( x, x) n ( x x k) k k σ Chebyshev Dstance k ( x x ) d( x, x) max k k k n Lec8: Unsupervsed Learnng

7 Non-parametrc Clusterng Smplest Clusterng Algorthm A smple clusterng algorthmcan be developed after a smlarty measure s defned For each sample pars Groupthe samples n the same clusterf the dstance between them s less than a partcular threshold d 0 Advantage: Easy to understand and smple to mplement Dsadvantage: Hghly dependent on the threshold(d 0 ) Non-parametrc Clusterng Smplest Clusterng Algorthm Examples: large d 0 One Cluster Medum d 0 Reasonable Result Small d 0 Every sample s a cluster Lec8: Unsupervsed Learnng 5 Lec8: Unsupervsed Learnng 6 Non-parametrc Clusterng Crteron Functon Sum-of-Squared-Error (SSE) Crteron Mean of samples n cluster D s: m n x D x SSE crteron s: J e c x D x m n s the number of samples n D c s the number of cluster Non-parametrc Clusterng Crteron Functon Example: J e [5 ] + [5 5] m + [6 ] + [6 5] [ ] + [ ] m + [ ] + [ 3] 5 [3 ] [5.5 [.8.5].6] ( 5 5.5) + (.5) + ( 5 5.5) + ( 5.5) ( 6 5.5) + (.5) + ( 6 5.5) + ( 5.5) (.8) + (.6) + (.8) + (.6) (.8) + (.6) + (.8) + ( 3.6) ( ) ( ) Lec8: Unsupervsed Learnng 7 Lec8: Unsupervsed Learnng 8

8 Non-parametrc Clusterng Crteron Functon Smaller SSE s preferred e e J J Non-parametrc Clusterng Crteron Functon IsSSEa goodcrteron for all stuatons? No! Approprate when: The clusters form compact groups Equally szed clusters Not Approprate When natural groupngs have very dfferent szes More reasonable result! Lec8: Unsupervsed Learnng 9 Lec8: Unsupervsed Learnng 30 Non-parametrc Clusterng Crteron Functon For example: Result s more reasonable However, t has a larger value of SSEdue to the large cluster Result Result Non-parametrc Clusterng Crteron Functon SSE can be rewrtten as: J e c c ' x m x x x D n x D ' x D S s the average squared dstance between ponts n the th cluster and J e can be replaced by other crteron functons Wthn-cluster scatter matrx m n x D x n s Between-cluster scatter matrx Large J e Small J e Lec8: Unsupervsed Learnng 3 Lec8: Unsupervsed Learnng 3 Total scatter matrx

9 Non-parametrc Clusterng Clusterng Algorthm Fnd the optmal clusterng result Exhaustve search s mpossble C n /c! possble parttons Methods: Iteratve Optmzaton Algorthm K-means Herarchcal Clusterng Bottom Up Approach Top Down Approach Lec8: Unsupervsed Learnng 33 Non-parametrc Clusterng Iteratve Optmzaton Algorthm Algorthm (smlar to gradent descent):. Fnd a reasonable ntal partton. Move sample(s)from one cluster to another such that the objectve functon s mproved 3. Repeatstep untl stable Move [ 3] from cluster x to cluster o J e 0.9 J e 8. Lec8: Unsupervsed Learnng 3 Clusterng: Iteratve Optmzaton Algorthm K-means A popular technque: K-means Clusterng: Iteratve Optmzaton Algorthm K-means Crteron Functon: x D x m Assume there are c (k) classes We use k3n the followng example J e c. Intalzaton Randomly assgn the center of each cluster. Assgn Samples Assgn samples to closest center 3. Re-calculate the mean Compute the new means usng new samples Lec8: Unsupervsed Learnng 35 Repeat untl stable (no sample moves agan) Lec8: Unsupervsed Learnng 36

10 st step nd step 3rd step Centers are changed Centers are changed Clusterng: Iteratve Optmzaton Algorthm K-means It s a smart waywhch decreases the objectve functon Fast, effcent, popular Algorthm convergesafter a fnte number of teratons of steps and 3 However, Smlar to gradent descent, K-means method may be trapped at local mnmum Centers are not changed K-means Stops Trapped at Local Mnmum Global Mnmum Lec8: Unsupervsed Learnng 37 Lec8: Unsupervsed Learnng 38 Non-parametrc Clusterng Herarchcal Clusterng So far only dsjont clustershave been dscussed. In some stuatons, clusters may have subclustersand so on. Herarchcal cluster (taxonomy) s an example. Non-parametrc: Herarchcal Clusterng Dendogram Sutable way to represent a herarchcal clusterng s a Dendogram Example --Use Bnary Tree wth bottom-up parwse dstance smlarty measure Mean of x 3 & x Sample ponts Dendogram Lec8: Unsupervsed Learnng 39 Lec8: Unsupervsed Learnng 0

11 Non-parametrc: Herarchcal Clusterng Venn Venn dagramcan also show herarchcal clusterng (refer to last slde on Dendogram) However, no quanttatvenformaton (parwse dstance) can be conveyed n a Venn Dagram Non-parametrc: Herarchcal Clusterng Two types: Dvsve (top down) Approach Start wth cluster One cluster contans all samples Form herarchy by splttng the most dssmlar clusters Agglomeratve (bottom up) Approach Start wth nclusters Each cluster contans one sample Form herarchy by mergng the most smlar clusters Not effcent f a large number of samples but a number of clusters s needed Lec8: Unsupervsed Learnng Lec8: Unsupervsed Learnng Three Important Factors: Algorthm: Herarchcal Clusterng Top Down Approach Any Iteratve Optmzaton Algorthm can be appled by settng c Three Important Factors: Algorthm: Herarchcal Clusterng Bottom Up Approach Algorthm Do Whle more than one cluster.calculatedstancebetween two clusters for all cluster pars.mergethe nearesttwo clusters Four common dstance measures Mnmum Dstance: Maxmum Dstance: Average Dstance: Mean Dstance: Lec8: Unsupervsed Learnng 3 Lec8: Unsupervsed Learnng

12 Three Important Factors: Algorthm: Herarchcal Clusterng Bottom Up Approach Sngle Lnkage(Nearest-Neghbor) Intally every pont forms a cluster Mnmum Dstancebetween clusters s used n mergng two nearest neghbors Encourage growth of elongated clusters Dsadvantage: Senstve to nose cluster6 Cluster 6 cluster3 3 Cluster Mn s shortest Dstance between Two clusters cluster 5 cluster5 Ideal case Nose data Lec8: Unsupervsed Learnng 5 Three Important Factors: Algorthm: Herarchcal Clusterng Bottom Up Approach Complete Lnkage (Farthest Neghbor) cluster Intally every pont forms a cluster Maxmum Dstances used between clusters n mergng two nearest neghbors Encourages compact clusters Does not work well f elongated clusters present D D cluster6 6 cluster cluster Max s greatest Dstance between Two clusters 3 5 cluster3 cluster5 Ideally, D and D3 should be merged Lec8: Unsupervsed Learnng 6 d d D3 Snce d < d, D and D wll be merged Three Important Factors: Algorthm: Herarchcal Clusterng Bottom Up Approach Mnmum and maxmum dstance are extremely senstve to nose sample snce ther measurement nvolves mnma or maxma The result s more robustto outler when the average or the mean dstance are used Clusterng: Number of Clusters How to decde the number of clusters? Possble soluton: Try a range of cand see whch one has the lowest crteron value Mean dstances less tme consumedthan Average dstance Lec8: Unsupervsed Learnng 7 Lec8: Unsupervsed Learnng 8

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Hierarchical agglomerative. Cluster Analysis. Christine Siedle Clustering 1

Hierarchical agglomerative. Cluster Analysis. Christine Siedle Clustering 1 Herarchcal agglomeratve Cluster Analyss Chrstne Sedle 19-3-2004 Clusterng 1 Classfcaton Basc (unconscous & conscous) human strategy to reduce complexty Always based Cluster analyss to fnd or confrm types

More information

K-means and Hierarchical Clustering

K-means and Hierarchical Clustering Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n gvng your own lectures. Feel free to use these sldes verbatm, or to modfy them to ft your

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

cos(a, b) = at b a b. To get a distance measure, subtract the cosine similarity from one. dist(a, b) =1 cos(a, b)

cos(a, b) = at b a b. To get a distance measure, subtract the cosine similarity from one. dist(a, b) =1 cos(a, b) 8 Clusterng 8.1 Some Clusterng Examples Clusterng comes up n many contexts. For example, one mght want to cluster journal artcles nto clusters of artcles on related topcs. In dong ths, one frst represents

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Graph-based Clustering

Graph-based Clustering Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Active Contours/Snakes

Active Contours/Snakes Actve Contours/Snakes Erkut Erdem Acknowledgement: The sldes are adapted from the sldes prepared by K. Grauman of Unversty of Texas at Austn Fttng: Edges vs. boundares Edges useful sgnal to ndcate occludng

More information

Fuzzy Logic Based RS Image Classification Using Maximum Likelihood and Mahalanobis Distance Classifiers

Fuzzy Logic Based RS Image Classification Using Maximum Likelihood and Mahalanobis Distance Classifiers Research Artcle Internatonal Journal of Current Engneerng and Technology ISSN 77-46 3 INPRESSCO. All Rghts Reserved. Avalable at http://npressco.com/category/jcet Fuzzy Logc Based RS Image Usng Maxmum

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

APPLIED MACHINE LEARNING

APPLIED MACHINE LEARNING Methods for Clusterng K-means, Soft K-means DBSCAN 1 Objectves Learn basc technques for data clusterng K-means and soft K-means, GMM (next lecture) DBSCAN Understand the ssues and major challenges n clusterng

More information

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation Internatonal Conference on Logstcs Engneerng, Management and Computer Scence (LEMCS 5) Maxmum Varance Combned wth Adaptve Genetc Algorthm for Infrared Image Segmentaton Huxuan Fu College of Automaton Harbn

More information

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers

Investigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,

More information

Multi-stable Perception. Necker Cube

Multi-stable Perception. Necker Cube Mult-stable Percepton Necker Cube Spnnng dancer lluson, Nobuuk Kaahara Fttng and Algnment Computer Vson Szelsk 6.1 James Has Acknowledgment: Man sldes from Derek Hoem, Lana Lazebnk, and Grauman&Lebe 2008

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

EXTENDED BIC CRITERION FOR MODEL SELECTION

EXTENDED BIC CRITERION FOR MODEL SELECTION IDIAP RESEARCH REPORT EXTEDED BIC CRITERIO FOR ODEL SELECTIO Itshak Lapdot Andrew orrs IDIAP-RR-0-4 Dalle olle Insttute for Perceptual Artfcal Intellgence P.O.Box 59 artgny Valas Swtzerland phone +4 7

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

12. Segmentation. Computer Engineering, i Sejong University. Dongil Han

12. Segmentation. Computer Engineering, i Sejong University. Dongil Han Computer Vson 1. Segmentaton Computer Engneerng, Sejong Unversty Dongl Han Image Segmentaton t Image segmentaton Subdvdes an mage nto ts consttuent regons or objects - After an mage has been segmented,

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

This excerpt from. Foundations of Statistical Natural Language Processing. Christopher D. Manning and Hinrich Schütze The MIT Press.

This excerpt from. Foundations of Statistical Natural Language Processing. Christopher D. Manning and Hinrich Schütze The MIT Press. Ths excerpt from Foundatons of Statstcal Natural Language Processng. Chrstopher D. Mannng and Hnrch Schütze. 1999 The MIT Press. s provded n screen-vewable form for personal use only by members of MIT

More information

Fitting: Deformable contours April 26 th, 2018

Fitting: Deformable contours April 26 th, 2018 4/6/08 Fttng: Deformable contours Aprl 6 th, 08 Yong Jae Lee UC Davs Recap so far: Groupng and Fttng Goal: move from array of pxel values (or flter outputs) to a collecton of regons, objects, and shapes.

More information

SIGGRAPH Interactive Image Cutout. Interactive Graph Cut. Interactive Graph Cut. Interactive Graph Cut. Hard Constraints. Lazy Snapping.

SIGGRAPH Interactive Image Cutout. Interactive Graph Cut. Interactive Graph Cut. Interactive Graph Cut. Hard Constraints. Lazy Snapping. SIGGRAPH 004 Interactve Image Cutout Lazy Snappng Yn L Jan Sun Ch-Keung Tang Heung-Yeung Shum Mcrosoft Research Asa Hong Kong Unversty Separate an object from ts background Compose the object on another

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Topics. Clustering. Unsupervised vs. Supervised. Vehicle Example. Vehicle Clusters Advanced Algorithmics

Topics. Clustering. Unsupervised vs. Supervised. Vehicle Example. Vehicle Clusters Advanced Algorithmics .0.009 Topcs Advanced Algorthmcs Clusterng Jaak Vlo 009 Sprng What s clusterng Herarchcal clusterng K means + K medods SOM Fuzzy EM Jaak Vlo MTAT.0.90 Text Algorthms Unsupervsed vs. Supervsed Clusterng

More information

Intelligent Information Acquisition for Improved Clustering

Intelligent Information Acquisition for Improved Clustering Intellgent Informaton Acquston for Improved Clusterng Duy Vu Unversty of Texas at Austn duyvu@cs.utexas.edu Mkhal Blenko Mcrosoft Research mblenko@mcrosoft.com Prem Melvlle IBM T.J. Watson Research Center

More information

Machine Learning. K-means Algorithm

Machine Learning. K-means Algorithm Macne Learnng CS 6375 --- Sprng 2015 Gaussan Mture Model GMM pectaton Mamzaton M Acknowledgement: some sldes adopted from Crstoper Bsop Vncent Ng. 1 K-means Algortm Specal case of M Goal: represent a data

More information

EECS 730 Introduction to Bioinformatics Sequence Alignment. Luke Huan Electrical Engineering and Computer Science

EECS 730 Introduction to Bioinformatics Sequence Alignment. Luke Huan Electrical Engineering and Computer Science EECS 730 Introducton to Bonformatcs Sequence Algnment Luke Huan Electrcal Engneerng and Computer Scence http://people.eecs.ku.edu/~huan/ HMM Π s a set of states Transton Probabltes a kl Pr( l 1 k Probablty

More information

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 2 Sofa 2016 Prnt ISSN: 1311-9702; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-2016-0017 Hybrdzaton of Expectaton-Maxmzaton

More information

Clustering Algorithm of Similarity Segmentation based on Point Sorting

Clustering Algorithm of Similarity Segmentation based on Point Sorting Internatonal onference on Logstcs Engneerng, Management and omputer Scence (LEMS 2015) lusterng Algorthm of Smlarty Segmentaton based on Pont Sortng Hanbng L, Yan Wang*, Lan Huang, Mngda L, Yng Sun, Hanyuan

More information

A multi-level thresholding approach using a hybrid optimal estimation algorithm

A multi-level thresholding approach using a hybrid optimal estimation algorithm Pattern Recognton Letters 28 (2007) 662 669 www.elsever.com/locate/patrec A mult-level thresholdng approach usng a hybrd optmal estmaton algorthm Shu-Ka S. Fan *, Yen Ln Department of Industral Engneerng

More information

Clustering. A. Bellaachia Page: 1

Clustering. A. Bellaachia Page: 1 Clusterng. Obectves.. Clusterng.... Defntons... General Applcatons.3. What s a good clusterng?. 3.4. Requrements 3 3. Data Structures 4 4. Smlarty Measures. 4 4.. Standardze data.. 5 4.. Bnary varables..

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Understanding K-Means Non-hierarchical Clustering

Understanding K-Means Non-hierarchical Clustering SUNY Albany - Techncal Report 0- Understandng K-Means Non-herarchcal Clusterng Ian Davdson State Unversty of New York, 1400 Washngton Ave., Albany, 105. DAVIDSON@CS.ALBANY.EDU Abstract The K-means algorthm

More information

On Supporting Identification in a Hand-Based Biometric Framework

On Supporting Identification in a Hand-Based Biometric Framework On Supportng Identfcaton n a Hand-Based Bometrc Framework Pe-Fang Guo 1, Prabr Bhattacharya 2, and Nawwaf Kharma 1 1 Electrcal & Computer Engneerng, Concorda Unversty, 1455 de Masonneuve Blvd., Montreal,

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Data Mining: Model Evaluation

Data Mining: Model Evaluation Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem

Ecient Computation of the Most Probable Motion from Fuzzy. Moshe Ben-Ezra Shmuel Peleg Michael Werman. The Hebrew University of Jerusalem Ecent Computaton of the Most Probable Moton from Fuzzy Correspondences Moshe Ben-Ezra Shmuel Peleg Mchael Werman Insttute of Computer Scence The Hebrew Unversty of Jerusalem 91904 Jerusalem, Israel Emal:

More information

A Deflected Grid-based Algorithm for Clustering Analysis

A Deflected Grid-based Algorithm for Clustering Analysis A Deflected Grd-based Algorthm for Clusterng Analyss NANCY P. LIN, CHUNG-I CHANG, HAO-EN CHUEH, HUNG-JEN CHEN, WEI-HUA HAO Department of Computer Scence and Informaton Engneerng Tamkang Unversty 5 Yng-chuan

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Simplification of 3D Meshes

Simplification of 3D Meshes Smplfcaton of 3D Meshes Addy Ngan /4/00 Outlne Motvaton Taxonomy of smplfcaton methods Hoppe et al, Mesh optmzaton Hoppe, Progressve meshes Smplfcaton of 3D Meshes 1 Motvaton Hgh detaled meshes becomng

More information

Help for Time-Resolved Analysis TRI2 version 2.4 P Barber,

Help for Time-Resolved Analysis TRI2 version 2.4 P Barber, Help for Tme-Resolved Analyss TRI2 verson 2.4 P Barber, 22.01.10 Introducton Tme-resolved Analyss (TRA) becomes avalable under the processng menu once you have loaded and selected an mage that contans

More information

Survey of Cluster Analysis and its Various Aspects

Survey of Cluster Analysis and its Various Aspects Harmnder Kaur et al, Internatonal Journal of Computer Scence and Moble Computng, Vol.4 Issue.0, October- 05, pg. 353-363 Avalable Onlne at www.csmc.com Internatonal Journal of Computer Scence and Moble

More information

Data Mining MTAT (4AP = 6EAP)

Data Mining MTAT (4AP = 6EAP) Clusterng Data Mnng MTAT018 (AP = 6EAP) Clusterng Jaak Vlo 009 Fall Groupng objects by smlarty Take all data and ask what are typcal examples, groups n data Jaak Vlo and other authors UT: Data Mnng 009

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

An Improved Neural Network Algorithm for Classifying the Transmission Line Faults

An Improved Neural Network Algorithm for Classifying the Transmission Line Faults 1 An Improved Neural Network Algorthm for Classfyng the Transmsson Lne Faults S. Vaslc, Student Member, IEEE, M. Kezunovc, Fellow, IEEE Abstract--Ths study ntroduces a new concept of artfcal ntellgence

More information

A Topology-aware Random Walk

A Topology-aware Random Walk A Topology-aware Random Walk Inkwan Yu, Rchard Newman Dept. of CISE, Unversty of Florda, Ganesvlle, Florda, USA Abstract When a graph can be decomposed nto clusters of well connected subgraphs, t s possble

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

Clustering is a discovery process in data mining.

Clustering is a discovery process in data mining. Cover Feature Chameleon: Herarchcal Clusterng Usng Dynamc Modelng Many advanced algorthms have dffculty dealng wth hghly varable clusters that do not follow a preconceved model. By basng ts selectons on

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

A Two-Stage Algorithm for Data Clustering

A Two-Stage Algorithm for Data Clustering A Two-Stage Algorthm for Data Clusterng Abdolreza Hatamlou 1 and Salwan Abdullah 2 1 Islamc Azad Unversty, Khoy Branch, Iran 2 Data Mnng and Optmsaton Research Group, Center for Artfcal Intellgence Technology,

More information

Histogram based Evolutionary Dynamic Image Segmentation

Histogram based Evolutionary Dynamic Image Segmentation Hstogram based Evolutonary Dynamc Image Segmentaton Amya Halder Computer Scence & Engneerng Department St. Thomas College of Engneerng & Technology Kolkata, Inda amya_halder@ndatmes.com Arndam Kar and

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

A new segmentation algorithm for medical volume image based on K-means clustering

A new segmentation algorithm for medical volume image based on K-means clustering Avalable onlne www.jocpr.com Journal of Chemcal and harmaceutcal Research, 2013, 5(12):113-117 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCRC5 A new segmentaton algorthm for medcal volume mage based

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

MULTI-VIEW ANCHOR GRAPH HASHING

MULTI-VIEW ANCHOR GRAPH HASHING MULTI-VIEW ANCHOR GRAPH HASHING Saehoon Km 1 and Seungjn Cho 1,2 1 Department of Computer Scence and Engneerng, POSTECH, Korea 2 Dvson of IT Convergence Engneerng, POSTECH, Korea {kshkawa, seungjn}@postech.ac.kr

More information

Unsupervised Neural Network Adaptive Resonance Theory 2 for Clustering

Unsupervised Neural Network Adaptive Resonance Theory 2 for Clustering 3 rd Internatonal G raduate Conference on Engneerng, Scence and Humantes (IGCESH) School of Graduate Studes Unverst Teknolog Malaysa 4 November 00 Unsupervsed Neural Network Adaptve Resonance Theory for

More information

Clustering algorithms and validity measures

Clustering algorithms and validity measures Clusterng algorthms and valdty measures M. Hald, Y. Batstas, M. Vazrganns Department of Informatcs Athens Unversty of Economcs & Busness Emal: {mhal, yanns, mvazrg}@aueb.gr Abstract Clusterng ams at dscoverng

More information

Design of Structure Optimization with APDL

Design of Structure Optimization with APDL Desgn of Structure Optmzaton wth APDL Yanyun School of Cvl Engneerng and Archtecture, East Chna Jaotong Unversty Nanchang 330013 Chna Abstract In ths paper, the desgn process of structure optmzaton wth

More information

AMath 483/583 Lecture 21 May 13, Notes: Notes: Jacobi iteration. Notes: Jacobi with OpenMP coarse grain

AMath 483/583 Lecture 21 May 13, Notes: Notes: Jacobi iteration. Notes: Jacobi with OpenMP coarse grain AMath 483/583 Lecture 21 May 13, 2011 Today: OpenMP and MPI versons of Jacob teraton Gauss-Sedel and SOR teratve methods Next week: More MPI Debuggng and totalvew GPU computng Read: Class notes and references

More information

Fitting and Alignment

Fitting and Alignment Fttng and Algnment Computer Vson Ja-Bn Huang, Vrgna Tech Many sldes from S. Lazebnk and D. Hoem Admnstratve Stuffs HW 1 Competton: Edge Detecton Submsson lnk HW 2 wll be posted tonght Due Oct 09 (Mon)

More information

Including Spatial Information in Clustering of Multi-Channel Images

Including Spatial Information in Clustering of Multi-Channel Images Includng Spatal Informaton n Clusterng of Mult-Channel Images een wetenschappeljke proeve op het gebed van de Natuurwetenschappen, Wskunde en Informatca PROEFSCHRIFT ter verkrjgng van de graad van doctor

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Adaptive Transfer Learning

Adaptive Transfer Learning Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk

More information

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

CSCI 5417 Information Retrieval Systems Jim Martin!

CSCI 5417 Information Retrieval Systems Jim Martin! CSCI 5417 Informaton Retreval Systems Jm Martn! Lecture 11 9/29/2011 Today 9/29 Classfcaton Naïve Bayes classfcaton Ungram LM 1 Where we are... Bascs of ad hoc retreval Indexng Term weghtng/scorng Cosne

More information

CLUSTERING that discovers the relationship among data

CLUSTERING that discovers the relationship among data Ramp-based Twn Support Vector Clusterng Zhen Wang, Xu Chen, Chun-Na L, and Yuan-Ha Shao arxv:82.0370v [cs.lg] 0 Dec 208 Abstract Tradtonal plane-based clusterng methods measure the cost of wthn-cluster

More information