Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided

Similar documents
SIGGRAPH Interactive Image Cutout. Interactive Graph Cut. Interactive Graph Cut. Interactive Graph Cut. Hard Constraints. Lazy Snapping.

Classification. Outline. 8.1 Statistical Learning Theory Formulation. 8.3 Methods for Classification. 8.2 Classical Formulation.

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

CS 534: Computer Vision Model Fitting

Expectation Maximization (EM). Mixtures of Gaussians. Learning probability distribution

Image Segmentation. Image Segmentation

Lecture Note 08 EECS 4101/5101 Instructor: Andy Mirzaian. All Nearest Neighbors: The Lifting Method

Recognizing Faces. Outline

Human Action Recognition Using Discriminative Models in the Learned Hierarchical Manifold Space

Machine Learning. Topic 6: Clustering

Image Alignment CSC 767

Feature Reduction and Selection

Angle-Independent 3D Reconstruction. Ji Zhang Mireille Boutin Daniel Aliaga

Announcements. Supervised Learning

Support Vector Machines

Machine Learning: Algorithms and Applications

Bayesian Networks: Independencies and Inference. What Independencies does a Bayes Net Model?

Detection of hand grasping an object from complex background based on machine learning co-occurrence of local image feature

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

Unsupervised Learning and Clustering

Graph-based Clustering

K-means and Hierarchical Clustering

Lecture 4: Principal components

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

Motivation. Motivation. Monte Carlo. Example: Soft Shadows. Outline. Monte Carlo Algorithms. Advanced Computer Graphics (Fall 2009)

LECTURE : MANIFOLD LEARNING

A Vision-based Facial Expression Recognition and Adaptation System from Video Stream

Lecture #15 Lecture Notes

Machine Learning 9. week

Color, Texture and Segmentation. CSE 455 Linda Shapiro

Calibrating a single camera. Odilon Redon, Cyclops, 1914

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Machine Learning. K-means Algorithm

S1 Note. Basis functions.

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

Outline. Seamless Image Stitching in the Gradient Domain. Related Approaches. Image Stitching. Introduction Related Work

THE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY

Fitting: Deformable contours April 26 th, 2018

Cast Indexing for Videos by NCuts and Page Ranking

Overview. CSC 2400: Computer Systems. Pointers in C. Pointers - Variables that hold memory addresses - Using pointers to do call-by-reference in C

On the Two-level Hybrid Clustering Algorithm

12. Segmentation. Computer Engineering, i Sejong University. Dongil Han

PCA Based Gait Segmentation

Indirect Volume Rendering

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

FULL-FRAME VIDEO STABILIZATION WITH A POLYLINE-FITTED CAMCORDER PATH

Reading. 14. Subdivision curves. Recommended:

Semi-Supervised Biased Maximum Margin Analysis for Interactive Image Retrieval

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Structure from Motion

Support Vector Machines

Example 2: Straight Lines. Image Segmentation. Example 3: Lines and Circular Arcs. Example 1: Regions

Example 1: Regions. Image Segmentation. Example 3: Lines and Circular Arcs. Example 2: Straight Lines. Region Segmentation: Segmentation Criteria

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

C2 Training: June 8 9, Combining effect sizes across studies. Create a set of independent effect sizes. Introduction to meta-analysis

High Dimensional Data Clustering

Fuzzy C-Means Initialized by Fixed Threshold Clustering for Improving Image Retrieval

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Classification Method in Integrated Information Network Using Vector Image Comparison

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions

Trends in Computer Science and Information Technology

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Point Cloud Surface Representations

LEAST SQUARES. RANSAC. HOUGH TRANSFORM.

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

Active Contours/Snakes

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Biostatistics 615/815

2D Raster Graphics. Integer grid Sequential (left-right, top-down) scan. Computer Graphics

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

CSCI 104 Sorting Algorithms. Mark Redekopp David Kempe

An Image Fusion Approach Based on Segmentation Region

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Image segmentation by using the localized subspace iteration algorithm

Some Tutorial about the Project. Computer Graphics

Unsupervised Learning

INF Repetition Anne Solberg INF

Exact solution, the Direct Linear Transfo. ct solution, the Direct Linear Transform

What are the camera parameters? Where are the light sources? What is the mapping from radiance to pixel color? Want to solve for 3D geometry

Skew Estimation in Document Images Based on an Energy Minimization Framework

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Clustering. A. Bellaachia Page: 1

Multi-View Face Alignment Using 3D Shape Model for View Estimation

Classifier Selection Based on Data Complexity Measures *

Programming in Fortran 90 : 2017/2018

Intra-Parametric Analysis of a Fuzzy MOLP

Complex Filtering and Integration via Sampling

APPLIED MACHINE LEARNING

Cost-efficient deployment of distributed software services

Face Tracking Using Motion-Guided Dynamic Template Matching

Smoothing Spline ANOVA for variable screening

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline

Potential Malicious Users Discrimination with Time Series Behavior Analysis

Understanding K-Means Non-hierarchical Clustering

Uncertainty operations with Statool. Jianzhong Zhang. A thesis submitted to the graduate faculty

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

Polyhedrons in spherical coordination system

Transcription:

Regon Segmentaton Readngs: hater 10: 10.1 Addtonal Materals Provded K-means lusterng tet EM lusterng aer Grah Parttonng tet Mean-Shft lusterng aer 1

Image Segmentaton Image segmentaton s the oeraton of arttonng an mage nto a collecton of connected sets of els. 1. nto regons, whch usually cover the mage 2. nto lnear structures, such as - lne segments - curve segments 3. nto 2D shaes, such as - crcles - ellses - rbbons long, symmetrc regons 2

Eamle: Regons 3

Man Methods of Regon Segmentaton 1. Regon Growng 2. Slt and Merge 3. lusterng 4

lusterng There are K clusters 1,, K wth means m 1,, m K. The least-squares error s defned as K 2 D = - m k. k=1 k Out of all ossble arttons nto K clusters, choose the one that mnmzes D. Why don t we ust do ths? If we could, would we get meanngful obects? 5

K-Means lusterng Form K-means clusters from a set of n-dmensonal vectors 1. Set c teraton count to 1 2. hoose randomly a set of K means m 1 1,, m K 1. 3. For each vector comute D, m k c, k=1, K and assgn to the cluster wth nearest mean. 4. Increment c by 1, udate the means to get m 1 c,,m K c. 5. Reeat stes 3 and 4 untl k c = k c+1 for all k. 6

K-Means Eamle 1 7

K-Means Eamle 2 8

K-Means Eamle 3 9

K-means Varants Dfferent ways to ntalze the means Dfferent stong crtera Dynamc methods for determnng the rght number of clusters K for a gven mage The EM Algorthm: a robablstc formulaton of K-means 10

K-Means Boot Ste: Intalze K clusters: 1,, K Each cluster s reresented by ts mean m Iteraton Ste: Estmate the cluster for each data ont Re-estmate the cluster arameters 11

K-Means Eamle 12

K-Means Eamle Where do the red onts belong? 13

K-means vs. EM K-means EM luster mean mean, varance, Reresentaton and weght luster randomly select ntalze K Intalzaton K means Gaussan dstrbutons Eectaton assgn each ont soft-assgn each ont to closest mean to each dstrbuton Mamzaton comute means comute new arams of current clusters of each dstrbuton 14

Notaton Nµ, σ s a 1D normal Gaussan dstrbuton wth mean µ and standard devaton σ so the varance s σ 2. 15

Nµ, s a multvarate Gaussan dstrbuton wth mean µ and covarance matr. What s a covarance matr? R G B R σ 2 R σ RG σ RB G σ GR σ 2 G σ GB B σ BR σ BG σ 2 B varancex: σ X 2 = - µ 2 1/N covx,y = - µ y - µ y 1/N 16

1. Suose we have a set of clusters: 1, 2,..., K over a set of data onts X = { 1, 2,..., N }. P s the robablty or weght of cluster. P s the robablty of cluster gven ont. P s the robablty of ont belongng to cluster. 2. Suose that a cluster s reresented by a Gaussan dstrbuton Nµ,σ. Then for any ont : 17

EM: Eectaton-Mamzaton Boot Ste: Intalze K clusters: 1,, K Iteraton Ste: µ, Σ and P for each cluster. Estmate the cluster of each data ont Re-estmate the cluster arameters, Σ, µ For each cluster Eectaton Mamzaton 18

1-D EM wth Gaussan Dstrbutons Each cluster s reresented by a Gaussan dstrbuton Nµ, σ. Intalzaton: For each cluster ntalze ts mean µ, varance σ 2, and weght α. Nµ 1, σ 1 α 1 = P 1 Nµ 2, σ 2 α 2 = P 2 Nµ 3, σ 3 α 3 = P 3 19

Eectaton For each ont and each cluster comute P. P = P P / P P = Σ P P Where do we get P and P? 20

1. Use the df for a normal dstrbuton: 2. Use α = P from the current arameters of cluster. 21

Mamzaton Havng comuted P for each ont and each cluster, use them to comute new mean, varance, and weght for each cluster. = µ = T µ µ N = 22 σ 2 =

1 ={r 1, g 1, b 1 } 2 ={r 2, g 2, b 2 } ={r, g, b } luster Parameters µ 1,Σ 1, 1 for 1 µ 2,Σ 2, 2 for 2 µ k,σ k, k for k Mult-Dmensonal Eectaton Ste for olor Image Segmentaton Inut Known Inut Estmaton Outut + lassfcaton Results 1 1 2 = = 23

1 ={r 1, g 1, b 1 } 2 ={r 2, g 2, b 2 } ={r, g, b } luster Parameters µ 1,Σ 1, 1 for 1 µ 2,Σ 2, 2 for 2 µ k,σ k, k for k Mult-dmensonal Mamzaton Ste for olor Image Segmentaton = Σ T µ µ Inut Known Inut Estmaton Outut + lassfcaton Results 1 1 2 = µ N = 24

Full EM Algorthm Mult-Dmensonal Boot Ste: Intalze K clusters: 1,, K Iteraton Ste: Eectaton Ste Mamzaton Ste µ, Σ and P for each cluster. = = = Σ T µ µ = µ N = 25

Vsualzng EM lusters ellses show one, two, and three standard devatons mean 26

EM Demo Demo htt://www.neurosc.ast.go./~akaho/mtureem.html Eamle htt://www-2.cs.cmu.edu/~awm/tutorals/gmm13.df 27

EM Alcatons Blobworld: Image segmentaton usng Eectaton-Mamzaton and ts alcaton to mage queryng Y s Generatve/Dscrmnatve Learnng of obect classes n color mages 28

Blobworld: Samle Results 29

Janbo Sh s Grah-Parttonng An mage s reresented by a grah whose nodes are els or small grous of els. The goal s to artton the vertces nto dsont sets so that the smlarty wthn each set s hgh and across dfferent sets s low. 30

Mnmal uts Let G = V,E be a grah. Each edge u,v has a weght wu,v that reresents the smlarty between u and v. Grah G can be broken nto 2 dsont grahs wth node sets A and B by removng edges that connect these sets. Let cuta,b = uεa, vεb wu,v. One way to segment G s to fnd the mnmal cut. 31

uta,b cuta,b = uεa, vεb wu,v A B w1 w2 32

Normalzed ut Mnmal cut favors cuttng off small node grous, so Sh roosed the normalzed cut. cuta, B cuta,b NcutA,B = ------------- + ------------- assoa,v assob,v normalzed cut assoa,v = wu,t u A, t V How much s A connected to the grah as a whole. 33

Eamle Normalzed ut A 2 2 2 1 4 3 2 2 2 1 2 2 2 2 2 3 B 3 3 NcutA,B = ------- + ------ 21 16 34

Sh turned grah cuts nto an egenvector/egenvalue roblem. Set u a weghted grah G=V,E V s the set of N els E s a set of weghted edges weght w gves the smlarty between nodes and Length N vector d: d s the sum of the weghts from node to all other nodes N N matr D: D s a dagonal matr wth d on ts dagonal N N symmetrc matr W: W = w 35

Let be a characterstc vector of a set A of nodes = 1 f node s n a set A = -1 otherwse Let y be a contnuous aromaton to Solve the system of equatons D W y = λ D y for the egenvectors y and egenvalues λ Use the egenvector y wth second smallest egenvalue to bartton the grah y => => A If further subdvson s merted, reeat recursvely 36

How Sh used the rocedure Sh defned the edge weghts w, by w, = e * - F-F 2 / σi - X-X 2 e / σx f X-X 2 < r 0 otherwse where X s the satal locaton of node F s the feature vector for node I whch can be ntensty, color, teture, moton The formula s set u so that w, s 0 for nodes that are too far aart. 37

Eamles of Sh lusterng See Sh s Web Page htt://www.cs.uenn.edu/~sh/ 38

Problems wth Grah uts Need to know when to sto Very Slooooow Problems wth EM Local mnma Need to know number of segments Need to choose generatve model 39

Mean-Shft lusterng Smle, lke K-means But you don t have to select K Statstcal method Guaranteed to converge to a fed number of clusters. 40

Fndng Modes n a Hstogram How Many Modes Are There? Easy to see, hard to comute 41

Mean Shft [omancu & Meer] Iteratve Mode Search 1. Intalze random seed, and wndow W 2. alculate center of gravty the mean of W: 3. Translate the search wndow to the mean 4. Reeat Ste 2 untl convergence 42

Numerc Eamle Must Use Normalzed Hstogram! wndow W centered at 12 mean shft 10 11 12 13 14 H 5 4 3 2 1 N 5/15 4/15 3/15 2/15 1/15 N = 105/15+114/15+123/15+132/15+141/15 = 11.33 43

Mean Shft Aroach Intalze a wndow around each ont See where t shfts ths determnes whch segment t s n Multle onts wll shft to the same segment 44

Segmentaton Algorthm Frst run the mean shft rocedure for each data ont and store ts convergence ont z. Lnk together all the z s that are closer than.5 from each other to form clusters Assgn each ont to ts cluster Elmnate small regons 45

Mean-shft for mage segmentaton Useful to take nto account satal nformaton nstead of R, G, B, run n R, G, B,, y sace 46

References Sh and Malk, Normalzed uts and Image Segmentaton, Proc. VPR 1997. arson, Belonge, Greensan and Malk, Blobworld: Image Segmentaton Usng Eectaton-Mamzaton and ts Alcaton to Image Queryng, IEEE PAMI, Vol 24, No. 8, Aug. 2002. omancu and Meer, Mean shft analyss and alcatons, Proc. IV 1999. 47