Clustering Color/Intensity. Group together pixels of similar color/intensity.

Similar documents
Applications. Foreground / background segmentation Finding skin-colored regions. Finding the moving objects. Intelligent scissors

CS 2750: Machine Learning. Clustering. Prof. Adriana Kovashka University of Pittsburgh January 17, 2017

Mixture Models and EM

Unsupervised Learning: Clustering

Region-based Segmentation

Machine Learning and Data Mining. Clustering (1): Basics. Kalev Kask

Lecture: k-means & mean-shift clustering

Lecture: k-means & mean-shift clustering

Lecture-17: Clustering with K-Means (Contd: DT + Random Forest)

CS787: Assignment 3, Robust and Mixture Models for Optic Flow Due: 3:30pm, Mon. Mar. 12, 2007.

Case-Based Reasoning. CS 188: Artificial Intelligence Fall Nearest-Neighbor Classification. Parametric / Non-parametric.

CS 188: Artificial Intelligence Fall 2008

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Unsupervised Learning : Clustering

Kernels and Clustering

CS 4495 Computer Vision. Segmentation. Aaron Bobick (slides by Tucker Hermans) School of Interactive Computing. Segmentation

Image Segmentation. Selim Aksoy. Bilkent University

Image Segmentation. Selim Aksoy. Bilkent University

Segmentation by Clustering. Segmentation by Clustering Reading: Chapter 14 (skip 14.5) General ideas

Discrete geometry. Lecture 2. Alexander & Michael Bronstein tosca.cs.technion.ac.il/book

Segmentation by Clustering Reading: Chapter 14 (skip 14.5)

What to come. There will be a few more topics we will cover on supervised learning

Lecture 10: Semantic Segmentation and Clustering

Introduction to Machine Learning CMU-10701

Idea. Found boundaries between regions (edges) Didn t return the actual region

INF4820. Clustering. Erik Velldal. Nov. 17, University of Oslo. Erik Velldal INF / 22

Segmentation and Grouping April 19 th, 2018

Midterm Examination CS 540-2: Introduction to Artificial Intelligence

Cluster Evaluation and Expectation Maximization! adapted from: Doug Downey and Bryan Pardo, Northwestern University

Clustering CS 550: Machine Learning

Figure (5) Kohonen Self-Organized Map

Unsupervised Learning. Clustering and the EM Algorithm. Unsupervised Learning is Model Learning

Unsupervised Data Mining: Clustering. Izabela Moise, Evangelos Pournaras, Dirk Helbing

Introduction to Mobile Robotics

An Introduction to PDF Estimation and Clustering

Clustering. CS294 Practical Machine Learning Junming Yin 10/09/06

A Comparison of Document Clustering Techniques

Clustering. Unsupervised Learning

Final Review CMSC 733 Fall 2014

Image Analysis - Lecture 5

Clustering Lecture 5: Mixture Model

Image Analysis Lecture Segmentation. Idar Dyrdal

Unsupervised Learning

Example 1: Regions. Image Segmentation. Example 3: Lines and Circular Arcs. Example 2: Straight Lines. Region Segmentation: Segmentation Criteria

Unsupervised Learning and Data Mining

Machine Learning and Data Mining. Clustering. (adapted from) Prof. Alexander Ihler

CS 343: Artificial Intelligence

Unsupervised Learning Partitioning Methods

DS504/CS586: Big Data Analytics Big Data Clustering Prof. Yanhua Li

Clustering in R d. Clustering. Widely-used clustering methods. The k-means optimization problem CSE 250B

[7.3, EA], [9.1, CMB]

Image Segmentation. Shengnan Wang

Working with Unlabeled Data Clustering Analysis. Hsiao-Lung Chan Dept Electrical Engineering Chang Gung University, Taiwan

Processing and Others. Xiaojun Qi -- REU Site Program in CVMA

Midterm Examination CS540-2: Introduction to Artificial Intelligence

Hard clustering. Each object is assigned to one and only one cluster. Hierarchical clustering is usually hard. Soft (fuzzy) clustering

CSSE463: Image Recognition Day 21

Tracking system. Danica Kragic. Object Recognition & Model Based Tracking

Fuzzy Segmentation. Chapter Introduction. 4.2 Unsupervised Clustering.

Segmentation and low-level grouping.

Introduction to Data Mining

Clustering and Dimensionality Reduction

EN1610 Image Understanding Lab # 3: Edges

Example 2: Straight Lines. Image Segmentation. Example 3: Lines and Circular Arcs. Example 1: Regions

Clustering. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

COMP 551 Applied Machine Learning Lecture 13: Unsupervised learning

Segmentation Computer Vision Spring 2018, Lecture 27

Unsupervised Learning

University of Florida CISE department Gator Engineering. Clustering Part 2

K Means Clustering Using Localized Histogram Analysis and Multiple Assignment. Michael Bryson 4/18/2007

Based on Raymond J. Mooney s slides

K-means and Hierarchical Clustering

Targil 12 : Image Segmentation. Image segmentation. Why do we need it? Image segmentation

Reddit Recommendation System Daniel Poon, Yu Wu, David (Qifan) Zhang CS229, Stanford University December 11 th, 2011

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

Clustering. Lecture 6, 1/24/03 ECS289A

IBL and clustering. Relationship of IBL with CBR

Differential Geometry: Circle Packings. [A Circle Packing Algorithm, Collins and Stephenson] [CirclePack, Ken Stephenson]

Lab # 2 - ACS I Part I - DATA COMPRESSION in IMAGE PROCESSING using SVD

Unsupervised learning in Vision

Introduction to Machine Learning. Xiaojin Zhu

Note Set 4: Finite Mixture Models and the EM Algorithm

Geographical routing 1

IMAGE SEGMENTATION. Václav Hlaváč

K-Means Clustering Using Localized Histogram Analysis

Clust Clus e t ring 2 Nov

1. (10 pts) Order the following three images by how much memory they occupy:

Operators-Based on Second Derivative double derivative Laplacian operator Laplacian Operator Laplacian Of Gaussian (LOG) Operator LOG

Data Mining Chapter 9: Descriptive Modeling Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Cluster Analysis. Ying Shen, SSE, Tongji University

Administrative. Machine learning code. Supervised learning (e.g. classification) Machine learning: Unsupervised learning" BANANAS APPLES

University of Florida CISE department Gator Engineering. Clustering Part 4

Unsupervised Learning. Pantelis P. Analytis. Introduction. Finding structure in graphs. Clustering analysis. Dimensionality reduction.

Announcements. Image Segmentation. From images to objects. Extracting objects. Status reports next Thursday ~5min presentations in class

CS 343H: Honors AI. Lecture 23: Kernels and clustering 4/15/2014. Kristen Grauman UT Austin

Outline. Segmentation & Grouping. Examples of grouping in vision. Grouping in vision. Grouping in vision 2/9/2011. CS 376 Lecture 7 Segmentation 1

SYDE Winter 2011 Introduction to Pattern Recognition. Clustering

Clustering: Overview and K-means algorithm

Clustering Algorithms. Margareta Ackerman

Sequence clustering. Introduction. Clustering basics. Hierarchical clustering

Transcription:

Clustering Color/Intensity Group together pixels of similar color/intensity.

Agglomerative Clustering Cluster = connected pixels with similar color. Optimal decomposition may be hard. For example, find k connected components of image with least color variation. Greedy algorithm to make this fast.

Clustering Algorithm Initialize: Each pixel is a region with color of that pixel and neighbors = neighboring pixels. Loop Find adjacent two regions with most similar color. Merge to form new region with: all pixels of these regions average color of these regions. All neighbors of either region. Stopping condition: No regions similar Find k regions.

Example 20 2 2 20 2 2

Example 20 2 2 20 2 2

Example 20 2 2 20.. 2 2

Example 20 2 2 20... 2 2

Example 20 2 2..... 2 2

Example 20 2 2....... 2 2

Example

Example 20 2 2.9....2.9....2.9.9.9.9. 9.9.9.9.9. 9.9.9.9.9. 9

Clustering complexity n pixels. Initializing: O(n) time to compute regions. Loop: O(n) time to find closest neighbors (could speed up). O(n) time to update distance to all neighbors. At most n times through loop so O(n*n) time total.

Agglomerative Clustering: Discussion Start with definition of good clusters. Simple initialization. Greedy: take steps that seem to most improve clustering. This is a very general, reasonable strategy. Can be applied to almost any problem. But, not guaranteed to produce good quality answer.

Parametric Clustering Each cluster has a mean color/intensity, and a radius of possible colors. For intensity, this is just dividing histogram into regions. For color, like grouping D points into spheres.

K-means clustering Brute force difficult because many spheres, many pixels. Assume all spheres same radius; just need sphere centers. Iterative method. If we knew centers, it would be easy to assign pixels to clusters. If we knew which pixels in each cluster, it would be easy to find centers. So guess centers, assign pixels to clusters, pick centers for clusters, assign pixels to clusters,. matlab

Why is this better? With a greedy algorithm, once we make a decision we cannot undo it. With an iterative algorithm, we can make changes.

K-means Algorithm. Initialize Pick k random cluster centers Pick centers near data. Heuristics: uniform distribution in range of data; randomly select data points. 2. Assign each point to nearest center.. Make each center average of pts assigned to it.. Go to step 2.

Let s consider a simple example. Suppose we want to cluster black and white intensities, and we have the intensities:. Suppose we start with centers c = and c2=0. We assign,, to c, to c2. Then we update c = (++)/ =, c2 =. Then we assign, to c and and to c2. Then we update c = 2, c2 = 9 ½. Then the algorithm has converged. No assignments change, so the centers don t change.

K-means Properties We can think of this as trying to find the optimal solution to: Given points p pn, find centers c ck and find mapping f:{p pn}->{c ck} that minimizes C = (p-f(p))^2 + + (pn-f(pn))^2. Every step reduces C. The mean is the pt that minimizes sum of squared distance to a set of points. So changing the center to be the mean reduces this distance. When we reassign a point to a closer center, we reduce its distance to its cluster center. Convergence: since there are only a finite set of possible assignments.

Local Minima However, algorithm might not find the best possible assignments and centers. Consider points 0, 20, 2. K-means can converge to centers at 0, 2. Or to centers at 0,. Heuristic solutions Start with many random starting points and pick the best solution.

E-M Like K-means with soft assignment. Assign point partly to all clusters based on probability it belongs to each. Compute weighted averages (c j ) and variance (σ). f j ( p i ) = j e e p c i p c Cluster centers are c j. i j 2 j σ 2 2 σ 2

Example Matlab: tutorial2 Fuzzy assignment allows cluster to creep towards nearby points and capture them.

E-M/K-Means domains Used color/intensity as example. But same methods can be applied whenever a group is described by parameters and distances. Lines (circles, ellipses); independent motions; textures (a little harder).