Although implementations and applications vary, the idea of the EMD. and to some extent it mimics the human perception of texture similarities.
|
|
- Kristopher Boyd
- 6 years ago
- Views:
Transcription
1 Earth Mover's distance èemdè was rst introduced by Rubner et The for color and texture images ë11, 12ë. This distance can be calculated between al. Introduction two collections of points, when there is a measure of similarity measure between any It was shown to work well for image retrieval ë12ë, texture classication and points. ë9ë. segmentation The EMD allows for partial matches,so it is convenient for image Properties: and to some extent it mimics the human perception of texture similarities. retrieval, algorithms exist for computing it, especially in one dimension. However, Ecient is a purely heuristic distance. it Although implementations and applications vary, the idea of the EMD History: been in use for many years. The idea of matching up the closest values èa special has of the EMDè appeared in vision in 1983 ë13ë. In statistics, there is an equivalent case on probability distributions known as Mallows, or Wasserstein, distance, metric a clear probabilistic interpretation, introduced by Mallows in 1972 ë7ë. The use with this metric in physics and probability literatures dates back to the 1940s ë10ë. of we give the probabilistic interpretation of the EMD and discuss Here various statistical issues arising from this interpretation. 1
2 W èp; Q; Fè = m X j=1 f ijdij The Earth Mover's Distance EMD is dened as a distance between two ësignatures" of the form P = The m ;p m èg and Q = fèy1;q1è;:::;èy n ;q n èg. Signatures represent data fèx1;p1è;:::;èx centers of data clusters and the number of points in these clusters. They are not by normalized to have the same mass. necessarily EMD is dened by an optimal ow F =èf ij è, which minimizes the work The to move ëearth" from one signature to another: required d ij = dèx i ;y j è is a measure of dissimilaritybetween points x i and y j, for example, where distance in R d. Euclidean i=1 The ow must satisfy the following constraints: Move non-negative amounts of earth: f ij ç 0 for all i; j è1è 2
3 Take no more than you have, and move no more than ts in: j=1 f ij ç p i ; mx i=1 f ij ç q j for all i; j è2è Make sure all the earth is moved: mx m X = minè ij p i; i=1 f j=1 j=1 q jè è3è Once the optimal ow f ij Pm P n j=1 f ij d ij i=1 P mi=1 P nj=1 f ij i=1 is found, the EMD is dened as EMDèP,Qè = Normalization is done to make sure smaller signatures are not favored over larger ones. 3
4 E F kx, Y k p = m X j=1 f ijkx i, y j k p = m X j=1 f ijd ij : The Mallows Distance let X and Y berandom variables with probability distributions P and Q Now R d. The Mallows distance, M p èp; Qè, between P and Q is the minimum of the in dierence between X and Y,taken over all joint probability distributions expected èx; Y è with marginals P and Q: of p èp; Qè = min M fèe F kx, Y k p è 1=p :èx; Y è ç F; X ç P; Y ç Qg: F p can be any number greater or equal to 1, but the most interesting cases are Here =1and p =2. For the denition to make sense, the distributions P and Q must p nite p-th moments. have apply this denition to discrete distributions P = fèx1;p1è;:::;èx m ;p m èg To Q = fèy1;q1è;:::;èy n ;q n èg, we need to minimize the expectation under F =èf ij è, and joint distribution of X and Y : the The distribution F is subject to the following constraints: i=1 4 i=1
5 Marginals of F must be P and Q: j=1 f ij = p i ; mx i=1 f ij = q j for all i; j è5è Probabilities must add up to 1: mx m X = ij f j=1 n X = i q j =1 è6è j=1 p i=1 Probabilities must be non-negative: i=1 to EMD constraints: Compare è1è and è4è are the same f ij ç 0 for all i; j è4è As long as P and Q have the same total mass, the EMD constraints è2è are to become equalities and are the same as è5è. forced è3è and è6è are the same because both P and Q are proper probability distributions. that normalizing two signatures with the same total mass does not change their Note EMD. pointed out by Rubner et al., for two signatures with equal masses the EMD is a true As on distributions, and it is exactly the same as the Mallows distance metric with p set to 1. 5
6 The case of unequal total masses two sets of data X = f1; 4g and Y = f1; 2; 3; 4g. Example: on distributions: normalize both sets to have weight 1 èeach point in Mallows X has weight 1=2, each point in Y has weight 1=4è. The Mallows distance is then M1èX; Y è=1=2: on signatures: if we give every point weight 1 èso that the total mass EMD X is 2 and the total mass of Y is 4è, then X is completely contained in Y, no earth of needs to be moved, and EMDèX; Y è=0: 6
7 Key dierence: EMD allows for partial matching The EMD allows for matching any part of the distribution, no matter how Even if Y contained a thousand other points with very dierent values, small. distance between X and Y would still be 0. So partial matches may be the especially with textures. spurious, The EMD on signatures is not invariant to weight scaling, unless both signatures are scaled by the same factor. So if, e.g., one of the two texture patches is duplicated to produce a larger image, the distance will change. Partial matching may be appropriate in other non-texture contexts such image retrieval. It is a computationally ecient and convenient way to search a as large image for a small match, but it should be used cautiously. 7
8 Computing the Mallows distance from data In practice, it is important to distinguish between two dierent issues: Choosing the right distance for the problem, e.g., Mallows or ç 2 or L 2. Different 1. problems may require dierent distances. Estimating the distributions well from the available data, e.g., by a xed-bin 2. adaptive-bin histogram, signature, or some other method. The choice histogram, on the amount of available data, the dimensionality of the data, and of depends on the problem. course There are no a priori reasons for these two issues to be connected, other than computational convenience. For any discretized distribution estimate, the Mallows distance can be computed via optimization algorithms for the transportation problem ë5ë. 8
9 i=1 kx i, y j i kp i=1 jx èiè, yèièj p The optimization problem can be stated especially compactly if we have two samples the same size X = fx1;:::;x n g and Y = fy1;:::;y n g and use the empirical of distribution as our estimate, i.e., give every sample point weight 1=n: M p è ^F X ; ^F Y è= 0 B min èj1;:::;jnè n where the minimum is taken over all possible permutations of f1;:::;ng. problem can be solved by the Hungarian algorithm for the optimal This problem ë4ë, a special case of the transportation problem. assignment If the observations are one-dimensional, the optimization problem can be solved in this case the Mallows distance is just the L p vector distance between explicitly: the sorted vectors xè1è ç :::ç xènè and yè1è ç :::ç yènè: M p èx; Y è= 0 B n 9 1 1=p A C 1 1=p A C è7è
10 Applying the Mallows distance to textures now on,we concentrate on applying the Mallows distance èor the EMDè to From features, in this case to vectors of lter responses. One has to decide whether texture to use the joint or the marginal distributions, and how to estimate them from data. Marginals versus joint: Joint distribution of lter responses in theory contains a lot more information than the individual lter marginals. However, a high-dimensional distribution be much harder to estimate accurately. Also, the transportation problem can algorithm becomes slow for large problems, and the estimates must be coarsened to a feasible numberofpoints ènot more than a few hundredè. Marginal distributions, on the other hand, can be estimated very well even a relatively small image. And if one uses the empirical distribution, all one from to do is sort the vectors, which can be done very fast. There is no need needs bin or to use an optimization algorithm. to results below show that good estimates of the marginals outperform The joint. It also agrees with texture classication results in ë9ë, where the joint the distribution of lter responses did not do any better than the marginals. 10
11 Marginal distribution estimates: Fixed-bin histograms are easy to compute and store, but not necessarily accurate and sensitive to the choice of bin width. Adaptive-bin histograms, e.g., signatures, concentrate on the relevant part space only, and result in more compact estimates. They also correspond to the of of textons in the sense of ë6ë, where texton distributions are constructed concept clustering lter responses and computing the frequency of each cluster. However, by choice of clustering algorithm and its parameters make a dierence to the results. the Empirical distribution, where one keeps every point with mass 1=n, does not any additional algorithms or parameters, and is in general a good estimate. involve it with the Mallows distance has an extra advantage, because when all points Using the same weight, computing the distance requires only sorting. have distribution estimates: Joint xed-bin histogram does not work very well in high-dimensional The The empirical distribution cannot be used with Mallows distance, spaces. for large samples the optimization problem becomes computationally infeasible. since An adaptive-bin histogram seems to be the best choice in this case. 11
12 Experimental results experiments showing the usefulness of the EMD for texture analysis Extensive been published ë9, 11, 12ë. Here we test the empirical distribution as an have of the lter marginals as opposed to xed- and adaptive-bin histograms, and estimate marginals to the joint. compare The images were ltered with a lter bank of 40 lters, all rst and second derivatives at dierent scales and orientations, as in ë6ë. Gaussian When marginal distributions were used, the distance between two textures is computed as the sum of Mallows distances M2 between lter marginals. We used the relatively small MeasTex texture database ë8ë, which consists of Brodatz textures. The benchmarking strategy of ë9ë was followed: sets of 16 random non-overlapping square patches were extracted from each texture, with 16 of 16, 32, 64, and 128 pixels. sides For each image size, the classication error is estimated by the ëleave-one-out" i.e., leaving out each image in turn, computing its distance to all the method, images, and assigning it to the class of its nearest neighbor. The error other is the percentage of incorrectly assigned images. rate 12
13 table compares four methods of estimating the marginal: empirical distribution The èno binningè, coarse xed-bin histogram è16 binsè, ne xed-bin histogram è256 and adaptive-bin histogram where responses are clustered into 16 bins by a binsè, type algorithm. For the joint, only adaptive-bin histogram was used. k-means classication results: percent misclassied Texture dist. Image size Marginal estimate Empirical hist Adaptive hist Coarse hist Fine èadapt.hist.è Joint The empirical distribution function contains the most information and consistently does better than other estimates. The adaptive-bin histogram is nearly as good and requires less memory, takes longer to compute. but The xed-bin histograms perform substantially worse. The joint performs worse than the empirical distribution of the marginals. 13
14 References References and Acknowledgments P. J. Bickel and D. A. Freedman. Some asymptotic theory for the bootstrap. Annals of Statistics, 9:1196í1217, ë1ë P. Brodatz. Textures. Dover, New York, ë2ë A. Efros and T. Leung. Texture synthesis by non-parametric sampling. In Proceedings of the IEEE International Conference on ë3ë Vision, pages 1033í1038. Corfu, Greece, Sept Computer H. W. Kuhn. The Hungarian method for the assignment problem. Naval Research Logistics Quarterly, 2:83í97, ë4ë G. Luenberger. Linear and Nonlinear Programming. Addison-Wesley, ë5ëd. J. Malik, S. Belongie, J. Shi, and T. Leung. Textons, contours, and regions: cue combination in image segmentation. In Proceedings ë6ë the IEEE International Conference on Computer Vision, pages 918í925. Corfu, Greece, Sept of C. L. Mallows. A note on asymptotic joint normality. Annals of Mathematical Statistics, 43è2è:508í515, ë7ë Meastex image texture database and test suite. Website: ë8ë meastex v1.1èmeastex.html. J. Puzicha, Y. Rubner, C. Tomasi, and J. M. Buhmann. Empirical evaluation of dissimilarity measures for color and texture. In ë9ë of the IEEE International Conference on Computer Vision, pages 1165í1173. Corfu, Greece, Sept Proceedings S. T. Rachev. The Monge-Kantorovich mass transference problem and its stochastic applications. Theory of Probability and its ë10ë 29:647í676, Applications, Y. Rubner, C. Tomasi, and L. Guibas. A metric for distributions with applications to image databases. In Proceedings of the IEEE ë11ë Conference on Computer Vision, pages 59í66. Bombay, India, Jan International Y. Rubner, C. Tomasi, and L. J. Guibas. The Earth Mover's distance as a metric for image retrieval. Technical Report STAN-CS- ë12ë Department of Computer Science, Stanford University, Sept TN-98-86, H. C. Shen and A. K. C. Wong. Generalized texture representation and metric. Computer Vision, Graphics, and Image Processing, ë13ë :187í206, M. Werman, S. Peleg, and A. Rozenfeld. A distance metric for multidimensional histograms. Computer Vision, Graphics, and Image ë14ë 32:328í336, Processing, are grateful to Jitendra Malik, Alex Berg, Alexei Efros, Jan Puzicha, and Jianbo We for helpful discussions and comments. We also thank Serge Belongie for the ltering Shi code and Yossi Rubner for the EMD code. 14
15 i=1 M2 2 èf i;g i è: Z è 1,1 ètè, G,1 ètèj p 1=p dt! : jf 0 Appendix: Some properties of the Mallows distance 1. M p is a metric. Convolution for =2: if R xdf i èxè = R xdg i èxè for i =1:::n, then 2. property n p èf X 1 ::: F n ;G1 ::: G n è ç 2 2 M This is stronger than the triangle inequality. 3. M p èf n ;Fè! 0 if and only if F n! F weakly èiè R kxk p df n èxè! R kxk p df èxè. èiiè If X1;:::;X n are independent observations from a distribution F,andF n is their 4. distribution, i.e., F n ètè =1=n P n i=1 1èX i ç tè, then M p èf n ;Fè! 0. empirical 5. If F and G are distributions on the real line, then M p èf; Gè = case = is especially simple because The p 1 1,1 Z, G,1 Z 1 ètèjdt = ètè jf 0 15,1 jf ètè, Gètèjdt:
Earth Mover s Distance and The Applications
Earth Mover s Distance and The Applications Hu Ding Computer Science and Engineering, Michigan State University The Motivations It is easy to compare two single objects: the pairwise distance. For example:
More informationFast and Robust Earth Mover s Distances
Fast and Robust Earth Mover s Distances Ofir Pele and Michael Werman School of Computer Science and Engineering The Hebrew University of Jerusalem {ofirpele,werman}@cs.huji.ac.il Abstract We present a
More informationDistribution Distance Functions
COMP 875 November 10, 2009 Matthew O Meara Question How similar are these? Outline Motivation Protein Score Function Object Retrieval Kernel Machines 1 Motivation Protein Score Function Object Retrieval
More informationAnnouncements. Texture. Review: last time. Texture 9/15/2009. Write your CS login ID on the pset hardcopy. Tuesday, Sept 15 Kristen Grauman UT-Austin
Announcements Texture Write your CS login ID on the pset hardcopy Tuesday, Sept 5 Kristen Grauman UT-Austin Review: last time Edge detection: Filter for gradient Threshold gradient magnitude, thin Texture
More informationTexture. COS 429 Princeton University
Texture COS 429 Princeton University Texture What is a texture? Antonio Torralba Texture What is a texture? Antonio Torralba Texture What is a texture? Antonio Torralba Texture Texture is stochastic and
More informationImage Processing. David Kauchak cs160 Fall Empirical Evaluation of Dissimilarity Measures for Color and Texture
Image Processing Empirical Evaluation of Dissimilarity Measures for Color and Texture Jan Puzicha, Joachim M. Buhmann, Yossi Rubner & Carlo Tomasi David Kauchak cs160 Fall 2009 Administrative 11/4 class
More informationLecture 6: Texture. Tuesday, Sept 18
Lecture 6: Texture Tuesday, Sept 18 Graduate students Problem set 1 extension ideas Chamfer matching Hierarchy of shape prototypes, search over translations Comparisons with Hausdorff distance, L1 on
More informationTexture April 17 th, 2018
Texture April 17 th, 2018 Yong Jae Lee UC Davis Announcements PS1 out today Due 5/2 nd, 11:59 pm start early! 2 Review: last time Edge detection: Filter for gradient Threshold gradient magnitude, thin
More informationSegmentation by Example
Segmentation by Example Sameer Agarwal and Serge Belongie Department of Computer Science and Engineering University of California, San Diego La Jolla, CA 92093, USA {sagarwal,sjb}@cs.ucsd.edu Abstract
More informationTexture April 14 th, 2015
Texture April 14 th, 2015 Yong Jae Lee UC Davis Announcements PS1 out today due 4/29 th, 11:59 pm start early! 2 Review: last time Edge detection: Filter for gradient Threshold gradient magnitude, thin
More informationRobust Shape Retrieval Using Maximum Likelihood Theory
Robust Shape Retrieval Using Maximum Likelihood Theory Naif Alajlan 1, Paul Fieguth 2, and Mohamed Kamel 1 1 PAMI Lab, E & CE Dept., UW, Waterloo, ON, N2L 3G1, Canada. naif, mkamel@pami.uwaterloo.ca 2
More informationMeasuring the Distance Between Images and Image Uncertainty Using Wavelet Decompositions and the Earth Mover s Distance
Proceedings 59th ISI World Statistics Congress, 5-30 August 03, Hong Kong (Session CPS08) p.3767 Measuring the Distance Between Images and Image Uncertainty Using Wavelet Decompositions and the Earth Mover
More informationTracking and Recognizing People in Colour using the Earth Mover s Distance
Tracking and Recognizing People in Colour using the Earth Mover s Distance DANIEL WOJTASZEK, ROBERT LAGANIÈRE S.I.T.E. University of Ottawa, Ottawa, Ontario, Canada K1N 6N5 danielw@site.uottawa.ca, laganier@site.uottawa.ca
More informationSVM-KNN : Discriminative Nearest Neighbor Classification for Visual Category Recognition
SVM-KNN : Discriminative Nearest Neighbor Classification for Visual Category Recognition Hao Zhang, Alexander Berg, Michael Maire Jitendra Malik EECS, UC Berkeley Presented by Adam Bickett Objective Visual
More informationTexture Classification by Combining Local Binary Pattern Features and a Self-Organizing Map
Texture Classification by Combining Local Binary Pattern Features and a Self-Organizing Map Markus Turtinen, Topi Mäenpää, and Matti Pietikäinen Machine Vision Group, P.O.Box 4500, FIN-90014 University
More informationCS 2770: Computer Vision. Edges and Segments. Prof. Adriana Kovashka University of Pittsburgh February 21, 2017
CS 2770: Computer Vision Edges and Segments Prof. Adriana Kovashka University of Pittsburgh February 21, 2017 Edges vs Segments Figure adapted from J. Hays Edges vs Segments Edges More low-level Don t
More informationBag-of-features. Cordelia Schmid
Bag-of-features for category classification Cordelia Schmid Visual search Particular objects and scenes, large databases Category recognition Image classification: assigning a class label to the image
More informationA Fast Distance Between Histograms
Fast Distance Between Histograms Francesc Serratosa 1 and lberto Sanfeliu 2 1 Universitat Rovira I Virgili, Dept. d Enginyeria Informàtica i Matemàtiques, Spain francesc.serratosa@.urv.net 2 Universitat
More informationTexture-Based Image Retrieval Without Segmentation
Texture-Based Image Retrieval Without Segmentation Yossi Rubner and Carlo Tomasi Computer Science Department Stanford University Stanford, CA 905 [rubner,tomasi]@@cs.stanford.edu Abstract Image segmentation
More informationAnnouncements. Texture. Review. Today: Texture 9/14/2015. Reminder: A1 due this Friday. Tues, Sept 15. Kristen Grauman UT Austin
Announcements Reminder: A due this Friday Texture Tues, Sept 5 Kristen Grauman UT Austin Review Edge detection: Filter for gradient Threshold gradient magnitude, thin Today: Texture Chamfer matching to
More informationSegmentation Computer Vision Spring 2018, Lecture 27
Segmentation http://www.cs.cmu.edu/~16385/ 16-385 Computer Vision Spring 218, Lecture 27 Course announcements Homework 7 is due on Sunday 6 th. - Any questions about homework 7? - How many of you have
More informationTopology-Preserved Diffusion Distance for Histogram Comparison
Topology-Preserved Diffusion Distance for Histogram Comparison Wang Yan, Qiqi Wang, Qingshan Liu, Hanqing Lu, and Songde Ma National Laboratory of Pattern Recognition Institute of Automation, Chinese Academy
More informationTexture. Texture. 2) Synthesis. Objectives: 1) Discrimination/Analysis
Texture Texture D. Forsythe and J. Ponce Computer Vision modern approach Chapter 9 (Slides D. Lowe, UBC) Key issue: How do we represent texture? Topics: Texture segmentation Texture-based matching Texture
More informationUsing the Kolmogorov-Smirnov Test for Image Segmentation
Using the Kolmogorov-Smirnov Test for Image Segmentation Yong Jae Lee CS395T Computational Statistics Final Project Report May 6th, 2009 I. INTRODUCTION Image segmentation is a fundamental task in computer
More information3D Deep Learning on Geometric Forms. Hao Su
3D Deep Learning on Geometric Forms Hao Su Many 3D representations are available Candidates: multi-view images depth map volumetric polygonal mesh point cloud primitive-based CAD models 3D representation
More informationc Copyright 1999 by Yossi Rubner All Rights Reserved ii
PERCEPTUAL METRICS FOR IMAGE DATABASE NAVIGATION a dissertation submitted to the department of computer science and the committee on graduate studies of stanford university in partial fulfillment of the
More informationK-Nearest Neighbors. Jia-Bin Huang. Virginia Tech Spring 2019 ECE-5424G / CS-5824
K-Nearest Neighbors Jia-Bin Huang ECE-5424G / CS-5824 Virginia Tech Spring 2019 Administrative Check out review materials Probability Linear algebra Python and NumPy Start your HW 0 On your Local machine:
More informationTexture. Texture is a description of the spatial arrangement of color or intensities in an image or a selected region of an image.
Texture Texture is a description of the spatial arrangement of color or intensities in an image or a selected region of an image. Structural approach: a set of texels in some regular or repeated pattern
More informationCS6670: Computer Vision
CS6670: Computer Vision Noah Snavely Lecture 16: Bag-of-words models Object Bag of words Announcements Project 3: Eigenfaces due Wednesday, November 11 at 11:59pm solo project Final project presentations:
More informationCS 534: Computer Vision Texture
CS 534: Computer Vision Texture Spring 2004 Ahmed Elgammal Dept of Computer Science CS 534 Ahmed Elgammal Texture - 1 Outlines Finding templates by convolution What is Texture Co-occurrence matrecis for
More informationLearning and Inferring Depth from Monocular Images. Jiyan Pan April 1, 2009
Learning and Inferring Depth from Monocular Images Jiyan Pan April 1, 2009 Traditional ways of inferring depth Binocular disparity Structure from motion Defocus Given a single monocular image, how to infer
More informationShape Descriptor using Polar Plot for Shape Recognition.
Shape Descriptor using Polar Plot for Shape Recognition. Brijesh Pillai ECE Graduate Student, Clemson University bpillai@clemson.edu Abstract : This paper presents my work on computing shape models that
More information2 The MiníMax Principle First consider a simple problem. This problem will address the tradeos involved in a two-objective optimiation problem, where
Determining the Optimal Weights in Multiple Objective Function Optimiation Michael A. Gennert Alan L. Yuille Department of Computer Science Division of Applied Sciences Worcester Polytechnic Institute
More informationAutomated Discrimination of Shapes in High Dimensions
Automated Discrimination of Shapes in High Dimensions Linh Lieu and Naoki Saito Dept. of Mathematics, Univ. of California-Davis, One Shields Ave., Davis, CA, USA 95616 ABSTRACT We present a new method
More informationSupervised texture detection in images
Supervised texture detection in images Branislav Mičušík and Allan Hanbury Pattern Recognition and Image Processing Group, Institute of Computer Aided Automation, Vienna University of Technology Favoritenstraße
More informationDiffusion Distance for Histogram Comparison
Diffusion Distance for Histogram Comparison Haibin Ling Center for Automation Research, Computer Science Department, University of Maryland College Park, MD, 20770, USA hbling@umiacs.umd.edu Kazunori Okada
More informationTexton-based Texture Classification
Texton-based Texture Classification Laurens van der Maaten a Eric Postma a a MICC, Maastricht University P.O. Box 616, 6200 MD Maastricht, The Netherlands Abstract Over the last decade, several studies
More informationPower Functions and Their Use In Selecting Distance Functions for. Document Degradation Model Validation. 600 Mountain Avenue, Room 2C-322
Power Functions and Their Use In Selecting Distance Functions for Document Degradation Model Validation Tapas Kanungo y ; Robert M. Haralick y and Henry S. Baird z y Department of Electrical Engineering,
More informationPerceptual Metrics for Image Database Navigation
Perceptual Metrics for Image Database Navigation PERCEPTUAL METRICS FOR IMAGE DATABASE NAVIGATION YOSSI RUBNER Stanford University Computer Science Department Stanford University Stanford, CA 94305 USA
More informationBeyond Mere Pixels: How Can Computers Interpret and Compare Digital Images? Nicholas R. Howe Cornell University
Beyond Mere Pixels: How Can Computers Interpret and Compare Digital Images? Nicholas R. Howe Cornell University Why Image Retrieval? World Wide Web: Millions of hosts Billions of images Growth of video
More informationCMPSCI 670: Computer Vision! Grouping
CMPSCI 670: Computer Vision! Grouping University of Massachusetts, Amherst October 14, 2014 Instructor: Subhransu Maji Slides credit: Kristen Grauman and others Final project guidelines posted Milestones
More informationFast and Adaptive Pairwise Similarities for Graph Cuts-based Image Segmentation
Fast and Adaptive Pairwise Similarities for Graph Cuts-based Image Segmentation Baris Sumengen UC, Santa Barbara Santa Barbara, CA 931 sumengen@ece.ucsb.edu Luca Bertelli UC, Santa Barbara Santa Barbara,
More informationEfficient Similarity Search in Scientific Databases with Feature Signatures
DATA MANAGEMENT AND DATA EXPLORATION GROUP Prof. Dr. rer. nat. Thomas Seidl DATA MANAGEMENT AND DATA EXPLORATION GROUP Prof. Dr. rer. nat. Thomas Seidl Efficient Similarity Search in Scientific Databases
More informationFixed-Window Image Descriptors for Image Retrieval. Leonidas J. Guibas, Brian Rogo, and Carlo Tomasi. Stanford University ABSTRACT
IS&T/SPIE Symposium on Electronic Imaging: Science and Technology, February 1995 Fixed-Window Image Descriptors for Image Retrieval Leonidas J. Guibas, Brian Rogo, and Carlo Tomasi Computer Science Department
More informationOutline. Segmentation & Grouping. Examples of grouping in vision. Grouping in vision. Grouping in vision 2/9/2011. CS 376 Lecture 7 Segmentation 1
Outline What are grouping problems in vision? Segmentation & Grouping Wed, Feb 9 Prof. UT-Austin Inspiration from human perception Gestalt properties Bottom-up segmentation via clustering Algorithms: Mode
More informationStructured Optimal Transport
Structured Optimal Transport David Alvarez-Melis, Tommi Jaakkola, Stefanie Jegelka CSAIL, MIT OTML Workshop @ NIPS, Dec 9th 2017 Motivation: Domain Adaptation c(x i,y j ) c(x k,y`) Labeled Source Domain
More informationA Content Based Image Retrieval System Based on Color Features
A Content Based Image Retrieval System Based on Features Irena Valova, University of Rousse Angel Kanchev, Department of Computer Systems and Technologies, Rousse, Bulgaria, Irena@ecs.ru.acad.bg Boris
More informationCS 534: Computer Vision Texture
CS 534: Computer Vision Texture Ahmed Elgammal Dept of Computer Science CS 534 Texture - 1 Outlines Finding templates by convolution What is Texture Co-occurrence matrices for texture Spatial Filtering
More informationDiscriminative classifiers for image recognition
Discriminative classifiers for image recognition May 26 th, 2015 Yong Jae Lee UC Davis Outline Last time: window-based generic object detection basic pipeline face detection with boosting as case study
More informationDistances between intuitionistic fuzzy sets
Fuzzy Sets and Systems 4 (000) 505 58 www.elsevier.com/locate/fss Distances between intuitionistic fuzzy sets Eulalia Szmidt, Janusz Kacprzyk Systems Research Institute, Polish Academy of Sciences, ul.
More informationLocal Image Features
Local Image Features Computer Vision CS 143, Brown Read Szeliski 4.1 James Hays Acknowledgment: Many slides from Derek Hoiem and Grauman&Leibe 2008 AAAI Tutorial This section: correspondence and alignment
More informationInvariant shape similarity. Invariant shape similarity. Invariant similarity. Equivalence. Equivalence. Equivalence. Equal SIMILARITY TRANSFORMATION
1 Invariant shape similarity Alexer & Michael Bronstein, 2006-2009 Michael Bronstein, 2010 tosca.cs.technion.ac.il/book 2 Invariant shape similarity 048921 Advanced topics in vision Processing Analysis
More informationPthread Parallel K-means
Pthread Parallel K-means Barbara Hohlt CS267 Applications of Parallel Computing UC Berkeley December 14, 2001 1 Introduction K-means is a popular non-hierarchical method for clustering large datasets.
More information2 The Service Provision Problem The formulation given here can also be found in Tomasgard et al. [6]. That paper also details the background of the mo
Two-Stage Service Provision by Branch and Bound Shane Dye Department ofmanagement University of Canterbury Christchurch, New Zealand s.dye@mang.canterbury.ac.nz Asgeir Tomasgard SINTEF, Trondheim, Norway
More informationEdge Histogram Descriptor, Geometric Moment and Sobel Edge Detector Combined Features Based Object Recognition and Retrieval System
Edge Histogram Descriptor, Geometric Moment and Sobel Edge Detector Combined Features Based Object Recognition and Retrieval System Neetesh Prajapati M. Tech Scholar VNS college,bhopal Amit Kumar Nandanwar
More informationEffective Corner Matching
Effective Corner Matching P. Smith, D. Sinclair, R. Cipolla and K. Wood Department of Engineering, University of Cambridge, Cambridge, UK Olivetti and Oracle Research Laboratory, Cambridge, UK pas1@eng.cam.ac.uk
More informationRegularity Analysis of Non Uniform Data
Regularity Analysis of Non Uniform Data Christine Potier and Christine Vercken Abstract. A particular class of wavelet, derivatives of B-splines, leads to fast and ecient algorithms for contours detection
More informationKernels and Clustering
Kernels and Clustering Robert Platt Northeastern University All slides in this file are adapted from CS188 UC Berkeley Case-Based Learning Non-Separable Data Case-Based Reasoning Classification from similarity
More informationCS 340 Lec. 4: K-Nearest Neighbors
CS 340 Lec. 4: K-Nearest Neighbors AD January 2011 AD () CS 340 Lec. 4: K-Nearest Neighbors January 2011 1 / 23 K-Nearest Neighbors Introduction Choice of Metric Overfitting and Underfitting Selection
More informationIn a two-way contingency table, the null hypothesis of quasi-independence. (QI) usually arises for two main reasons: 1) some cells involve structural
Simulate and Reject Monte Carlo Exact Conditional Tests for Quasi-independence Peter W. F. Smith and John W. McDonald Department of Social Statistics, University of Southampton, Southampton, SO17 1BJ,
More informationFeature Descriptors. CS 510 Lecture #21 April 29 th, 2013
Feature Descriptors CS 510 Lecture #21 April 29 th, 2013 Programming Assignment #4 Due two weeks from today Any questions? How is it going? Where are we? We have two umbrella schemes for object recognition
More informationCSE 573: Artificial Intelligence Autumn 2010
CSE 573: Artificial Intelligence Autumn 2010 Lecture 16: Machine Learning Topics 12/7/2010 Luke Zettlemoyer Most slides over the course adapted from Dan Klein. 1 Announcements Syllabus revised Machine
More informationShape Context Matching For Efficient OCR
Matching For Efficient OCR May 14, 2012 Matching For Efficient OCR Table of contents 1 Motivation Background 2 What is a? Matching s Simliarity Measure 3 Matching s via Pyramid Matching Matching For Efficient
More informationTexture Metrics. Yossi Rubner and Carlo Tomasi Computer Science Department, Stanford University Stanford, CA
Texture Metrics Yossi Rubner and Carlo Tomasi Computer Science Department, Stanford University Stanford, CA 94305 [rubner,tomasi]@cs.stanford.edu ABSTRACT We introduce a class of metric perceptual distances
More informationPOTENTIAL ENERGY DISTANCE BASED IMAGE RETRIEVAL
The Pennsylvania State University The Graduate School Eberly School of Science POTENTIAL ENERGY DISTANCE BASED IMAGE RETRIEVAL A Thesis in Statistics By Qi Fang 2013 Qi Fang Submitted in Partial Fullfillment
More informationCS 343H: Honors AI. Lecture 23: Kernels and clustering 4/15/2014. Kristen Grauman UT Austin
CS 343H: Honors AI Lecture 23: Kernels and clustering 4/15/2014 Kristen Grauman UT Austin Slides courtesy of Dan Klein, except where otherwise noted Announcements Office hours Kim s office hours this week:
More informationGrouping and Segmentation
Grouping and Segmentation CS 554 Computer Vision Pinar Duygulu Bilkent University (Source:Kristen Grauman ) Goals: Grouping in vision Gather features that belong together Obtain an intermediate representation
More informationAn Adaptive and Deterministic Method for Initializing the Lloyd-Max Algorithm
An Adaptive and Deterministic Method for Initializing the Lloyd-Max Algorithm Jared Vicory and M. Emre Celebi Department of Computer Science Louisiana State University, Shreveport, LA, USA ABSTRACT Gray-level
More informationMachine Learning and Data Mining. Clustering (1): Basics. Kalev Kask
Machine Learning and Data Mining Clustering (1): Basics Kalev Kask Unsupervised learning Supervised learning Predict target value ( y ) given features ( x ) Unsupervised learning Understand patterns of
More informationTracking. Establish where an object is, other aspects of state, using time sequence Biggest problem -- Data Association
Tracking Establish where an object is, other aspects of state, using time sequence Biggest problem -- Data Association Key ideas Tracking by detection Tracking through flow Track by detection (simple form)
More informationAnnouncements. Recognition. Recognition. Recognition. Recognition. Homework 3 is due May 18, 11:59 PM Reading: Computer Vision I CSE 152 Lecture 14
Announcements Computer Vision I CSE 152 Lecture 14 Homework 3 is due May 18, 11:59 PM Reading: Chapter 15: Learning to Classify Chapter 16: Classifying Images Chapter 17: Detecting Objects in Images Given
More informationThe goals of segmentation
Image segmentation The goals of segmentation Group together similar-looking pixels for efficiency of further processing Bottom-up process Unsupervised superpixels X. Ren and J. Malik. Learning a classification
More informationContrained K-Means Clustering 1 1 Introduction The K-Means clustering algorithm [5] has become a workhorse for the data analyst in many diverse elds.
Constrained K-Means Clustering P. S. Bradley K. P. Bennett A. Demiriz Microsoft Research Dept. of Mathematical Sciences One Microsoft Way Dept. of Decision Sciences and Eng. Sys. Redmond, WA 98052 Renselaer
More informationObject Recognition Using Pictorial Structures. Daniel Huttenlocher Computer Science Department. In This Talk. Object recognition in computer vision
Object Recognition Using Pictorial Structures Daniel Huttenlocher Computer Science Department Joint work with Pedro Felzenszwalb, MIT AI Lab In This Talk Object recognition in computer vision Brief definition
More informationSTUDYING THE FEASIBILITY AND IMPORTANCE OF GRAPH-BASED IMAGE SEGMENTATION TECHNIQUES
25-29 JATIT. All rights reserved. STUDYING THE FEASIBILITY AND IMPORTANCE OF GRAPH-BASED IMAGE SEGMENTATION TECHNIQUES DR.S.V.KASMIR RAJA, 2 A.SHAIK ABDUL KHADIR, 3 DR.S.S.RIAZ AHAMED. Dean (Research),
More informationShape Context Matching For Efficient OCR. Sudeep Pillai
Shape Context Matching For Efficient OCR Sudeep Pillai May 18, 2012 Contents 1 Introduction 2 1.1 Motivation................................... 2 1.2 Background................................... 2 1.2.1
More informationThe Earth Mover s Distance under Transformation Sets
Proceedings of the 7th IEEE International Conference on Computer Vision, September 1999 1 The Earth Mover s Distance under Transformation Sets Scott Cohen Leonidas Guibas Computer Science Department Stanford
More informationGlobal Metric Learning by Gradient Descent
Global Metric Learning by Gradient Descent Jens Hocke and Thomas Martinetz University of Lübeck - Institute for Neuro- and Bioinformatics Ratzeburger Allee 160, 23538 Lübeck, Germany hocke@inb.uni-luebeck.de
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Kernels and Clustering Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationPrevious Lecture - Coded aperture photography
Previous Lecture - Coded aperture photography Depth from a single image based on the amount of blur Estimate the amount of blur using and recover a sharp image by deconvolution with a sparse gradient prior.
More informationFrom Pixels to Blobs
From Pixels to Blobs 15-463: Rendering and Image Processing Alexei Efros Today Blobs Need for blobs Extracting blobs Image Segmentation Working with binary images Mathematical Morphology Blob properties
More informationApplied Bayesian Nonparametrics 5. Spatial Models via Gaussian Processes, not MRFs Tutorial at CVPR 2012 Erik Sudderth Brown University
Applied Bayesian Nonparametrics 5. Spatial Models via Gaussian Processes, not MRFs Tutorial at CVPR 2012 Erik Sudderth Brown University NIPS 2008: E. Sudderth & M. Jordan, Shared Segmentation of Natural
More information9.1. K-means Clustering
424 9. MIXTURE MODELS AND EM Section 9.2 Section 9.3 Section 9.4 view of mixture distributions in which the discrete latent variables can be interpreted as defining assignments of data points to specific
More informationTexture Classification: Are Filter Banks Necessary?
Texture Classification: Are Filter Banks Necessary? Manik Varma Robotics Research Group Dept. of Engineering Science University of Oxford Oxford, UK OX1 3PJ manik@robots.ox.ac.uk Andrew Zisserman Robotics
More informationGenerative and discriminative classification techniques
Generative and discriminative classification techniques Machine Learning and Category Representation 013-014 Jakob Verbeek, December 13+0, 013 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.13.14
More informationCoarse-to-fine image registration
Today we will look at a few important topics in scale space in computer vision, in particular, coarseto-fine approaches, and the SIFT feature descriptor. I will present only the main ideas here to give
More informationFinding 2D Shapes and the Hough Transform
CS 4495 Computer Vision Finding 2D Shapes and the Aaron Bobick School of Interactive Computing Administrivia Today: Modeling Lines and Finding them CS4495: Problem set 1 is still posted. Please read the
More informationComputer Vision. Exercise Session 10 Image Categorization
Computer Vision Exercise Session 10 Image Categorization Object Categorization Task Description Given a small number of training images of a category, recognize a-priori unknown instances of that category
More informationBag of Words Models. CS4670 / 5670: Computer Vision Noah Snavely. Bag-of-words models 11/26/2013
CS4670 / 5670: Computer Vision Noah Snavely Bag-of-words models Object Bag of words Bag of Words Models Adapted from slides by Rob Fergus and Svetlana Lazebnik 1 Object Bag of words Origin 1: Texture Recognition
More informationSegmentation and Grouping
02/23/10 Segmentation and Grouping Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem Last week Clustering EM Today s class More on EM Segmentation and grouping Gestalt cues By boundaries
More informationSegmentation and Grouping April 21 st, 2015
Segmentation and Grouping April 21 st, 2015 Yong Jae Lee UC Davis Announcements PS0 grades are up on SmartSite Please put name on answer sheet 2 Features and filters Transforming and describing images;
More informationComputer vision: models, learning and inference. Chapter 13 Image preprocessing and feature extraction
Computer vision: models, learning and inference Chapter 13 Image preprocessing and feature extraction Preprocessing The goal of pre-processing is to try to reduce unwanted variation in image due to lighting,
More informationLocal Features and Bag of Words Models
10/14/11 Local Features and Bag of Words Models Computer Vision CS 143, Brown James Hays Slides from Svetlana Lazebnik, Derek Hoiem, Antonio Torralba, David Lowe, Fei Fei Li and others Computer Engineering
More informationDepartment of Electrical Engineering, Keio University Hiyoshi Kouhoku-ku Yokohama 223, Japan
Shape Modeling from Multiple View Images Using GAs Satoshi KIRIHARA and Hideo SAITO Department of Electrical Engineering, Keio University 3-14-1 Hiyoshi Kouhoku-ku Yokohama 223, Japan TEL +81-45-563-1141
More informationComputer Vision for HCI. Topics of This Lecture
Computer Vision for HCI Interest Points Topics of This Lecture Local Invariant Features Motivation Requirements, Invariances Keypoint Localization Features from Accelerated Segment Test (FAST) Harris Shi-Tomasi
More informationSimilarity Estimation Techniques from Rounding Algorithms. Moses Charikar Princeton University
Similarity Estimation Techniques from Rounding Algorithms Moses Charikar Princeton University 1 Compact sketches for estimating similarity Collection of objects, e.g. mathematical representation of documents,
More informationNearest neighbor classification DSE 220
Nearest neighbor classification DSE 220 Decision Trees Target variable Label Dependent variable Output space Person ID Age Gender Income Balance Mortgag e payment 123213 32 F 25000 32000 Y 17824 49 M 12000-3000
More informationFinal Exam Schedule. Final exam has been scheduled. 12:30 pm 3:00 pm, May 7. Location: INNOVA It will cover all the topics discussed in class
Final Exam Schedule Final exam has been scheduled 12:30 pm 3:00 pm, May 7 Location: INNOVA 1400 It will cover all the topics discussed in class One page double-sided cheat sheet is allowed A calculator
More informationMedian filter. Non-linear filtering example. Degraded image. Radius 1 median filter. Today
Today Non-linear filtering example Median filter Replace each pixel by the median over N pixels (5 pixels, for these examples). Generalizes to rank order filters. In: In: 5-pixel neighborhood Out: Out:
More informationNon-linear filtering example
Today Non-linear filtering example Median filter Replace each pixel by the median over N pixels (5 pixels, for these examples). Generalizes to rank order filters. In: In: 5-pixel neighborhood Out: Out:
More information