Discriminative classifiers for object classification. Last time

Similar documents
Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Categorizing objects: of appearance

Recognition continued: discriminative classifiers

Announcements. Recognizing object categories. Today 2/10/2016. Recognition via feature matching+spatial verification. Kristen Grauman UT-Austin

Support Vector Machines

Announcements. Supervised Learning

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

Support Vector Machines. CS534 - Machine Learning

CMPSCI 670: Computer Vision! Object detection continued. University of Massachusetts, Amherst November 10, 2014 Instructor: Subhransu Maji

Support Vector Machines

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Classification / Regression Support Vector Machines

Discriminative classifiers for image recognition

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

What is Object Detection? Face Detection using AdaBoost. Detection as Classification. Principle of Boosting (Schapire 90)

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS

Discriminative Dictionary Learning with Pairwise Constraints

Multi-stable Perception. Necker Cube

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Computer Vision. Pa0ern Recogni4on Concepts Part II. Luis F. Teixeira MAP- i 2012/13

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

INF 4300 Support Vector Machine Classifiers (SVM) Anne Solberg

Collaboratively Regularized Nearest Points for Set Based Recognition

Histogram of Template for Pedestrian Detection

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

Edge Detection in Noisy Images Using the Support Vector Machines

Lecture 5: Multilayer Perceptrons

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline

RECOGNIZING GENDER THROUGH FACIAL IMAGE USING SUPPORT VECTOR MACHINE

Machine Learning 9. week

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

Feature Reduction and Selection

Metrol. Meas. Syst., Vol. XXIII (2016), No. 1, pp METROLOGY AND MEASUREMENT SYSTEMS. Index , ISSN

Fast Feature Value Searching for Face Detection

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

CS 534: Computer Vision Model Fitting

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

The Research of Support Vector Machine in Agricultural Data Classification

Fitting and Alignment

Learning-based License Plate Detection on Edge Features

SUMMARY... I TABLE OF CONTENTS...II INTRODUCTION...

Face Recognition Based on SVM and 2DPCA

Image Alignment CSC 767

LECTURE : MANIFOLD LEARNING

Scale Selective Extended Local Binary Pattern For Texture Classification

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

Implementation of Robust HOG-SVM based Pedestrian Classification

Using Neural Networks and Support Vector Machines in Data Mining

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Fitting: Voting and the Hough Transform

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification

Support Vector Machines

Face Detection with Deep Learning

Unsupervised Learning and Clustering

An Anti-Noise Text Categorization Method based on Support Vector Machines *

Efficient Text Classification by Weighted Proximal SVM *

INF Repetition Anne Solberg INF

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Quadratic Program Optimization using Support Vector Machine for CT Brain Image Classification

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Unsupervised Learning

Smoothing Spline ANOVA for variable screening

Support Vector Machine for Remote Sensing image classification

Local Quaternary Patterns and Feature Local Quaternary Patterns

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Probability Distribution Kernel based on Whitening. Transformation

Classification of Product Images in Different Color Models with Customized Kernel for Support Vector Machine

WIRELESS CAPSULE ENDOSCOPY IMAGE CLASSIFICATION BASED ON VECTOR SPARSE CODING.

Robust Inlier Feature Tracking Method for Multiple Pedestrian Tracking

Comparing Image Representations for Training a Convolutional Neural Network to Classify Gender

Face Recognition Method Based on Within-class Clustering SVM

Learning a Class-Specific Dictionary for Facial Expression Recognition

Online Detection and Classification of Moving Objects Using Progressively Improving Detectors

Wavelets and Support Vector Machines for Texture Classification

SRBIR: Semantic Region Based Image Retrieval by Extracting the Dominant Region and Semantic Learning

The Study of Remote Sensing Image Classification Based on Support Vector Machine

Classifier Selection Based on Data Complexity Measures *

Margin-Constrained Multiple Kernel Learning Based Multi-Modal Fusion for Affect Recognition

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

On the detection of pornographic digital images

3D vector computer graphics

Feature Extractions for Iris Recognition

SIGGRAPH Interactive Image Cutout. Interactive Graph Cut. Interactive Graph Cut. Interactive Graph Cut. Hard Constraints. Lazy Snapping.

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Machine Learning: Algorithms and Applications

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

Programming in Fortran 90 : 2017/2018

A Robust LS-SVM Regression

Applications of Support Vector Machines for Pattern Recognition: A Survey

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

Support Vector classifiers for Land Cover Classification

Transcription:

Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng algorthm, cascades Face detecton example Lmtatons of a global appearance descrpton Lmtatons of sldng ndo detectors 1

Example: learnng skn colors We can represent a class-condtonal densty usng a hstogram (a non-parametrc dstrbuton) P(x skn) No e get a ne mage, and ant to label each pxel as skn or non-skn. Feature x = Hue P(x not skn) Feature x = Hue Bayes rule posteror P ( skn x) = lkelhood pror P ( x skn ) P ( skn ) P( x) P ( skn x ) α P ( x skn ) P ( skn ) 2

Detecton va classfcaton: Man dea ng gnton ory Augmented Tutoral Comput Vsual Perceptual Object and Recog Sens Fleshng out ths ppelne a bt more, e need to: 1. Obtan tranng data 2. Defne features 3. Defne classfer Feature extracton Tranng examples Car/non-car Classfer AdaBoost: Intuton ng Vsual Perceptual Object and Recog Sens gnton ory Augmented Tutoral Comput Fnal classfer s combnaton of the eak classfers 3

Feature extracton: rectangular flters ng gnton ory Augmented Tutoral Comput Vsual Perceptual Object and Recog Sens Rectangular flters Effcently computable th ntegral mage: any sum can be computed n constant tme Avod scalng mages scale features drectly for same cost Vola & Jones, CVPR 2001 Feature output s dfference beteen adjacent regons Value at (x,y) s sum of pxels above and to the left of (x,y) Integral mage ng gnton ory Augmented Tutoral Comput Vsual Perceptual Object and Recog Sens Feature extracton: flter lbrary Use AdaBoost both to select the nformatve features and to form the classfer Consderng all possble flter parameters: poston, scale, and type: 180,000+ possble features assocated th each 24 x 24 ndo 4

AdaBoost for feature+classfer selecton ng Want to select the sngle rectangle feature and threshold that best separates postve (faces) and negatve (nonfaces) tranng examples, n terms of eghted error. gnton ory Augmented Tutoral Comput Vsual Perceptual Object and Recog Sens Outputs of a possble rectangle feature on faces and non-faces. Resultng eak classfer: For next round, reeght the examples accordng to errors, choose another flter/threshold combo. Vola-Jones Face Detector: Results Vsual Perceptual Object and Recog Sens gnton ory Augmented Tutoral Comput ng 5

Outlne Dscrmnatve classfers Boostng (last tme) Nearest neghbors Support vector machnes Applcaton to pedestran detecton Applcaton to gender classfcaton Nearest Neghbor classfcaton Assgn label of nearest tranng data pont to each test data pont Black = negatve Red = postve from Duda et al. Novel test example Closest to a postve example from the tranng set, so classfy t as postve. Vorono parttonng of feature space for 2-category 2D data 6

K-Nearest Neghbors classfcaton For a ne pont, fnd the k closest ponts from tranng data Labels of the k ponts vote to classfy Black = negatve Red = postve k= 5 If query lands here, the 5 NN consst of 3 negatves and 2 postves, so e classfy t as negatve. Source: D. Loe Example: nearest neghbor classfcaton We could dentfy the pengun n the ne ve based on the dstance beteen ts chest spot pattern and all the stored penguns patterns. Labeled database of knon pengun examples 7

Nearest neghbors: pros and cons Pros: Smpleto mplement Flexble to feature / dstance choces Naturally handles mult class cases Can do ell n practce th enough representatve data Cons: Large search problem to fnd nearest neghbors Storage of data Must kno e have a meanngful dstance functon Outlne Dscrmnatve classfers Boostng (last tme) Nearest neghbors Support vector machnes Applcaton to pedestran detecton Applcaton to gender classfcaton 8

Lnear classfers Lnes n R 2 Let a = c c x= x y ax + cy + b = 0 9

Lnes n R 2 Let a = c c ax + cy + b x= x y = 0 x + b = 0 ( ) x 0, y 0 D Lnes n R 2 Let a = c c ax + cy + b x= x y = 0 x + b = 0 10

( ) x 0, y 0 D Lnes n R 2 Let a = c c ax + cy + b x= x y = 0 x + b = 0 D = ax + cy a 2 + c + b 2 = x + b Τ 0 0 dstance from pont to lne ( ) x 0, y 0 D Lnes n R 2 Let a = c c ax + cy + b x= x y = 0 x + b = 0 D = ax + cy a 2 + c + b 2 = x + b Τ 0 0 dstance from pont to lne 11

Lnear classfers Fnd lnear functon to separate postve and negatve examples x postve : x negatve: x + b 0 x + b < 0 Whch lne s best? Support Vector Machnes (SVMs) Dscrmnatve classfer based on optmal separatng lne (for 2d case) Maxmze the margn Maxmze the margn beteen the postve and negatve tranng examples 12

Support vector machnes Want lne that maxmzes the margn. x postve ( y x negatve( y = 1) : x + b 1 = 1) : x + b 1 For support, vectors, x + b = ± 1 Support vectors Margn C. Burges, A Tutoral on Support Vector Machnes for Pattern Recognton, Data Mnng and Knoledge Dscovery, 1998 Support vector machnes Want lne that maxmzes the margn. x postve ( y x negatve( y = 1) : x + b 1 = 1) : x + b 1 For support, vectors, x + b = ± 1 Support vectors Margn M Dstance beteen pont x + b and lne: For support vectors: Τ x + b ± 1 = M = 1 1 = 2 13

Support vector machnes Want lne that maxmzes the margn. x postve ( y x negatve( y = 1) : x + b 1 = 1) : x + b 1 For support, vectors, x + b = ± 1 Dstance beteen pont x + b and lne: Support vectors Margn M Therefore, the margn s 2 / Fndng the maxmum margn lne 1. Maxmze margn 2/ 2. Correctly classfy all tranng data ponts: x postve ( y x negatve( y = 1) : x + b 1 = 1) : Quadratc optmzaton problem: x + b 1 Mnmze 1 T 2 Subject to y ( x +b) 1 One constrant for each tranng pont. Note sgn trck. C. Burges, A Tutoral on Support Vector Machnes for Pattern Recognton, Data Mnng and Knoledge Dscovery, 1 14

Fndng the maxmum margn lne Soluton: = α y x learned eght Support vector Fndng the maxmum margn lne Soluton: = α y x b = y x (for any support vector) x + b = α y x x + b Classfcaton functon: Notce that t t reles on an nner product beteen the test t pont x and the support vectors x (Solvng the optmzaton problem also nvolves computng the nner products x x j beteen all pars of tranng ponts) + f ( x) = sgn ( x + b) = sgn ( α x x b) + If f(x) < 0, classfy as negatve, f f(x) > 0, classfy as postve 15

Questons Ho s the SVM objectve dfferent from the boostng objectve? What f the features are not 2d? What f the data s not lnearly separable? What f e have more than just to categores? Questons Ho s the SVM objectve dfferent from the boostng objectve? What f the features are not 2d? Generalzes to d dmensons replace lne th hyperplane What f the data s not lnearly separable? What f e have more than just to categores? 16

( x, y z ) 0 0, Planes n R 3 D 0 a x = b Let x= y c z ax + by + cz + d = 0 x + d = 0 D = ax + by a 2 + b + cz 2 + c + d 2 = x + d Τ 0 0 0 dstance from pont to plane Hyperplanes n R n Hyperplane H s set of all vectors hch satsfy: n x R 1 x1 + 2 x2 + K+ n xn + b = 0 Τ x + b = 0 D( H, x) = Τ x + b dstance from pont to hyperplane 17

Questons What f the features are not 2d? What f the data s not lnearly separable? What f e have more than just to categores? Non lnear SVMs Datasets that are lnearly separable th some nose ork out great: 0 x But hat are e gong to do f the dataset s just too hard? 0 x Ho about mappng data to a hgher-dmensonal space: x 2 0 x 18

Non lnear SVMs: feature spaces General dea: the orgnal nput space can be mapped to some hgher-dmensonal feature space here the tranng set s separable: Φ: x φ(x) Slde from Andre Moore s tutoral: http://.autonlab.org/tutorals/svm.html Nonlnear SVMs The kernel trck: nstead of explctly computng the lftng transformaton φ(x), defne a kernel functon K such that K(x,x j j) = φ(x ) φ(x j ) Ths gves a nonlnear decson boundary n the orgnal feature space: α yk ( x, x ) + b 19

Examples of kernel functons Lnear: K ( x, x ) = j x T x j Gaussan RBF: x x j K( x,x j ) = exp( 2 2σ 2 ) Hstogram ntersecton: K ( x, x j ) = mn( x ( k), x j ( k)) k Questons What f the features are not 2d? What f the data s not lnearly separable? What f e have more than just to categores? 20

Mult class SVMs Acheve mult class classfer by combnng a number of bnary classfers One vs. all Tranng: learn an SVM for each class vs. the rest Testng: apply each SVM to test example and assgn to t the class of the SVM that returns the hghest decson value One vs. one Tranng: learn an SVM for each par of classes Testng: each learned SVM votes for a class to assgn to the test example SVMs for recognton 1. Defne your representaton for each example. 2. Select a kernel functon. 3. Compute parse kernel values beteen labeled examples 4. Gve ths kernel matrx to SVM optmzaton softare to dentfy support vectors & eghts. 5. To classfy a ne example: compute kernel values beteen ne nput and support vectors, apply eghts, check sgn of output. 21

Pedestran detecton Detectng uprght, alkng humans also possble usng sldng ndo s appearance/texture; e.g., gnton ory Augmented Tutoral Comput Vsual Perceptual Object and Recog Sens ng SVM th Haar avelets [Papageorgou & Poggo, IJCV 2000] Space-tme rectangle features [Vola, Jones & Sno, ICCV 2003] SVM th HoGs [Dalal & Trggs, CVPR 2005] Example: pedestran detecton th HoG s and SVM s ng gnton ory Augmented Tutoral Comput Vsual Perceptual Object and Recog Sens Dalal & Trggs, CVPR 2005 Map each grd cell n the nput ndo to a hstogram countng the gradents per orentaton. Tran a lnear SVM usng tranng set of pedestran vs. nonpedestran ndos. Code avalable: http://pascal.nralpes.fr/soft/olt/ 22

Pedestran detecton th HoG s & SVM s Vsual Perceptual Object and Recog Sens gnton ory Augmented Tutoral Comput ng Hstograms of Orented Gradents for Human Detecton, Navneet Dalal, Bll Trggs, Internatonal Conference on Computer Vson & Pattern Recognton - June 2005 http://lear.nralpes.fr/pubs/2005/dt05/ Example: learnng gender th SVMs Moghaddam and Yang, Learnng Gender th Support Faces, TPAMI 2002. Moghaddam and Yang, Face & Gesture 2000. 23

Face algnment processng Processed faces Moghaddam and Yang, Learnng Gender th Support Faces, TPAMI 2002. Learnng gender th SVMs Tranng examples: 1044 males 713 females Experment th varous kernels, select Gaussan RBF x x j K ( x, x j ) = exp( ) 2 2σ 2 24

Support Faces Moghaddam and Yang, Learnng Gender th Support Faces, TPAMI 2002. Moghaddam and Yang, Learnng Gender th Support Faces, TPAMI 2002. 25

Gender percepton experment: Ho ell can humans do? Subjects: 30 people (22 male, 8 female) Ages md-20 s to md-40 s Test data: 254 face mages (6 males, 4 females) Lo res and hgh res versons Task: Classfy as male or female, forced choce No tme lmt Moghaddam and Yang, Face & Gesture 2000. Gender percepton experment: Ho ell can humans do? Error Error Moghaddam and Yang, Face & Gesture 2000. 26

Human vs. Machne SVMs performed better than any sngle human test subject, at ether resoluton Hardest examples for humans Moghaddam and Yang, Face & Gesture 2000. 27

SVMs: Pros and cons Pros Many publcly avalable SVM packages: http://.kernel-machnes.org/softare http://.cse.ntu.edu.t/~cjln/lbsvm/ edu t/~cjln/lbsvm/ Kernel-based frameork s very poerful, flexble Often a sparse set of support vectors compact at test tme Work very ell n practce, even th very small tranng sample szes Cons No drect mult-class SVM, must combne to-class SVMs Can be trcky to select best kernel functon for a problem Computaton, memory Durng tranng tme, must compute matrx of kernel values for every par of examples Learnng can take a very long tme for large-scale problems Adapted from Lana Lazebnk Summary Dscrmnatve classfers appled to object dt detecton t / categorzaton t problems. Boostng (last tme) Nearest neghbors Support vector machnes Applcaton to pedestran detecton Applcaton to gender classfcaton 28