CS201: Computer Vision Introduction to Tracking

Similar documents
Feature Tracking and Optical Flow

Motion and Optical Flow. Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi

Feature Tracking and Optical Flow

Autonomous Navigation for Flying Robots

Visual Tracking (1) Feature Point Tracking and Block Matching

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

Robert Collins CSE598G. Intro to Template Matching and the Lucas-Kanade Method

Peripheral drift illusion

Visual Tracking (1) Tracking of Feature Points and Planar Rigid Objects

EE795: Computer Vision and Intelligent Systems

Image processing and features

CS 4495 Computer Vision Motion and Optic Flow

Ninio, J. and Stevens, K. A. (2000) Variations on the Hermann grid: an extinction illusion. Perception, 29,

CS201 Computer Vision Camera Geometry

Optical flow and tracking

Augmented Reality VU. Computer Vision 3D Registration (2) Prof. Vincent Lepetit

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Dense Image-based Motion Estimation Algorithms & Optical Flow

CS6670: Computer Vision

Lecture 16: Computer Vision

Capturing, Modeling, Rendering 3D Structures

Visual Tracking. Image Processing Laboratory Dipartimento di Matematica e Informatica Università degli studi di Catania.

Automatic Image Alignment (direct) with a lot of slides stolen from Steve Seitz and Rick Szeliski

EE795: Computer Vision and Intelligent Systems

CS664 Lecture #18: Motion

Visual Tracking. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Leow Wee Kheng CS4243 Computer Vision and Pattern Recognition. Motion Tracking. CS4243 Motion Tracking 1

Lecture 16: Computer Vision

Computer Vision Lecture 20

Tracking Computer Vision Spring 2018, Lecture 24

Computer Vision Lecture 20

Visual Tracking (1) Pixel-intensity-based methods

Application questions. Theoretical questions

Multi-stable Perception. Necker Cube

Chapter 9 Object Tracking an Overview

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Comparison between Motion Analysis and Stereo

Motion Estimation. There are three main types (or applications) of motion estimation:

Edge and corner detection

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

SE 263 R. Venkatesh Babu. Object Tracking. R. Venkatesh Babu

ECE Digital Image Processing and Introduction to Computer Vision

Using temporal seeding to constrain the disparity search range in stereo matching

OPPA European Social Fund Prague & EU: We invest in your future.

Lucas-Kanade Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

EECS 556 Image Processing W 09

PERFORMANCE CAPTURE FROM SPARSE MULTI-VIEW VIDEO

VC 11/12 T11 Optical Flow

Computer Vision Lecture 20

Kanade Lucas Tomasi Tracking (KLT tracker)

Computer Vision II Lecture 4

Local Feature Detectors

The Lucas & Kanade Algorithm

Displacement estimation

Particle Tracking. For Bulk Material Handling Systems Using DEM Models. By: Jordan Pease

Computational Optical Imaging - Optique Numerique. -- Single and Multiple View Geometry, Stereo matching --

Features Points. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

Computer Vision II Lecture 4

Optical flow. Cordelia Schmid

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

School of Computing University of Utah

Recall: Derivative of Gaussian Filter. Lecture 7: Correspondence Matching. Observe and Generalize. Observe and Generalize. Observe and Generalize

Real-Time Scene Reconstruction. Remington Gong Benjamin Harris Iuri Prilepov

Kanade Lucas Tomasi Tracking (KLT tracker)

Final Exam Study Guide

CS4495 Fall 2014 Computer Vision Problem Set 5: Optic Flow

CS 565 Computer Vision. Nazar Khan PUCIT Lectures 15 and 16: Optic Flow

3D Vision. Viktor Larsson. Spring 2019

Using Subspace Constraints to Improve Feature Tracking Presented by Bryan Poling. Based on work by Bryan Poling, Gilad Lerman, and Arthur Szlam

Structure from Motion

ViSP tracking methods overview

CS231A Section 6: Problem Set 3

Overview. Video. Overview 4/7/2008. Optical flow. Why estimate motion? Motion estimation: Optical flow. Motion Magnification Colorization.

Particle Filtering. CS6240 Multimedia Analysis. Leow Wee Kheng. Department of Computer Science School of Computing National University of Singapore

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Feature Detection. Raul Queiroz Feitosa. 3/30/2017 Feature Detection 1

C18 Computer vision. C18 Computer Vision. This time... Introduction. Outline.

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

EE795: Computer Vision and Intelligent Systems

Pictures at an Exhibition

Tracking in image sequences

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

Robust Camera Pan and Zoom Change Detection Using Optical Flow

Optical Flow-Based Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Face Tracking : An implementation of the Kanade-Lucas-Tomasi Tracking algorithm

Computer Vision I. Announcement. Corners. Edges. Numerical Derivatives f(x) Edge and Corner Detection. CSE252A Lecture 11

Computer Vision for HCI. Topics of This Lecture

Mosaics. Today s Readings

Computer Vision I. Announcements. Fourier Tansform. Efficient Implementation. Edge and Corner Detection. CSE252A Lecture 13.

Announcements. Computer Vision I. Motion Field Equation. Revisiting the small motion assumption. Visual Tracking. CSE252A Lecture 19.

CS5670: Computer Vision

CS 532: 3D Computer Vision 7 th Set of Notes

Mariya Zhariy. Uttendorf Introduction to Optical Flow. Mariya Zhariy. Introduction. Determining. Optical Flow. Results. Motivation Definition

Announcements. Edges. Last Lecture. Gradients: Numerical Derivatives f(x) Edge Detection, Lines. Intro Computer Vision. CSE 152 Lecture 10

Coarse-to-fine image registration

CS201 Computer Vision Lect 4 - Image Formation

EE795: Computer Vision and Intelligent Systems

Building a Panorama. Matching features. Matching with Features. How do we build a panorama? Computational Photography, 6.882

Final Review CMSC 733 Fall 2014

Transcription:

CS201: Computer Vision Introduction to Tracking John Magee 18 November 2014 Slides courtesy of: Diane H. Theriault Question of the Day How can we represent and use motion in images? 1

What is Motion? Change over time Position Orientation Pose Shape Motion in Images (No such thing, really, since an image is just a picture captured at one moment in time) Example: Video Camera captures a series of images as scene changes Video: Ordered sequence of images captured in rapid succession (Fixed Camera) 2

Motion in Images (No such thing, really, since an image is just a picture captured at one moment in time) Example: Video Camera captures a series of images as scene changes Video: Ordered sequence of images captured in rapid succession (Moving Camera) 3

Motion in Images (No such thing, really, since an image is just a picture captured at one moment in time) Example: Video Camera captures a series of images as scene changes Ordered sequence of images captured in rapid succession Example: Stereo Cameras captures same scene from different viewpoints Set of images where cameras are separated in space Stereoscopic Pairs 4

Motion in Images Motion in Images What happens to the color / brightness values captured in successive images as the scene or camera moves? What does it mean for an image feature to move? How do we use the movement of image features to infer things about the scene or the camera? (Usually need to assume that the difference between images is reasonably small) 5

Types of Tracking Tracking by Detection Feature Tracking Types of Tracking 6

Types of Tracking Optical Flow / Dense scene motion (After Spring Break) Contour Tracking Types of Tracking 7

Types of Tracking Multi-target Tracking Discussion Questions What are different types of tracking that we could do in video of sports? Surveillance? Videos of daily life? What other types of videos are you interested in? What types of information might we want to obtain by understanding motion in images? 8

Feature Tracking What is an image feature? Distinctive Repeatable Uniquely Localizable What does it mean for an image feature to move? Template Tracking Simple feature: small image patch Motion: The same pattern of brightness values appear in a (slightly) different place in the next image 9

Template Tracking Given: small image patch of something we re looking for Goal: Find the best-match location in the new image How: Search in a small window around its previous location Template Tracking Goal: Find the best-match location in the new image How to compute a matching score? Normalized Correlation Coefficient 10

Template Tracking Goal: Find the best-match location in the new image How to find the best location? Exhaustive search (convolution style) Template Tracking Given: template, initial location x0 For each image, t=1:n Search in a small window around x_{t-1} x_t is location with highest NCC score 11

Template Tracking Challenges: Computational Cost Getting lost / Drift Non-translational motion (e.g. rotation) Non-rigid motion (articulation of hand) 3D motion Changing appearance of real object Discussion Questions What are the benefits / downsides of using larger templates / search windows? Why is rotation and scaling problematic for a template tracker? If we update the template as we track, what problems do we solve? What problems do we create? How could we benefit from using a constellation of smaller templates instead of one big one? 12

Discussion Questions: Motion blur Image brightness / contrast changes Computational cost What is the location of an object? What are different types of tracking you could do on XXX video? Assume: Small changes Discussion Questions Assumptions needed for template tracking? Changes in brightness / contrast? How can image feature change? Translation only. No rotation. No scaling How to choose search window? How to choose size of template? Collection of templates to break up one big template? Expensive: Coarse-to-fine How does change accumulate over time? Update template. What might happen if you do that? 13

Question of the Day How can we track features and do better than brute force search? Lucas-Kanade Goal: Find the location in the new image with the best match What if we could do better than exhaustive search? How could we direct our search for the best match using the difference between the two images and the image gradient 14

Background Taylor Series Any function can be approximated with a polynomial Truncated Taylor Series First order approximation Taylor series example Background 15

Background Newton-Raphson method for finding roots (zeros) of a function Assume: Algorithm Want to find roots: Finding Best Match in 1D Two curves: F(x), G(x) Displacement: h Goal: Find the displacement (Derivation on board) Lucas-Kanade Assume: G is a translated version of F: G(x) = F(x+h) Assume: First-order approximation is sufficient F(x+h) = F(x) + h F (x) Assume: Displacement is small 16

Background Multivariate first order approximation Finding Best Match in 2D Two surfaces: F(x), G(x) Displacement: h Goal: Find the displacement Lucas-Kanade Assume: G is a translated version of F: G(x) = F(x+h) Assume: First-order approximation is sufficient Assume: Displacement is small 17

Lucas-Kanade Algorithm: For each patch, use previous location as initial guess Until error (F(x+h) G(x)) is sufficiently small Compute the summations over the gradient Compute the summation over the image values Find displacement, h, by: Find displacement, h, by Use equations to guide search over several iterations Lucas-Kanade Details: Compute sums with weights distance to center mitigate regions where image values match but gradients do not 18

Lucas-Kanade Details: When misregistration might be large wrt image patch Smooth image Coarse-to-fine strategy (search in low resolution image for approximate match, then refine in high resolution image) Kanade-Tomasi How to choose features to track? Manual annotation Large gradient Zero-crossings of Laplacian Corners No! 19

Kanade-Tomasi How to choose features to track? Look to tracking equation: Kanade-Tomasi What properties of this thing can we use? If the matrix is not invertible, we can t track this patch How do we know if it is invertible? Its determinant Slightly more information: eigenvalues Find regions of the image where Eigenvalues are sufficiently large (larger than in a patch that is just noise) Ratio of eigenvalues is reasonable (matrix is well-conditioned) 20

Shi-Tomasi Includes the work from Tomasi-Kanade Additionally: Extension from G(x) = F(x+h) to G(x) = F(Ax+h) (Affine model of motion) When to give up on patches that you are tracking? When dissimilarity score is bad (large difference between G(x) and F(Ax+h) after solving for A and h) Use translation for tracking and affine model for deciding when to give up Discussion Questions: Why would we want to bother with this approach? Why is it important for the displacement to be small with respect to the window size? What can we do if our assumption about the displacement is not true? 21