Motion estimation. Lihi Zelnik-Manor

Similar documents
Computer Vision Lecture 20

Dense Image-based Motion Estimation Algorithms & Optical Flow

Optical flow. Cordelia Schmid

Optical flow. Cordelia Schmid

Optical Flow-Based Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Peripheral drift illusion

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

CS 4495 Computer Vision Motion and Optic Flow

CS6670: Computer Vision

EE795: Computer Vision and Intelligent Systems

Feature Tracking and Optical Flow

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

Lecture 16: Computer Vision

Lucas-Kanade Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Multi-stable Perception. Necker Cube

EECS 556 Image Processing W 09

Computer Vision Lecture 20

Computer Vision Lecture 20

Computer Vision Lecture 18

Comparison between Motion Analysis and Stereo

Feature Tracking and Optical Flow

Lecture 16: Computer Vision

VC 11/12 T11 Optical Flow

Optical flow and tracking

Motion Estimation. There are three main types (or applications) of motion estimation:

Particle Tracking. For Bulk Material Handling Systems Using DEM Models. By: Jordan Pease

Motion and Optical Flow. Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi

Optical Flow Estimation

Overview. Video. Overview 4/7/2008. Optical flow. Why estimate motion? Motion estimation: Optical flow. Motion Magnification Colorization.

Mariya Zhariy. Uttendorf Introduction to Optical Flow. Mariya Zhariy. Introduction. Determining. Optical Flow. Results. Motivation Definition

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

Capturing, Modeling, Rendering 3D Structures

SURVEY OF LOCAL AND GLOBAL OPTICAL FLOW WITH COARSE TO FINE METHOD

Ninio, J. and Stevens, K. A. (2000) Variations on the Hermann grid: an extinction illusion. Perception, 29,

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Automatic Image Alignment (direct) with a lot of slides stolen from Steve Seitz and Rick Szeliski

EE795: Computer Vision and Intelligent Systems

CAP5415-Computer Vision Lecture 8-Mo8on Models, Feature Tracking, and Alignment. Ulas Bagci

Notes 9: Optical Flow

Midterm Wed. Local features: detection and description. Today. Last time. Local features: main components. Goal: interest operator repeatability

Lecture 20: Tracking. Tuesday, Nov 27

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

EE 264: Image Processing and Reconstruction. Image Motion Estimation II. EE 264: Image Processing and Reconstruction. Outline

Segmentation and Tracking of Partial Planar Templates

Visual Tracking (1) Tracking of Feature Points and Planar Rigid Objects

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Last Lecture. Edge Detection. Filtering Pyramid

Spatial track: motion modeling

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Announcements. Computer Vision I. Motion Field Equation. Revisiting the small motion assumption. Visual Tracking. CSE252A Lecture 19.

CS 2770: Intro to Computer Vision. Multiple Views. Prof. Adriana Kovashka University of Pittsburgh March 14, 2017

How is project #1 going?

Comparison Between The Optical Flow Computational Techniques

Photo by Carl Warner

Announcements. Recognition I. Optical Flow: Where do pixels move to? dy dt. I + y. I = x. di dt. dx dt. = t

Visual Tracking (1) Feature Point Tracking and Block Matching

CS 565 Computer Vision. Nazar Khan PUCIT Lectures 15 and 16: Optic Flow

Scale Invariant Feature Transform (SIFT) CS 763 Ajit Rajwade

Computing F class 13. Multiple View Geometry. Comp Marc Pollefeys

Ruch (Motion) Rozpoznawanie Obrazów Krzysztof Krawiec Instytut Informatyki, Politechnika Poznańska. Krzysztof Krawiec IDSS

Robust Model-Free Tracking of Non-Rigid Shape. Abstract

OPPA European Social Fund Prague & EU: We invest in your future.

Computer Vision II Lecture 4

CS201: Computer Vision Introduction to Tracking

Local features: detection and description. Local invariant features

ELEC Dr Reji Mathew Electrical Engineering UNSW

Tutorial of Motion Estimation Based on Horn-Schunk Optical Flow Algorithm in MATLAB

Global Flow Estimation. Lecture 9

Fitting a transformation: Feature-based alignment April 30 th, Yong Jae Lee UC Davis

3D Photography: Epipolar geometry

Multibody Motion Estimation and Segmentation from Multiple Central Panoramic Views

Chaplin, Modern Times, 1936

Global Flow Estimation. Lecture 9

MAN-522: COMPUTER VISION SET-2 Projections and Camera Calibration

Optic Flow and Basics Towards Horn-Schunck 1

Adaptive Multi-Stage 2D Image Motion Field Estimation

Image warping/morphing

Computational Optical Imaging - Optique Numerique. -- Single and Multiple View Geometry, Stereo matching --

Final Exam Study Guide CSE/EE 486 Fall 2007

Image stitching. Digital Visual Effects Yung-Yu Chuang. with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac

Image Warping. Computational Photography Derek Hoiem, University of Illinois 09/28/17. Photo by Sean Carroll

Warping, Morphing and Mosaics

Image processing and features

Multi-stable Perception. Necker Cube

Visual Tracking. Image Processing Laboratory Dipartimento di Matematica e Informatica Università degli studi di Catania.

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Dense 3D Reconstruction. Christiano Gava

The Template Update Problem

Dense 3D Reconstruction. Christiano Gava

Visual Tracking. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Motion Tracking and Event Understanding in Video Sequences

Determining the 2d transformation that brings one image into alignment (registers it) with another. And

Displacement estimation

Stereo Matching! Christian Unger 1,2, Nassir Navab 1!! Computer Aided Medical Procedures (CAMP), Technische Universität München, Germany!!

Horn-Schunck and Lucas Kanade 1

What have we leaned so far?

Feature Based Registration - Image Alignment

Automatic Image Alignment (feature-based)

Transcription:

Motion estimation Lihi Zelnik-Manor

Optical Flow Where did each piel in image 1 go to in image 2

Optical Flow Pierre Kornprobst's Demo

ntroduction Given a video sequence with camera/objects moving we can better understand the scene if we find the motions of the camera/objects.

Scene nterpretation How is the camera moving? How man moving objects are there? Which directions are the moving in? How fast are the moving? Can we recognize their tpe of motion (e.g. walking, running, etc.)?

Applications Recover camera ego-motion. Result b MobilEe (www.mobilee.com)

Applications Motion segmentation Result b: L.Zelnik-Manor, M.Machline, M.rani Multi-bod Segmentation: Revisiting Motion Consistenc To appear, JCV

Applications Structure from Motion nput Reconstructed shape Result b: L. Zhang, B. Curless, A. Hertzmann, S.M. Seitz Shape and motion under varing illumination: Unifing structure from motion, photometric stereo, and multi-view stereo CCV 03

Eamples of Motion fields Forward motion Rotation Horizontal translation Closer objects appear to move faster!!

Motion Field & Optical Flow Field Motion Field = Real world 3D motion Optical Flow Field = Projection of the motion field onto the 2d image CCD 3D motion vector 2D optical flow vector u= u,v ( )

When does it break? The screen is stationar et displas motion Homogeneous objects generate zero optical flow. Fied sphere. Changing light source. Non-rigid teture motion

The Optical Flow Field Still, in man cases it does work. Goal: Find for each piel a velocit vector which sas: How quickl is the piel moving across the image n which direction it is moving u= ( u,v)

How do we actuall do that?

Estimating Optical Flow Assume the image intensit is constant Time = t Time = tdt (, ) d, d ( ), = (, t) ( d, d, t dt)

Brightness Constanc Equation,, t = d, d, t dt ( ) ( ) First order Talor Epansion Simplif notations: Divide b dt and denote: =, t (, t) d d dt d d dt u= u d dt v= v = t d dt = 0 t Gradient and flow are orthogonal Problem : One equation, two unknowns

Problem : The Aperture Problem For points on a line of fied intensit we can onl recover the normal flow Time t Time tdt? Where did the blue point move to? We need additional constraints

Use Local nformation Sometimes enlarging the aperture can help

Local smoothness Lucas Kanade (1984) Assume constant (u,v) in small neighborhood t v u = [ ] t v u = = 2 1 2 2 1 1 t t v u A = b u

Lucas Kanade (1984) Goal: Minimize Au b 2 Method: Least-Squares A u = b A T A u = 22 21 A T b 21 u = ( T ) 1 T A A A b

How does Lucas-Kanade behave? = 2 2 T A A We want this matri to be invertible. i.e., no zero eigenvalues ( ) b A A A T T 1 u =

How does Lucas-Kanade behave? Edge A T A becomes singular (, ) (, ) 2 2 = 0 0 is eigenvector with eigenvalue 0

How does Lucas-Kanade behave? Homogeneous ( ) 0, A T A 0 0 eigenvalues

How does Lucas-Kanade behave? Tetured regions two high eigenvalues ( ) 0,

How does Lucas-Kanade behave? Edge A T A becomes singular Homogeneous regions low gradients High teture A T A 0

terative Refinement Estimate velocit at each piel using one iteration of Lucas and Kanade estimation u = Warp one image toward the other using the estimated flow field (easier said than done) ( T ) 1 T A A A b Refine estimate b repeating the process

Optical Flow: terative Estimation estimate update û nitial guess: Estimate: u 0 = 0 1= ˆ 0 u u u 0

Optical Flow: terative Estimation f ( u ) 1 1 estimate update û nitial guess: Estimate: u 1 u ˆ 2 = u1 u 0

Optical Flow: terative Estimation f ( u ) 1 2 estimate update û nitial guess: Estimate: u 2 3 = ˆ 2 u u u 0

Optical Flow: terative Estimation f ( u ) f ( ) 1 3 2 0

Optical Flow: terative Estimation Some mplementation ssues: Warping is not eas (ensure that errors in warping are smaller than the estimate refinement) Warp one image, take derivatives of the other so ou don t need to re-compute the gradient after each iteration. Often useful to low-pass filter the images before motion estimation (for better derivative estimation, and linear approimations to image intensit)

Other break-downs Brightness constanc is not satisfied Correlation based methods A point does not move like its neighbors what is the ideal window size? Regularization based methods The motion is not small (Talor epansion doesn t hold) Aliasing Use multi-scale estimation

Optical Flow: Aliasing Temporal aliasing causes ambiguities in optical flow because images can have man piels with the same intensit..e., how do we know which correspondence is correct? actual shift estimated shift nearest match is correct (no aliasing) To overcome aliasing: coarse-to to-fine estimation. nearest match is incorrect (aliasing)

Multi-Scale Flow Estimation u=1.25 piels u=2.5 piels u=5 piels image t-1 u=10 piels image t1 Gaussian pramid of image t Gaussian pramid of image t1

Multi-Scale Flow Estimation run Lucas-Kanade warp & upsample run Lucas-Kanade... image t-1 image t1 Gaussian pramid of image t Gaussian pramid of image t1

Eamples: Motion Based Segmentation nput Segmentation result Result b: L.Zelnik-Manor, M.Machline, M.rani Multi-bod Segmentation: Revisiting Motion Consistenc To appear, JCV

Eamples: Motion Based Segmentation nput Segmentation result Result b: L.Zelnik-Manor, M.Machline, M.rani Multi-bod Segmentation: Revisiting Motion Consistenc To appear, JCV

Other break-downs Brightness constanc is not satisfied Correlation based methods A point does not move like its neighbors what is the ideal window size? Regularization based methods The motion is not small (Talor epansion doesn t hold) Aliasing Use multi-scale estimation

Robust Estimation Sources of outliers (multiple motions): specularities / highlights jpeg artifacts / interlacing / motion blur multiple motions (occlusion boundaries, transparenc) Noise distributions are often non-gaussian, having much heavier tails. Noise samples from the tails are called outliers.

Regularization Horn and Schunk (1981) Add global smoothness term Smoothness error: Error in brightness constanc equation E E ( 2 2) ( 2 2 u u v v ) dd s = D ( ) 2 u v dd c = D t Minimize: E λ c E s Solve b calculus of variations

Robust Estimation Black & Anandan (1993) Regularization can over-smooth across edges Use smarter regularization

Least Squares Estimation Standard Least Squares Estimation allows too much influence for outling points ) ( ) ) ( ) ( ) ( ) ( 2 m m m E i i i i i = = = = ρ ψ ρ ρ ( nfluence

Robust Estimation Problem: Least-squares estimators penalize deviations between data & model with quadratic error f n (etremel sensitive to outliers) error penalt function influence function Redescending error functions (e.g., Geman-McClure) help to reduce the influence of outling measurements. error penalt function influence function

Robust Estimation Black & Anandan (1993) Regularization can over-smooth across edges Use smarter regularization Minimize: D ( u v ) λ[ ρ ( u u ) ( v v )] dd 1 t 2, ρ2, ρ Brightness constanc Smoothness

Eamples: Motion Based Segmentation nput Segmentation result Optical Flow estimation b: M. J. Black and P. Anandan, A framework for the robust estimation of optical flow, CCV 93 Segmentation b: L.Zelnik-Manor, M.Machline, M.rani Multi-bod Segmentation: Revisiting Motion Consistenc, JCV 06

Parametric motion estimation

Global (parametric) motion models 2D Models: Affine Quadratic Planar projective transform (Homograph) 3D Models: nstantaneous camera motion models Homographepipole PlaneParalla

Motion models Translation Affine Perspective 3D rotation 2 unknowns 6 unknowns 8 unknowns 3 unknowns

Affine Motion For panning camera or planar surfaces: p p p v p p p u 6 5 4 3 2 1 = = t p p p p p p = ) ( ) ( 6 5 4 3 2 1 [ ] t = p Onl 6 parameters to solve for Better results Least Square Minimization (over all piels): [ ] 2 = a Err ) ( [ ] t p

Quadratic instantaneous approimation to planar motion Other 2D Motion Models 2 8 7 6 5 4 8 2 7 3 2 1 q q q q q v q q q q q u = = v u h h h h h h h h h h h h = = = = ', ' and ' ' 9 8 7 6 5 4 9 8 7 3 2 1 Projective eact planar motion

3D Motion Models Z T T v Z T T u Z Y Z Y X Z X Z Y X ) ( ) (1 ) ( ) (1 2 2 Ω Ω Ω = Ω Ω Ω = v u t h h h t h h h t h h h t h h h = = = = ', ' : and ' ' 3 9 8 7 1 6 5 4 3 9 8 7 1 3 2 1 γ γ γ γ ) ( 1 ) ( 1 2 3 3 1 3 3 t t t v t t t u w w = = = = γ γ γ γ Local Parameter: Z Y X Z Y X T T T,,,,, Ω Ω Ω ), ( Z nstantaneous camera motion: Global parameters: Global parameters: 3 2 1 9 1,,,,, t t t h h ), ( γ HomographEpipole Local Parameter: Residual Planar Paralla Motion Global parameters: 3 2 1,, t t t ), ( γ Local Parameter:

Segmentation of Affine Motion = nput Segmentation result Result b: L.Zelnik-Manor, M.rani Multi-frame estimation of planar motion, PAM 2000

Panoramas nput Motion estimation b Andrew Zisserman s group

Stabilization Result b: L.Zelnik-Manor, M.rani Multi-frame estimation of planar motion, PAM 2000

Sparse matching

Patch matching (revisited) How do we determine correspondences? block matching or SSD (sum squared differences)

Correlation and SSD For larger displacements, do template matching Define a small area around a piel as the template Match the template against each piel within a search area in net image. Use a match measure such as correlation, normalized correlation, or sum-of-squares difference Choose the maimum (or minimum) as the match Sub-piel estimate (Lucas-Kanade)

Discrete Search vs. Gradient Based Consider image translated b 2 1, 0 0 2, 1 )), ( ), ( ), ( ( )), ( ), ( ( ), ( v v u u v u v u E η = = u 0,v 0 ), ( ), ( ), ( ), ( ), ( 1 0 0 1 0 v u η = = The discrete search method simpl searches for the best estimate. The gradient method linearizes the intensit function and solves for the estimate

Tracking result

Laered Scene Representations

Motion representations How can we describe this scene?

Block-based motion prediction Break image up into square blocks Estimate translation for each block Use this to predict net frame, code difference (MPEG-2)

Laered motion Break image sequence up into laers : = Describe each laer s motion

Laered Representation For scenes with multiple affine motions Estimate dominant motion parameters Reject piels which do not fit Convergence Restart on remaining piels

Some Results Nebojsa Jojic and Brendan Fre, "Learning Fleible Sprites in Video Laers, CVPR 2001.

A bit more fun Action Recognition

Recognizing Actions at a Distance A.A. Efros, A.C. Berg, G. Mori, J. Malik mage frame Optical flow F, Use optical flow as a template for frame classification

Recognizing Actions at a Distance A.A. Efros, A.C. Berg, G. Mori, J. Malik mage frame Optical flow F, F, F F, F, F, F blurred F, F, F, F

Recognizing Actions at a Distance A.A. Efros, A.C. Berg, G. Mori, J. Malik Database: Test sequence: For each frame in test sequence find closest frame in database

Recognizing Actions at a Distance A.A. Efros, A.C. Berg, G. Mori, J. Malik Red bars show classification results

Recognizing Actions at a Distance A.A. Efros, A.C. Berg, G. Mori, J. Malik View Greg n World Cup video

References on Optical Flow Lucas-Kanade method: B.D. Lucas and T. Kanade An terative mage Registration Technique with an Application to Stereo Vision JCA '81 pp. 674-679 S. Baker and. Matthews Lucas-Kanade 20 Years On: A Unifing Framework JCV, Vol. 56, No. 3, March, 2004, pp. 221-255. http://www.ri.cmu.edu/projects/project_515.html (papers code) Regularization based methods: B. K. P. Horn and B. Schunck, "Determining Optical Flow," Artificial ntelligence, 17 (1981), pp. 185-203 Black, M. J. and Anandan, P., A framework for the robust estimation of optical flow, CCV 93, Ma, 1993, pp. 231-236 (papers code) Comparison of various optical flow techniques: Barron, J.L., Fleet, D.J., and Beauchemin, S. Performance of optical flow techniques. JCV, 1994, 12(1):43-77 Laered representation (affine): James R. Bergen P. Anandan Keith J. Hanna Rajesh Hingorani Hierarchical Model- Based Motion Estimation ECCV 92, pp. 237-- 252

That s all for toda

nterlace vs. progressive scan http://www.ais.com/products/video/camera/progressive_scan.htm

Progressive scan http://www.ais.com/products/video/camera/progressive_scan.htm

nterlace http://www.ais.com/products/video/camera/progressive_scan.htm

Progressive scan vs. intelaced sensors Most camcorders are interlaced several eceptions (check the specs before ou bu!) some can be switched between progressive and interlaced Used to be true also for video cameras (interlaced) But now it s becoming the opposite man/most digital video cameras are progressive scan