COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Similar documents
Visual Tracking (1) Feature Point Tracking and Block Matching

Visual Tracking (1) Tracking of Feature Points and Planar Rigid Objects

Feature Tracking and Optical Flow

Feature Tracking and Optical Flow

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

CS 4495 Computer Vision Motion and Optic Flow

Leow Wee Kheng CS4243 Computer Vision and Pattern Recognition. Motion Tracking. CS4243 Motion Tracking 1

VC 11/12 T11 Optical Flow

CS 565 Computer Vision. Nazar Khan PUCIT Lectures 15 and 16: Optic Flow

Lecture 16: Computer Vision

Peripheral drift illusion

CS-465 Computer Vision

Visual Tracking (1) Pixel-intensity-based methods

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

CS6670: Computer Vision

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

Lecture 16: Computer Vision

EE795: Computer Vision and Intelligent Systems

Computer Vision Lecture 20

Optical flow and tracking

Computer Vision Lecture 20

Lucas-Kanade Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

EE795: Computer Vision and Intelligent Systems

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

Motion and Optical Flow. Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi

Dense Image-based Motion Estimation Algorithms & Optical Flow

Multi-stable Perception. Necker Cube

CS664 Lecture #18: Motion

CS231A Section 6: Problem Set 3

Mariya Zhariy. Uttendorf Introduction to Optical Flow. Mariya Zhariy. Introduction. Determining. Optical Flow. Results. Motivation Definition

Augmented Reality VU. Computer Vision 3D Registration (2) Prof. Vincent Lepetit

Optical Flow-Based Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Visual Tracking. Image Processing Laboratory Dipartimento di Matematica e Informatica Università degli studi di Catania.

Comparison between Motion Analysis and Stereo

Ninio, J. and Stevens, K. A. (2000) Variations on the Hermann grid: an extinction illusion. Perception, 29,

Capturing, Modeling, Rendering 3D Structures

Visual Tracking. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Optic Flow and Basics Towards Horn-Schunck 1

Autonomous Navigation for Flying Robots

EE 264: Image Processing and Reconstruction. Image Motion Estimation I. EE 264: Image Processing and Reconstruction. Outline

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Feature Detection. Raul Queiroz Feitosa. 3/30/2017 Feature Detection 1

SIFT: SCALE INVARIANT FEATURE TRANSFORM SURF: SPEEDED UP ROBUST FEATURES BASHAR ALSADIK EOS DEPT. TOPMAP M13 3D GEOINFORMATION FROM IMAGES 2014

Motion Estimation. There are three main types (or applications) of motion estimation:

Tracking Computer Vision Spring 2018, Lecture 24

CPSC 425: Computer Vision

EECS 556 Image Processing W 09

Comparison between Optical Flow and Cross-Correlation Methods for Extraction of Velocity Fields from Particle Images

Computer Vision Lecture 20

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

Outline 7/2/201011/6/

ECE Digital Image Processing and Introduction to Computer Vision

Lecture 20: Tracking. Tuesday, Nov 27

CAP 5415 Computer Vision Fall 2012

Ruch (Motion) Rozpoznawanie Obrazów Krzysztof Krawiec Instytut Informatyki, Politechnika Poznańska. Krzysztof Krawiec IDSS

Notes 9: Optical Flow

Image processing and features

Coarse-to-fine image registration

Horn-Schunck and Lucas Kanade 1

Computational Optical Imaging - Optique Numerique. -- Single and Multiple View Geometry, Stereo matching --

SURVEY OF LOCAL AND GLOBAL OPTICAL FLOW WITH COARSE TO FINE METHOD

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Motion Analysis. Motion analysis. Now we will talk about. Differential Motion Analysis. Motion analysis. Difference Pictures

Computer Vision II Lecture 4

Automatic Image Alignment (direct) with a lot of slides stolen from Steve Seitz and Rick Szeliski

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

SE 263 R. Venkatesh Babu. Object Tracking. R. Venkatesh Babu

Face Tracking : An implementation of the Kanade-Lucas-Tomasi Tracking algorithm

Motion Estimation (II) Ce Liu Microsoft Research New England

Final Review CMSC 733 Fall 2014

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Local Features: Detection, Description & Matching

Edge and corner detection

Final Exam Study Guide

The Lucas & Kanade Algorithm

Digital Image Processing (CS/ECE 545) Lecture 5: Edge Detection (Part 2) & Corner Detection

The SIFT (Scale Invariant Feature

Image features. Image Features

16720 Computer Vision: Homework 3 Template Tracking and Layered Motion.

CS 4495 Computer Vision A. Bobick. CS 4495 Computer Vision. Features 2 SIFT descriptor. Aaron Bobick School of Interactive Computing

Corner Detection. Harvey Rhody Chester F. Carlson Center for Imaging Science Rochester Institute of Technology

CS 378: Autonomous Intelligent Robotics. Instructor: Jivko Sinapov

Feature Descriptors. CS 510 Lecture #21 April 29 th, 2013

CS201: Computer Vision Introduction to Tracking

EECS150 - Digital Design Lecture 14 FIFO 2 and SIFT. Recap and Outline

CSE 252A Computer Vision Homework 3 Instructor: Ben Ochoa Due : Monday, November 21, 2016, 11:59 PM

Line, edge, blob and corner detection

C18 Computer vision. C18 Computer Vision. This time... Introduction. Outline.

Motion Estimation and Optical Flow Tracking

Detection of Moving Objects in Colour based and Graph s axis Change method

Lecture 6: Edge Detection

Local Feature Detectors

The 2D/3D Differential Optical Flow

Spatial track: motion modeling

Optic Flow and Motion Detection

Spatial track: motion modeling

Optical flow. Cordelia Schmid

Announcements. Motion. Structure-from-Motion (SFM) Motion. Discrete Motion: Some Counting

Optical Flow Estimation with CUDA. Mikhail Smirnov

Transcription:

COMPUTER VISION 2017-2018 > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

OUTLINE Optical flow Lucas-Kanade Horn-Schunck Applications of optical flow Optical flow tracking Histograms of oriented flow Assignment

OPTICAL FLOW

OPTICAL FLOW The process of estimating the motion field (the trajectory of points) from time-varying image intensity (video) Or: mapping pixels from one frame to another

OPTICAL FLOW 2 For each pixel, we want to know its location in the next frame Can be described as a vector: Length (amount of movement) Direction of movement Can be color-coded

OPTICAL FLOW 3 Example: motion in video (Brox et al., PAMI 2011) http://lmb.informatik.uni-freiburg.de/people/brox/demos.html

OPTICAL FLOW 4 Example: motion in video (Brox et al., PAMI 2011) http://lmb.informatik.uni-freiburg.de/people/brox/demos.html

OPTICAL FLOW 5 Example: motion in video (Brox et al., PAMI 2011) http://lmb.informatik.uni-freiburg.de/people/brox/demos.html

OPTICAL FLOW 6 Optical flow is due to: Object motion Camera motion Variations in intensity

APPLICATIONS Tracking Estimate depth from flow Background subtraction Object detection Image stabilization Activity recognition

BASIC IDEA For now, we assume two subsequent frames in a video Basic idea: For each pixel, determine where it is in the next frame We call this dense optical flow Can be done: Lucas-Kanade Horn-Schunck Both based on the same principle

ASSUMPTION Formally: (x, y, t) is a pixel (x, y) at time t f(x, y, t) the pixel value (intensity) When analyzing image motion, we use an important assumption based on object appearance: The intensity of the pixel under motion does not change Brightness constancy assumption (BCA) f(x + dx, y + dy, t + dt) = f(x, y, t)

ASSUMPTION 2 An image is a 2D projection of a 3D scene! The intensity-constant assumption is typically true for: Short-time movement Movement paralel to the image plane (in-plane or 2D motion) But less so for: Out-of-plane movement (out-of-plane rotation) At depth borders Occlusions Discontinuities at the boundaries!

ASSUMPTION 3 Example video: Garden sequence http://persci.mit.edu/demos/jwang/garden-layer/orig-seq.html

ASSUMPTION4

FORMULATION Consider a pixel (x,y) at time t with intensity f(x, y, t) At t+dt, the pixel has moved with (dx,dy): (x+dx, y+dy, t+dt) We use u and v to denote dx/dt and dy/dt We are interested in finding the (u,v) for each pixel Vector (u,v) is the apparent 2D motion of the pixel (optical flow) To find the optical flow for an image, we need to find the most likely values of (u,v) for each pixel

FORMULATION 2 Question: what are the (u,v) vectors to go from: Frame 1 to frame 2 Answer: (2, 0) Frame 2 to frame 3 Answer: (1, 1)

FORMULATION 3 Example: J(x,y) = I(x + u(x, y), y + v(x, y)) I(1, 3) = 9 u(1, 3) = 3, v(1, 3) = 0 J(1, 3) = I(1 + u(1, 3), 3 + v(1, 3)) = I(4, 3) = 12

FORMULATION 4 Intensity-assumption: f(x, y, t) = f(x + dx, y + dy, t + dt) For a constant intensity (pixel does not change value): = (,, ) = 0 We will use Taylor series expansion: + = + + + Same for y and t

FORMULATION 5 With Taylor expansion (only 1 st derivative), this becomes: +, + + + =,, + + + And its derivative to t: + + = 0 Substituting with u and v: =, = + + = 0

FORMULATION 6 Again: + + = 0 For convenience, we write: =, = and = So: + = f x and f y can be measured from the image they are the gradients in x- and y-direction! Apply derivative masks to first and second frame: x: 1 1 1 1, y: 1 1 1 1, t (1st ): 1 1 1 1, (2nd ) 1 1 1 1 (Roberts)

ERROR FUNCTION We have to estimate (u,v) for each pixel How to know what a good estimate is? Remember the intensity-constancy assumption: Pixels are not supposed to change in intensity The cost of moving a pixel to a new location is the difference in intensity:, = + +

ISSUE For each pixel, we have equation + = Optical flow: u and v need to be estimated Two unknowns, one equation problem!

ISSUE2

ISSUE3

ISSUE4

ISSUE5

ISSUE6

ISSUE7

ISSUE8

ISSUE9

ISSUE10

ISSUE 11 Within the view, we cannot determine the movement

ISSUE12

ISSUE 13 Animations! http://web.mit.edu/persci/demos/motion&form/demos/two-squares/two-squares.html

ISSUE 13 So we need more equations to estimate the optical flow (u,v) for each pixel Aperture problem when f x and/or f y are zero (edges and even areas) because of + = Ideally, we include a corner to be sure about the estimate Two options: Look at motion in small region: Lucas-Kanade Introduce additional constraints: Horn-Schunck

LUCAS-KANADE

LUCAS-KANADE Basic idea: Assume that neighboring pixels have same optical flow Consider a small window around a pixel More equations with the same two unknowns solvable! Formally: For each pixel i in window W, we have: + = Written in matrix form: = n: the number of pixels in the window (W=5x5 n=25)

LUCAS-KANADE 2 So we have a series of equations: = Over-complete system! Recall our error term:, = + + Adapted for windows:, = + + We want to find optical flow (u,v) for which the summed pixel intensity differences is smallest minimize the error

LUCAS-KANADE 3 We want to minimize this error:, = + + Typical solution: solve for u and v separately by setting their partial derivatives to zero Use chain rule ( = 2, = 2 ): (, ) = 2 + + = 0 and (, ) = 2 + + = 0 The 2 term can be dropped (it is a constant)

LUCAS-KANADE 4 We can write: + + = 0 + + = 0 As: + = + = And in matrix form: =

LUCAS-KANADE 5 With the same analogy as: = = We can pre-multiply both sides of: = = with the inverse:, which gives us: =

LUCAS-KANADE 6 From: = Eventually, we can calculate the optical flow in both the u and v direction: = =

LUCAS-KANADE 7 We could also calculate this directly: = =, with = and = = ( ) ( ) is the pseudo-inverse Reworking the right-hand-side will result in the same equations

LUCAS-KANADE 8 Lucas-Kanade in practice uses small windows: 3x3 or 5x5 Cannot cope with larger movements (outside the window) For example: With a 5x5 window, u and v cannot be larger than 2 in any direction When the movement is larger, the results are unstable the best optical flow estimate is out of reach Solution: Use pyramids (recall SIFT lecture)

LUCAS-KANADE9

LUCAS-KANADE 10 Without pyramids: unstable results with larger motion

LUCAS-KANADE 11 Application of pyramids: resolution is much lower

LUCAS-KANADE 12 With pyramids: stable results with larger motion

LUCAS-KANADE 13 Some of the properties of Lucas-Kanade: Depends on image gradients (f x and f y ) For uniform colors f x =f y =0, optical flow is not defined Fast! Usually applied for each pixel: dense optical flow estimation Can also be applied only around corners (similar to SIFT) sparse result Noise has large effect on gradient, so also on optical flow estimation: Can be reduced by applying a Gaussian filter

QUESTIONS?

HORN-SCHUNCK

HORN-SCHUNCK Instead of using a window of pixels, Horn-Schunck optical flow estimation uses a smoothness constraint: Smaller values of u and v are favored over larger ones

HORN-SCHUNCK 2 Instead of using a window of pixels, Horn-Schunck optical flow estimation uses a smoothness constraint: Smaller values of u and v are favored over larger ones Original error function:, = + + Error function in Horn-Schunck:, = + + + + With the del operator: =,, (vector of partial derivatives to each variable)

HORN-SCHUNCK 3 Horn-Schunck error function:, = + + + + Intensity assumption Smoothness term Weighting factor Again, we want to minimize (u, v) to have the lowest error Intuitively, we are searching for (u, v) that: Follows the intensity assumption, and Minimizes the estimated motion Global estimation (hence the integrals)

HORN-SCHUNCK 4 Smoothness term: + = = + = 0 = = + = 0 The motion of neighboring pixels should be the same

HORN-SCHUNCK 5, = + + + + Using Euler-Lagrange equations, we obtain the following two equations to minimize: + + = 0 + + = 0 Discrete case: = and = and are the average values of u and v

HORN-SCHUNCK 6 For the average intensity values, we look at the local neighborhood of a pixel: Apply a Laplacian filter Typically of limited size (3x3) 0 0 0 or 0 0 0

HORN-SCHUNCK 7 Eventually, u and v become: = = With: = + + and = + +

HORN-SCHUNCK 8 The two equations: + + = 0 + + = 0 Depend on initial values of u and v Estimated iteratively, with u = u k First set u 0 =v 0 =0 Then, set = and = Repeat algorithm until convergence (local minimum)

HORN-SCHUNCK 9 Properties of Horn-Schunck optical flow: Smooth flow (no jumps ) Global information, so larger motion possible Slow because of iterative nature Still: problems at the boundaries Also: might get stuck in local minimum

APPLICATIONS OF OPTICAL FLOW

OPTICAL FLOW TRACKING Given that we have an estimate of the movement of a pixel, or a small window (Lucas-Kanade), we can track this pixel Kanade-Lucas-Tomasi is a well-known tracker

KLT TRACKING KLT (Kanade-Lucas-Tomasi) tracking works by estimating the optical flow at certain points, from frame to frame: Typically, Harris corners are used Found on corners with strong gradient Comparable to interest points (remember from SIFT) KLT tracking is straightforward for translations of single points, more difficult for: More complex transformations (rotations, affine) Multiple points See links on website for further reading

KLT TRACKING Circles are tracked points http://www.youtube.com/watch?v=e86nlznbul8

HISTOGRAMS OF ORIENTED FLOW We can calculate optical flow within a bounding box Often, specific movement in parts of the region is meaningful We can use the histogram of oriented gradients (HOG) concept Optical flow has a magnitude and orientation Bin flow instead of gradients

QUESTIONS?

ASSIGNMENT

ASSIGNMENT Assignment 3: Deadline is Tuesday March 6, 23:00 Assignment help session on Tuesday March 6, 13:15-15:00

NEXT LECTURE Next lecture: Training, classification, detection Tuesday March 8, 11:00-12:45, UNNIK-211