Optic Flow and Basics Towards Horn-Schunck 1

Similar documents
Horn-Schunck and Lucas Kanade 1

CS 4495 Computer Vision Motion and Optic Flow

Peripheral drift illusion

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Ninio, J. and Stevens, K. A. (2000) Variations on the Hermann grid: an extinction illusion. Perception, 29,

CS-465 Computer Vision

Dense Image-based Motion Estimation Algorithms & Optical Flow

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Visual Tracking (1) Feature Point Tracking and Block Matching

Lucas-Kanade Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

SURVEY OF LOCAL AND GLOBAL OPTICAL FLOW WITH COARSE TO FINE METHOD

Lecture 16: Computer Vision

EECS 556 Image Processing W 09

Visual Tracking (1) Tracking of Feature Points and Planar Rigid Objects

VC 11/12 T11 Optical Flow

Feature Tracking and Optical Flow

CS 565 Computer Vision. Nazar Khan PUCIT Lectures 15 and 16: Optic Flow

CS6670: Computer Vision

Mariya Zhariy. Uttendorf Introduction to Optical Flow. Mariya Zhariy. Introduction. Determining. Optical Flow. Results. Motivation Definition

Lecture 16: Computer Vision

Motion and Optical Flow. Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi

Capturing, Modeling, Rendering 3D Structures

Event-based Computer Vision

Comparison Between The Optical Flow Computational Techniques

Feature Tracking and Optical Flow

Spatial track: motion modeling

Notes 9: Optical Flow

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Optical Flow CS 637. Fuxin Li. With materials from Kristen Grauman, Richard Szeliski, S. Narasimhan, Deqing Sun

EE795: Computer Vision and Intelligent Systems

Computer Vision Lecture 20

Computer Vision Lecture 20

Motion Analysis. Motion analysis. Now we will talk about. Differential Motion Analysis. Motion analysis. Difference Pictures

The 2D/3D Differential Optical Flow

ELEC Dr Reji Mathew Electrical Engineering UNSW

Multi-stable Perception. Necker Cube

Comparison between Motion Analysis and Stereo

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

EE795: Computer Vision and Intelligent Systems

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Marcel Worring Intelligent Sensory Information Systems

CPSC 425: Computer Vision

Automatic Image Alignment (direct) with a lot of slides stolen from Steve Seitz and Rick Szeliski

Computer Vision Lecture 20

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Announcements. Motion. Structure-from-Motion (SFM) Motion. Discrete Motion: Some Counting

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Overview. Video. Overview 4/7/2008. Optical flow. Why estimate motion? Motion estimation: Optical flow. Motion Magnification Colorization.

Visual Tracking (1) Pixel-intensity-based methods

Computer Vision for HCI. Motion. Motion

CS664 Lecture #18: Motion

Optical flow and tracking

Optical flow. Cordelia Schmid

Ruch (Motion) Rozpoznawanie Obrazów Krzysztof Krawiec Instytut Informatyki, Politechnika Poznańska. Krzysztof Krawiec IDSS

Computer Vision I - Filtering and Feature detection

Spatial track: motion modeling

General Principles of 3D Image Analysis

Motion Estimation. There are three main types (or applications) of motion estimation:

Optical flow. Cordelia Schmid

COMP 558 lecture 16 Nov. 8, 2010

Hand-Eye Calibration from Image Derivatives

Image Processing 1 (IP1) Bildverarbeitung 1

Visual Tracking. Image Processing Laboratory Dipartimento di Matematica e Informatica Università degli studi di Catania.

Lecture 20: Tracking. Tuesday, Nov 27

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Displacement estimation

Multiple View Geometry

Optical Flow. Adriana Bocoi and Elena Pelican. 1. Introduction

Chapter 3 Image Registration. Chapter 3 Image Registration

Edge and local feature detection - 2. Importance of edge detection in computer vision

Optical Flow-Based Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

Optical Flow Estimation

International Journal of Advance Engineering and Research Development

Autonomous Navigation for Flying Robots

MASTER THESIS. Optical Flow Features for Event Detection. Mohammad Afrooz Mehr, Maziar Haghpanah

Direct Plane Tracking in Stereo Images for Mobile Navigation

Robert Collins CSE598G. Intro to Template Matching and the Lucas-Kanade Method

Leow Wee Kheng CS4243 Computer Vision and Pattern Recognition. Motion Tracking. CS4243 Motion Tracking 1

Computer Vision I - Basics of Image Processing Part 2

Stereo Scene Flow for 3D Motion Analysis

Visual Tracking. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

SUBDIVISION ALGORITHMS FOR MOTION DESIGN BASED ON HOMOLOGOUS POINTS

3D Computer Vision. Dense 3D Reconstruction II. Prof. Didier Stricker. Christiano Gava

Worksheet 3.5: Triple Integrals in Spherical Coordinates. Warm-Up: Spherical Coordinates (ρ, φ, θ)

3D-OBJECT DETECTION METHOD BASED ON THE STEREO IMAGE TRANSFORMATION TO THE COMMON OBSERVATION POINT

Motion Estimation (II) Ce Liu Microsoft Research New England

Augmented Reality VU. Computer Vision 3D Registration (2) Prof. Vincent Lepetit

Technion - Computer Science Department - Tehnical Report CIS

Particle Tracking. For Bulk Material Handling Systems Using DEM Models. By: Jordan Pease

CUDA GPGPU. Ivo Ihrke Tobias Ritschel Mario Fritz

Massachusetts Institute of Technology Department of Computer Science and Electrical Engineering 6.801/6.866 Machine Vision QUIZ II

Vector Integration : Surface Integrals

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

Time To Contact (TTC) Berthold K.P. Horn, Yajun Fang, Ichiro Masaki

CHAPTER 5 MOTION DETECTION AND ANALYSIS

Transcription:

Optic Flow and Basics Towards Horn-Schunck 1 Lecture 7 See Section 4.1 and Beginning of 4.2 in Reinhard Klette: Concise Computer Vision Springer-Verlag, London, 2014 1 See last slide for copyright information. 1 / 32

Frames in a Video Sequence Sequence of images or video frames Time difference δt between two subsequent frames Example: δt = 1/30 s means 30 Hz (read: hertz ) or 30 fps (read: frames per second ) I (.,., t) is the frame at time t with values I (x, y, t) 2 / 32

Agenda 1 Local Displacement vs Optic Flow 2 Aperture Problem and Gradient Flow 3 The Horn-Schunck Algorithm 4 Optic Flow Constraint 5 Optimization Problem 3 / 32

Projected Motion P = (X, Y, Z) projected at t δt into p = (x, y) in I (.,., t) Camera: focal length f, projection centre O, looks along optic axis Ideal model defines central projection into xy image plane Projection of motion v δt into displacement d in the image plane 4 / 32

2D Motion Assumptions: motion of P between t δt and (t + 1) δt 1 linear 2 with constant speed Local displacement: Projection d = (ξ, ψ) of this 3D motion Visible displacement: The optic flow u = [u, v] from p = (x, y) to p = (x + u, y + v) Often: optic flow not identical to local displacement 5 / 32

2D Motion Optic Flow Rotating barber s pole: sketch of 2D motion (without scaling vectors) and sketch of optic flow, an optical illusion Lambertian sphere: rotation but not visible Moving light source: textured static object and a moving light source (e.g., the sun) generate optic flow 6 / 32

Vector Fields Rotating rectangle: around a fixpoint, parallel to the image plane: Motion maps: vectors start at time t and end at time t + 1 To be visualized by using a color key Dense if vectors at (nearly) all pixel locations; otherwise sparse Difficult to infer the shape of a polyhedron from a motion map 7 / 32

Color Key Colors represent direction of vector (start at the center of the disk) Saturation represents magnitude of the vector, with White for no motion 8 / 32

Example 1 for Horn-Schunck-Algorithm Subsequent frames taken at 25 fps Color-coded motion field calculated with basic Horn-Schunck algorithm Sparse (magnified) vectors are redundant information 9 / 32

Local Displacement Aperture Problem Horn-Schunck Optic Flow Constraint Optimization Problem Example 2 for Horn-Schunck-Algorithm Two frames of video sequence Color-coded motion field calculated with basic Horn-Schunck algorithm 10 / 32

Agenda 1 Local Displacement vs Optic Flow 2 Aperture Problem and Gradient Flow 3 The Horn-Schunck Algorithm 4 Optic Flow Constraint 5 Optimization Problem 11 / 32

Aperture Defined by Available Window Sitting in a waiting train and assuming to move because the train on the next track started to move A program only sees both circular windows at time t (left) and time t + 1 (right); it concludes an upward shift and misses the shift diagonally towards the upper right corner 12 / 32

Camera Aperture Visible motion defined by the aperture of the camera Images taken at times t, t + 1, and t + 2 Inner rectangles: we conclude an upward translation with minor rotation Three images: indicate a motion of this car to the left Ground truth: car is actually driving around a roundabout 13 / 32

Gradient Flow Due to aperture problem: local optic flow detects gradient flow 2D gradient x,y I = (I x (x, y, t), I y (x, y, t)) I x and I y are partial derivatives of I (.,., t) w.r.t. x and y True 2D motion d: diagonally up Identified motion: projection of d onto gradient vector 14 / 32

Agenda 1 Local Displacement vs Optic Flow 2 Aperture Problem and Gradient Flow 3 The Horn-Schunck Algorithm 4 Optic Flow Constraint 5 Optimization Problem 15 / 32

Origin of the Algorithm The algorithm was published in B.K.P. Horn and B.G. Schunck. Determining optic flow. Artificial Intelligence, vol 17, pp. 185 203, 1981 as a pioneering work for estimating optic flow. We discuss this algorithm in detail 16 / 32

Agenda 1 Local Displacement vs Optic Flow 2 Aperture Problem and Gradient Flow 3 The Horn-Schunck Algorithm 4 Optic Flow Constraint 5 Optimization Problem 17 / 32

Taylor expansion Difference quotient φ(x) φ(x 0 ) = φ(x 0 + δx) φ(x 0 ) x x 0 δx of function φ converges into differential quotient dφ (x 0 ) dx for δx 0. First-order Taylor expansion: φ (x 0 + δx) = φ (x 0 ) + δx dφ (x 0) dx + e where error e equals zero if φ is linear in [x 0, x 0 + δx] 18 / 32

Taylor expansion for 1D Case General 1D Taylor expansion: φ (x 0 + δx) = i=0,1,2,... 1 i! δx i di φ (x 0 ) d i x with 0! = 1, i! = 1 2 3... i for i 1, and d i is the ith derivative 19 / 32

3D Taylor Expansion In our case of the frame sequence: I (x + δx, y + δy, t + δt) = I (x, y, t) + δx I I x (x, y, t) + δy y (x, y, t) + δt I t (x, y, t) + e Assumption 1. Let e = 0, i.e. function I (.,.,.) behaves like linear for small values of δx, δy, and δt 20 / 32

Simplifications Assumption 2. δx and δy model the motion of one pixel between t and t + 1 Assumption 3. Intensity constancy assumption (ICA) I (x + δx, y + δy, t + δt) = I (x, y, t) Results into 0 = δx I I I (x, y, t) + δy (x, y, t) + δt (x, y, t) x y t Optic Flow Equation: 0 = δx δt I x (x, y, t) + δy δt I y (x, y, t) + I t (x, y, t) 21 / 32

Horn-Schunck Constraint Take changes in x- and y-coordinate during δt as optic flow u(x, y, t) = (u(x, y, t), v(x, y, t)) Horn-Schunck Constraint or Optic Flow Equation: 0 = u (x, y, t) I I I (x, y, t) + v (x, y, t) (x, y, t) + (x, y, t) x y t Short form: 0 = ui x + vi y + I t 22 / 32

in uv velocity space, with optic flow vector u = [u, v] 23 / 32 The uv Velocity Space Straight line I t = u I x + v I y = u x,y I

Inner or Dot Vector Product Vectors a = (a 1, a 2,..., a n ) and b = (b 1, b 2,..., b n ) Dot Product or Inner Product: a b = a 1 b 1 + a 2 b 2 +... + a n b n a b = a 2 b 2 cos α a 2 = a 2 1 + a2 2 +... + a2 n α: angle between both vectors, 0 α < π 24 / 32

What We Have So Far I x, I y and I t are estimated in given frames (u, v) for pixel (x, y, t) is a point on the given straight line But: where on this straight line? 25 / 32

Agenda 1 Local Displacement vs Optic Flow 2 Aperture Problem and Gradient Flow 3 The Horn-Schunck Algorithm 4 Optic Flow Constraint 5 Optimization Problem 26 / 32

Labeling Model and First Labeling Constraint Labeling Model: Labeling function f assigns label (u, v) to all p Ω in I (.,., t) Possible set of vectors (u, v) R 2 defines the set of labels First Constraint for Labeling f : data error or data energy E data (f ) = Ω [ u I x + v I y + I t ] 2 needs to be minimized, with u I x + v I y + I t = 0 in the ideal case 27 / 32

Second Labeling Constraint One option: motion constancy within pixel neighborhoods at time t Smoothness Constraint, defined for smoothness error or smoothness energy E smooth (f ) = Ω u 2 x + u 2 y + v 2 x + v 2 y where u x is the 1 st order derivative of u with respect to x, and so forth. 28 / 32

The Optimization Problem Calculate labeling function f which minimizes where λ > 0 is a weight Example: λ = 0.1 E total (f ) = E data (f ) + λ E smooth (f ) Total Variation (TV): Search for an optimum f in the set of al possible labelings Error terms apply L 2 -penalties, thus a TV-L 2 optimization problem 29 / 32

Least-Square Error Optimization Least-square error (LSE) optimization follows a standard scheme: 1 Define an error or energy function. 2 Calculate derivatives of this function with respect to all the unknown parameters. 3 Set derivatives equal to zero and solve this equational system with respect to the unknowns. The result defines a minimum of the error function. 30 / 32

Task: LSE for Error Function Defined by Both Constraints We assume that all pixels have all their four 4-adjacent pixels also in the image. E data (f ) = Ω [ u I x + v I y + I t ] 2 E smooth (f ) = Ω ( u x+1,y u xy ) 2 + (u x,y+1 u xy ) 2 + (v x+1,y v xy ) 2 + (v x,y+1 v xy ) 2 Now we have all prepared for solving the LSE problem 31 / 32

Copyright Information This slide show was prepared by Reinhard Klette with kind permission from Springer Science+Business Media B.V. The slide show can be used freely for presentations. However, all the material is copyrighted. R. Klette. Concise Computer Vision. c Springer-Verlag, London, 2014. In case of citation: just cite the book, that s fine. 32 / 32