Lecture 16: Computer Vision

Similar documents
Lecture 16: Computer Vision

Feature Tracking and Optical Flow

Feature Tracking and Optical Flow

VC 11/12 T11 Optical Flow

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Lucas-Kanade Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Peripheral drift illusion

CS 4495 Computer Vision Motion and Optic Flow

Lecture 20: Tracking. Tuesday, Nov 27

Automatic Image Alignment (direct) with a lot of slides stolen from Steve Seitz and Rick Szeliski

Motion and Optical Flow. Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi

CS6670: Computer Vision

EE795: Computer Vision and Intelligent Systems

Ninio, J. and Stevens, K. A. (2000) Variations on the Hermann grid: an extinction illusion. Perception, 29,

Capturing, Modeling, Rendering 3D Structures

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

Optical flow and tracking

Multi-stable Perception. Necker Cube

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

Optical Flow-Based Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

Comparison between Motion Analysis and Stereo

EE795: Computer Vision and Intelligent Systems

Dense Image-based Motion Estimation Algorithms & Optical Flow

EECS 556 Image Processing W 09

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Computer Vision Lecture 20

Computer Vision Lecture 20

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

ECE Digital Image Processing and Introduction to Computer Vision

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Computer Vision Lecture 20

Overview. Video. Overview 4/7/2008. Optical flow. Why estimate motion? Motion estimation: Optical flow. Motion Magnification Colorization.

Edge and corner detection

Visual Tracking (1) Tracking of Feature Points and Planar Rigid Objects

Mosaics. Today s Readings

Optical flow. Cordelia Schmid

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

Lecture 14: Computer Vision

Optical flow. Cordelia Schmid

Visual Tracking (1) Feature Point Tracking and Block Matching

Final Exam Study Guide

Leow Wee Kheng CS4243 Computer Vision and Pattern Recognition. Motion Tracking. CS4243 Motion Tracking 1

Ruch (Motion) Rozpoznawanie Obrazów Krzysztof Krawiec Instytut Informatyki, Politechnika Poznańska. Krzysztof Krawiec IDSS

Particle Tracking. For Bulk Material Handling Systems Using DEM Models. By: Jordan Pease

CS4670: Computer Vision

CS201: Computer Vision Introduction to Tracking

Mariya Zhariy. Uttendorf Introduction to Optical Flow. Mariya Zhariy. Introduction. Determining. Optical Flow. Results. Motivation Definition

Computer Vision II Lecture 4

Lecture 6: Edge Detection

Displacement estimation

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Robert Collins CSE598G. Intro to Template Matching and the Lucas-Kanade Method

Optic Flow and Motion Detection

Optic Flow and Basics Towards Horn-Schunck 1

3D Photography. Marc Pollefeys, Torsten Sattler. Spring 2015

CS 565 Computer Vision. Nazar Khan PUCIT Lectures 15 and 16: Optic Flow

CS4670: Computer Vision

Feature Detectors and Descriptors: Corners, Lines, etc.

Image warping and stitching

CS231A Section 6: Problem Set 3

Local Feature Detectors

Autonomous Navigation for Flying Robots

Image processing and features

Visual Tracking. Image Processing Laboratory Dipartimento di Matematica e Informatica Università degli studi di Catania.

CS5670: Computer Vision

Feature Based Registration - Image Alignment

Image warping and stitching

Dense 3D Reconstruction. Christiano Gava

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

CS664 Lecture #18: Motion

Computational Optical Imaging - Optique Numerique. -- Single and Multiple View Geometry, Stereo matching --

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

CPSC 425: Computer Vision

Horn-Schunck and Lucas Kanade 1

Local features: detection and description May 12 th, 2015

CSCI 1290: Comp Photo

3D Vision. Viktor Larsson. Spring 2019

Motion. CS 554 Computer Vision Pinar Duygulu Bilkent University

Local features: detection and description. Local invariant features

Chapter 3 Image Registration. Chapter 3 Image Registration

Marcel Worring Intelligent Sensory Information Systems

Computer Vision II Lecture 4

Final Exam Study Guide CSE/EE 486 Fall 2007

Motion Estimation. There are three main types (or applications) of motion estimation:

Spatial track: motion modeling

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Optical Flow CS 637. Fuxin Li. With materials from Kristen Grauman, Richard Szeliski, S. Narasimhan, Deqing Sun

Image Features. Work on project 1. All is Vanity, by C. Allan Gilbert,

N-Views (1) Homographies and Projection

Multiple View Geometry

Announcements. Mosaics. How to do it? Image Mosaics

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Other Linear Filters CS 211A

Visual Tracking. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

SIFT: SCALE INVARIANT FEATURE TRANSFORM SURF: SPEEDED UP ROBUST FEATURES BASHAR ALSADIK EOS DEPT. TOPMAP M13 3D GEOINFORMATION FROM IMAGES 2014

Image matching. Announcements. Harder case. Even harder case. Project 1 Out today Help session at the end of class. by Diva Sian.

Multimedia Computing: Algorithms, Systems, and Applications: Edge Detection

What have we leaned so far?

Midterm Wed. Local features: detection and description. Today. Last time. Local features: main components. Goal: interest operator repeatability

Transcription:

CS442/542b: Artificial ntelligence Prof. Olga Veksler Lecture 16: Computer Vision Motion Slides are from Steve Seitz (UW), David Jacobs (UMD)

Outline Motion Estimation Motion Field Optical Flow Field Methods for Optical Flow estimation 1. Discrete Search 2. Lukas-Kanade Approach to Optical Flow Optical Flow Constraint Equation Aperture Problem Pyramid Approach

Why estimate motion? Lots of uses Track object(s) Correct for camera jitter (stabilization) Align images (mosaics) 3D shape reconstruction Special effects

Optical Flow and Motion Field Optical flow is the apparent motion of brightness patterns between 2 (or several) frames in an image sequence Usually represent optical flow by a 2 dimensional vector (u,v) Rubik's cube rotating to the right on a turntable

Optical Flow and Motion Field Optical flow is the apparent motion of brightness patterns between 2 (or several) frames in an image sequence Why does brightness change between frames? Assuming that illumination does not change: changes are due to the RELATVE MOTON between the scene and the camera There are 3 possibilities: Camera still, moving scene Moving camera, still scene Moving camera, moving scene Optical Flow is what we can estimate from image sequences

Motion Field (MF) The actual relative motion between 3D scene and the camera is 3 dimensional motion will have horizontal (x), vertical (y), and depth (z) components, in general We can project these 3D motions onto the image plane What we get is a 2 dimensional motion field Motion field is the projection of the actual 3D motion in the scene onto the image plane Motion Field is what we actually need to estimate for applications

Examples of Motion Fields (a) (b) (c) (d) (a) Translation perpendicular to a surface. (b) Rotation about axis perpendicular to image plane. (c) Translation parallel to a surface at a constant distance. (d) Translation parallel to an obstacle in front of a more distant background.

Optical Flow vs. Motion Field Optical Flow is the apperent motion of brightness patterns We equate Optical Flow Field with Motion Field Frequently works, but not always (a) A smooth sphere is rotating under constant illumination. Thus the optical flow field is zero, but the motion field is not (a) (b) (b) A fixed sphere is illuminated by a moving source the shading of the image changes. Thus the motion field is zero, but the optical flow field is not

Optical Flow vs. Motion Field Famous llusions Optical flow and motion fields do not coincide http://www.sandlotscience.com/distortions/breathing_square.htm http://www.sandlotscience.com/ambiguous/barberpole_llusion.htm

Optical Flow vs. Motion Field Motion field and Optical Flow are very different

Human Motion System llusory Snakes from Gary Bradski and Sebastian Thrun

Discrete Search for Optical Flow Given window W in H, find best matching window in Minimize SSD (sum squared difference) or SAD (sum of absolute differences) of pixels in window just like window matching for stereo, except the set of locations to search over in the second image is different search over a specified range of (u,v) values this (u,v) range defines the search range can use integral image technique for fast search

Computing Optical Flow: Brightness Constancy Equation Can we estimate optical flow without the search over all possible locations? Yes! f the motion is small Let P be a moving point in 3D At time t, P has coordinates (X(t),Y(t),Z(t)) Let p=(x(t),y(t)) be the coordinates of its image at time t Let (x(t),y(t),t) be the brightness at p at time t. Brightness Constancy Assumption: As P moves over time, (x(t),y(t),t) remains constant

Computing Optical Flow: Brightness Constancy Equation [x(t),y(t),t] = constant Taking derivative with respect to time: d [ x( t ),y ( t ),t ] dt = 0 x x t + y y t + t = 0

Computing Optical Flow: Brightness Constancy Equation Let 1 equation with 2 unknowns x x t u v t + = = = y x y t x y t t y t + t = 0 (Frame spatial gradient) (optical flow) (derivative across frames)

Computing Optical Flow: Brightness Constancy Equation x x t + y y t + t = 0 Written using dot product notation: v u x + t = y Where have used more compact notation: = x x 0 = y y

Computing Optical Flow: Brightness Constancy Equation 1 equation with 2 unknowns: v u + x t = y 0 ntuitively, what does this constraint mean? The component of the flow in the gradient direction is determined Recall that gradient points in the direction perpendicular to the edge The component of the flow parallel to an edge is unknown v (, ) x y any point on the red line is a solution to the equation above u

Aperture problem true motion is in the direction of the red arrow

Aperture problem?

Computing Optical Flow: Brightness Constancy Equation How to get more equations for a pixel? Basic idea: impose additional constraints most common is to assume that the flow field is smooth locally one method: pretend the pixel s neighbors have the same (u,v) f we use a 5x5 window, that gives us 25 equations per pixel! x x x p p u t i + i v = ( ) ( ) 0 ( p1 ) y ( p1 ) ( p ) ( p ) M 2 ( p ) ( p ) 25 y y M 2 25 matrix A 25x2 u v = vector d 2x1 t t t ( p1 ) ( p ) M 2 ( p ) 25 vector b 25x1

Computing Optical Flow: Brightness Constancy Equation x and y are computed just as before (recall lectures on filtering) For example, can use Sobel operator -1 0 1 1 2 1-2 0 2 0 0 0-1 0 1-1 -2-1 Note that 1/8 factor is now mandatory, unlike in edge detection, since we want the actual gradient value

Computing Optical Flow: Brightness Constancy Equation t is the derivative between the frames 121 121 122 123 122 123 121 121 122 123 20 20 121 121 122 123 122 123 121 121 122 123 22 22 122 123 124 123 124 123 122 123 124 123 24 21 120 122 122 123 122 123 120 122 122 123 22 22 121 121 124 123 124 123 121 121 124 123 24 23 125 120 124 123 124 123 125 120 124 123 24 24 5 : frame at time = 5 6 : frame at time = 5 Simplest approximation to t (p) = t+1 (p)- t (p) For example for pixel with coordinates (4,3) above t (4,3) = 22-122 = -100

Lukas-Kanade flow x x x ( p1 ) y ( p1 ) ( p ) ( p ) M 2 ( p ) ( p ) 25 y y M 2 25 v u t t t ( p1 ) ( p ) ( p ) matrix A vector d vector b 25x2 2x1 25x1 Problem: now we have more equations than unknowns Where have we seen this before? Can t find the exact solution d, but can solve Least Squares Problem: = M 2 25

Lukas-Kanade flow Solution: solve least squares problem minimum least squares solution given by solution (in d) of: x x x y x y y y u v = x y t t The summations are over all pixels in the K x K window This technique was first proposed by Lucas & Kanade (1981) Note: solution is at sub-pixel precision, that is you can get answer like u= 0.7 and v = -0.33 Contrast this with discrete search: to find answer at sub-pixel precision, you have to search at sub-pixel precision (usually)

Conditions for solvability Optimal (u, v) satisfies Lucas-Kanade equation x x x y x y y y u v = x y t t When is this solvable? A T A should be invertible A T A entries should not be too small (noise) A T A should be well-conditioned λ 1 / λ 2 should not be too large (λ 1 = larger eigenvalue) The eigenvectors of A T A relate to edge direction and magnitude

Edge gradients very large or very small large λ 1, small λ 2

Low texture region gradients have small magnitude small λ 1, small λ 2

High textured region gradients are different, large magnitudes large λ 1, large λ 2

Observation This is a two image problem BUT Can measure sensitivity by just looking at one of the images! This tells us which pixels are easy to track, which are hard very useful for feature tracking

Errors in Lucas-Kanade What are the potential causes of errors in this procedure? Suppose A T A is easily invertible Suppose there is not much noise in the image When our assumptions are violated Brightness constancy is not satisfied The motion is not small A point does not move like its neighbors window size is too large what is the ideal window size?

terative Refinement terative Lucas-Kanade Algorithm 1. Estimate velocity at each pixel by solving Lucas- Kanade equations 2. Warp H towards using the estimated flow field - use image warping techniques 3. Repeat until convergence

Optical Flow Results * From Khurram Hassan-Shafique CAP5415 Computer Vision 2003

Revisiting the small motion assumption s this motion small enough? Probably not it s much larger than one pixel How might we solve this problem?

Reduce the resolution!

Coarse-to-fine optical flow estimation u=1.25 pixels u=2.5 pixels u=5 pixels image H u=10 pixels image Gaussian pyramid of image H Gaussian pyramid of image

Coarse-to-fine optical flow estimation run iterative L-K run iterative L-K warp & upsample... image HJ image Gaussian pyramid of image H Gaussian pyramid of image

mage warping T(x,y) y y x f(x,y) x g(x,y ) Given a coordinate transform (x,y ) = h(x,y) and a source image f(x,y), how do we compute a transformed image g(x,y ) = f(t(x,y))?

Forward warping T(x,y) y y x f(x,y) x g(x,y ) Send each pixel f(x,y) to its corresponding location (x,y ) = T(x,y) in the second image Q: what if pixel lands between two pixels?

Forward warping T(x,y) y y x f(x,y) x g(x,y ) Send each pixel f(x,y) to its corresponding location (x,y ) = T(x,y) in the second image Q: what if pixel lands between two pixels? A: distribute color among neighboring pixels (x,y ) Known as splatting

Optical Flow Results * From Khurram Hassan-Shafique CAP5415 Computer Vision 2003

Motion tracking Suppose we have more than two images How to track a point through all of the images? n principle, we could estimate motion between each pair of consecutive frames Given point in first frame, follow arrows to trace out it s path Problem: DRFT small errors will tend to grow and grow over time the point will drift way off course Feature Tracking Choose only the points ( features ) that are easily tracked How to find these features? windows where has two large eigenvalues Called the Harris Corner Detector

Feature Detection

Tracking features Feature tracking Compute optical flow for that feature for each consecutive H, When will this go wrong? Occlusions feature may disappear need mechanism for deleting, adding new features Changes in shape, orientation allow the feature to deform Changes in color Large motions will pyramid techniques work for feature tracking?

Tracking Over Many Frames Feature tracking with m frames 1. Select features in first frame 2. Given feature in frame i, compute position in i+1 3. Select more features if needed 4. i = i + 1 5. f i < m, go to step 2 ssues Discrete search vs. Lucas Kanade? depends on expected magnitude of motion discrete search is more flexible Compare feature in frame i to i+1 or frame 1 to i+1? affects tendency to drift.. How big should search window be? too small: lost features. Too large: slow