Peripheral drift illusion

Similar documents
Ninio, J. and Stevens, K. A. (2000) Variations on the Hermann grid: an extinction illusion. Perception, 29,

Multi-stable Perception. Necker Cube

CS 4495 Computer Vision Motion and Optic Flow

Feature Tracking and Optical Flow

Feature Tracking and Optical Flow

Motion and Optical Flow. Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Optical flow and tracking

Computer Vision Lecture 20

ECE Digital Image Processing and Introduction to Computer Vision

Computer Vision Lecture 20

Comparison between Motion Analysis and Stereo

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

EECS 556 Image Processing W 09

Computer Vision Lecture 20

CSCI 1290: Comp Photo

CS6670: Computer Vision

EE795: Computer Vision and Intelligent Systems

Lucas-Kanade Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Lecture 16: Computer Vision

EE795: Computer Vision and Intelligent Systems

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Overview. Video. Overview 4/7/2008. Optical flow. Why estimate motion? Motion estimation: Optical flow. Motion Magnification Colorization.

Lecture 16: Computer Vision

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Automatic Image Alignment (direct) with a lot of slides stolen from Steve Seitz and Rick Szeliski

Dense Image-based Motion Estimation Algorithms & Optical Flow

Optical flow. Cordelia Schmid

Optical flow. Cordelia Schmid

VC 11/12 T11 Optical Flow

Lecture 14: Tracking mo3on features op3cal flow

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Optical Flow-Based Motion Estimation. Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.

Visual Tracking. Image Processing Laboratory Dipartimento di Matematica e Informatica Università degli studi di Catania.

Capturing, Modeling, Rendering 3D Structures

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

Optical Flow CS 637. Fuxin Li. With materials from Kristen Grauman, Richard Szeliski, S. Narasimhan, Deqing Sun

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Lecture 13: Tracking mo3on features op3cal flow

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Visual Tracking. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Computer Vision Lecture 18

Spatial track: motion modeling

CS664 Lecture #18: Motion

Motion Estimation. There are three main types (or applications) of motion estimation:

CS231A Section 6: Problem Set 3

3D Photography. Marc Pollefeys, Torsten Sattler. Spring 2015

Particle Tracking. For Bulk Material Handling Systems Using DEM Models. By: Jordan Pease

Spatial track: motion modeling

SURVEY OF LOCAL AND GLOBAL OPTICAL FLOW WITH COARSE TO FINE METHOD

Visual Tracking (1) Tracking of Feature Points and Planar Rigid Objects

Fi#ng & Matching Region Representa3on Image Alignment, Op3cal Flow

Optic Flow and Basics Towards Horn-Schunck 1

3D Vision. Viktor Larsson. Spring 2019

CS201: Computer Vision Introduction to Tracking

Leow Wee Kheng CS4243 Computer Vision and Pattern Recognition. Motion Tracking. CS4243 Motion Tracking 1

Autonomous Navigation for Flying Robots

Lecture 20: Tracking. Tuesday, Nov 27

Visual Tracking (1) Feature Point Tracking and Block Matching

CS 565 Computer Vision. Nazar Khan PUCIT Lectures 15 and 16: Optic Flow

Optic Flow and Motion Detection

(More) Algorithms for Cameras: Edge Detec8on Modeling Cameras/Objects. Connelly Barnes

Image processing and features

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

Local Feature Detectors

CPSC 425: Computer Vision

Wikipedia - Mysid

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

Outline 7/2/201011/6/

Computer Vision I - Filtering and Feature detection

CS5670: Computer Vision

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Computer Vision II Lecture 4

EE 264: Image Processing and Reconstruction. Image Motion Estimation I. EE 264: Image Processing and Reconstruction. Outline

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

CAP 5415 Computer Vision Fall 2012

Notes 9: Optical Flow

Local Image Features

Edge and corner detection

Augmented Reality VU. Computer Vision 3D Registration (2) Prof. Vincent Lepetit

Scale Invariant Feature Transform

Step-by-Step Model Buidling

Displacement estimation

Ruch (Motion) Rozpoznawanie Obrazów Krzysztof Krawiec Instytut Informatyki, Politechnika Poznańska. Krzysztof Krawiec IDSS

Computer Vision. Computer Science Tripos Part II. Dr Christopher Town. Texture. Includes: more regular patterns. Includes: more random patterns

Visual Tracking (1) Pixel-intensity-based methods

Mariya Zhariy. Uttendorf Introduction to Optical Flow. Mariya Zhariy. Introduction. Determining. Optical Flow. Results. Motivation Definition

Comparison Between The Optical Flow Computational Techniques

Feature Based Registration - Image Alignment

CS-465 Computer Vision

Scale Invariant Feature Transform

Final Exam Study Guide

Motion illusion, rotating snakes

Motion Analysis. Motion analysis. Now we will talk about. Differential Motion Analysis. Motion analysis. Difference Pictures

Computer Vision I - Basics of Image Processing Part 2

SIFT: SCALE INVARIANT FEATURE TRANSFORM SURF: SPEEDED UP ROBUST FEATURES BASHAR ALSADIK EOS DEPT. TOPMAP M13 3D GEOINFORMATION FROM IMAGES 2014

Local Features: Detection, Description & Matching

Video Alignment. Literature Survey. Spring 2005 Prof. Brian Evans Multidimensional Digital Signal Processing Project The University of Texas at Austin

Dense 3D Reconstruction. Christiano Gava

Transcription:

Peripheral drift illusion

Does it work on other animals?

Computer Vision Motion and Optical Flow Many slides adapted from J. Hays, S. Seitz, R. Szeliski, M. Pollefeys, K. Grauman and others

Video A video is a sequence of frames captured over time Now our image data is a function of space (x, y) and time (t)

Motion Applications: Segmentation of video Background subtraction Shot boundary detection Motion segmentation Segment the video into multiple coherently moving objects

Motion and perceptual organization Sometimes, motion is the only cue

Motion and perceptual organization Sometimes, motion is the only cue

Motion and perceptual organization Even impoverished motion data can evoke a strong percept

Motion and perceptual organization Even impoverished motion data can evoke a strong percept

Motion estimation techniques Feature-based methods Extract visual features (corners, textured areas) and track them over multiple frames Sparse motion fields, but more robust tracking Suitable when image motion is large (10s of pixels) Direct, dense methods Directly recover image motion at each pixel from spatio-temporal image brightness variations Dense motion fields, but sensitive to appearance variations Suitable for video and when image motion is small

Motion estimation: Optical flow Optic flow is the apparent motion of objects or surfaces Will start by estimating motion of each pixel separately Then will consider motion of entire image

Problem definition: optical flow How to estimate pixel motion from image I(x,y,t) to I(x,y,t+1)? Solve pixel correspondence problem Given a pixel in I(x,y,t), look for nearby pixels of the same color in I(x,y,t+1) Key assumptions I( x, y, t ) I( x, y, t 1) Small motion: Points do not move very far Color constancy: A point in I(x,y,t) looks the same in I(x,y,t+1) For grayscale images, this is brightness constancy

Optical flow constraints (grayscale images) I( x, y, t ) I( x, y, t 1) Let s look at these constraints more closely Brightness constancy constraint (equation) I( x, y, t) I( x u, y v, t 1) Small motion: (u and v are less than 1 pixel, or smooth) Taylor series expansion of I: I I I( x u, y v) I( x, y) u v [higher order terms] x y I I I( x, y) u v x y

Optical flow equation Combining these two equations 0 I( x u, y v, t 1) I( x, y, t) I( x, y, t 1) I u I v I( x, y, t) x y (Short hand: I x = I x for t or t+1)

Optical flow equation Combining these two equations 0 I( x u, y v, t 1) I( x, y, t) I( x, y, t 1) I u I v I( x, y, t) [ I( x, y, t 1) I( x, y, t)] I u I v I I u I v t x y I I u, v t x y x y (Short hand: I x = I x for t or t+1)

Optical flow equation Combining these two equations 0 I( x u, y v, t 1) I( x, y, t) I( x, y, t 1) I u I v I( x, y, t) [ I( x, y, t 1) I( x, y, t)] I u I v I I u I v t x y I I u, v t x y x y (Short hand: I x = I x for t or t+1) In the limit as u and v go to zero, this becomes exact 0 I I u, v t Brightness constancy constraint equation I u I v I x y t 0

How does this make sense? Brightness constancy constraint equation I u I v I 0 x y t What do the static image gradients have to do with motion estimation?

The brightness constancy constraint Can we use this equation to recover image motion (u,v) at each pixel? 0 I I u, v or t I xu I yv It How many equations and unknowns per pixel? One equation (this is a scalar equation!), two unknowns (u,v) The component of the motion perpendicular to the gradient (i.e., parallel to the edge) cannot be measured 0 If (u, v) satisfies the equation, so does (u+u, v+v ) if I u' v' T 0 gradient (u,v) (u+u,v+v ) (u,v ) edge

Aperture problem

Aperture problem

Aperture problem

The barber pole illusion http://en.wikipedia.org/wiki/barberpole_illusion

The barber pole illusion http://en.wikipedia.org/wiki/barberpole_illusion

Solving the ambiguity B. Lucas and T. Kanade. An iterative image registration technique with an application to stereo vision. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 674 679, 1981. How to get more equations for a pixel? Spatial coherence constraint Assume the pixel s neighbors have the same (u,v) If we use a 5x5 window, that gives us 25 equations per pixel

Solving the ambiguity Least squares problem:

Matching patches across images Overconstrained linear system Least squares solution for d given by The summations are over all pixels in the K x K window

Conditions for solvability Optimal (u, v) satisfies Lucas-Kanade equation When is this solvable? I.e., what are good points to track? A T A should be invertible A T A should not be too small due to noise eigenvalues 1 and 2 of A T A should not be too small A T A should be well-conditioned 1 / 2 should not be too large ( 1 = larger eigenvalue) Does this remind you of anything? Criteria for Harris corner detector

Low texture region gradients have small magnitude small 1, small 2

Edge large gradients, all the same large 1, small 2

High textured region gradients are different, large magnitudes large 1, large 2

The aperture problem resolved Actual motion

The aperture problem resolved Perceived motion

Errors in Lucas-Kanade A point does not move like its neighbors Motion segmentation Brightness constancy does not hold Do exhaustive neighborhood search with normalized correlation - tracking features maybe SIFT more later. The motion is large (larger than a pixel) 1. Not-linear: Iterative refinement 2. Local minima: coarse-to-fine estimation

Revisiting the small motion assumption Is this motion small enough? Probably not it s much larger than one pixel How might we solve this problem?

Optical Flow: Aliasing Temporal aliasing causes ambiguities in optical flow because images can have many pixels with the same intensity. I.e., how do we know which correspondence is correct? actual shift estimated shift nearest match is correct (no aliasing) To overcome aliasing: coarse-to-fine estimation. nearest match is incorrect (aliasing)

Reduce the resolution!

Coarse-to-fine optical flow estimation u=1.25 pixels u=2.5 pixels u=5 pixels image 1 u=10 pixels image 2 Gaussian pyramid of image 1 Gaussian pyramid of image 2

Coarse-to-fine optical flow estimation run iterative L-K run iterative L-K warp & upsample... image J1 image I2 Gaussian pyramid of image 1 Gaussian pyramid of image 2

Optical Flow Results * From Khurram Hassan-Shafique CAP5415 Computer Vision 2003

Optical Flow Results * From Khurram Hassan-Shafique CAP5415 Computer Vision 2003

Large displacement optical flow, Brox et al., CVPR 2009 CS 4495 Computer Vision A. Bobick State-of-the-art optical flow, 2009 Start with something similar to Lucas-Kanade + gradient constancy + energy minimization with smoothing term + region matching + keypoint matching (long-range) Region-based +Pixel-based +Keypoint-based

State-of-the-art optical flow, 2015 Deep convolutional network which accepts a pair of input frames and upsamples the estimated flow back to input resolution. Very fast because of deep network, near the state-of-the-art in terms of end-point-error. Fischer et al. 2015. https://arxiv.org/abs/1504.06852

State-of-the-art optical flow, 2015 Synthetic Training data Fischer et al. 2015. https://arxiv.org/abs/1504.06852

State-of-the-art optical flow, 2015 Results on Sintel Fischer et al. 2015. https://arxiv.org/abs/1504.06852

Optical flow Definition: optical flow is the apparent motion of brightness patterns in the image Ideally, optical flow would be the same as the motion field Have to be careful: apparent motion can be caused by lighting changes without any actual motion Think of a uniform rotating sphere under fixed lighting vs. a stationary sphere under moving illumination