Announcements. Motion. Structure-from-Motion (SFM) Motion. Discrete Motion: Some Counting

Similar documents
Announcements. Motion. Structure-from-Motion (SFM) Motion. Discrete Motion: Some Counting

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Stereo Wrap + Motion. Computer Vision I. CSE252A Lecture 17

Structure from Motion and Multi- view Geometry. Last lecture

Two-view geometry Computer Vision Spring 2018, Lecture 10

Recovering structure from a single view Pinhole perspective projection

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Announcements. Stereo

Lecture 9: Epipolar Geometry

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

Computer Vision Lecture 17

Announcements. Stereo

Computer Vision Lecture 17

Structure from Motion. Prof. Marco Marcon

Rectification and Distortion Correction

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Computer Vision Lecture 20

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Structure from motion

Computer Vision Lecture 20

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

BIL Computer Vision Apr 16, 2014

Introduction to Computer Vision

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Cameras and Stereo CSE 455. Linda Shapiro

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

Stereo and Epipolar geometry

CS231A Midterm Review. Friday 5/6/2016

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Epipolar Geometry and Stereo Vision

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Epipolar Geometry and Stereo Vision

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

Announcements. Motion. Motion. Continuous Motion. Background Subtraction

Midterm Exam Solutions

Epipolar Geometry CSE P576. Dr. Matthew Brown

Comparison between Motion Analysis and Stereo

Epipolar geometry. x x

Two-View Geometry (Course 23, Lecture D)

Lecture 6 Stereo Systems Multi-view geometry

Structure from motion

Epipolar Geometry and Stereo Vision

calibrated coordinates Linear transformation pixel coordinates

Computer Vision Lecture 20

Lecture 5 Epipolar Geometry

Camera Model and Calibration. Lecture-12

Recap: Features and filters. Recap: Grouping & fitting. Now: Multiple views 10/29/2008. Epipolar geometry & stereo vision. Why multiple views?

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

Multiple Views Geometry

3D Computer Vision. Structure from Motion. Prof. Didier Stricker

Lecture 6 Stereo Systems Multi- view geometry Professor Silvio Savarese Computational Vision and Geometry Lab Silvio Savarese Lecture 6-24-Jan-15

C18 Computer vision. C18 Computer Vision. This time... Introduction. Outline.

Lecture 10: Multi view geometry

Geometry of Multiple views

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Spatial track: motion modeling

CSE 252A Computer Vision Homework 3 Instructor: Ben Ochoa Due : Monday, November 21, 2016, 11:59 PM

Epipolar Geometry in Stereo, Motion and Object Recognition

Multi-stable Perception. Necker Cube

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

3D reconstruction class 11

Animation and Ray Tracing

CS201 Computer Vision Camera Geometry

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

CS664 Lecture #19: Layers, RANSAC, panoramas, epipolar geometry

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

Epipolar Geometry and the Essential Matrix

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Lecture 14: Basic Multi-View Geometry

Epipolar Geometry and Stereo Vision

CS231A Course Notes 4: Stereo Systems and Structure from Motion

Combining Monocular and Stereo Depth Cues

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Motion Analysis. Motion analysis. Now we will talk about. Differential Motion Analysis. Motion analysis. Difference Pictures

CSE 252B: Computer Vision II

Public Library, Stereoscopic Looking Room, Chicago, by Phillips, 1923

Projective Geometry and Camera Models

Feature Tracking and Optical Flow

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

3D Modeling using multiple images Exam January 2008

EE795: Computer Vision and Intelligent Systems

3D Vision Real Objects, Real Cameras. Chapter 11 (parts of), 12 (parts of) Computerized Image Analysis MN2 Anders Brun,

Introduction à la vision artificielle X

Final Exam Study Guide CSE/EE 486 Fall 2007

CS664 Lecture #18: Motion

Lecture 10: Multi-view geometry

Stereo Vision. MAN-522 Computer Vision

Depth from two cameras: stereopsis

Week 2: Two-View Geometry. Padua Summer 08 Frank Dellaert

CSE 252B: Computer Vision II

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Transcription:

Announcements Motion HW 4 due Friday Final Exam: Tuesday, 6/7 at 8:00-11:00 Fill out your CAPES Introduction to Computer Vision CSE 152 Lecture 20 Motion Some problems of motion 1. Correspondence: Where have elements of the image moved between image frames 2. Reconstruction: Given correspondence, what is 3-D geometry of scene 3. Ego Motion: How has the camera moved. 4. Segmentation: What are regions of image corresponding to different moving objects 5. Tracking: Where have objects moved in the image? related to correspondence and segmentation. Variations: Small motion (video), Wide-baseline (multi-view) Structure-from-Motion (SFM) Goal: Take as input two or more images or video w/o any information on camera position/motion, and estimate camera position and 3-D structure of scene. Two Approaches 1. Discrete motion (wide baseline) 1. Orthographic (affine) vs. Perspective 2. Two view vs. Multi-view 3. Calibrated vs. Uncalibrated 2. Continuous (Infinitesimal) motion Discrete Motion: Some Counting Epipolar Constraint: Calibrated Case Consider M images of N points, how many unknowns? 1. Camera locations: Affix coordinate system to location of first camera location: (M-1)*6 Unknowns 2. 3-D Structure: 3*N Unknowns 3. Can only recover structure and motion up to scale. Why? Total number of unknowns: (M-1)*6+3*N-1 Total number of measurements: 2*M*N Solution is possible when (M-1)*6+3*N-1 2*M*N M=2 N 5 M=3 N 4 Essential Matrix (Longuet-Higgins, 1981) where 1

The Eight-Point Algorithm (Longuet-Higgins, 1981) Let F denote the Essential Matrix here Set F 33 to 1 Solve For F Solve For R and t Sketch of Two View SFM Algorithm Input: Two images 1. Detect feature points 2. Find 8 matching feature points (easier said than done, but usually done with RANSAC Alogrithm) 3. Compute the Essential Matrix E using Normalized 8-point Algorithm 4. Compute R and T (recall that E=RS where S is skew symmetric matrix) 5. Perform stereo matching using recovered epipolar geometry expressed via E. 6. Reconstruct 3-D geometry of corresponding points. Consider a video camera moving continuously along a trajectory (rotating & translating). How do points in the image move What does that tell us about the 3-D motion & scene structure? Continuous Motion Motion When objects move at equal speed, those more remote seem to move more slowly. - Euclid, 300 BC Simplest Idea for video processing Image Differences Given image I(u,v,t) and I(u,v, t+δt), compute I(u,v, t+δt) - I(u,v,t). This is partial derivative: Background Subtraction Gather image I(x,y,t 0 ) of background without objects of interest (perhaps computed over average over many images). At time t, pixels where I(x,y,t)-I(x,y,t 0 ) > τ are labeled as coming from foreground objects At object boundaries, is large and is a cue for segmentation Doesn t tell which way stuff is moving Raw Image Foreground region 2

The Motion Field Where in the image did a point move? Motion Field Down and left What causes a motion field? Motion Field Yields 3-D Motion Information 1. Camera moves (translates, rotates) 2. Object s in scene move rigidly 3. Objects articulate (pliers, humans, quadripeds) 4. Objects bend and deform (fish) 5. Blowing smoke, clouds The instantaneous velocity of points in an image LOOMING The Focus of Expansion (FOE) Intersection of velocity vector with image plane With just this information it is possible to calculate: 1. Direction of motion 2. Time to collision Is motion estimation inherent in humans? Demo Rigid Motion and the Motion Field 3

Rigid Motion: General Case Position & Orientation Rotational & linear velocity General Motion P u = f x v z y f z x z 2 y Position and orientation of a rigid body Rotation Matrix & Translation vector Rigid Motion: Velocity Vector: T Angular Velocity Vector: ω (or Ω) Substitute where p=(x,y,z) T Motion Field Equation Pure Translation T: Components of 3-D linear motion ω: Angular velocity vector (u,v): Image point coordinates Z: depth f: focal length ω = 0 Forward Translation & Focus of Expansion [Gibson, 1950] Pure Translation Radial about FOE Parallel ( FOE point at infinity) T Z = 0 Motion parallel to image plane 4

Pure Rotation: T=0 Rotational MOTION FIELD The instantaneous velocity of points in an image PURE ROTATION ω = (0,0,1)T Independent of Tx Ty Tz Independent of Z Only function of (u,v), f and ω Pure Rotation: Motion Field on Sphere Motion Field Equation: Estimate Depth If T, ω, and f are known or measured, then for each image point (u,v), one can solve for the depth Z given measured motion (du/ dt, dv/dt) at (u,v). Optical Flow: Where do pixels move to? Optical Flow 5

Estimating the motion field from images 1. Feature-based (Sect. 8.4.2 of Trucco & Verri) 1. Detect Features (corners) in an image 2. Search for the same features nearby (Feature tracking). 2. Differential techniques (Sect. 8.4.1) Final Exam Closed book One cheat sheet Single piece of paper, handwritten, no photocopying, no physical cut & paste. You can start with your sheet fromthe midterm. What to study Basically material presented in class, and supporting material from text If it was in text, but NEVER mentioned in class, it is very unlikely to be on the exam Question style: Short answer Some longer problems to be worked out. 6