Announcements. Motion. Structure-from-Motion (SFM) Motion. Discrete Motion: Some Counting

Similar documents
Announcements. Motion. Structure-from-Motion (SFM) Motion. Discrete Motion: Some Counting

Stereo Wrap + Motion. Computer Vision I. CSE252A Lecture 17

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Structure from Motion and Multi- view Geometry. Last lecture

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Two-view geometry Computer Vision Spring 2018, Lecture 10

Finally: Motion and tracking. Motion 4/20/2011. CS 376 Lecture 24 Motion 1. Video. Uses of motion. Motion parallax. Motion field

Computer Vision Lecture 20

Computer Vision Lecture 20

Recovering structure from a single view Pinhole perspective projection

Computer Vision Lecture 17

Computer Vision Lecture 17

Comparison between Motion Analysis and Stereo

Structure from Motion. Prof. Marco Marcon

Feature Tracking and Optical Flow

Computer Vision Lecture 20

Spatial track: motion modeling

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Announcements. Stereo

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

Lecture 9: Epipolar Geometry

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Introduction to Computer Vision

Multi-stable Perception. Necker Cube

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Rectification and Distortion Correction

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Announcements. Stereo

EE795: Computer Vision and Intelligent Systems

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

CS 4495 Computer Vision Motion and Optic Flow

Announcements. Motion. Motion. Continuous Motion. Background Subtraction

Feature Tracking and Optical Flow

calibrated coordinates Linear transformation pixel coordinates

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Midterm Exam Solutions

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

BIL Computer Vision Apr 16, 2014

Lecture 16: Computer Vision

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Structure from motion

Optic Flow and Basics Towards Horn-Schunck 1

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

CS664 Lecture #18: Motion

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Optical flow and tracking

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

CS231A Midterm Review. Friday 5/6/2016

Cameras and Stereo CSE 455. Linda Shapiro

CSE 252A Computer Vision Homework 3 Instructor: Ben Ochoa Due : Monday, November 21, 2016, 11:59 PM

Computer Vision. Introduction

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

C18 Computer vision. C18 Computer Vision. This time... Introduction. Outline.

Epipolar Geometry and Stereo Vision

Recap: Features and filters. Recap: Grouping & fitting. Now: Multiple views 10/29/2008. Epipolar geometry & stereo vision. Why multiple views?

Camera Model and Calibration. Lecture-12

CS6670: Computer Vision

Lecture 16: Computer Vision

Two-View Geometry (Course 23, Lecture D)

Motion Analysis. Motion analysis. Now we will talk about. Differential Motion Analysis. Motion analysis. Difference Pictures

Peripheral drift illusion

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Stereo and Epipolar geometry

Epipolar Geometry and Stereo Vision

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Announcements. Recognition I. Optical Flow: Where do pixels move to? dy dt. I + y. I = x. di dt. dx dt. = t

Geometric camera models and calibration

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Public Library, Stereoscopic Looking Room, Chicago, by Phillips, 1923

CS231A Course Notes 4: Stereo Systems and Structure from Motion

Epipolar Geometry CSE P576. Dr. Matthew Brown

3D Computer Vision. Structure from Motion. Prof. Didier Stricker

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Structure from motion

Lecture 6 Stereo Systems Multi-view geometry

CS201 Computer Vision Camera Geometry

Expectations. Computer Vision. Grading. Grading. Our Goal. Our Goal

Animation and Ray Tracing

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

Epipolar Geometry and Stereo Vision

Epipolar geometry. x x

Epipolar Geometry in Stereo, Motion and Object Recognition

CPSC 425: Computer Vision

Final Exam Study Guide

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

Announcements. Stereo Vision Wrapup & Intro Recognition

Lecture 5 Epipolar Geometry

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Epipolar Geometry and Stereo Vision

Assignment 2: Stereo and 3D Reconstruction from Disparity

Lecture 14: Basic Multi-View Geometry

Project 3 code & artifact due Tuesday Final project proposals due noon Wed (by ) Readings Szeliski, Chapter 10 (through 10.5)

Motion. CS 554 Computer Vision Pinar Duygulu Bilkent University

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

3D reconstruction class 11

Transcription:

Announcements Motion Introduction to Computer Vision CSE 152 Lecture 20 HW 4 due Friday at Midnight Final Exam: Tuesday, 6/12 at 8:00AM-11:00AM, regular classroom Extra Office Hours: Monday 6/11 9:00AM-10:00AM Fill out your CAPES Motion What are problems that we solve using motion 1. Correspondence: Where have elements of the image moved between image frames 2. Ego Motion: How has the camera moved. 3. Reconstruction: Given correspondence, what is 3-D geometry of scene 4. Segmentation: What are regions of image corresponding to different moving objects 5. Tracking: Where have objects moved in the image? related to correspondence and segmentation. Variations: Small motion (video), Wide-baseline (multi-view) Structure-from-Motion (SFM) Goal: Take as input two or more images or video w/o any information on camera position/motion, and estimate camera position and 3-D structure of scene. Two Approaches 1. Discrete motion (wide baseline) 1. Orthographic (affine) vs. Perspective 2. Two view vs. Multi-view 3. Calibrated vs. Uncalibrated 2. Continuous (Infinitesimal) motion Discrete Motion: Some Counting Consider M images of N points, how many unknowns? 1. Camera locations: Affix coordinate system to location of first camera location: (M-1)*6 Unknowns 2. 3-D Structure: 3*N Unknowns 3. Can only recover structure and motion up to scale. Why? The Eight-Point Algorithm (Longuet-Higgins, 1981) Let F denote the Essential Matrix here Set F 33 to 1 Total number of unknowns: (M-1)*6+3*N-1 Total number of measurements: 2*M*N Solution is possible when (M-1)*6+3*N-1 2*M*N M=2 N 5 M=3 N 4 Solve For F Solve For R and t 1

Sketch of Two View SFM Algorithm Input: Two images 1. Detect feature points 2. Find 8 matching feature points (easier said than done, but usually done with RANSAC Alogrithm) 3. Compute the Essential Matrix E using Normalized 8-point Algorithm 4. Compute R and T (recall that E=RS where S is skew symmetric matrix) 5. Perform stereo matching using recovered epipolar geometry expressed via E. 6. Reconstruct 3-D geometry of corresponding points. Consider a video camera moving continuously along a trajectory (rotating & translating). How do points in the image move What does that tell us about the 3-D motion & scene structure? Continuous Motion Motion When objects move at equal speed, those more remote seem to move more slowly. - Euclid, 300 BC Simplest Idea for video processing Image Differences: Brightness change at a location Given image I(u,v,t) and I(u,v, t+δt), compute I(u,v, t+δt) - I(u,v,t). This is partial derivative: At object boundaries, is large and is a cue for segmentation Doesn t tell which way stuff is moving Background Subtraction Gather image I(x,y,t 0 ) of background without objects of interest (perhaps computed over average over many images). At time t, pixels where I(x,y,t)-I(x,y,t 0 ) > τ are labeled as coming from foreground objects The Motion Field Where in the image did a point move? Down and left Raw Image Foreground region 2

Motion Field What causes a motion field? 1. Camera moves (translates, rotates) 2. Object s in scene move rigidly 3. Objects articulate (pliers, humans, quadripeds) 4. Objects bend and deform (fish) 5. Blowing smoke, clouds Motion Field Yields 3-D Motion Information The instantaneous velocity of points in an image LOOMING Is motion estimation inherent in humans? Demo The Focus of Expansion (FOE) Intersection of velocity vector with image plane With just this information it is possible to calculate: 1. Direction of motion 2. Time to collision Rigid Motion: General Case Position & Orientation Rotational & linear velocity Rigid Motion and the Motion Field How do we relate 3-D motion of a camera to the 2-D motion of points in the image Position and orientation of a rigid body Rotation Matrix & Translation vector P Rigid Motion: Velocity Vector: T Angular Velocity Vector: ω (or Ω) p = T+ Ω p 3

General Motion Motion Field Equation Substitute u = f x v z y f z x z 2 y p = T+ Ω p where p=(x,y,z) T T: Components of 3-D linear motion ω: Angular velocity vector (u,v): Image point coordinates Z: depth f: focal length Pure Translation Forward Translation & Focus of Expansion [Gibson, 1950] ω = 0 Pure Translation Pure Rotation: T=0 Radial about FOE Parallel ( FOE point at infinity) T Z = 0 Motion parallel to image plane Independent of T x T y T z Independent of Z Only function of (u,v), f and ω 4

Rotational MOTION FIELD Pure Rotation: Motion Field on Sphere The instantaneous velocity of points in an image PURE ROTATION ω = (0,0,1)T Motion Field Equation: Estimate Depth Optical Flow If T, ω, and f are known or measured, then for each image point (u,v), one can solve for the depth Z given measured motion (du/ dt, dv/dt) at (u,v). Optical Flow: Where do pixels move to? Estimating the motion field from images 1. Feature-based (Sect. 8.4.2 of Trucco & Verri) 1. Detect Features (corners) in an image 2. Search for the same features nearby (Feature tracking). 2. Differential techniques (Sect. 8.4.1) 5

Mathematical formulation Optical Flow Constraint I (x,y,t) = brightness at image point (x,y) at time t Consider scene (or camera) to be moving, so (x,y) is a function of time (i.e., x(t), y(t)), and point is moving with velecity (dx/dt,dy/dt) Brightness constancy assumption: Optical flow constraint equation : Normal Flow Illusion Works Barber Pole Illusion Final Exam Closed book No Calculator One cheat sheet Single piece of paper, handwritten, no photocopying, no physical cut & paste. You can start with your sheet from the midterm. What to study Basically material presented in class, and supporting material from text If it was in text, but NEVER mentioned in class, it is very unlikely to be on the exam Question style: Short answer Some longer problems to be worked out. 6