Introduction to Computer Vision. Week 3, Fall 2010 Instructor: Prof. Ko Nishino

Similar documents
Mosaics. Today s Readings

Announcements. Mosaics. Image Mosaics. How to do it? Basic Procedure Take a sequence of images from the same position =

Announcements. Mosaics. How to do it? Image Mosaics

Image Stitching. Slides from Rick Szeliski, Steve Seitz, Derek Hoiem, Ira Kemelmacher, Ali Farhadi

Image Warping and Mosacing

CS4670: Computer Vision

Recap. DoF Constraint Solver. translation. affine. homography. 3D rotation

N-Views (1) Homographies and Projection

CS6670: Computer Vision

Broad field that includes low-level operations as well as complex high-level algorithms

Image warping and stitching

Image warping and stitching

Final Exam Study Guide CSE/EE 486 Fall 2007

Computer Vision and Graphics (ee2031) Digital Image Processing I

Geometric Transformations and Image Warping

Geometric camera models and calibration

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Reading. 2. Fourier analysis and sampling theory. Required: Watt, Section 14.1 Recommended:

Fourier analysis and sampling theory

Video Mosaics for Virtual Environments, R. Szeliski. Review by: Christopher Rasmussen

All good things must...

Homographies and RANSAC

Image warping , , Computational Photography Fall 2017, Lecture 10

ME/CS 132: Introduction to Vision-based Robot Navigation! Low-level Image Processing" Larry Matthies"

Image warping and stitching

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Agenda. Rotations. Camera models. Camera calibration. Homographies

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Digital Image Processing. Lecture 6

Midterm Examination CS 534: Computational Photography

calibrated coordinates Linear transformation pixel coordinates

Lecture 4: Image Processing

Introduction to Computer Vision. Week 8, Fall 2010 Instructor: Prof. Ko Nishino

Single-view 3D Reconstruction

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Introduction to Digital Image Processing

Agenda. Rotations. Camera calibration. Homography. Ransac

Computer Graphics. Texture Filtering & Sampling Theory. Hendrik Lensch. Computer Graphics WS07/08 Texturing

Targil 10 : Why Mosaic? Why is this a challenge? Exposure differences Scene illumination Miss-registration Moving objects

3D Geometry and Camera Calibration

EE795: Computer Vision and Intelligent Systems

Stitching and Blending

Announcements. Image Matching! Source & Destination Images. Image Transformation 2/ 3/ 16. Compare a big image to a small image

Pin Hole Cameras & Warp Functions

Vision Review: Image Formation. Course web page:

Epipolar Geometry and Stereo Vision

Today s lecture. Image Alignment and Stitching. Readings. Motion models

Motivation. The main goal of Computer Graphics is to generate 2D images. 2D images are continuous 2D functions (or signals)

Lecture 2: 2D Fourier transforms and applications

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Compositing a bird's eye view mosaic

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

An introduction to 3D image reconstruction and understanding concepts and ideas

Video Alignment. Final Report. Spring 2005 Prof. Brian Evans Multidimensional Digital Signal Processing Project The University of Texas at Austin

The main goal of Computer Graphics is to generate 2D images 2D images are continuous 2D functions (or signals)

3. Image formation, Fourier analysis and CTF theory. Paula da Fonseca

Capturing, Modeling, Rendering 3D Structures

CS 1674: Intro to Computer Vision. Midterm Review. Prof. Adriana Kovashka University of Pittsburgh October 10, 2016

Lecture 16: Computer Vision

Lecture 16: Computer Vision

Midterm Exam Solutions

Image formation. Thanks to Peter Corke and Chuck Dyer for the use of some slides

Warping, Morphing and Mosaics

Image processing and features

Warping. 12 May 2015

CS4670: Computer Vision

Geometric Transformations and Image Warping Chapter 2.6.5

Image Restoration. Yao Wang Polytechnic Institute of NYU, Brooklyn, NY 11201

Photometric Processing

Camera Model and Calibration

Perspective Projection [2 pts]

Image preprocessing in spatial domain

More Mosaic Madness. CS194: Image Manipulation & Computational Photography. Steve Seitz and Rick Szeliski. Jeffrey Martin (jeffrey-martin.

Image preprocessing in spatial domain

Outline 7/2/201011/6/

EE795: Computer Vision and Intelligent Systems

Image Warping: A Review. Prof. George Wolberg Dept. of Computer Science City College of New York

Nonlinear Multiresolution Image Blending

Cameras and Stereo CSE 455. Linda Shapiro

Pin Hole Cameras & Warp Functions

Local features and image matching May 8 th, 2018

The SIFT (Scale Invariant Feature

EE795: Computer Vision and Intelligent Systems

Edge Detection. CS664 Computer Vision. 3. Edges. Several Causes of Edges. Detecting Edges. Finite Differences. The Gradient

16720: Computer Vision Homework 1

Lecture 5: Frequency Domain Transformations

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Camera Calibration. COS 429 Princeton University

Computer Vision I. Announcement. Stereo Vision Outline. Stereo II. CSE252A Lecture 15

Computer Vision Projective Geometry and Calibration. Pinhole cameras

CS 664 Slides #9 Multi-Camera Geometry. Prof. Dan Huttenlocher Fall 2003

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Final Review CMSC 733 Fall 2014

CS201 Computer Vision Camera Geometry

Image Composition. COS 526 Princeton University

Unit 3 Multiple View Geometry

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Image Compositing and Blending

Transcription:

Introduction to Computer Vision Week 3, Fall 2010 Instructor: Prof. Ko Nishino

Last Week! Image Sensing " Our eyes: rods and cones " CCD, CMOS, Rolling Shutter " Sensing brightness and sensing color! Projective Geometry/Camera Calibration " Projection models: perspective, orthographic, weakperspective, and affine " Homogeneous coordinates (to and from) " Camera parameters: intrinsic and extrinsic " Camera calibration " Direct Linear Calibration: Total Least Square " Non-linear calibration

Measurements on Planes 4 3 2 1 1 Approach: unwarp then measure What kind of warp is this? 2 3 4

Image Rectification p p To unwarp (rectify) an image solve for homography H given p and p solve equations of the form: wp = Hp linear in unknowns: w and coefficients of H H is defined up to an arbitrary scale factor how many points are necessary to solve for H?

Solving for Homographies

Solving for Homographies! Total least squares A h 0 " Since h is only defined up to scale, solve for unit vector! " Minimize 2n " 9 9 2n " Solution:! = eigenvector of A T A with smallest eigenvalue " Works with 4 or more points (more points more accurate)

Homography! Homography is a singular case of the Fundamental Matrix " Two views of coplanar points " Two views that share the same center of projection

Project #1: Homography! Rectification " Take two images of an object with a planar surface " Make a fronto-parallel image of one of the planar surfaces " Submit results for three images including the test! Compositing " Take two images " Composite the entire or part of one image into another using the homography of corresponding regions " Submit results for three images including the test! Planar Mosaic (Extra Credit)

Example! Rectification This is your test image

Example! Rectification

Example! Compositing This is your test image set

Example! Composite " Need not be rectangular " Masking and Blending

Example! Planar Mosaic (Extra Credit) Note that the COP is the same

by I. Zoghlami

Ingredients! Take good images! Specify correspondences (manual)! Compute homography " Solve with eigen decomposition! Apply homography " Warping " Interpolation " Masking " Blending

Image Warping h(x,y) y y x f(x,y) x g(x,y )! Given a coordinate transform (x,y ) = h(x,y) and a source image f(x,y), how do we compute a transformed image g(x,y ) = f(h(x,y))?

Forward Warping h(x,y) y y x f(x,y) x g(x,y )! Send each pixel f(x,y) to its corresponding location (x,y ) = h(x,y) in the second image Q: what if pixel lands between two pixels?

Forward Warping h(x,y) y y x f(x,y) x g(x,y )! Send each pixel f(x,y) to its corresponding location (x,y ) = h(x,y) in the second image Q: what if pixel lands between two pixels? A: distribute color among neighboring pixels (x,y ) Known as splatting

Inverse Warping y h -1 (x,y) y x f(x,y) x g(x,y )! Get each pixel g(x,y ) from its corresponding location! (x,y) = h -1 (x,y ) in the first image Q: what if pixel comes from between two pixels?

Inverse Warping y h -1 (x,y) y x f(x,y) x g(x,y )! Get each pixel g(x,y ) from its corresponding location (x,y) = h -1 (x,y ) in the first image Q: what if pixel comes from between two pixels? A: resample color value

Forward vs. Inverse Warping! Q: which is better?! A: usually inverse eliminates holes " however, it requires an invertible warp function not always possible...

Bilinear Interpolation! A simple method for resampling images (i, j) a b

Image Blending

Feathering + 1 0 1 0 =

Effect of Window Size 1 left 1 0 right 0

Effect of Window Size 1 1 0 0

Good Window Size 1! Optimal window: smooth but not ghosted " Doesn t always work... 0

Pyramid Blending! Create a Laplacian pyramid, blend each level Burt, P. J. and Adelson, E. H., A multiresolution spline with applications to image mosaics, ACM Transactions on Graphics, 42(4), October 1983, 217-236.

Alpha Blending I 3 p I 1 I 2 Encoding blend weights: I(x,y) = (!R,!G,!B,!) color at p = Implement this in two steps: 1. accumulate: add up the (! premultiplied) RGB! values at each pixel 2. normalize: divide each pixel s accumulated RGB by its! value Q: what if! = 0?

Poisson Image Editing! For more info: Perez et al, SIGGRAPH 2003 " http://research.microsoft.com/vision/cambridge/papers/perez_siggraph03.pdf

Project #1: Homography! Due 10/17 Sunday Midnight! See the assignment web page for details (3 artifacts for each task)! Skelton Code on the web " Fill in the empty functions " Write additional functions for extra credits! Cameras!

Image Filtering (Continuous) Reading: Robot Vision Chapter 6

What is an Image?

Image as a Function! We can think of an image as a function, f, from R 2 to R: " f (x, y) gives the intensity at position (x, y) " Realistically, we expect the image only to be defined over a rectangle, with a finite range: " f: [a,b]x[c,d] # [0,1]! A color image is just three functions pasted together. We can write this as a vector-valued function:

Image as a Function f(x,y) y x

Image Processing! Define a new image g in terms of an existing image f " We can transform either the domain or the range of f! Range transformation: What kinds of operations can this perform?

Image Processing! Some operations preserve the range but change the domain of f : What kinds of operations can this perform?

Image Processing! Still other operations operate on both the domain and the range of f.

Linear Shift Invariant Systems Linearity: Shift invariance:

Example of LSIS Defocused image ( g ) is a processed version of the focused image ( f ) Ideal lens is a LSIS LSIS Linearity: Brightness variation Shift invariance: Scene movement (not valid for lenses with non-linear distortions)

Convolution LSIS is doing convolution; convolution is linear and shift invariant ( ) = f " g x $ % ( )h( x #")d" g = f &h #$ kernel h

Convolution Eric Weisstein s Math World

Example of Convolution 1 1-1 1-1 1 1-2 -1 1 2

Convolution Kernel! What h will give us g = f? Dirac Delta Function (Unit Impulse Function) Sifting property: $ $ % f ( x)" ( x)dx = % f ( 0)" ( x)dx #$ % ( ) = f " g x & & $% ( )# ( x $" )d" % = "(#)h( x $# )d# $% #$ = f 0 % ( )dx = f 0 $ ( ) " x #$ ( ) = f ( x) = h( x)

Point Spread Function scene Optical System image Ideally, the optical system is a dirac delta function. However, optical systems are never ideal. point source Optical System point spread func. Human eyes point spread function

Point Spread Function normal vision myopia hyperopia astigmatism Images by Richmond Eye Associates

Properties of Convolution! Commutative! Associative Cascade system

2D Convolution ( ) = f " x," y g x $ $ % #$ #$ % ( )h( x #" x,y #" y )d" x d" y

Jean Baptiste Joseph Fourier (1768-1830)! Had crazy idea (1807): Any periodic function can be rewritten as a weighted sum of Sines and Cosines of different frequencies.! Don t believe it? " Neither did Lagrange, Laplace, Poisson and other big wigs " Not translated into English until 1878!! But it s true! " called Fourier Series " Possibly the greatest tool used in Engineering

A Sum of Sinusoids! Our building block:! Add enough of them to get any signal f(x) you want!

Time and Frequency

Time and Frequency = +

Frequency Spectra = +

Frequency Spectra! Usually, frequency is more interesting than the phase

Frequency Spectra = + =

Frequency Spectra = + =

Frequency Spectra = + =

Frequency Spectra = + =

Frequency Spectra = + =

Frequency Spectra =

Fourier Transform Represent the signal as an infinite weighted sum of an infinite number of sinusoids (u: oscillation frequency) F u Arbitrary function Spatial Domain (x) ( ) = f x Note: phase is encoded! by sin/cos pair! % Inverse Fourier Transform (IFT) f x % $ "$ $ ( ) = F u #$ ( )e "i2#ux dx Single Analytic Expression Frequency Domain (u) (Frequency Spectrum F(u)) ( )e i2"ux dx

Fourier Transform (Physicists Definition) Represent the signal as an infinite weighted sum of an infinite number of sinusoids (u: angular frequency) F u ( ) = f x Note: $ # "# ( )e "iux dx Arbitrary function Spatial Domain (x) Single Analytic Expression Frequency Domain (u) (Frequency Spectrum F(u)) Inverse Fourier Transform (IFT) f ( x) = 1 2" % $ #$ F( u)e iux dx

Fourier Transform Pairs (I) Play with http://www.falstad.com/fourier/ Note that these are derived using angular frequency ( )

Fourier Transform Pairs (II) Note that these are derived using angular frequency ( )

Fourier Transform and Convolution Let Then G u g = f "h % $ ( ) = g x & "$ & ( )e "i2#ux dx % % = f (")h( x #")e #i2$ux d"dx & #% & #% % % = [ f (")e #i2$u" d" ][ h( x #")e #i2$u( x #" ) dx] #% #% & [ ] & h x' % = f (")e #i2$u" d" #% = F( u)h( u) Convolution in spatial domain ( )e #i2$ux' dx' % #%[ ] Multiplication in frequency domain

Fourier Transform and Convolution Spatial Domain (x) g = f "h g = fh Frequency Domain (u) G = FH G = F " H So, we can find g(x) by Fourier transform IFT. FT FT