Computational Cameras: Exploiting Spatial- Angular Temporal Tradeoffs in Photography

Similar documents
Coding and Modulation in Cameras

emerging-technology Home > News > emerging-technology > Mitsubishi Electric Develops Deblurring Flutter Shutter Camera

Radiance Photography. Todor Georgiev Adobe Systems. Andrew Lumsdaine Indiana University

Focal stacks and lightfields

Tutorial: Compressive Sensing of Video

Computational Photography

Modeling Light. Michal Havlik : Computational Photography Alexei Efros, CMU, Fall 2007

Computational Photography

Image-Based Rendering

Computational Photography: Real Time Plenoptic Rendering

Next Generation CAT System

A Theory of Multi-Layer Flat Refractive Geometry

The Focused Plenoptic Camera. Lightfield photographers, focus your cameras! Karl Marx

Flexible Voxels for Motion-Aware Videography

Plenoptic camera and its Applications

Computational Imaging for Self-Driving Vehicles

Projective Geometry and Camera Models

Computational Methods for Radiance. Render the full variety offered by the direct observation of objects. (Computationally).

Image-based Lighting

3D Shape and Indirect Appearance By Structured Light Transport

DEPTH, STEREO AND FOCUS WITH LIGHT-FIELD CAMERAS

VIDEO FOR VIRTUAL REALITY LIGHT FIELD BASICS JAMES TOMPKIN

AS the most important medium for people to perceive


Chapter 12-Close-Up and Macro Photography

A projector-camera setup for geometry-invariant frequency demultiplexing

(54) DIGITAL REFOCUSING FOR WIDE-ANGLE (52) US. Cl /36; 348/E IMAGES USING AXIAL-CONE CAMERAS

CS201 Computer Vision Lect 4 - Image Formation

Compensating for Motion During Direct-Global Separation

Efficient Depth-Compensated Interpolation for Full Parallax Displays

Jingyi Yu CISC 849. Department of Computer and Information Science

Hemispherical confocal imaging using turtleback reflector

Acquiring 4D Light Fields of Self-Luminous Light Sources Using Programmable Filter

CS 498 VR. Lecture 20-4/11/18. go.illinois.edu/vrlect20

The 7d plenoptic function, indexing all light.

Lenses. Digital Cameras. Lenses. Lenses. Light focused by the lens. Lenses. Aperture Shutter Speed Holding a Camera Steady Shooting Modes ISO

A Theory of Plenoptic Multiplexing

Title: The Future of Photography is Computational Photography. Subtitle: 100 years after invention, Integral Photography becomes feasible

Modeling Light. Slides from Alexei A. Efros and others

Single-view 3D Reconstruction

CS 563 Advanced Topics in Computer Graphics Camera Models. by Kevin Kardian

ECE-161C Cameras. Nuno Vasconcelos ECE Department, UCSD

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Overview of Active Vision Techniques

Introduction to 3D Machine Vision

Compressive Sensing of High-Dimensional Visual Signals. Aswin C Sankaranarayanan Rice University

3D object recognition used by team robotto

Modeling Light. Michal Havlik

Plenoptic Cameras. Bastian Goldlücke, Oliver Klehm, Sven Wanner, and Elmar Eisemann. 5.1 Introduction

Modeling Light. Michal Havlik : Computational Photography Alexei Efros, CMU, Fall 2011

I have a meeting with Peter Lee and Bob Cosgrove on Wednesday to discuss the future of the cluster. Computer Graphics

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India

Projective Geometry and Camera Models

Real-time Generation and Presentation of View-dependent Binocular Stereo Images Using a Sequence of Omnidirectional Images

Motion and Tracking. Andrea Torsello DAIS Università Ca Foscari via Torino 155, Mestre (VE)

Multiple View Geometry

Three-dimensional directional display and pickup

Scalable multi-gpu cloud raytracing with OpenGL

Mohit Gupta. Assistant Professor, Computer Science Department University of Wisconsin-Madison. Ph.D. in Robotics, Carnegie Mellon University

HTC ONE (M8) DUO CAMERA WHITE PAPERS

Light. Properties of light. What is light? Today What is light? How do we measure it? How does light propagate? How does light interact with matter?

Pinhole Camera Model 10/05/17. Computational Photography Derek Hoiem, University of Illinois

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Depth Estimation with a Plenoptic Camera

Time-of-flight basics

Other approaches to obtaining 3D structure

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

L1a: Introduction to Light Fields

Capturing light. Source: A. Efros

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye

MAS.963 Special Topics: Computational Camera and Photography

Light Field Spring

Video Mosaics for Virtual Environments, R. Szeliski. Review by: Christopher Rasmussen

The 2 nd part of the photographic triangle

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

P2C2: Programmable Pixel Compressive Camera for High Speed Imaging

AN O(N 2 LOG(N)) PER PLANE FAST DISCRETE FOCAL STACK TRANSFORM

All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them.

Flat-Field Mega-Pixel Lens Series

Announcements. Written Assignment 2 out (due March 8) Computer Graphics

3D Surface Reconstruction Based On Plenoptic Image. Haiqiao Zhang

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

Introduction to Shutter Speed in Digital Photography. Read more:

Perceptual Effects in Real-time Tone Mapping

Multiview Reconstruction

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc.

Computer Graphics. Lecture 9 Environment mapping, Mirroring

Digital Imaging Study Questions Chapter 8 /100 Total Points Homework Grade

Introduction to Photography

Using Your Digital Camera

Volumetric Scene Reconstruction from Multiple Views

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome!

References Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A. Adams

Modeling Light. Michal Havlik

Genesis : The first complete 35mm Digital Cinematography System

DEPTH AND ANGULAR RESOLUTION IN PLENOPTIC CAMERAS. M. Damghanian, R. Olsson, M. Sjöström

We ll go over a few simple tips for digital photographers.

Radiometry and reflectance

INTRODUCTION TO MEDICAL IMAGING- 3D LOCALIZATION LAB MANUAL 1. Modifications for P551 Fall 2013 Medical Physics Laboratory

Transcription:

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras: Exploiting Spatial- Angular Temporal Tradeoffs in Photography Amit Agrawal Mitsubishi Electric Research Labs (MERL) Cambridge, MA, USA

Mitsubishi Electric Research Labs (MERL) Computational Cameras Where are the cameras?

Mitsubishi Electric Research Labs (MERL) Computational Cameras Cameras in Mobile Phones Source: isuppli

Mitsubishi Electric Research Labs (MERL) Computational Cameras Have Cameras Evolved? Lens Based Camera Obscura, 1568 Digital Cameras

Mitsubishi Electric Research Labs (MERL) Computational Cameras Conventional Cameras Tradeoffs in photography Aperture size, shutter speed, ISO Fast lens More light but low depth of field Allows small shutter time Macro, Wildlife, Sports High ISO Low light scenes, but more noise

Mitsubishi Electric Research Labs (MERL) Computational Cameras Have Projectors Evolved? Similar trends in form factor/cost Film/Slide projectors Digital projectors Pocket Projectors Pico Projectors Projectors in smartphones

Mitsubishi Electric Research Labs (MERL) Computational Cameras Projector vs Cameras Current projectors offer capabilities far beyond current cameras Each projector pixel can be independently controlled Allows coding and modulation of outgoing light How about cameras where each pixel can be independently controlled? Allow coding and modulation of incoming light?

Mitsubishi Electric Research Labs (MERL) Computational Cameras Projectors vs Cameras Exposure, Frame Rate, Resolution etc. High level controls Brightness, color temperature Per Pixel Control?

Mitsubishi Electric Research Labs (MERL) Computational Cameras Projectors vs Cameras Exposure, Frame Rate, Resolution etc. High level controls Brightness, color temperature Computational Cameras Per Pixel Control?

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Flutter Shutter Camera Coded Aperture Mask based light field camera Reinterpretable Camera Wide Angle light field camera

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Flutter Shutter Camera Coded Aperture Mask based light field camera Reinterpretable Camera Wide Angle light field camera

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Camera Coding/Modulation Dimension Flutter Shutter Time (Exposure) Coded Aperture Space Light Field Camera Space and Angle Reinterpretable Camera Space, Time, Angle Flexible Voxels Space and Time

Mitsubishi Electric Research Labs (MERL) Computational Cameras

Mitsubishi Electric Research Labs (MERL) Computational Cameras Coded Exposure [Raskar, Agrawal, Tumblin SIGGRAPH 2006]

Mitsubishi Electric Research Labs (MERL) Computational Cameras Coded Exposure (Flutter Shutter) Camera Raskar, Agrawal, Tumblin [Siggraph2006] Coding in Time: Shutter is opened and closed

Mitsubishi Electric Research Labs (MERL) Computational Cameras Blurring == Convolution Sharp Photo Blurred Photo PSF == Sinc Function Traditional Camera: Shutter is OPEN: Box Filter ω

Mitsubishi Electric Research Labs (MERL) Computational Cameras Sharp Photo Blurred Photo PSF == Broadband Function Preserves High Spatial Frequencies Flutter Shutter: Shutter is OPEN and CLOSED

Mitsubishi Electric Research Labs (MERL) Traditional Computational Cameras Coded Exposure Deblurred Image Deblurred Image Image of Static Object

Coded Exposure (Flutter Shutter) Camera Raskar, Agrawal, Tumblin [Siggraph2006] Coding in Time: Shutter is opened and closed

Mitsubishi Electric Research Labs (MERL) Computational Cameras Flutter Shutter Video Camera Pointgrey Dragonfly2 Camera Use Trigger Mode 5 On-chip, Additional Cost = $0

Mitsubishi Electric Research Labs (MERL) How to handle focus blur? Computational Cameras

Mitsubishi Electric Research Labs (MERL) Coded Exposure (Flutter Shutter) Raskar, Agrawal, Tumblin SIGGRAPH 2006 Computational Cameras Coded Aperture with Veeraraghavan, Raskar, Tumblin, & Mohan, SIGGRAPH 2007 Temporal 1-D broadband code: Motion Deblurring Spatial 2-D broadband code: Focus Deblurring

Mitsubishi Electric Research Labs (MERL) Computational Cameras LED In Focus Photo

Mitsubishi Electric Research Labs (MERL) Computational Cameras Out of Focus Photo: Open Aperture

Mitsubishi Electric Research Labs (MERL) Computational Cameras Out of Focus Photo: Coded Aperture

Blurred Photos Open Aperture Coded Aperture, 7 * 7 Mask

Deblurred Photos Open Aperture Coded Aperture, 7 * 7 Mask

Mitsubishi Electric Research Labs (MERL) Captured Blurred Photo Computational Cameras

Mitsubishi Electric Research Labs (MERL) Refocused on Person Computational Cameras

Mitsubishi Electric Research Labs (MERL) Computational Cameras Coded Imaging Blocking Light == More Information Coded Exposure Coding in Time Coded Aperture Coding in Space

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Camera Coding/Modulation Dimension Flutter Shutter Time (Exposure) Coded Aperture Space Light Field Camera Space and Angle

Mask? Mask Sensor Mask Sensor Full Resolution Digital Refocusing: Coded Aperture Camera 4D Light Field from 2D Photo: Heterodyne Light Field Camera

Mitsubishi Electric Research Labs (MERL) Computational Cameras Lytro: Lenslet-based Light Field camera Adelson and Wang, 1992, Ng et al. 2005

Mask based Light Field Camera (SIGGRAPH 2007) Sensor Mask Sum of Cosines Mask Pinhole Array Mask Tiled Broadband Mask

MERL, Northwestern Univ. Mask-Enhanced Cameras: Heterodyned Light Fields & Coded Aperture Veeraraghavan, Raskar, Agrawal, Mohan & Tumblin Optical Heterodyning High Freq Carrier 100.1 MHz Receiver: Demodulation Baseband Audio Signal Incoming Signal Reference Carrier Main Lens Object Mask Sensor Software Demodulation Recovered Light Field Photographic Signal (Light Field) Carrier Incident Modulated Signal Reference Carrier

Captured Light Field Digital Refocusing

Recovering Full Resolution 2D Image For in-focus scene Inserting Mask == Spatially Varying Image Attenuation Compensate using calibration image Full Resolution Image In Focus Out of Focus Captured Photo In Focus Out of Focus Calibration Photo of Pinhole Array

Recovered Image In Focus Out of Focus

Lens Glare Reduction using Light Field

Mitsubishi Electric Research Labs (MERL) Computational Cameras Effects of Glare on Image Hard to model, Low Frequency in 2D But reflection glare is outlier in 4D ray-space Sensor b a Lens Inter-reflections Angular Variation at pixel a

Mitsubishi Electric Research Labs (MERL) Computational Cameras Captured Photo: LED On

Mitsubishi Electric Research Labs (MERL) v Computational Cameras u y x

Mitsubishi Electric Research Labs (MERL) Computational Cameras Sequence of Sub-Aperture Views Traditional Camera Photo Glare Reduced Photo

Mitsubishi Electric Research Labs (MERL) Computational Cameras

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Camera Coding/Modulation Dimension Flutter Shutter Time (Exposure) Coded Aperture Space Light Field Camera Space and Angle Reinterpretable Camera Space, Time, Angle

Captured Photo

Video from Single-Shot (Temporal Frames)

Captured Photo

Rotating Doll in Focus

Reinterpretable Camera Resolution tradeoff for Conventional Imaging Fixed before capture video camera, lightfield camera Scene independent Resolution tradeoff for Reinterpretable Camera Variable in post-capture Scene dependent Different for different parts of the scene/captured photo

Captured 2D Photo

Captured 2D Photo Static Scene Parts In-Focus High Resolution 2D Image

Captured 2D Photo Static Scene Parts In-Focus Out of Focus High Resolution 2D Image 4D Light Field

Captured 2D Photo Static Scene Parts Dynamic Scene Parts In-Focus Out of Focus In-Focus High Resolution 2D Image 4D Light Field Video

Captured 2D Photo Static Scene Parts Dynamic Scene Parts In-Focus Out of Focus In-Focus Out of Focus High Resolution 2D Image 4D Light Field Video 1D Parallax + Motion

Coded Aperture Optical Heterodyning Reinterpretable Imager Static Aperture Mask Sensor Static Mask Sensor Dynamic Aperture Mask Static Mask Sensor SIGGRAPH 2007 Veeraraghavan et al. SIGGRAPH 2007 This Paper Digital Refocusing

Coded Aperture Optical Heterodyning Reinterpretable Imager Static Aperture Mask Sensor Static Mask Sensor Dynamic Aperture Mask Static Mask Sensor SIGGRAPH 2007 SIGGRAPH 2007 This Paper Digital Refocusing Light Field Capture

Coded Aperture Static Aperture Mask Sensor Optical Heterodyning Static Mask Sensor Reinterpretable Camera Dynamic Aperture Mask Static Mask Sensor SIGGRAPH 2007 SIGGRAPH 2007 Eurographics 2010 Digital Refocusing Light Field Capture Post-Capture Resolution Control

Implementation Camera Motor Wheel Shutter Aperture Mask on Wheel Near-Sensor Mask

Captured Photo

Static Object (in-focus)

Static Objects (Out-of-focus)

Moving Object (in depth)

Rotating Object (in focus)

Reconstructed Sub-Aperture Views (3 by 3 Light Field)

For Static Objects Angle Angle

For Moving Toy in Middle Angle Time

For Rotating Toy on Right Time Time

High Resolution Image Refocused on Static Toy

Digital Refocusing on Static Objects

Digital Refocusing on Static Objects

Digital Refocusing on Static Objects

Digital Refocusing on Static Objects

Digital Refocusing on Static Objects

Digital Refocusing on Static Objects

Digital Refocusing on Toy Moving in Depth

Digital Refocusing on Toy Moving in Depth

Digital Refocusing on Toy Moving in Depth

Digital Refocusing on Toy Moving in Depth

Digital Refocusing on Toy Moving in Depth

Digital Refocusing on Toy Moving in Depth

Video Video for frames Rotating of Toy in-focus

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Camera Coding/Modulation Dimension Flutter Shutter Time (Exposure) Coded Aperture Space Light Field Camera Space and Angle Reinterpretable Camera Space, Time, Angle Flexible Voxels Space and Time

Mitsubishi Electric Research Labs (MERL) Computational Cameras Flexible Voxels Similar idea as Reintepretable Camera But for videos Traditional Video Camera Spatial/Temporal Resolution is fixed Scene Independent Flexible Voxels Motion Aware Video Camera Scene dependent variable resolution

Sampling of the Space-Time Volume Conventional Sampling Scheme: Sensor Plane Frame 1 Frame 2 Frame N Camera Integration Time Time Our Sampling Scheme: Frame 1 Frame 2 Frame N Camera

Co-located Projector-Camera Setup Scene Camera Integration Time Projector Pattern Beam Splitter Image Plane Image Plane Projector Pixel 1 Pixel 2 Pixel K Camera Time 100

Multiple Balls Bouncing and Colliding (15 FPS) Close-up Large Motion Blur 101

Motion-aware Video Increasing Temporal Resolution + + Captured Frame Different Spatio-temporal Interpretations Motion Analysis Optical Motion-Aware Flow Magnitudes Video

Multiple Balls Bouncing Input Sequence Motion-Aware Video 104

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Camera Coding/Modulation Dimension Flutter Shutter Time (Exposure) Coded Aperture Space Light Field Camera Space and Angle Reinterpretable Camera Space, Time and Angle Flexible Voxels Space and Time Common Implementation using fast programmable LCD s

Light Fields Camera arrays / Hand-held light field cameras Represented by a set of perspective cameras Typically capture narrow field of view (FOV) light field Narrow FOV [Wilburn et al. 05] [Ng et al. 05] y v u x [Georgiev et al. 06] [Veeraraghavan et al. 07] Set of Perspective Cameras

Wide FOV Light Field? Normal Wide FOV Images

Wide FOV Light Field! Spherical Mirror Array Refractive Sphere Array

Wide FOV Refocusing (150 x150 )

Focus Back

Focus Ball

Focus Person

All-in-Focus

Depth Map

Wide FOV Refocusing (90 x80 )

Focus Back

Focus Tree

Focus Board

All-in-Focus

Depth Map

Refocusing in Traditional Light Field Object A Refocusing Geometry Projection to Refocusing Geometry Object B Real Cameras Refocus Viewpoint Efficient operation using projective texture mapping on GPU

Axial-Cone Modeling of Spherical Mirror Array Real Camera Virtual Cameras Spherical Mirrors

Axial-Cone Modeling of Rotationally Symmetric Mirror Real Camera d Captured Photo A cone of rays in the real camera (Angle ) Virtual Camera Rotationally Symmetric Mirror A cone of rays in a virtual camera (Distance d, Angle )

Axial-Cone Modeling of Spherical Mirror Array Real Camera Virtual Cameras Spherical Mirrors

Axial-Cone Modeling of Refractive Sphere Array Real Camera Refractive Spheres Virtual Cameras

Captured Photo Each Sphere Image Axial-Cone Modeling Projection to Refocusing Geometry One Light Field View

Light Field Views (100 x100 )

Light Field Views (100 x100 )

Refocusing Result (100 x100 )

Rendering using a Single Perspective Camera Perspective Distortion FOV: 100 x100 FOV: 150 x150

Refocusing Result: Cube Map (150 x150 )

Refocusing Result: Mercator Projection (150 x150 )

Dense Depth Estimation Plane sweeping for dense depth estimation d 3 d 2 d 1 Refocus Viewpoint Check color consistency across light field views at each depth layer

Axial-Cones Taguchi, Agrawal, Veeraraghavan, Ramalingam, & Raskar MERL / MIT Media Lab Dense Depth Estimation Plane sweeping for dense depth estimation d 3 d 2 d 1 Refocus Viewpoint Depth Map MITSUBISHI ELECTRIC RESEARCH LABORATORIES

Axial-Cones Taguchi, Agrawal, Veeraraghavan, Ramalingam, & Raskar MERL / MIT Media Lab Dense Depth Estimation Plane sweeping for dense depth estimation d 3 d 2 d 1 Refocus Viewpoint All-in-Focus Rendering MITSUBISHI ELECTRIC RESEARCH LABORATORIES

Axial-Cones Taguchi, Agrawal, Veeraraghavan, Ramalingam, & Raskar MERL / MIT Media Lab Prototypes Advantages Spherical Mirror Array Single-shot Flexible camera placement Low cost Portable Refractive Sphere Array MITSUBISHI ELECTRIC RESEARCH LABORATORIES

Array of 1 Refractive Spheres

Refocusing Perspective Projection (90 x80 )

All-in-Focus Perspective Projection (90 x80 )

Depth Map Perspective Projection (90 x80 )

Mitsubishi Electric Research Labs (MERL) Computational Cameras Light Field Mode? Flutter Shutter mode? Reinterpretable Mode?

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Sensing Per-Pixel Control Wide Angle Light Fields Modulation in other dimensions: wavelength Slicing and Sampling of Plentoptic function Reconstruction algorithms Image/video based priors, compressive sensing Statistical properties of plenoptic function

Mitsubishi Electric Research Labs (MERL) Computational Cameras Acknowledgements Ramesh Raskar, MIT Jack Tumblin, Northerwestern Univ Ashok Veeraraghavan, Rice Univ. Mohit Gupta, Columbia Univ Ankit Mohan, Flutter Srinivasa Narasimhan, CMU Cyrus Wilson Yuichi Taguchi, MERL Srikumar Ramalingam, MERL MERL, Jay Thornton, Joseph Katz, John Barnwell MELCO, Japan

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Flutter Shutter Camera Coded Aperture Mask based light field camera Reinterpretable Camera Wide Angle light field camera

Mitsubishi Electric Research Labs (MERL) Computational Cameras Computational Cameras Flutter Shutter Camera Coded Aperture Mask based light field camera Reinterpretable Camera Wide Angle light field camera