Computational Photography: Real Time Plenoptic Rendering

Similar documents
Computational Methods for Radiance. Render the full variety offered by the direct observation of objects. (Computationally).

The Focused Plenoptic Camera. Lightfield photographers, focus your cameras! Karl Marx

Radiance Photography. Todor Georgiev Adobe Systems. Andrew Lumsdaine Indiana University

Title: The Future of Photography is Computational Photography. Subtitle: 100 years after invention, Integral Photography becomes feasible

Modeling Light. Michal Havlik : Computational Photography Alexei Efros, CMU, Fall 2007

Modeling Light. Slides from Alexei A. Efros and others

Focused Plenoptic Camera and Rendering. Focused Plenoptic Camera and Rendering. Todor Georgiev 1, a) 2, b) and Andrew Lumsdaine

Focal stacks and lightfields

DEPTH, STEREO AND FOCUS WITH LIGHT-FIELD CAMERAS

DEPTH AND ANGULAR RESOLUTION IN PLENOPTIC CAMERAS. M. Damghanian, R. Olsson, M. Sjöström

Modeling Light. Michal Havlik

Depth Estimation with a Plenoptic Camera

AN O(N 2 LOG(N)) PER PLANE FAST DISCRETE FOCAL STACK TRANSFORM

Modeling Light. Michal Havlik : Computational Photography Alexei Efros, CMU, Fall 2011

Plenoptic Cameras. Bastian Goldlücke, Oliver Klehm, Sven Wanner, and Elmar Eisemann. 5.1 Introduction

Computational Photography

Modeling Light. On Simulating the Visual Experience

Modeling Light. Michal Havlik

Coding and Modulation in Cameras

Introduction to 3D Concepts

Computational Cameras: Exploiting Spatial- Angular Temporal Tradeoffs in Photography

Computer Graphics. Lecture 9 Environment mapping, Mirroring

New Method of Microimages Generation for 3D Display. Nicolò Incardona *, Seokmin Hong, Manuel Martínez-Corral

GPU assisted light field capture and processing

CS201 Computer Vision Lect 4 - Image Formation

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

An Approach on Hardware Design for Computationally Intensive Image Processing Applications based on Light Field Refocusing Algorithm

I have a meeting with Peter Lee and Bob Cosgrove on Wednesday to discuss the future of the cluster. Computer Graphics

But, vision technology falls short. and so does graphics. Image Based Rendering. Ray. Constant radiance. time is fixed. 3D position 2D direction

Announcements. Written Assignment 2 out (due March 8) Computer Graphics

Computational Photography

Other approaches to obtaining 3D structure


C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Plenoptic camera and its Applications

Analysis of Two Representative Algorithms of Depth Estimation from Light Field Images

Interpolation using scanline algorithm

CSE 167: Introduction to Computer Graphics Lecture #7: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2015

Capturing light. Source: A. Efros

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Depth extraction from unidirectional integral image using a modified multi-baseline technique

CS451Real-time Rendering Pipeline

The Rasterization Pipeline

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Computer Vision I - Image Matching and Image Formation

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Computer Graphics. Chapter 1 (Related to Introduction to Computer Graphics Using Java 2D and 3D)

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Models and Architectures

Multi-View Omni-Directional Imaging

Computer Graphics (CS 563) Lecture 4: Advanced Computer Graphics Image Based Effects: Part 2. Prof Emmanuel Agu

Image or Object? Is this real?

Pipeline Operations. CS 4620 Lecture 14

The CAFADIS camera: a new tomographic wavefront sensor for Adaptive Optics.

Measuring Light: Radiometry and Cameras

Scalable multi-gpu cloud raytracing with OpenGL

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker

The 7d plenoptic function, indexing all light.

Visual Pathways to the Brain

L16. Scan Matching and Image Formation

Mach band effect. The Mach band effect increases the visual unpleasant representation of curved surface using flat shading.

Direct Rendering of Trimmed NURBS Surfaces

CS4670: Computer Vision

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007

Computer Graphics Shadow Algorithms

Chapter 7 - Light, Materials, Appearance

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Mali Demos: Behind the Pixels. Stacy Smith

Real-time Generation and Presentation of View-dependent Binocular Stereo Images Using a Sequence of Omnidirectional Images

Homework 3: Programmable Shaders

CS427 Multicore Architecture and Parallel Computing

Depth Estimation of Semi-submerged Objects Using a Light Field Camera. Juehui Fan

SAMPLING AND NOISE. Increasing the number of samples per pixel gives an anti-aliased image which better represents the actual scene.

References Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A. Adams

IMAGE-BASED RENDERING

3D Surface Reconstruction Based On Plenoptic Image. Haiqiao Zhang

CS 563 Advanced Topics in Computer Graphics Camera Models. by Kevin Kardian

Tecnologie per la ricostruzione di modelli 3D da immagini. Marco Callieri ISTI-CNR, Pisa, Italy

Real-time Integral Photography Holographic Pyramid using a Game Engine

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Image-Based Rendering

Computer Graphics (CS 543) Lecture 10: Normal Maps, Parametrization, Tone Mapping

Image-Based Modeling and Rendering

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India

Topics and things to know about them:

High Dynamic Range Imaging.

Understanding Variability

Light. Properties of light. What is light? Today What is light? How do we measure it? How does light propagate? How does light interact with matter?

DEVELOPMENT AND EVALUATION OF DEPTH ESTIMATION FROM PLENOPTIC IMAGING FOR APPLICATION IN RETINAL IMAGING

Your logo here. October, 2017

CS6670: Computer Vision

Real Time Reflections Han-Wei Shen

Real-Time Universal Capture Facial Animation with GPU Skin Rendering

Rasterization Overview

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Image Based Rendering. D.A. Forsyth, with slides from John Hart

Head Mounted Display Optics I!

Transcription:

Computational Photography: Real Time Plenoptic Rendering Andrew Lumsdaine, Georgi Chunev Indiana University Todor Georgiev Adobe Systems

Who was at the Keynote Yesterday? 2

Overview Plenoptic cameras Rendering with GPUs Effects Choosing focus Choosing viewpoint (parallax) Stereo Choosing depth of field HDR Polarization Super resolution Demos Conclusion 3

Making (and Recreating) Memories 4

What s Wrong with this Picture? 5

Perspective

Film CAPTURE PROCESS VIEW Color Exposure Focus Color Exposure Difficult to Adjust 7

Along Came Photoshop CAPTURE PROCESS VIEW Color Exposure Focus Color Exposure Color Exposure 8

What s Wrong with This Picture?

What s Wrong? It s Only a Picture!

Can We Create More than Pictures? Can we request that Photography renders the full variety offered by the direct observation of objects? Is it possible to create a photographic print in such a manner that it represents the exterior world framed, in appearance, between the boundaries of the print, as if those boundaries were that of a window opened on reality. Gabriel Lippmann, 1908.

Pixels and Cores Moore s Law: Megapixels keep growing 7.2 MP = 8 by 10 at 300dpi Available on cell phones 60MP sensors available now Larger available soon (can a use be found?) Use pixels to capture richer information about a scene Computationally process captured data GPU power also riding Moore s Law curve 12

Infinite Variety (Focusing)

Focusing 14

Focusing 15

Different Views 16

Different Views 17

Different Views 18

Computational Photography With traditional photography light rays in a scene go through optical elements and are captured by a sensor With computational photography, we capture the light rays and apply optical elements computationally CAPTURE PROCES S VIEW Color Exposure Focus Color Exposure Focus Color Exposure Focus 3D 19

Real Time Plenoptic Rendering Capture the information about the intensity of light rays in a scene (the radiance or plenoptic function), not just a 2D picture Render images take pictures later, computationally Explore the full variety Requires real-time rendering (made possible by GPU) Opens new ways of working with photography Provides new photographic effects 20

Taking Pictures A traditional camera places optical elements into the light rays in a scene A pixel on the sensor is illuminated by rays from all directions The sensor records the intensity of the sum of those rays We lose all of the information about individual rays 21

Radiance (Plenoptic Function, Lightfield) Instead of integrating rays from all directions, capture the rays themselves (the radiance) Record all the information about the scene 22

Computational Photography The Basic Idea Using radicance, we can take the picture computationally Choose and apply optical elements computationally Render computationally Explore the full variety computationally CAPTURE PROCES S VIEW Color Exposure Focus Color Exposure Focus 23 Color Exposure Focus 3D

Radiance (and Transformations) The radiance is a density function over 4D ray space Each ray is a point in 4D ray space Directional coordinate Spatial coordinate 2D phase space (2D diagrams shown because they are easier to draw.) 26

Radiance (and Transformations) The radiance is a density function over 4D ray space Effects of optical elements (lenses, free space) are linear transformations Rendering (taking a picture) is integration over all p at a given q 27

Capturing the 4D radiance with a 2D sensor To capture individual rays, first we have to separate them At a particular spatial point, we have a set of rays at all directions If we let those rays travel through a pinhole, they will separate into distinguishable individual rays Two pinholes will sample two positions 29

A plenoptic camera A camera with an array of pinholes will capture an image that represents the 4D radiance In practice, one might use a microlens array to capture more light 30

The focused plenoptic camera With the Adobe camera, we make one important modification We use the microlenses to create an array of relay cameras to sample the plenoptic function with higher spatial resolution Note that image plane can also be behind the sensor (virtual image is captured) 33

Rendering: Taking a Computational Picture To take a picture (render) we integrate over all directions p 35

The Story So Far A plenoptic camera takes a 2D picture radiance image (or flat ) The pixels are samples of the radiance in the 4D ray space Optical elements (lenses, space) transform the ray space We take a picture by rendering (computationally) We adjust the picture by transforming the ray space (computationally) 36

The Part of the Talk Where we Reveal the Magic 37

First, the Camera 38

Plenoptic Image (Flat) 39

GPU Programming Basic alternatives for programming GPU: General purpose (CUDA) or graphics-based (GLSL) Open GL Shader Language (GLSL) a natural fit Texture mapping

Rendering with GPU using Open GL Read in plenoptic radiance image Create 2D texture object for radiance Serialize image data to Open GL compatible format Define the texture to OpenGL

GLSL Implementation of Rendering

GLSL Implementation of Rendering gl_texcoord[0].st gl_fragcolor

GLSL Implementation of Rendering Given output pixel coordinate gl_texcoord[0].st Find relevant microimage Find offset within Center

GLSL Rendering

Choosing View

Choosing View

Choosing View

GLSL Rendering

Choosing View

Choosing View

Choosing View

Computational Refocusing What does the sensor capture with different focal planes? What is this in terms of phase space?

Computational Refocusing We capture radiance r1. How can we compute r2? Apply translation transform of the radiance and render from transformed r Very expensive

Plenoptic 2.0 Refocusing Principle Rendering for two different focal planes Comments?

Plenoptic 2.0 Refocusing Principle A new focal plane can be rendered directly from original radiance

GLSL Rendering

Computational Focusing

Computational Focusing

Computational Focusing

To Find Out More Georgiev, T., Lumsdaine, A., Focused Plenoptic Camera and Rendering, Journal of Electronic Imaging, Volume 19, Issue 2, 2010 http://www.tgeorgiev.net/gtc2010/ 62

GLSL Implementation (Basic Rendering)