Light. Properties of light. What is light? Today What is light? How do we measure it? How does light propagate? How does light interact with matter?

Similar documents
Announcements. Light. Properties of light. Light. Project status reports on Wednesday. Readings. Today. Readings Szeliski, 2.2, 2.3.

CS6670: Computer Vision

CS4670: Computer Vision

CS6670: Computer Vision

CS5670: Computer Vision

Ligh%ng and Reflectance

Radiometry Measuring Light

Capturing light. Source: A. Efros

Lecture 11. Color. UW CSE vision faculty

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

Computer Graphics (CS 4731) Lecture 16: Lighting, Shading and Materials (Part 1)

Introduction. Lighting model Light reflection model Local illumination model Reflectance model BRDF

Lecture 15: Shading-I. CITS3003 Graphics & Animation

Colour Reading: Chapter 6. Black body radiators

Simple Lighting/Illumination Models

(0, 1, 1) (0, 1, 1) (0, 1, 0) What is light? What is color? Terminology

Computer Graphics (CS 543) Lecture 7b: Intro to lighting, Shading and Materials + Phong Lighting Model

Light Field Spring

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Shading. Brian Curless CSE 557 Autumn 2017

Lecture 22: Basic Image Formation CAP 5415

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

Lighting and Materials

Raytracing. COSC 4328/5327 Scott A. King

Visualisatie BMT. Rendering. Arjan Kok

CSE 167: Introduction to Computer Graphics Lecture #6: Colors. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

CSE 167: Lecture #7: Color and Shading. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Computer Vision. The image formation process

Lecture 1 Image Formation.

w Foley, Section16.1 Reading

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 557 Fall Required: Shirley, Chapter 10

Modeling Light. On Simulating the Visual Experience

Radiometry and reflectance

S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016

Lighting affects appearance

Local Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Global Illumination. Frank Dellaert Some slides by Jim Rehg, Philip Dutre

Color Vision. Spectral Distributions Various Light Sources

Reading. Shading. An abundance of photons. Introduction. Required: Angel , 6.5, Optional: Angel 6.4 OpenGL red book, chapter 5.

Photometric Stereo.

Topic 9: Lighting & Reflection models 9/10/2016. Spot the differences. Terminology. Two Components of Illumination. Ambient Light Source

Lighting affects appearance

Topic 9: Lighting & Reflection models. Lighting & reflection The Phong reflection model diffuse component ambient component specular component

CSE 681 Illumination and Phong Shading

CS 5625 Lec 2: Shading Models

Reflection and Shading

Topic 12: Texture Mapping. Motivation Sources of texture Texture coordinates Bump mapping, mip-mapping & env mapping

Sung-Eui Yoon ( 윤성의 )

Topic 11: Texture Mapping 11/13/2017. Texture sources: Solid textures. Texture sources: Synthesized

Virtual Reality for Human Computer Interaction

Lighting. Figure 10.1

Unit 3: Optics Chapter 4

Lighting and Shading

CSE 167: Lecture #6: Color. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Topic 11: Texture Mapping 10/21/2015. Photographs. Solid textures. Procedural

1. Final Projects 2. Radiometry 3. Color. Outline

Lenses: Focus and Defocus

Global Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Ø Sampling Theory" Ø Fourier Analysis Ø Anti-aliasing Ø Supersampling Strategies" Ø The Hall illumination model. Ø Original ray tracing paper

Illumination. Michael Kazhdan ( /657) HB Ch. 14.1, 14.2 FvDFH 16.1, 16.2

Path Tracing part 2. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

COMP environment mapping Mar. 12, r = 2n(n v) v

Lights, Surfaces, and Cameras. Light sources emit photons Surfaces reflect & absorb photons Cameras measure photons

Rendering: Reality. Eye acts as pinhole camera. Photons from light hit objects

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University

Radiance. Pixels measure radiance. This pixel Measures radiance along this ray

Comp 410/510 Computer Graphics. Spring Shading

Introduction to Computer Vision. Week 8, Fall 2010 Instructor: Prof. Ko Nishino

(A) Electromagnetic. B) Mechanical. (C) Longitudinal. (D) None of these.

Image Formation. Camera trial #1. Pinhole camera. What is an Image? Light and the EM spectrum The H.V.S. and Color Perception

Computer Graphics. Illumination and Shading

Understanding Variability

COMP371 COMPUTER GRAPHICS

Shading. Brian Curless CSE 457 Spring 2017

And if that 120MP Camera was cool

Shading. Shading = find color values at pixels of screen (when rendering a virtual 3D scene).

Why does a visual system need color? Color. Why does a visual system need color? (an incomplete list ) Lecture outline. Reading: Optional reading:

Illumination & Shading

Color. Reading: Optional reading: Chapter 6, Forsyth & Ponce. Chapter 4 of Wandell, Foundations of Vision, Sinauer, 1995 has a good treatment of this.

Light Transport Baoquan Chen 2017

Reading. 2. Color. Emission spectra. The radiant energy spectrum. Watt, Chapter 15.

Illumination in Computer Graphics

Shading and Illumination

Ray Tracing: Special Topics CSCI 4239/5239 Advanced Computer Graphics Spring 2018

CS201 Computer Vision Lect 4 - Image Formation

CS130 : Computer Graphics Lecture 8: Lighting and Shading. Tamar Shinar Computer Science & Engineering UC Riverside

The Display pipeline. The fast forward version. The Display Pipeline The order may vary somewhat. The Graphics Pipeline. To draw images.

A Brief Overview of. Global Illumination. Thomas Larsson, Afshin Ameri Mälardalen University

CSE 167: Lecture #6: Color. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Midterm Exam CS 184: Foundations of Computer Graphics page 1 of 11

Shading I Computer Graphics I, Fall 2008

Ray Tracing. CSCI 420 Computer Graphics Lecture 15. Ray Casting Shadow Rays Reflection and Transmission [Ch ]

Local vs. Global Illumination & Radiosity

Module 5: Video Modeling Lecture 28: Illumination model. The Lecture Contains: Diffuse and Specular Reflection. Objectives_template

L 32 Light and Optics [3]

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

Light, Reflectance, and Global Illumination

Transcription:

Light Properties of light Today What is light? How do we measure it? How does light propagate? How does light interact with matter? by Ted Adelson Readings Andrew Glassner, Principles of Digital Image Synthesis (Vol. 1), Morgan Kaufmann Publishers, 1995, pp. 5-32. Watt & Policarpo, The Computer Image, Addison-Wesley, 1998, pp. 64-71, 103-114 (5.3 is optional). What is light? Electromagnetic radiation (EMR) moving along rays in space R(!) is EMR, measured in units of power (watts)! is wavelength Light field We can describe all of the light in the scene by specifying the radiation (or radiance along all light rays) arriving at every point in space and from every direction

The light field Stanford light field gantry Known as the plenoptic function If you know R, you can predict how the scene would appear from any viewpoint. How? The light field t is not time (different from above t!) Assume radiance does not change along a ray what does this assume about the world? Parameterize rays by intersection with two planes: Usually drop! and time parameters How could you capture a light field? Lytro What is light? Electromagnetic radiation (EMR) moving along rays in space R(!) is EMR, measured in units of power (watts)! is wavelength Perceiving light How do we convert radiation into color? What part of the spectrum do we see? FRANK O CONNELL/THE NEW YORK TIMES 7

The visible light spectrum We see electromagnetic radiation in a range of wavelengths Light spectrum The appearance of light depends on its power spectrum How much power (or energy) at each wavelength daylight tungsten bulb Our visual system converts a light spectrum into color This is a rather complex transformation The human visual system Density of rods and cones Color perception Light hits the retina, which contains photosensitive cells rods and cones These cells convert the spectrum into a few discrete values Rods and cones are non-uniformly distributed on the retina Rods responsible for intensity, cones responsible for color Fovea - Small region (1 or 2 ) at the center of the visual field containing the highest density of cones (and no rods). Less visual acuity in the periphery many rods wired to the same neuron

Demonstrations of visual acuity Demonstrations of visual acuity With one eye shut, at the right distance, all of these letters should appear equally legible (Glassner, 1.7). With left eye shut, look at the cross on the left. At the right distance, the circle on the right should disappear (Glassner, 1.8). Brightness contrast and constancy The apparent brightness depends on the surrounding region brightness contrast: a constant colored region seem lighter or darker depending on the surround: Light response is nonlinear Our visual system has a large dynamic range We can resolve both light and dark things at the same time One mechanism for achieving this is that we sense light intensity on a logarithmic scale an exponential intensity ramp will be seen as a linear ramp Another mechanism is adaptation rods and cones adapt to be more sensitive in low light, less sensitive in bright light. brightness constancy: a surface looks the same under widely varying lighting conditions.

Visual dynamic range After images Tired photoreceptors Send out negative response after a strong stimulus http://www.michaelbach.de/ot/mot_adaptspiral/index.html Color perception Color perception L response curve Power S M L Wavelength Three types of cones Each is sensitive in a different region of the spectrum but regions overlap Short (S) corresponds to blue Medium (M) corresponds to green Long (L) corresponds to red Different sensitivities: we are more sensitive to green than red varies from person to person (and with age) Colorblindness deficiency in at least one type of cone Rods and cones act as filters on the spectrum To get the output of a filter, multiply its response curve by the spectrum, integrate over all wavelengths Each cone yields one number Q: How can we represent an entire spectrum with 3 numbers? A: We can t! Most of the information is lost. As a result, two different spectra may appear indistinguishable» such spectra are known as metamers» http://www.cs.brown.edu/exploratories/freesoftware/repository/edu/brown/cs/ exploratories/applets/spectrum/metamers_guide.html

Perception summary The mapping from radiance to perceived color is quite complex! We throw away most of the data We apply a logarithm Brightness affected by pupil size Brightness contrast and constancy effects Afterimages Camera response function Now how about the mapping from radiance to pixels? It s also complex, but better understood This mapping known as the film or camera response function How can we recover radiance values given pixel values? Why should we care? Useful if we want to estimate material properties Shape from shading requires radiance Enables creating high dynamic range images What does the response function depend on? Recovering the camera response Method 1 Carefully model every step in the pipeline measure aperture, model film, digitizer, etc. this is *really* hard to get right Method 2 Calibrate (estimate) the response function Image several objects with known radiance Measure the pixel values Fit a function pixel intensity = response curve Recovering the camera response Method 3 Calibrate the response function from several images Consider taking images with shutter speeds 1/1000, 1/100, 1/10, and 1 Q: What is the relationship between the radiance or pixel values in consecutive images? A: 10 times as much radiance Can use this to recover the camera response function pixel intensity = response curve radiance Find the inverse: maps pixel intensity to radiance For more info exposure " radiance * time = P. E. Debevec and J. Malik. Recovering High Dynamic Range Radiance Maps from Photographs. In SIGGRAPH 97, August 1997

High dynamic range imaging Light transport Techniques Debevec: http://www.debevec.org/research/hdr/ Columbia: http://www.cs.columbia.edu/cave/software/rascal/rrgallery.php Light sources Basic types point source directional source a point source that is infinitely far away area source a union of point sources from Steve Marschner More generally a light field can describe *any* distribution of light sources

from Steve Marschner The interaction of light and matter What happens when a light ray hits a point on an object? Some of the light gets absorbed converted to other forms of energy (e.g., heat) Some gets transmitted through the object possibly bent, through refraction Some gets reflected as we saw before, it could be reflected in multiple directions at once Let s consider the case of reflection in detail In the most general case, a single incoming ray could be reflected in all directions. How can we describe the amount of light reflected in each direction? The BRDF Diffuse reflection The Bidirectional Reflection Distribution Function Given an incoming ray and outgoing ray what proportion of the incoming light is reflected along outgoing ray? surface normal Diffuse reflection Dull, matte surfaces like chalk or latex paint Microfacets scatter incoming light randomly Effect is that light is reflected equally in all directions Answer given by the BRDF:

Diffuse reflection Diffuse reflection governed by Lambert s law Viewed brightness does not depend on viewing direction Brightness does depend on direction of illumination This is the model most often used in computer vision Specular reflection For a perfect mirror, light is reflected about N Near-perfect mirrors have a highlight around R common model: Lambert s Law: L, N, V unit vectors I e = outgoing radiance I i = incoming radiance BRDF for Lambertian surface Specular reflection Phong illumination model Phong approximation of surface reflectance Assume reflectance is modeled by three components Diffuse term Specular term Ambient term (to compensate for inter-reflected light) Moving the light source Changing n s L, N, V unit vectors I e = outgoing radiance I i = incoming radiance I a = ambient light k a = ambient light reflectance factor (x) + = max(x, 0)

Measuring the BRDF Columbia-Utrecht Database traditional design by Greg Ward Gonioreflectometer Device for capturing the BRDF by moving a camera + light source Need careful control of illumination, environment Captured BRDF models for a variety of materials http://www.cs.columbia.edu/cave/software/curet/index.php