Advanced Rendering 1/36

Similar documents
Recall: Basic Ray Tracer

Visual cues to 3D geometry. Light Reflection and Advanced Shading. Shading. Recognizing materials. size (perspective) occlusion shading

Michelson Interferometer

I have a meeting with Peter Lee and Bob Cosgrove on Wednesday to discuss the future of the cluster. Computer Graphics

Photon Maps. The photon map stores the lighting information on points or photons in 3D space ( on /near 2D surfaces)

Specular reflection. Lighting II. Snell s Law. Refraction at boundary of media

HW Chapter 20 Q 2,3,4,5,6,10,13 P 1,2,3. Chapter 20. Classic and Modern Optics. Dr. Armen Kocharian

Sung-Eui Yoon ( 윤성의 )

COMPUTER GRAPHICS COURSE. LuxRender. Light Transport Foundations

CS354 Computer Graphics Ray Tracing. Qixing Huang Januray 24th 2017

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

Today. Participating media. Participating media. Rendering Algorithms: Participating Media and. Subsurface scattering

Ray Tracing. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

Recap: Refraction. Amount of bending depends on: - angle of incidence - refractive index of medium. (n 2 > n 1 ) n 2

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc.

Light: Geometric Optics

Measuring Light: Radiometry and Cameras

Lecture 4 Recap of PHYS110-1 lecture Physical Optics - 4 lectures EM spectrum and colour Light sources Interference and diffraction Polarization

specular diffuse reflection.

The Rendering Equation. Computer Graphics CMU /15-662

Ray Tracing. CSCI 420 Computer Graphics Lecture 15. Ray Casting Shadow Rays Reflection and Transmission [Ch ]

Announcements. Written Assignment 2 out (due March 8) Computer Graphics

L 32 Light and Optics [3]

Light. Electromagnetic wave with wave-like nature Refraction Interference Diffraction

CENG 477 Introduction to Computer Graphics. Ray Tracing: Shading

Chapter 23. Light Geometric Optics

Conceptual Physics 11 th Edition

2/26/2016. Chapter 23 Ray Optics. Chapter 23 Preview. Chapter 23 Preview

Photorealism: Ray Tracing

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome!

Ray Tracing. Brian Curless CSEP 557 Fall 2016

COMP371 COMPUTER GRAPHICS

History of Light. 5 th Century B.C.

LECTURE 37: Ray model of light and Snell's law

Physics 202, Lecture 23

Lec. 7: Ch. 2 - Geometrical Optics. 1. Shadows 2. Reflection 3. Refraction 4. Dispersion. 5. Mirages, sun dogs, etc.

EECS 487: Interactive Computer Graphics

What is it? How does it work? How do we use it?

Ch. 22 Properties of Light HW# 1, 5, 7, 9, 11, 15, 19, 22, 29, 37, 38

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

Schedule. MIT Monte-Carlo Ray Tracing. Radiosity. Review of last week? Limitations of radiosity. Radiosity

Chapter 5 Mirror and Lenses

Computer Graphics. Si Lu. Fall uter_graphics.htm 11/22/2017

INTRODUCTION REFLECTION AND REFRACTION AT BOUNDARIES. Introduction. Reflection and refraction at boundaries. Reflection at a single surface

Chapter 26 Geometrical Optics

Phys 1020, Day 18: Questions? Cameras, Blmfld Reminders: Next Up: digital cameras finish Optics Note Final Project proposals next week!

Topic 9: Lighting & Reflection models 9/10/2016. Spot the differences. Terminology. Two Components of Illumination. Ambient Light Source

AP Physics: Curved Mirrors and Lenses

Refraction of Light. This bending of the ray is called refraction

Topic 9: Lighting & Reflection models. Lighting & reflection The Phong reflection model diffuse component ambient component specular component

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

Advanced Graphics. Path Tracing and Photon Mapping Part 2. Path Tracing and Photon Mapping

Pre-Lab Quiz / PHYS 224 Dispersion and Prism

3 Interactions of Light Waves

CS 563 Advanced Topics in Computer Graphics Camera Models. by Kevin Kardian

Consider a partially transparent object that is illuminated with two lights, one visible from each side of the object. Start with a ray from the eye

Distribution Ray-Tracing. Programação 3D Simulação e Jogos

Lecture 7 Notes: 07 / 11. Reflection and refraction

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

Chapter 24 - The Wave Nature of Light

GEOMETRIC OPTICS. LENSES refract light, so we need to know how light bends when entering and exiting a lens and how that interaction forms an image.

Nicholas J. Giordano. Chapter 24. Geometrical Optics. Marilyn Akins, PhD Broome Community College

Chapter 26 Geometrical Optics

index of refraction-light speed

MIT Monte-Carlo Ray Tracing. MIT EECS 6.837, Cutler and Durand 1

Chapter 36. Image Formation

Lighting and Shading

CPSC GLOBAL ILLUMINATION

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources.

Light and Electromagnetic Waves. Honors Physics

Lecture 24 EM waves Geometrical optics

Diffraction Diffraction occurs when light waves is passed by an aperture/edge Huygen's Principal: each point on wavefront acts as source of another

Distributed Ray Tracing

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker

Basic optics. Geometrical optics and images Interference Diffraction Diffraction integral. we use simple models that say a lot! more rigorous approach

Light & Optical Systems Reflection & Refraction. Notes

Lecture Outline Chapter 26. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Distributed Ray Tracing

Conceptual Physics Fundamentals

Diffraction. Single-slit diffraction. Diffraction by a circular aperture. Chapter 38. In the forward direction, the intensity is maximal.

GG450 4/5/2010. Today s material comes from p and in the text book. Please read and understand all of this material!

Global Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Reflection and Shading

Global Illumination The Game of Light Transport. Jian Huang

CS201 Computer Vision Lect 4 - Image Formation

Understanding Variability

PHYS 219 General Physics: Electricity, Light and Modern Physics

Lec. 6: Ch. 2 - Geometrical Optics

Light and Sound. Wave Behavior and Interactions

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

Movie: For The Birds. Announcements. Ray Tracing 1. Programming 2 Recap. Programming 3 Info Test data for part 1 (Lines) is available

Lighting & 3D Graphics. Images from 3D Creative Magazine

Where n = 0, 1, 2, 3, 4

Refraction at a single curved spherical surface

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007

Radiometry (From Intro to Optics, Pedrotti 1-4) Radiometry is measurement of Emag radiation (light) Consider a small spherical source Assume a black

Chapter 33 Continued Properties of Light. Law of Reflection Law of Refraction or Snell s Law Chromatic Dispersion Brewsters Angle

Lecture 5 Chapter 1 & 2. Sources of light

Chapter 7: Geometrical Optics. The branch of physics which studies the properties of light using the ray model of light.

Transcription:

Advanced Rendering 1/36

[Robert L. Cook et.al] 2/31

Shutter Speed The shutter allows light to hit the sensor for a finite duration of time Objects which move while the shutter is open create multiple images on the sensor, resulting in motion blur A faster shutter speed prevents motion blur, but can severely limit the amount of light available to the sensor making the resulting image too dark (especially when the aperture size is small) 3/31

Motion Blur Set up animations for moving objects during the time interval in which the shutter is open [T 0, T 1 ] E.g. describe the transform of the object by a function F(t) for t [T 0, T 1 ] For each ray: Assign a random time t ray =(1-α)T 0 +αt 1 All objects in the scene are placed in their time t ray locations i.e. given by F(t ray ) Trace the ray with the time t ray scene and get a color for that ray This works significantly better when multiple rays per pixel are used to combat temporal aliasing 4/31

[Robert L. Cook et.al] 5/31

Focal Length The distance over which initially parallel rays are brought into focus onto a focal point by a lens or lens system A stronger lens system has a shorter focal length Individual elements of a lens system can be adjusted to change the overall focal length of a camera (but each individual lens/element has a fixed focal length) If the object is far enough away (infinity), the image plane or sensor should be placed near (or at) the focal point 6/31

Field of View The part of the world that is visible to the sensor Zoom in/out by increasing/decreasing the focal length of the lens system The sensor needs to be moved out/in to adjust for the new focal length Since the size of the sensor does not change, the field of view will shrink/expand 7/31

Field of View Zooming in the camera shrinks the FOV 8/31

Field of View Zooming in the camera shrinks the FOV 500mm 1000mm 9/31

Field of View The field of view for a ray tracer can be adjusted by changing the distance between the eye point and the image plane Alternatively, one can change the size of the sensor (unlike in a real camera) A common mistake in computer graphics is to place the film plane too close to the main objects in the scene The desired field of view is then obtained by placing the eye point very close to the film plane or by making a very large film plane (resulting in an un-natural fish-eye lens effect) 10/31

Circle of confusion An optical spot caused by a cone of light rays from a lens not coming into perfect focus when imaging a point source When the spot is approximately equal to the size of a pixel on the sensor, the object seems to be in focus Objects at varying distances from the camera require the sensor to be paced at different distances from the lens in order for the object to be in focus Depth of Field - the distance between the nearest and farthest objects in a scene that appear roughly in focus (the circle of confusion is not too big) 11/31

Depth of Field A pinhole camera has infinite depth of field Making the aperture smaller improves the depth of field However, this limits the amount of light entering the camera making the image darker Decreasing the shutter speed to offset this can result in motion blur Small apertures also cause undesirable light diffraction U d C D f f is the focal length N=f/d is the F-Number (with d the aperture diameter) U is the real world object distance C is the allowable circle of confusion 12/31

Aperture & Depth of Field 13/31

Depth of Field Specify a focal plane (red plane, below) where objects will be in focus For each pixel, the focal point is calculated as the intersection of the standard ray tracing ray (green ray, below) and the pre-specified focal plane For each pixel, replace the pinhole eye with a circular region Then to shade that pixel, shoot multiple rays from sampled points in the circular region through the focal point (and average the results) Objects further away from the focal plane will have more blurring pinhole ray tracer lens ray tracer 14/31

Depth of Field 15/31

16/31

Dispersion Dispersion occurs because the phase velocity (and thus the index of refraction) depends on the frequency/wavelength of light Dispersive media: glass, raindrops Index of refraction: Air nn 1 λλ 1 ; Glass/water nn 2 λλ > 1 For visible light in most transparent materials, nn decreases towards 1 as the wavelength increases So blue light (λλ 400nm) bends more than red light λλ 700nm Cauchy s approximation nn λλ = AA + BB + CC λλ 2 λλ4 with material parameters A, B, C 17/36

Dispersion Humans have tri-stimulus color perception Allowed us to optimize rendering by only working with R, G, B However, interaction of light with an arbitrary surface cannot (in general) be described correctly using only three components Same RGB value could map to many different power distributions (and therefore wavelengths) Need to more accurately describe light and its interactions with surfaces using spectral power distributions Consider a prism dispersion setup with two orange light sources that have identical RGB values but different spectral power distributions: 18/36

Dispersion Glass prism lit by sodium light source Ocean Light Simulator 19/36

Dispersion Glass prism lit by red+green light source 20/36 Ocean Light Simulator

Wavelength Light Map When tracing photons from the light source, importance sample the light s spectral power distribution to obtain a λλ for each photon Use λλ and the reflectance/transmittance spectrum at each intersection point to trace the photon throughout the scene Store the incident power of the photon along with the wavelength of the photon in the photon map 21/36

Gathering (Camera Rays) When tracing rays from the camera, estimate the spectral power distribution at an intersection point using the nearby photon samples Multiply/Integrate this estimated spectral power distribution by the tristimulus response functions to obtain R, G, B values Requires significantly more samples in the photon map, although many optimization strategies exist 22/36

23/31

Atmospheric Effects Caused by light being scatted towards the camera by mist or dust 24/36

Absorption Light traveling through a medium may interact with the medium in such a way that the light energy is converted to another form of non-visible energy (such as heat) Define an absorption coefficient, σσ aa (xx) As we move a small distance dxx along a ray, a fraction of the current radiance LL(xx, ωω) given by σσ aa xx LL xx, ωω is absorbed: dll xx, ωω = σσ aa xx LL xx, ωω dxx 25/36

Out-Scattering Light traveling through a medium in a straight line may interact with the medium and be scattered to travel off in a different direction At sunset, the atmosphere scatters away most of the blue light leaving mostly red light traveling to our eyes from the sun Define a scattering coefficient, σσ ss (xx) As we move a small distance dxx along a ray, a fraction of the radiance LL(xx, ωω) given by σσ ss xx LL xx, ωω is scattered off in another direction (and no longer travels along the ray): dll xx, ωω = σσ ss xx LL xx, ωω dxx 26/36

Total Attenuation The probability light is attenuated (either absorbed or out-scattered) per unit length is cc xx = σσ aa xx + σσ ss (xx) As we move a small distance dxx along a ray, a fraction of the radiance LL(xx, ωω) given by cc(xx)ll xx, ωω is attenuated dll xx, ωω = cc(xx)ll xx, ωω dxx Both the primary rays shot from a camera pixel and the secondary shadow rays used in lighting calculations can pass through transparent objects or participating media The lighting contribution along these rays needs to be attenuated Attenuation in a participating medium can be modeled using Beer s law 27/36

Recall: Beer s Law If the media is homogeneous, the attenuation along the ray can be described using Beer s Law: di = ci dx where I is the light intensity, x is the distance along the ray, and c is the attenuation constant (which varies based on color/wavelength) Solving this Ordinary Differential Equation (ODE) with the initial value I ( 0) = I0, results in: I( x) = I 0 e cx 28/36

Inhomogeneous Beer s Law For non-homogeneous media, the attenuation constant varies spatially based on the concentration of the inhomogeneities Discretize the ray into NN small segments, and treat the attenuation as constant over each small segment Converges to the correct answer as the number of segments is increased The attenuation along the i-th segment is set to be ee cc.5(xx ii 1+xx ii ) Δxx where Δxx = (xx NN xx 0 )/NN is the segment length, cc.5(xx ii 1 +xx ii ) is the attenuation constant evaluated at the center of the segment, and xx ii = xx 0 + iδxx The total attenuation along the ray is computed via multiplication: ee cc.5(xx 0+xx 1 ) Δxx ee cc.5(xx 1+xx 2 ) Δxx ee cc.5 xx NN 1+xx NN Δxx xx 0 xx NN Δxx 29/36

Attenuation Note how the smoke, acting as a participating media, casts a shadow on the ground The shadow rays cast from the ground plane to the light source have their light attenuated by the smoke volume The shadow is not completely black, since some light passes through the smoke volume 30/36

Camera Rays Send out rays from the camera into the scene intersecting objects and calculating their lighting as usual For any ray that travels through participating media on the way from the camera to the object, the radiance along that ray needs to be attenuated Just like as for shadow rays An objects color could be partially attenuated or completely attenuated (not visible - occluded) by the participating media 31/36

Volumetric Light Map Represent a 3D light map with a uniform grid enclosing the participating media Could also use an octree, or any other spatial partition with sample points For each sample point that lies within the participating medium, send out a shadow ray to the light source and compute the attenuated radiance that reaches that sample point This computes and stores self-shadowing effects (e.g., LEFT: clouds appear darker on the side away from the light; RIGHT: smoke has light and dark regions from self-shadowing) 32/36

In-Scattering At each point along a ray, some light will be in-scattered from the surrounding medium into the direction of the ray In-scattering increases the radiance along the direction of the ray The sky appears blue because the particles in the atmosphere scatter blue light in every direction (and some of it towards our eyes) Without in-scattering, the sky would appear black 33/36

In-Scattering The radiance contribution due to in-scattering from the participating media needs to be added along the rays from the camera to the objects Without in-scattering, an object whose outgoing radiance was completely attenuated by participating media produces black pixels since there would be zero radiance along those rays In order to add the in-scattering contribution, add in-scattered light along each segment of the ray discretization already used for attenuation In-scattered light is added to the total light at each point, and thus gets attenuated by subsequent segments along the discretized ray Thus the calculation needs to be done from object to camera The precomputed volumetric light map stores candidate light for inscattering 34/36

In-Scattering At the center of each discretized segment of the camera ray, interpolate the radiance LL xx, ωω from the sample points of the volumetric light map The direction of the incoming light, ωω, is the direction from the light source to the center of the discretized segment a separate light map is needed for each light source A phase function pp(ωω, ωω ) gives the probability that an incoming ray from the light source with direction ωω is scattered into the direction of the camera ray ωω The radiance at this point xx scattered towards the camera along direction ωωω is pp ωω, ωω LL(xx, ωω)σσ ss xx The in-scattered radiance contribution from the entire discretized segment is then pp ωω, ωω LL xx, ωω σσ ss xx Δxx light dd N.B. σσ ss is the probability of any scattering in any direction, and pp selects the subset of these that scatter in the desired direction 35/36

Phase Functions Energy conservation: sphere pp ωω, ωω dωω = 1 Many phase functions are parameterized by the phase angle: cosθθ = ωω ωω 1. Isotropic: pp cosθθ = 1 4ππ 2. Rayleigh: pp cosθθ = 3 1 + 8 cos2 θθ Models scattering due to particles smaller than the wavelength of light, such as in the atmosphere 1 4ππ 1 gg2 1+gg 2 2ggggggg 1.5 gg denotes the average phase angle and can be treated as a tunable parameter which allows one to adjust the appearance of a medium gg = 0 results in the isotropic phase function 36/36 3. Henyey-Greenstein: pp cosθθ =

Volumetric Emission Some participating media emit light e.g. fire Hot carbon soot emits blackbody radiation based on temperature Electrons emit light energy as they fall from higher energy excited states to lower energy states This lighting information can be added as a separate volumetric light map Note that this volumetric emission is in every direction, as opposed to the previously calculated information for self-shadowing Thus when calculating the in-scattering contribution, need to integrate the product of the phase function and incoming radiance over all incoming directions 37/36

Volumetric Emission Adding volumetric emission to the light map gives the desired orange/blue/etc. colors But only adding it to the light map doesn t allow for the hot carbon soot and light energy from electrons to cast shadows and light the scene To do this, need to treat this region as a volume light Often modeled by breaking it up into many small point lights similar to an area light These point lights are then used just like every other light in the scene in regards to shadow rays, creating photon maps, etc. and participate in the creation of the volumetric light map for the self shadowing of the participating media 38/36

39/31

Lens Flare Light can be reflected and scattered by lenses in the lens system, resulting in generally unwanted but impressive effects This is caused in part by material inhomogeneities in the lens Model and ray trace a full optical model of the entire camera lens system Geometry of lens surfaces and characteristic planes (entrance, aperture, sensor plane) Absorption and dispersion of lens elements Antireflective coatings Diffraction 40/36

Camera Lens Geometry The camera lens consists of a large number of elements Specify the geometry of lens elements with parameters such as radius (for spherical elements), diameter, distance between elements, index of refraction, etc. Trace rays through these elements Use Snell s law to compute the reflected/transmitted ray directions Use Fresnel Equations to compute the transmission and reflection components Canon EF-S 17-55mm f/2.8 IS USM 41/36

Absorption and Dispersion Light is attenuated as it moves through each lens element Treat each lens element as a participating medium with an absorption probability Lens elements are characterized by the Abbe number which measures the material s dispersion Results in chromatic aberration Model these effects using spectral rendering techniques Panasonic DMC-FZ5 by Tony & Marilyn Karp 42/36

Antireflective Coatings Optical coatings are applied to the surfaces of lenses to reduce reflection Characterized by a residual reflectivity RR(λλ, θθ) where λλ is the wavelength of the light and θθ is the incident angle Residual reflectivity denotes the amount of light reflected after the destructive interference of the coating is taken into account Requires spectral rendering techniques and multiple evaluations of the Fresnel equations 43/36

Diffraction Light tends to spread out as it goes through small openings Happens when your aperture gets too small we call this diffraction limited Tends to create constructive and destructive interference Airy disk Difficult to account for in a ray tracer due to geometric optics assumptions can fake with a texture 44/36

Ray Tracing Lens Flare Trace rays from the lights into the camera lens Intersect the rays with the camera lens elements Use spectral rendering to compute reflection and transmission in a physically accurate manner Store wavelength-dependent illumination samples on the film Estimate the spectral power distribution on each pixel of the film to obtain the image J.J. Abrams apologizes for being addicted to lens flare 45/36

Rogue One: December 16, 2016 Star Wars: December 18, 2015 46/36

Star Wars: The Last Jedi December 15, 2017 47/36

CS 248 Interactive Computer Graphics Making a video game