Projections and Hardware Rendering. Brian Curless CSE 557 Fall 2014

Similar documents
Reading. 18. Projections and Z-buffers. Required: Watt, Section , 6.3, 6.6 (esp. intro and subsections 1, 4, and 8 10), Further reading:

Projections. Brian Curless CSE 457 Spring Reading. Shrinking the pinhole. The pinhole camera. Required:

Shading. CSE 457 Winter 2015

Shading. Brian Curless CSE 457 Spring 2015

Reading. Shading. Introduction. An abundance of photons. Required: Angel , Optional: OpenGL red book, chapter 5.

Reading. Shading. An abundance of photons. Introduction. Required: Angel , 6.5, Optional: Angel 6.4 OpenGL red book, chapter 5.

Shading. Brian Curless CSE 457 Spring 2017

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture 10

Pipeline Operations. CS 4620 Lecture 14

CEng 477 Introduction to Computer Graphics Fall

CSE 167: Lecture #8: Lighting. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Shading and Illumination

Overview. Shading. Shading. Why we need shading. Shading Light-material interactions Phong model Shading polygons Shading in OpenGL

Lecture 4 Advanced Computer Graphics (CS & SE )

INF3320 Computer Graphics and Discrete Geometry

3D Polygon Rendering. Many applications use rendering of 3D polygons with direct illumination

Three-Dimensional Graphics V. Guoying Zhao 1 / 55

Illumination and Shading

Computer Graphics I Lecture 11

Models and Architectures

CS Surface Rendering

Illumination and Shading

Illumination Model. The governing principles for computing the. Apply the lighting model at a set of points across the entire surface.

Illumination and Shading

Homework #2. Shading, Projections, Texture Mapping, Ray Tracing, and Bezier Curves

Why we need shading?

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

CS 4731: Computer Graphics Lecture 16: Phong Illumination and Shading. Emmanuel Agu

Ambient reflection. Jacobs University Visualization and Computer Graphics Lab : Graphics and Visualization 407

Shading in OpenGL. Outline. Defining and Maintaining Normals. Normalization. Enabling Lighting and Lights. Outline

CS4620/5620: Lecture 14 Pipeline

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Light Sources. Spotlight model

Computer Graphics. Shadows

Today s class. Simple shadows Shading Lighting in OpenGL. Informationsteknologi. Wednesday, November 21, 2007 Computer Graphics - Class 10 1

5.2 Shading in OpenGL

Homework #2. Shading, Ray Tracing, and Texture Mapping

Introduction to Computer Graphics with WebGL

Lighting/Shading III. Week 7, Wed Mar 3

CS 130 Final. Fall 2015

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 557 Fall Required: Shirley, Chapter 10

Computer Graphics. Illumination and Shading

CSE328 Fundamentals of Computer Graphics

Ray Tracing. Brian Curless CSEP 557 Fall 2016

CHAPTER 1 Graphics Systems and Models 3

CSE 167: Introduction to Computer Graphics Lecture #7: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2015

Illumination & Shading: Part 1

Virtual Reality for Human Computer Interaction

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Shadow Algorithms. CSE 781 Winter Han-Wei Shen

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

Lighting. CSC 7443: Scientific Information Visualization

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Institutionen för systemteknik

Shading , Fall 2004 Nancy Pollard Mark Tomczak

Illumination and Shading

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Methodology for Lecture. Importance of Lighting. Outline. Shading Models. Brief primer on Color. Foundations of Computer Graphics (Spring 2010)

Shading. Brian Curless CSE 557 Autumn 2017

CS451Real-time Rendering Pipeline

Deferred Rendering Due: Wednesday November 15 at 10pm

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

The Traditional Graphics Pipeline

Introduction Rasterization Z-buffering Shading. Graphics 2012/2013, 4th quarter. Lecture 09: graphics pipeline (rasterization and shading)

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

Texture Mapping. Brian Curless CSE 457 Spring 2016

CS 130 Exam I. Fall 2015

The Traditional Graphics Pipeline

CS 464 Review. Review of Computer Graphics for Final Exam

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

LIGHTING AND SHADING

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 457 Spring Required: Angel chapter 5.

Illumination and Shading ECE 567

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

The Traditional Graphics Pipeline

Computer Graphics. Illumination and Shading

Texture Mapping. Brian Curless CSE 457 Spring 2015

CS Computer Graphics: Illumination and Shading I

CS Computer Graphics: Illumination and Shading I

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Computer Vision Systems. Viewing Systems Projections Illuminations Rendering Culling and Clipping Implementations

CS452/552; EE465/505. Intro to Lighting

Rasterization Overview

Texture Mapping. Reading. Implementing texture mapping. Texture mapping. Daniel Leventhal Adapted from Brian Curless CSE 457 Autumn 2011.

CPSC / Illumination and Shading

API for creating a display window and using keyboard/mouse interations. See RayWindow.cpp to see how these are used for Assignment3

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CS 4204 Computer Graphics

9. Illumination and Shading

Local Illumination physics. Law of reflection and Snell s law of refraction

Graphics and Visualization

Lecture 4. Viewing, Projection and Viewport Transformations

Blue colour text questions Black colour text sample answers Red colour text further explanation or references for the sample answers

Local Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Homework #2. Hidden Surfaces, Projections, Shading and Texture, Ray Tracing, and Parametric Curves

Graphics Hardware and Display Devices

Transcription:

Projections and Hardware Rendering Brian Curless CSE 557 Fall 2014 1

Reading Required: Shirley, Ch. 7, Sec. 8.2, Ch. 18 Further reading: Foley, et al, Chapter 5.6 and Chapter 6 David F. Rogers and J. Alan Adams, Mathematical Elements for Computer Graphics, 2 nd Ed., McGraw-Hill, New York, 1990, Chapter 2. I. E. Sutherland, R. F. Sproull, and R. A. Schumacker, A characterization of ten hidden surface algorithms, ACM Computing Surveys 6(1): 1-55, March 1974. 2

Going back to the pinhole camera Recall that the Trace project uses, by default, the pinhole camera model. If we just consider finding out which surface point is visible at each image pixel, then we are ray casting. For each pixel center P ij Send ray from eye point (COP), C, through P ij into scene. Intersect ray with each object. Select nearest intersection. 3

Warping space A very different approach is to take the imaging setup: then warp all of space so that all the rays are parallel (and distant objects are smaller than closer objects): and then just draw everything onto the image plane, keeping track of what is in front: 4

3D Geometry Pipeline Graphics hardware follows the warping space approach. Before being turned into pixels, a piece of geometry goes through a number of transformations... 5

3D Geometry Pipeline (cont d) 6

Projections Projections transform points in n-space to m-space, where m<n. In 3-D, we map points from 3-space to the projection plane (PP) (a.k.a., image plane) along projectors (a.k.a., viewing rays) emanating from the center of projection (COP): There are two basic types of projections: Perspective distance from COP to PP finite Parallel distance from COP to PP infinite 7

Parallel projections For parallel projections, we specify a direction of projection (DOP) instead of a COP. There are two types of parallel projections: Orthographic projection DOP perpendicular to PP Oblique projection DOP not perpendicular to PP We can write orthographic projection onto the z=0 plane with a simple matrix. x x ' 1 0 0 0 y y ' 0 1 0 0 z 1 0 0 0 1 1 Normally, we do not drop the z value right away. Why not? 8

Z-buffer The Z-buffer or depth buffer algorithm [Catmull, 1974] can be used to determine which surface point is visible at each pixel. Here is pseudocode for the Z-buffer hidden surface algorithm: for each pixel (i,j) do Z-buffer [i,j] FAR Framebuffer[i,j] <background color> end for for each triangle A do for each pixel in A do Compute depth z of A at (i,j) if z > Z-buffer [i,j] then Z-buffer [i,j] z Framebuffer[i,j] color of A end if end for end for Q: What should FAR be set to? 9

Rasterization The process of filling in the pixels inside of a polygon is called rasterization. During rasterization, the z value can be computed incrementally (fast!). Curious fact: Described as the brute-force image space algorithm by [SSS] Mentioned only in Appendix B of [SSS] as a point of comparison for huge memories, but written off as totally impractical. Today, Z-buffers are commonly implemented in hardware. 10

Properties of parallel projection Properties of parallel projection: Not realistic looking Good for exact measurements Are actually a kind of affine transformation Parallel lines remain parallel Ratios are preserved Angles not (in general) preserved Most often used in CAD, architectural drawings, etc., where taking exact measurement is important 11

Derivation of perspective projection Consider the projection of a point onto the projection plane: By similar triangles, we can compute how much the x and y coordinates are scaled: 12

Homogeneous coordinates revisited Remember how we said that affine transformations work with the last coordinate always set to one. What happens if the coordinate is not one? We divide all the coordinates by w: x x/ w y y/ w z z/ w w 1 If w = 1, then nothing changes. Sometimes we call this division step the perspective divide. 13

Homogeneous coordinates and perspective projection Now we can re-write the perspective projection as a matrix equation: x x' 1 0 0 0 x y y' 0 1 0 0 y z w' 0 0 1/ d 0 z/ d 1 After division by w, we get: d x x ' z d y' y z 1 1 Again, projection implies dropping the z coordinate to give a 2D image, but we usually keep it around a little while longer. 14

Projective normalization After applying the perspective transformation and dividing by w, we are free to do a simple parallel projection to get the 2D image. What does this imply about the shape of things after the perspective transformation + divide? 15

Viewing angle An alternative to specifying the distance from COP to PP is to specify a viewing angle: Given the height of the image h and, what is d? What happens to d as increases (while h is constant)? 16

Zoom and dolly 17

Vanishing points What happens to two parallel lines that are not parallel to the projection plane? Think of train tracks receding into the horizon... The equation for a line is: px vx p y v y lptv t pz vz 1 0 After perspective transformation we get: x' px tvx y' py tv y w' ( pz tvz)/ d 18

Vanishing points (cont'd) Dividing by w: px tvx d pz tv z x ' py tvy y' d pz tvz w ' 1 Letting t go to infinity: We get a point! What happens to the line l = q + tv? Each set of parallel lines intersect at a vanishing point on the PP. Q: How many vanishing points are there? 19

Clipping and the viewing frustum The center of projection and the portion of the projection plane that map to the final image form an infinite pyramid. The sides of the pyramid are clipping planes. Frequently, additional clipping planes are inserted to restrict the range of depths. These clipping planes are called the near and far or the hither and yon clipping planes. All of the clipping planes bound the the viewing frustum. 20

Properties of perspective projections The perspective projection is an example of a projective transformation. Here are some properties of projective transformations: Lines map to lines Parallel lines do not necessarily remain parallel Ratios are not preserved One of the advantages of perspective projection is that size varies inversely with distance looks realistic. A disadvantage is that we can't judge distances as exactly as we can with parallel projections. 21

Rasterization with color Recall that the z-buffer works by interpolating z-values across a triangle that has been projected into image space, a process called rasterization. During rasterization, colors can be smeared across a triangle as well: 22

Gouraud interpolation Recall from the shading lecture, rendering with per triangle normals leads to faceted appearance. An improvement is to compute per-vertex normals and use graphics hardware to do Gouraud interpolation: 1. Compute normals at the vertices. 2. Shade only the vertices. 3. Interpolate the resulting vertex colors. 23

Gouraud interpolation artifacts Gouraud interpolation has significant limitations. 1. If the polygonal approximation is too coarse, we can miss specular highlights. 2. We will encounter Mach banding (derivative discontinuity enhanced by human eye). This is what graphics hardware does by default. A substantial improvement is to do 24

Phong interpolation To get an even smoother result with fewer artifacts, we can perform Phong interpolation. Here s how it works: 1. Compute normals at the vertices. 2. Interpolate normals and normalize. 3. Shade using the interpolated normals. 25

Default pipeline: Gouraud interpolation Vertex processor Primitive assembler Default vertex processing: L determine lighting direction V determine viewing direction Nnormalize( n e ) cblinn-phong shade with LV,, N, kd, ks, ns attach c blinn-phong to vertex as varying v i project v to image 1 2 3 i, i, vi v v triangle Rasterizer Fragment processor Default fragment processing: p color c blinn-phong 26

Programmable pipeline: Phong-interpolated normals! Vertex processor Vertex shader: attach n e to vertex as varying attach v e to vertex as varying v i project v to image Primitive assembler v, v, v 1 2 3 i i i triangle Rasterizer Fragment processor Fragment shader: L determine lighting direction V determine viewing direction p N normalize( n e ) color shade with LV,, Nk,, k, n p p p d s s 27

Texture mapping and the z-buffer Texture-mapping can also be handled in z-buffer algorithms. Method: Scan conversion is done in screen space, as usual Each pixel is colored according to the texture Texture coordinates are found by Gouraud-style interpolation Note: Mapping is more complicated to handle perspective correctly! 28

Shading in OpenGL The OpenGL lighting model allows you to associate different lighting colors according to material properties they will influence. Thus, our original shading equation: I=k +k I + e a La 1 a+b r +cr NL NH I B k +k n 2 L,j j d j s j + j j j j j j s becomes: I=k +k I + e a La 1 a+b r +cr 2 j j j j j j ki + B ki ( NL ) +ki ( NH ) n a La,j j d Ld, j j s Ls,j j + s where you can have a global ambient light with intensity I La in addition to having an ambient light intensity I La,j associated with each individual light, as well as separate diffuse and specular intensities, I Ld,j and I Ls,j, repectively. 29

Materials in OpenGL The OpenGL code to specify the surface shading properties is fairly straightforward. For example: GLfloat ke[] = { 0.1, 0.15, 0.05, 1.0 }; GLfloat ka[] = { 0.1, 0.15, 0.1, 1.0 }; GLfloat kd[] = { 0.3, 0.3, 0.2, 1.0 }; GLfloat ks[] = { 0.2, 0.2, 0.2, 1.0 }; GLfloat ns[] = { 50.0 }; glmaterialfv(gl_front, GL_EMISSION, ke); glmaterialfv(gl_front, GL_AMBIENT, ka); glmaterialfv(gl_front, GL_DIFFUSE, kd); glmaterialfv(gl_front, GL_SPECULAR, ks); glmaterialfv(gl_front, GL_SHININESS, ns); Notes: The GL_FRONT parameter tells OpenGL that we are specifiying the materials for the front of the surface. Only the alpha value of the diffuse color is used for blending. It s usually set to 1. 30

Shading in OpenGL, cont d In OpenGL this equation, for one light source (the 0 th ) is specified something like: GLfloat La[] = { 0.2, 0.2, 0.2, 1.0 }; GLfloat La0[] = { 0.1, 0.1, 0.1, 1.0 }; GLfloat Ld0[] = { 1.0, 1.0, 1.0, 1.0 }; GLfloat Ls0[] = { 1.0, 1.0, 1.0, 1.0 }; GLfloat pos0[] = { 1.0, 1.0, 1.0, 0.0 }; GLfloat a0[] = { 1.0 }; GLfloat b0[] = { 0.5 }; GLfloat c0[] = { 0.25 }; GLfloat S0[] = { -1.0, -1.0, 0.0 }; GLfloat beta0[] = { 45 }; GLfloat e0[] = { 2 }; gllightmodelfv(gl_light_model_ambient, La); gllightfv(gl_light0, GL_AMBIENT, La0); gllightfv(gl_light0, GL_DIFFUSE, Ld0); gllightfv(gl_light0, GL_SPECULAR, Ls0); gllightfv(gl_light0, GL_POSITION, pos0); gllightfv(gl_light0, GL_CONSTANT_ATTENUATION, a0); gllightfv(gl_light0, GL_LINEAR_ATTENUATION, b0); gllightfv(gl_light0, GL_QUADRATIC_ATTENUATION, c0); gllightfv(gl_light0, GL_SPOT_DIRECTION, S0); gllightf(gl_light0, GL_SPOT_CUTOFF, beta0); gllightf(gl_light0, GL_SPOT_EXPONENT, e0); 31

Shading in OpenGL, cont d Notes: You can have as many as GL_MAX_LIGHTS lights in a scene. This number is system-dependent. For directional lights, you specify a light direction, not position, and the attenuation and spotlight terms are ignored. The directions of directional lights and spotlights are specified in the coordinate systems of the lights, not the surface points as we ve been doing in lecture. 32