Reading. 18. Projections and Z-buffers. Required: Watt, Section , 6.3, 6.6 (esp. intro and subsections 1, 4, and 8 10), Further reading:

Similar documents
Projections and Hardware Rendering. Brian Curless CSE 557 Fall 2014

Projections. Brian Curless CSE 457 Spring Reading. Shrinking the pinhole. The pinhole camera. Required:

Reading. 12. Texture Mapping. Texture mapping. Non-parametric texture mapping. Required. w Watt, intro to Chapter 8 and intros to 8.1, 8.4, 8.6, 8.8.

Pipeline Operations. CS 4620 Lecture 10

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture 14

3D Polygon Rendering. Many applications use rendering of 3D polygons with direct illumination

Subdivision surfaces. University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell

3D Rasterization II COS 426

CHAPTER 1 Graphics Systems and Models 3

Homework #2. Shading, Ray Tracing, and Texture Mapping

Topics and things to know about them:

The Traditional Graphics Pipeline

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Reading. 13. Texture Mapping. Non-parametric texture mapping. Texture mapping. Required. Watt, intro to Chapter 8 and intros to 8.1, 8.4, 8.6, 8.8.

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Chapter 5. Projections and Rendering

Computer Graphics I Lecture 11

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

The Traditional Graphics Pipeline

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

Institutionen för systemteknik

Game Architecture. 2/19/16: Rasterization

The Traditional Graphics Pipeline

CS 130 Final. Fall 2015

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

Computer Graphics Introduction. Taku Komura

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Introduction to Computer Graphics with WebGL

Graphics Hardware and Display Devices

Texture Mapping. Brian Curless CSE 457 Spring 2016

Computer Graphics. Texture Filtering & Sampling Theory. Hendrik Lensch. Computer Graphics WS07/08 Texturing

CS4620/5620: Lecture 14 Pipeline

QUESTION BANK 10CS65 : COMPUTER GRAPHICS AND VISUALIZATION

For Intuition about Scene Lighting. Today. Limitations of Planar Shadows. Cast Shadows on Planar Surfaces. Shadow/View Duality.

CS354 Computer Graphics Ray Tracing. Qixing Huang Januray 24th 2017

3D Viewing. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

How do we draw a picture?

CS 325 Computer Graphics

Texture Mapping. Michael Kazhdan ( /467) HB Ch. 14.8,14.9 FvDFH Ch. 16.3, , 16.6

CS 130 Exam I. Fall 2015

Introduction Rasterization Z-buffering Shading. Graphics 2012/2013, 4th quarter. Lecture 09: graphics pipeline (rasterization and shading)

Last Time. Why are Shadows Important? Today. Graphics Pipeline. Clipping. Rasterization. Why are Shadows Important?

CSE 690: GPGPU. Lecture 2: Understanding the Fabric - Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department

Shading. Brian Curless CSE 457 Spring 2017

Computer Graphics. Illumination and Shading

FROM VERTICES TO FRAGMENTS. Lecture 5 Comp3080 Computer Graphics HKBU

Spring 2012 Final. CS184 - Foundations of Computer Graphics. University of California at Berkeley

Models and Architectures

Ray tracing. Computer Graphics COMP 770 (236) Spring Instructor: Brandon Lloyd 3/19/07 1

CPSC / Illumination and Shading

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Homework #2. Hidden Surfaces, Projections, Shading and Texture, Ray Tracing, and Parametric Curves

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Shadow Algorithms. CSE 781 Winter Han-Wei Shen

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Texture mapping. Computer Graphics CSE 167 Lecture 9

CSE328 Fundamentals of Computer Graphics

Rasterization Overview

Last Time. Reading for Today: Graphics Pipeline. Clipping. Rasterization

Homework #2. Shading, Projections, Texture Mapping, Ray Tracing, and Bezier Curves

CSE528 Computer Graphics: Theory, Algorithms, and Applications

3D Viewing. CS 4620 Lecture 8

CS451Real-time Rendering Pipeline

MA 323 Geometric Modelling Course Notes: Day 21 Three Dimensional Bezier Curves, Projections and Rational Bezier Curves

- Rasterization. Geometry. Scan Conversion. Rasterization

Rasterization. MIT EECS Frédo Durand and Barb Cutler. MIT EECS 6.837, Cutler and Durand 1

9. Illumination and Shading

Ray Tracing. Brian Curless CSEP 557 Fall 2016

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

3D Viewing. Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Shading. Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller/Fuhrmann

3D Viewing. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 9

Interactive Computer Graphics A TOP-DOWN APPROACH WITH SHADER-BASED OPENGL

Computer Graphics. Shadows

CS5620 Intro to Computer Graphics

Mach band effect. The Mach band effect increases the visual unpleasant representation of curved surface using flat shading.

CS184 : Foundations of Computer Graphics Professor David Forsyth Final Examination

RASTERISED RENDERING

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

CS 488. More Shading and Illumination. Luc RENAMBOT

Computer Graphics Shadow Algorithms

Line Drawing. Introduction to Computer Graphics Torsten Möller / Mike Phillips. Machiraju/Zhang/Möller

Spring 2009 Prof. Hyesoon Kim

Computer Graphics. Bing-Yu Chen National Taiwan University The University of Tokyo

Graphics Pipeline 2D Geometric Transformations

CS 431/636 Advanced Rendering Techniques

Sampling, Aliasing, & Mipmaps

CSE4030 Introduction to Computer Graphics

Computer Vision Systems. Viewing Systems Projections Illuminations Rendering Culling and Clipping Implementations

CSE 167: Introduction to Computer Graphics Lecture #7: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2015

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

GLOBAL EDITION. Interactive Computer Graphics. A Top-Down Approach with WebGL SEVENTH EDITION. Edward Angel Dave Shreiner

CS184 : Foundations of Computer Graphics Professor David Forsyth Final Examination (Total: 100 marks)

Transcription:

Reading Required: Watt, Section 5.2.2 5.2.4, 6.3, 6.6 (esp. intro and subsections 1, 4, and 8 10), Further reading: 18. Projections and Z-buffers Foley, et al, Chapter 5.6 and Chapter 6 David F. Rogers and J. Alan Adams, Mathematical Elements for Computer Graphics, 2 nd Ed., McGraw-Hill, New York, 1990, Chapter 2. I. E. Sutherland, R. F. Sproull, and R. A. Schumacker, A characterization of ten hidden surface algorithms, ACM Computing Surveys 6(1): 1-55, March 1974. 1 2

3D Geometry Pipeline 3D Geometry Pipeline (cont d) Before being turned into pixels by graphics hardware, a piece of geometry goes through a number of transformations... 3 4

Projections Projections transform points in n-space to m- space, where m<n. In 3-D, we map points from 3-space to the projection plane (PP) along projectors emanating from the center of projection (COP): The center of projection is exactly the same as the pinhole in a pinhole camera. There are two basic types of projections: Perspective distance from COP to PP finite Parallel distance from COP to PP infinite Parallel projections For parallel projections, we specify a direction of projection (DOP) instead of a COP. There are two types of parallel projections: Orthographic projection DOP perpendicular to PP Oblique projection DOP not perpendicular to PP We can write orthographic projection onto the z=0 plane with a simple matrix. x x' 1 0 0 0 y y' = 0 1 0 0 z 1 0 0 0 1 1 But normally, we do not drop the z value right away. Why not? 5 6

Properties of parallel projection Properties of parallel projection: Not realistic looking Good for exact measurements Are actually a kind of affine transformation Parallel lines remain parallel Angles not (in general) preserved Most often used in CAD, architectural drawings, etc., where taking exact measurement is important Derivation of perspective projection Consider the projection of a point onto the projection plane: By similar triangles, we can compute how much the x and y coordinates are scaled: 7 [Note: Watt uses a left-handed coordinate system, and he looks down the +z axis, so his PP is at +d.] 8

Homogeneous coordinates revisited Remember how we said that affine transformations work with the last coordinate always set to one. What happens if the coordinate is not one? We divide all the coordinates by w: x x/ w y y/ w z z/ w w 1 If w = 1, then nothing changes. Sometimes we call this division step the perspective divide. Homogeneous coordinates and perspective projection Now we can re-write the perspective projection as a matrix equation: x x' 1 0 0 0 x = y y' 0 1 0 0 = y z w' 0 0 1/ d 0 z/ d 1 After division by w, we get: x d x' z y y' d = z 1 1 Again, projection implies dropping the z coordinate to give a 2D image, but we usually keep it around a little while longer. 9 10

Projective normalization After applying the perspective transformation and dividing by w, we are free to do a simple parallel projection to get the 2D image. What does this imply about the shape of things after the perspective transformation + divide? Vanishing points What happens to two parallel lines that are not parallel to the projection plane? Think of train tracks receding into the horizon... The equation for a line is: px vx p y v y l= p+ tv= + t pz vz 1 0 After perspective transformation we get: x' px + tvx y' py tv = + y w' ( pz+ tvz)/ d 11 12

Vanishing points (cont'd) Dividing by w: Letting t go to infinity: We get a point! px + tvx d pz+ tv z x' py tvy y' + = d pz+ tvz w ' 1 What happens to the line l = q + tv? Each set of parallel lines intersect at a vanishing point on the PP. Q: How many vanishing points are there? Properties of perspective projections The perspective projection is an example of a projective transformation. Here are some properties of projective transformations: Lines map to lines Parallel lines do not necessarily remain parallel Ratios are not preserved One of the advantages of perspective projection is that size varies inversely with distance looks realistic. A disadvantage is that we can't judge distances as exactly as we can with parallel projections. Q: Why did nature give us eyes that perform perspective projections? Q: Do our eyes see in 3D? 13 14

Z-buffer We can use projections for hidden surface elimination. The Z-buffer' or depth buffer algorithm [Catmull, 1974] is probably the simplest and most widely used of these techniques. Here is pseudocode for the Z-buffer hidden surface algorithm: for each pixel (i,j) do Z-buffer [i,j] FAR Framebuffer[i,j] <background color> end for for each polygon A do for each pixel in A do Compute depth z and shade s of A at (i,j) if z > Z-buffer [i,j] then Z-buffer [i,j] z Framebuffer[i,j] s end if end for end for Z-buffer, cont'd The process of filling in the pixels inside of a polygon is called rasterization. During rasterization, the z value and shade s can be computed incrementally (fast!). Curious fact: Described as the brute-force image space algorithm by [SSS] Mentioned only in Appendix B of [SSS] as a point of comparison for huge memories, but written off as totally impractical. Today, Z-buffers are commonly implemented in hardware. 15 16

Ray tracing vs. Z-Buffer Ray tracing: for each ray { for each object { test for intersection } } Z-Buffer: for each object { project_onto_screen; for each ray { test for intersection } } In both cases, optimizations are applied to the inner loop. Gouraud vs. Phong interpolation Does Z-buffer graphics hardware do a full shading calculation at every point? Not in the past, but this has changed in the last three years! Smooth surfaces are often approximated by polygonal facets, because: Graphics hardware generally wants polygons (esp. triangles). Sometimes it easier to write ray-surface intersection algorithms for polygonal models. How do we compute the shading for such a surface? Biggest differences: - ray order vs. object order - Z-buffer does some work in screen space - Z-buffer restricted to rays from a single center of projection! 17 18

Faceted shading Assume each face has a constant normal: Gouraud interpolation To get a smoother result that is easily performed in hardware, we can do Gouraud interpolation. Here s how it works: 1. Compute normals at the vertices. 2. Shade only the vertices. 3. Interpolate the resulting vertex colors. For a distant viewer and a distant light source, how will the color of each triangle vary? Result: faceted, not smooth, appearance. 19 20

Gouraud interpolation, cont'd Gouraud interpolation has significant limitations. 1. If the polygonal approximation is too coarse, we can miss specular highlights. Phong interpolation To get an even smoother result with fewer artifacts, we can perform Phong interpolation. Here s how it works: 1. Compute normals at the vertices. 2. Interpolate normals and normalize. 3. Shade using the interpolated normals. 2. We will encounter Mach banding (derivative discontinuity enhanced by human eye). Alas, this is usually what graphics hardware supported until very recently. But new graphics hardware supports 21 22

Gouraud vs. Phong interpolation Texture mapping and the z-buffer Texture-mapping can also be handled in z-buffer algorithms. Method: Scan conversion is done in screen space, as usual Each pixel is colored according to the texture Texture coordinates are found by Gouraudstyle interpolation Note: Mapping is more complicated if you want to do perspective right! - linear in world space!= linear in screen space 23 24

Antialiasing textures Computing the average color If you render an object with a texture map using point-sampling, you can get aliasing: The computationally difficult part is summing over the covered pixels. Several methods have been used. The simplest is brute force: Figure out which texels are covered and add up their colors to compute the average. From Crow, SIGGRAPH '84 Proper antialiasing requires area averaging over pixels: From Crow, SIGGRAPH '84 In some cases, you can average directly over the texture pixels to do the anti-aliasing. 25 26

Mip maps Mip map pyramid magnify The mip map hierarchy can be thought of as an image pyramid: A faster method is mip maps developed by Lance Williams in 1983: Stands for multum in parvo many things in a small place Keep textures prefiltered at multiple resolutions Has become the graphics hardware standard Level 0 (T 0 [i,j]) is the original image. Level 1 (T 1 [i,j]) averages over 2x2 neighborhoods of original. Level 2 (T 2 [i,j]) averages over 4x4 neighborhoods of original Level 3 (T 3 [i,j]) averages over 8x8 neighborhoods of original What s a fast way to pre-compute the texture map for each level? 27 28

Mip map resampling Summed area tables A more accurate method than mip maps is summed area tables invented by Frank Crow in 1984. Recall from calculus: b b a f ( x) dx = f ( x) dx f ( x) dx a What would the mip-map return for an average over a 5x5 neighborhood at location (u 0,v 0 )? How do we measure the fractional distance between levels? In discrete form: m m k fi [] = fi [] fi [] i= k i= 0 i= 0 What if you need to average over a non-square region? Q: If we wanted to do this real fast, what might we pre-compute? 29 30

Summed area tables (cont d) Comparison of techniques We can extend this idea to 2D by creating a table, S[i,j], that contains the sum of everything below and to the left. Point sampled MIP-mapped Q: How do we compute the average over a region from (l, b) to (r, t)? Summed area table Characteristics: Requires more memory and precision Gives less blurry textures From Crow, SIGGRAPH '84 31 32

Cost of Z-buffering Z-buffering is the algorithm of choice for hardware rendering (today), so let s think about how to make it run as fast as possible The steps involved in the Z-buffer algorithm are: 1. Send a triangle to the graphics hardware. 2. Transform the vertices of the triangle using the modeling matrix. 3. Transform the vertices using the projection matrix. 4. Set up for incremental rasterization calculations 5. Rasterize (generate fragments = potential pixels) 6. Shade at each fragment 7. Update the framebuffer according to z. What is the overall cost of Z-buffering? Cost of Z-buffering, cont d We can approximate the cost of this method as: k v k v k t k dm 2 bus bus + xform xform + ( ) setup + shade Where: k bus = bus cost to send a vertex v bus = number of vertices sent over the bus k xform = cost of transforming a vertex v xform = number of vertices transformed k setup = cost of setting up for rasterization t = number of triangles being rasterized k shade = cost of shading a fragment d = depth complexity (average times a pixel is covered) m 2 = number of pixels in frame buffer 33 34

Visibility tricks for Z-buffers Next class: Visual Perception Given this cost function: k v k v k t k dm 2 bus bus + xform xform + ( ) setup + shade what can we do to accelerate Z-buffering? Accel method v bus v xform t d m Topic: How does the human visual system? How do humans perceive color? How do we represent color in computations? Read: Glassner, Principles of Digital Image Synthesis, pp. 5-32. [Course reader pp.1-28] Watt, Chapter 15. Brian Wandell. Foundations of Vision. Sinauer Associates, Sunderland, MA, pp. 45-50 and 69-97, 1995. [Course reader pp. 29-34 and pp. 35-63] 35 36