Chapter IV Fragment Processing and Output Merging. 3D Graphics for Game Programming

Similar documents
3D Rasterization II COS 426

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Lecture 13: Reyes Architecture and Implementation. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Computer Graphics. Lecture 8 Antialiasing, Texture Mapping

Texturing Theory. Overview. All it takes is for the rendered image to look right. -Jim Blinn 11/10/2018

Page 1. Area-Subdivision Algorithms z-buffer Algorithm List Priority Algorithms BSP (Binary Space Partitioning Tree) Scan-line Algorithms

CPSC / Texture Mapping

Synthesis of Textures with Intricate Geometries using BTF and Large Number of Textured Micropolygons. Abstract. 2. Related studies. 1.

CS GAME PROGRAMMING Question bank

Discrete Techniques. 11 th Week, Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Review of Tuesday. ECS 175 Chapter 3: Object Representation

The Rasterization Pipeline

Per-Face Texture Mapping for Realtime Rendering. Realtime Ptex

1.2.3 The Graphics Hardware Pipeline

Computer Graphics. Shadows

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Identifying those parts of a scene that are visible from a chosen viewing position, and only process (scan convert) those parts

Shadow Algorithms. CSE 781 Winter Han-Wei Shen

Ptex: Per-face Texture Mapping for Production Rendering

CS4620/5620: Lecture 14 Pipeline

Visible-Surface Detection Methods. Chapter? Intro. to Computer Graphics Spring 2008, Y. G. Shin

Texture mapping. Computer Graphics CSE 167 Lecture 9

Level of Details in Computer Rendering

Textures. Texture coordinates. Introduce one more component to geometry

CS 354R: Computer Game Technology

Visualization Computer Graphics I Lecture 20

Chapter 7 - Light, Materials, Appearance

Computer Graphics CS 543 Lecture 13a Curves, Tesselation/Geometry Shaders & Level of Detail

CS451Real-time Rendering Pipeline

Pipeline Operations. CS 4620 Lecture 10

TEXTURE MAPPING. DVA338 Computer Graphics Thomas Larsson, Afshin Ameri

Comparing Reyes and OpenGL on a Stream Architecture

COMP371 COMPUTER GRAPHICS

The Rasterizer Stage. Texturing, Lighting, Testing and Blending

Spring 2009 Prof. Hyesoon Kim

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Real-Time Reyes: Programmable Pipelines and Research Challenges. Anjul Patney University of California, Davis

Spring 2011 Prof. Hyesoon Kim

Computer Graphics Fundamentals. Jon Macey

Rasterization and Graphics Hardware. Not just about fancy 3D! Rendering/Rasterization. The simplest case: Points. When do we care?

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

CS 428: Fall Introduction to. Texture mapping and filtering. Andrew Nealen, Rutgers, /18/2010 1

PROFESSIONAL. WebGL Programming DEVELOPING 3D GRAPHICS FOR THE WEB. Andreas Anyuru WILEY. John Wiley & Sons, Ltd.

COMPUTER GRAPHICS COURSE. Rendering Pipelines

CEng 477 Introduction to Computer Graphics Fall 2007

EECE 478. Learning Objectives. Learning Objectives. Rasterization & Scenes. Rasterization. Compositing

The Application Stage. The Game Loop, Resource Management and Renderer Design

Intro to OpenGL II. Don Fussell Computer Science Department The University of Texas at Austin

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

CHAPTER 1 Graphics Systems and Models 3

CS GPU and GPGPU Programming Lecture 11: GPU Texturing 1. Markus Hadwiger, KAUST

Texture Mapping and Sampling

the gamedesigninitiative at cornell university Lecture 17 Color and Textures

More Visible Surface Detection. CS116B Chris Pollett Mar. 16, 2005.

CSE 167: Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

A Real-time Micropolygon Rendering Pipeline. Kayvon Fatahalian Stanford University

Rasterization Computer Graphics I Lecture 14. Scan Conversion Antialiasing Compositing [Angel, Ch , ]

Programming Graphics Hardware

Rendering Grass with Instancing in DirectX* 10

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Mattan Erez. The University of Texas at Austin

Pipeline Operations. CS 4620 Lecture 14

Rationale for Non-Programmable Additions to OpenGL 2.0

8/5/2012. Introduction. Transparency. Anti-Aliasing. Applications. Conclusions. Introduction

Adaptive Point Cloud Rendering

Clipping. CSC 7443: Scientific Information Visualization

INF3320 Computer Graphics and Discrete Geometry

Soft shadows. Steve Marschner Cornell University CS 569 Spring 2008, 21 February

First Steps in Hardware Two-Level Volume Rendering

Height Fields and Contours Scalar Fields Volume Rendering Vector Fields [Angel Ch. 12] April 23, 2002 Frank Pfenning Carnegie Mellon University

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

CS130 : Computer Graphics. Tamar Shinar Computer Science & Engineering UC Riverside

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Computer Graphics 10 - Shadows

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

TSBK03 Screen-Space Ambient Occlusion

Deferred Splatting. Gaël GUENNEBAUD Loïc BARTHE Mathias PAULIN IRIT UPS CNRS TOULOUSE FRANCE.

POWERVR MBX. Technology Overview

Computer Graphics 1. Chapter 7 (June 17th, 2010, 2-4pm): Shading and rendering. LMU München Medieninformatik Andreas Butz Computergraphik 1 SS2010

Surface Rendering. Surface Rendering

Lesson 01 Polygon Basics 17. Lesson 02 Modeling a Body 27. Lesson 03 Modeling a Head 63. Lesson 04 Polygon Texturing 87. Lesson 05 NURBS Basics 117

CS 488. More Shading and Illumination. Luc RENAMBOT

Texture Mapping and Special Effects

CS 4620 Midterm, March 21, 2017

11/1/13. Visualization. Scientific Visualization. Types of Data. Height Field. Contour Curves. Meshes

Visualization. CSCI 420 Computer Graphics Lecture 26

Parallelizing Graphics Pipeline Execution (+ Basics of Characterizing a Rendering Workload)

Hardware Displacement Mapping

CS130 : Computer Graphics Lecture 2: Graphics Pipeline. Tamar Shinar Computer Science & Engineering UC Riverside

Hidden surface removal. Computer Graphics

Module 13C: Using The 3D Graphics APIs OpenGL ES

Deferred Rendering Due: Wednesday November 15 at 10pm

CS770/870 Fall 2015 Advanced GLSL

Introduction to Visualization and Computer Graphics

Computer Graphics - Week 7

Transcription:

Chapter IV Fragment Processing and Output Merging

Fragment Processing The per-fragment attributes may include a normal vector, a set of texture coordinates, a set of color values, a depth, etc. Using these data, the fragment processing stage determines the final color of each fragment. Two most important things to do in the fragment processing stage Per-fragment lighting (presented in Chapter 5) Texturing (presented in Chapters 8, 9, and 10). Let s see what texturing is. 4-2

Texture Coordinates An image texture is a 2D array of texels (texture elements). Each texel has a unique address, i.e., 2D array index. Let s map the texture into a square in the parameter space and use normalized texture coordinates (u,v) instead of the texel address. Then, multiple images of different resolutions can be glued to a surface without changing the texture coordinates. 4-3

Surface Parameterization The texture coordinates are assigned to the vertices of the polygon mesh. This process is called surface parameterization or simply parameterization. In general, parameterization requires unfolding a 3D surface onto a 2D planar domain. 4-4

Chart and Atlas The surface to be textured is often subdivided into a (hopefully small) number of patches. Each patch is unfolded and parameterized. Then, the artist draws an image for each patch. An image for a patch is often called a chart. Multiple charts are usually packed and arranged in a texture, which is often called an atlas. 4-5

Texture Coordinates to Texel Address Scan conversion is done with the texture coordinates. D3D performs the following computation to convert the texture coordinates to the texel address. t t x y = ( u S = ( v S x y ) 0.5 ) 0.5 4-6

Texturing Examples Observe that multiple images of different resolutions can be glued to a surface. 4-7

Texture Coordinates (revisited) Direct3D and OpenGL adopt different parameter spaces for texture coordinates. When a textured model is exported between Direct3D and OpenGL, the v- coordinates should be converted into (1-v). 4-8

Texture Addressing Mode The texture coordinates (u,v) are not necessarily in the range of [0,1]. The texture addressing mode handles (u,v)s outside the range. Border: The out-of-range coordinates are rendered with a separately defined border color. Wrap, repeat, or tile: The texture is tiled at every integer junction. Mirror: The texture is mirrored or reflected at every integer junction. We have smooth transition at the boundaries. 8-9

Texture Filtering Magnification & Minification Consider a quad. For each fragment located at (x,y) in the screen, its texture coordinates (u,v) are mapped to the texel address (t x,t y ). We say that the fragment at (x,y) is projected onto (t x,t y ). Note that (t x,t y ) are floating-point values in almost all cases. Consequently, texels around (t x,t y ) are sampled. This sampling process is called texture filtering. The screen-space quad may appear larger than the image texture, and therefore the texture is magnified so as to fit to the quad. There are more pixels than texels. 8-10

Texture Filtering Magnification & Minification (cont d) In contrast, the screen-space quad may appear smaller than the image texture, and the texture is minified. The pixels are sparsely projected onto the texture space. Summary Magnification: more pixels than texels Minification : less pixels than texels 8-11

Filtering for Magnification Option 1: Nearest point sampling A block of pixels can be mapped to a single texel. Consequently, adjacent pixel blocks can change abruptly from one texel to the next texel, and a blocky image is often produced. 8-12

Filtering for Magnification (cont d) Option 2: Bilinear interpolation It is preferred to nearest point sampling not only because the final result suffers much less from the blocky image problem but also because the graphics hardware is usually optimized for bilinear interpolation. 8-13

Filtering for Minification Consider the following checker-board image texture. If all pixels are surrounded by dark-gray texels, the textured primitive appears dark gray. If all pixels are surrounded by light-gray texels, the textured primitive appears light gray. 8-14

Z-buffering The output of the fragment program is often called the RGBAZ fragment. A or alpha for representing the fragment's opacity Z or depth used for z-buffering Using alpha and depth values, the fragment competes or is merged with the pixel of the color buffer. The z-buffer has the same resolution as the color buffer, and records the z- values of the pixels currently stored in the color buffer. When a fragment at (x,y) is passed from the fragment program, its z-value is compared with the z-buffer value at (x,y). If the fragment has a smaller z-value, its color and z-value are used to update the color buffer and z-buffer at (x,y), respectively. Otherwise, the fragment is discarded. 4-15

Z-buffering (cont d) Assume MaxZ is 1.0, the red triangle s depth is 0.8, and the blue triangle s is 0.5. 4-16

Z-buffering (cont d) Rendering-order independence!! 4-17

Ed Catmull Founder of Pixar and now president of Walt Disney and Pixar Animation Studios Academy awards 1993 Academy Scientific and Technical Award "for the development of PhotoRealistic RenderMan software which produces images used in motion pictures from 3D computer descriptions of shape and appearance" 1996 Academy Scientific and Technical Award "for pioneering inventions in Digital Image Compositing" 2001 Oscar "for significant advancements to the field of motion picture rendering as exemplified in Pixar's RenderMan" 2008 Gordon E. Sawyer Award for an individual in the motion picture industry whose technological contributions have brought credit to the industry Ed Catmull s work texture mapping z-buffering bicubic patches and subdivision surfaces 4-18

Alpha Blending The alpha channel in the range of [0,1] 0 denotes fully transparent. 1 denotes fully opaque. A typical blending equation is described as follows: For alpha blending, the primitives cannot be rendered in an arbitrary order. They must be rendered after all opaque primitives, and in back-to-front order. Therefore, the partially transparent objects should be sorted. 4-19

Z-culling The z-buffering test is done after the fragment processing stage. When the z- value of the fragment is greater than the buffered value, the fragment is determined to be occluded and thus discarded. It s inefficient!!! Let s kill the fragments before the fragment processing stage if possible. A tile is composed of nxn pixels, and records the maximum among their z-values. Let s compare it with the minimum z-coordinate of the three vertices of a triangle. 4-20

Z-culling (cont d) Z-culling is powerful, and is enabled by default. Consider the scene, where the first object occludes almost of triangles of the other objects. In a test, the average frame rate with the front-to-back ordered objects is 12.75 fps whereas that with the back-to-front ordered objects is 2.71. This shows that the triangles may need to be sorted in front-to-back order in order to maximize the performance increase brought by z-culling. 4-21

Pre-Z Pass Two-pass algorithm 1 st pass: The scene is rendered with no shading, e.g., no lighting and no texturing. Then, the color buffer is not filled, but the z-buffer is filled with the depths of the visible surfaces of the scene. 2 nd pass: The scene is rendered with full shading. Then, z-culling may cull out all fragments that are occluded by the visible surfaces. No occluded fragment enters the fragment processing stage. The pre-z pass algorithm even with back-to-front ordered objects outperforms the single-pass front-to-back rendering. 4-22