Rasterization Overview

Similar documents
CS4620/5620: Lecture 14 Pipeline

CS 112 The Rendering Pipeline. Slide 1

CS451Real-time Rendering Pipeline

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Computer Graphics Shadow Algorithms

Pipeline Operations. CS 4620 Lecture 14

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Rendering Objects. Need to transform all geometry then

Models and Architectures

Pipeline Operations. CS 4620 Lecture 10

Spring 2009 Prof. Hyesoon Kim

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

CSE 167: Introduction to Computer Graphics Lecture #5: Projection. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

The Graphics Pipeline and OpenGL I: Transformations!

Spring 2011 Prof. Hyesoon Kim

Computergrafik. Matthias Zwicker. Herbst 2010

Lecture 2. Shaders, GLSL and GPGPU

Introduction to Computer Graphics with WebGL

RASTERISED RENDERING

Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to mode

The Graphics Pipeline

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment

Deferred Rendering Due: Wednesday November 15 at 10pm

CS 498 VR. Lecture 18-4/4/18. go.illinois.edu/vrlect18

Lecture 4. Viewing, Projection and Viewport Transformations

Real - Time Rendering. Graphics pipeline. Michal Červeňanský Juraj Starinský

3D Polygon Rendering. Many applications use rendering of 3D polygons with direct illumination

CS452/552; EE465/505. Clipping & Scan Conversion

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Projection and viewing. Computer Graphics CSE 167 Lecture 4

Game Architecture. 2/19/16: Rasterization

CMSC427 Transformations II: Viewing. Credit: some slides from Dr. Zwicker

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI

Computer Graphics. Lecture 02 Graphics Pipeline. Edirlei Soares de Lima.

Computer graphics 2: Graduate seminar in computational aesthetics

3D Graphics and OpenGl. First Steps

CS 381 Computer Graphics, Fall 2012 Midterm Exam Solutions. The Midterm Exam was given in class on Tuesday, October 16, 2012.

3D Rendering Pipeline

Fondamenti di Grafica 3D The Rasterization Pipeline.

CS 4620 Program 3: Pipeline

CS 130 Final. Fall 2015

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Advanced Rendering Techniques

Volume Shadows Tutorial Nuclear / the Lab

1.2.3 The Graphics Hardware Pipeline

Viewing with Computers (OpenGL)

The Rendering Pipeline

E.Order of Operations

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

Motivation. Culling Don t draw what you can t see! What can t we see? Low-level Culling

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

Models and The Viewing Pipeline. Jian Huang CS456

- Rasterization. Geometry. Scan Conversion. Rasterization

Getting fancy with texture mapping (Part 2) CS559 Spring Apr 2017

3D Graphics for Game Programming (J. Han) Chapter II Vertex Processing

The Graphics Pipeline. Interactive Computer Graphics. The Graphics Pipeline. The Graphics Pipeline. The Graphics Pipeline: Clipping

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

CSE328 Fundamentals of Computer Graphics

CSE528 Computer Graphics: Theory, Algorithms, and Applications

VISIBILITY & CULLING. Don t draw what you can t see. Thomas Larsson, Afshin Ameri DVA338, Spring 2018, MDH

OpenGL SUPERBIBLE. Fifth Edition. Comprehensive Tutorial and Reference. Richard S. Wright, Jr. Nicholas Haemel Graham Sellers Benjamin Lipchak

CSE4030 Introduction to Computer Graphics

S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T

Announcements. Submitting Programs Upload source and executable(s) (Windows or Mac) to digital dropbox on Blackboard

OpenGL: Open Graphics Library. Introduction to OpenGL Part II. How do I render a geometric primitive? What is OpenGL

Hidden surface removal. Computer Graphics

Lecture 3 Sections 2.2, 4.4. Mon, Aug 31, 2009

Transforms 3: Projection Christian Miller CS Fall 2011

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Shadow Algorithms. CSE 781 Winter Han-Wei Shen

Graphics and Interaction Rendering pipeline & object modelling

Real-Time Shadows. Last Time? Textures can Alias. Schedule. Questions? Quiz 1: Tuesday October 26 th, in class (1 week from today!

Chapter 7 - Light, Materials, Appearance

Real-Time Shadows. Last Time? Today. Why are Shadows Important? Shadows as a Depth Cue. For Intuition about Scene Lighting

3D Graphics Pipeline II Clipping. Instructor Stephen J. Guy

OpenGL Transformations

Chapter 5. Projections and Rendering

Computer Graphics Lecture 11

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

INF3320 Computer Graphics and Discrete Geometry

CHAPTER 1 Graphics Systems and Models 3

Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Scanline Rendering 2 1/42

Introduction to Visualization and Computer Graphics

The Graphics Pipeline and OpenGL I: Transformations!

CSE 690: GPGPU. Lecture 2: Understanding the Fabric - Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department

The Traditional Graphics Pipeline

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

Blue colour text questions Black colour text sample answers Red colour text further explanation or references for the sample answers

Last Time. Why are Shadows Important? Today. Graphics Pipeline. Clipping. Rasterization. Why are Shadows Important?

1 OpenGL - column vectors (column-major ordering)

Hidden Surfaces II. Week 9, Mon Mar 15

Recall: Indexing into Cube Map

Virtual Cameras and The Transformation Pipeline

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

From Vertices to Fragments: Rasterization. Reading Assignment: Chapter 7. Special memory where pixel colors are stored.

Transcription:

Rendering Overview The process of generating an image given a virtual camera objects light sources Various techniques rasterization (topic of this course) raytracing (topic of the course Advanced Computer Graphics ) Major research topic in computer graphics rendering animation geometry processing

Rasterization Overview Rendering algorithm for generating 2D images from 3D scenes Transforming geometric primitives such as lines and polygons into raster image [Akenine-Moeller et al.: Real-time Rendering]

Model Data 3D objects are approximately represented by points, lines, polygons These primitives are processed to obtain a 2D image [Akenine-Moeller]

Rendering Pipeline Overview Processing stages comprise the rendering pipeline Supported by commodity graphics hardware GPU - graphics processing unit rasterization-based rendering pipeline OpenGL is a software interface to hardware this course focuses on concepts of the rendering pipeline this course assumes OpenGL in pipeline details

Rendering Pipeline - Task 3D input a virtual camera position, orientation, focal length objects points, lines, polygons geometry and material properties (position, normal, color, texture coordinates) light sources direction, position, color, intensity textures (images) 2D output per-pixel color values in the framebuffer Scene Description Vertex Processing Primitive Processing Rasterization Fragment Processing Display

Rendering Pipeline - Some Functionality Resolving visibility Evaluating a lighting model Computing shadows (not core functionality) Applying textures visibility lighting model shadow texture [Wright et al.: OpenGL SuperBible]

Rendering Pipeline - Main Stages Vertex processing processes all vertices independently transformations per vertex Primitive assembly points, lines, triangles Rasterization converts primitives into a raster image generates fragments Scene Description Vertex Processing Primitive Processing Rasterization Fragment Processing Display

Rendering Pipeline - Main Stages Vertex position transform Primitive assembly - combine vertices into polygons Rasterization - computes pixel positions Fragment generation with interpolated attributes, e.g. color [Lighthouse 3D]

Rendering Pipeline - Main Stages + lighting [Lighthouse 3D]

Vertex Processing - Overview Input vertices in object (model space) 3D positions attributes such as normal, material properties, texture coords Output vertices in window space 2D pixel positions attributes such as normal, material properties, texture coords additional or updated attributes such as normalized depth (distance to the viewer) color (result of the evaluation of the lighting model)

Vertex Processing - Overview object space eye space / camera space modelview transform lighting, projection clip space / normalized device coordinates window space clipping, viewport transform

Vertex Processing (Geometry Stage) model transform view transform lighting projection transform clipping viewport transform

Model Transform View Transform each object and the respective vertices are positioned, oriented, scaled in the scene with a model transform camera is positioned and oriented, represented by the view transform i.e., the inverse view transform is the transform that places the camera at the origin of the coordinate system, facing in the negative z-direction entire scene is transformed with the inverse view transform

Model Transform View Transform M M 3 1 M 2 Inverse V M 4 V -1 [Akenine-Moeller et al.: Real-time Rendering] M 1, M 2, M 3, M 4, V are matrices representing transformations M 1, M 2, M 3, M 4 are model transforms to place the objects in the scene V places and orientates the camera in space V -1 transforms the camera to the origin looking along the negative z-axis model and view transforms are combined in the modelview transform the modelview transform V -1 M 1..4 is applied to the objects

Lighting interaction of light sources and surfaces is represented with a lighting / illumination model lighting computes color for each vertex based on light source positions and properties based on transformed position, transformed normal, and material properties of a vertex

Projection Transform view volume canonical view volume the view volume depends on the camera properties orthographic projection cuboid perspective projection pyramidal frustum orthographic [Song Ho Ahn] perspective canonical view volume is a cube from (-1,-1,-1) to (1,1,1) view volume is specified by near, far, left, right, bottom, top

Projection Transform view volume (cuboid or frustum) is transformed into a cube (canonical view volume) objects inside (and outside) the view volume are transformed accordingly orthographic combination of translation and scaling all objects are translated and scaled in the same way perspective complex transformation scaling factor depends on the distance of an object to the viewer objects farther away from the camera appear smaller

Clipping primitives, that intersect the boundary of the view volume, are clipped primitives, that are inside, are passed to the next processing stage primitives, that are outside, are discarded clipping deletes and generates vertices and primitives [Akenine-Moeller et al.: Real-time Rendering]

Viewport Transform - Screen Mapping Projected primitive coordinates (x p, y p, z p ) are transformed to screen coordinates (x s, y s ) Screen coordinates together with depth value are window coordinates (x s, y s, z w ) [Akenine-Moeller et al.: Real-time Rendering]

Viewport Transform - Screen Mapping (x p, y p ) are translated and scaled from the range of (-1, 1) to actual pixel positions (x s, y s ) on the display z p is generally translated and scaled from the range of (-1, 1) to (0,1) for z w screen coordinates (x s, y s ) represent the pixel position of a fragment that is generated in a subsequent step z w, the depth value, is an attribute of this fragment used for further processing

Rasterization - Overview Input vertices with attributes and connectivity information attributes: color, depth, texture coordinates Output fragments with attributes pixel position interpolated color, depth, texture coordinates [Akenine-Moeller]

Illustration fragment processing vertices with connectivity fragments (pixel candidates) final image (pixels)

Fragment Processing - Overview Fragment attributes are processed and tests are performed fragment attributes are processed fragments are discarded or fragments pass a test and finally update the framebuffer Processing and testing make use of fragment attributes textures framebuffer data that is available for each pixel position depth buffer, color buffer, stencil buffer, accumulation buffer

Illustration additional data Vertex -color -depth Connectivity Rasterization Texture - color Fragment -color -depth final image Framebuffer -color -depth points of a triangle pixel candidates

Depth Test Depth test compare depth value of the fragment and depth value of the framebuffer at the position of the fragment used for resolving the visibility if the depth value of the fragment is larger than the framebuffer depth value, the fragment is discarded if the depth value of the fragment is smaller than the framebuffer depth value, the fragment passes and (potentially) overwrites the current color and depth values in the framebuffer

Depth Test current framebuffer incoming fragments tri 1 updated framebuffer [Wikipedia] current framebuffer incoming fragments tri 2 updated framebuffer

Rendering Pipeline - Summary Scene Description Vertex Processing Primitive Processing Rasterization Fragment Processing Display

Rendering Pipeline - Summary Primitives consist of vertices Vertices have attributes (color, depth, texture coords) Vertices are transformed and lit Primitives are rasterized into fragments with interpolated attributes Fragments are processed using their attributes such as color, depth, texture coordinates texture data / image data framebuffer data / data per pixel position (color, depth, stencil, accumulation) If a fragment passes all tests, it replaces the pixel data in the framebuffer