3D Rendering Pipeline

Similar documents
Rasterization Overview

CS451Real-time Rendering Pipeline

The Application Stage. The Game Loop, Resource Management and Renderer Design

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Rendering Objects. Need to transform all geometry then

Scanline Rendering 2 1/42

The Rasterization Pipeline

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Spring 2009 Prof. Hyesoon Kim

Spring 2011 Prof. Hyesoon Kim

Pipeline Operations. CS 4620 Lecture 14

CS 354R: Computer Game Technology

Graphics and Interaction Rendering pipeline & object modelling

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI

Adaptive Point Cloud Rendering

Computer Graphics. Lecture 02 Graphics Pipeline. Edirlei Soares de Lima.

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

Game Architecture. 2/19/16: Rasterization

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

CS4620/5620: Lecture 14 Pipeline

Chapter 5. Projections and Rendering

CHAPTER 1 Graphics Systems and Models 3

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

COMP30019 Graphics and Interaction Rendering pipeline & object modelling

Lecture outline. COMP30019 Graphics and Interaction Rendering pipeline & object modelling. Introduction to modelling

Lecture 2. Shaders, GLSL and GPGPU

Enhancing Traditional Rasterization Graphics with Ray Tracing. October 2015

3D GRAPHICS. design. animate. render

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

CSE 167: Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

Models and The Viewing Pipeline. Jian Huang CS456

Texturing Theory. Overview. All it takes is for the rendered image to look right. -Jim Blinn 11/10/2018

TSBK03 Screen-Space Ambient Occlusion

CSE 167: Introduction to Computer Graphics Lecture #9: Visibility. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

CSE 167: Introduction to Computer Graphics Lecture #10: View Frustum Culling

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

Graphics Hardware. Instructor Stephen J. Guy

CSE 690: GPGPU. Lecture 2: Understanding the Fabric - Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department

CSE 167: Introduction to Computer Graphics Lecture #11: Visibility Culling

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

CSC Graphics Programming. Budditha Hettige Department of Statistics and Computer Science

Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Optimizing and Profiling Unity Games for Mobile Platforms. Angelo Theodorou Senior Software Engineer, MPG Gamelab 2014, 25 th -27 th June

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

The Graphics Pipeline

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Bringing AAA graphics to mobile platforms. Niklas Smedberg Senior Engine Programmer, Epic Games

Enhancing Traditional Rasterization Graphics with Ray Tracing. March 2015

Shadow Algorithms. CSE 781 Winter Han-Wei Shen

Deferred Rendering Due: Wednesday November 15 at 10pm

Pipeline Operations. CS 4620 Lecture 10

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Computer graphics 2: Graduate seminar in computational aesthetics

The Graphics Pipeline and OpenGL I: Transformations!

PowerVR Hardware. Architecture Overview for Developers

Hardware-driven visibility culling

OpenGL: Open Graphics Library. Introduction to OpenGL Part II. How do I render a geometric primitive? What is OpenGL

2.11 Particle Systems

Performance OpenGL Programming (for whatever reason)

CS130 : Computer Graphics Lecture 2: Graphics Pipeline. Tamar Shinar Computer Science & Engineering UC Riverside

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

2: Introducing image synthesis. Some orientation how did we get here? Graphics system architecture Overview of OpenGL / GLU / GLUT

Enabling immersive gaming experiences Intro to Ray Tracing

Could you make the XNA functions yourself?

CS 4620 Program 3: Pipeline

The Rendering Pipeline (1)

Cloth Simulation on the GPU. Cyril Zeller NVIDIA Corporation

Rendering Grass with Instancing in DirectX* 10

Lecture 3 Sections 2.2, 4.4. Mon, Aug 31, 2009

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

Shadows in the graphics pipeline

Real-Time Graphics Architecture

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

WebGL (Web Graphics Library) is the new standard for 3D graphics on the Web, designed for rendering 2D graphics and interactive 3D graphics.

EECE 478. Learning Objectives. Learning Objectives. Rasterization & Scenes. Rasterization. Compositing

CSE328 Fundamentals of Computer Graphics

Ciril Bohak. - INTRODUCTION TO WEBGL

Real - Time Rendering. Graphics pipeline. Michal Červeňanský Juraj Starinský

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

Hands-On Workshop: 3D Automotive Graphics on Connected Radios Using Rayleigh and OpenGL ES 2.0

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

Introduction to Visualization and Computer Graphics

CS 130 Exam I. Fall 2015

Lecture 4. Viewing, Projection and Viewport Transformations

Course Title: Computer Graphics Course no: CSC209

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Chapter IV Fragment Processing and Output Merging. 3D Graphics for Game Programming

Culling. Computer Graphics CSE 167 Lecture 12

Drawing Fast The Graphics Pipeline


Models and Architectures

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

Transcription:

3D Rendering Pipeline Reference: Real-Time Rendering 3 rd Edition Chapters 2 4 OpenGL SuperBible 6 th Edition Overview Rendering Pipeline Modern CG Inside a Desktop Architecture Shaders Tool Stage Asset Conditioning Stage The Application Stage The Geometry Stage The Rasterizer Stage 1

https://youtu.be/gxi0l3yqbra Type equation here. 2

3

Fundamentals Rendering Pipeline Graphics Rendering Pipeline to generate/render a 2-D image give a virtual camera, 3-D objects, lights sources, shading equations, textures, etc Similar to mechanical pipelines such as ski-lifts and car assemblies A non-pipelined system can be sped up by n times by each additional stage added The stages of the pipeline are executed in parallel Stalled by the slowest moving stage (bottleneck) Fundamentals From 3D to 2D A high-level view of our pipeline hardware and application 3D Models Textures Assets Application Input Devices: Mouse Keyboard Tablet Processor Graphics Processor Memory Frame Buffer Display Device From Chapter 1 Interactive Computer Graphics 5 th Edition 4

Architecture Tools Stage (Offline) creation of assets through authoring tools Asset Conditioning Stage (Offline) Convert to compatible format Application Stage software running on the CPU ( drawing API calls, collision detection, animation, physics, etc ) Geometry Stage geometry transformations/projections, what is drawn, where drawn and should be drawn (typically on GPU) Rasterizer Stage draws the image using the generated data or perpixel operations. (Completely on the GPU) Tools Asset Conditioning Controlling the Pipeline Tools Stage (Offline) Through authoring tools Asset Conditioning Stage (Offline) Through exporters / converters Application Stage Through a programming language and Application Programming Interfaces (APIs) Geometry Stage Through Vertex, Tessellation, Geometry Shaders and various state settings Rasterizer Stage Through Fragment Shaders and various state settings Requires communication with the Graphics hardware! We need a way to control this system! 5

Fixed Pipeline http://www.khronos.org/opengles/2_x/ Programmable Pipeline 6

Controlling the Pipeline API: (Application Programming Interface) Programmer uses an interface to control the graphics subsystem Graphics subsystems can be: Workstation (Quadro Card) Desktop or laptop (GeForce / Radeon) Console system (Custom Graphics Chipset) Mobile phone or tablet (Tegra / Intel) Standardized subsystem interface increases portability Lets developers ignore the platform ( to a limit ) Controlling the Pipeline The goal of any interface: Provide an abstraction layer Separate the application from the graphics subsystem Your app does not need to know the hardware APIs try to strike a balance between the abstraction level Game Engine High abstraction, potentially needs significant rewrites to make it work for purposes outside games Console Games Low abstraction allows designers to get maximum performance, but does not port well between consoles. The first generation of games are unfamiliar with the hardware and it takes time to understand it 7

Controlling the Pipeline Application invokes commands which are converted by a driver into commands for the underlying graphics hardware Hardware works on the commands as efficiently and quickly as possible Commands are queued /partially completed Multiple stages can be processed in parallel Many commands are repetitive tasks (vertex or pixel commands) and independent of one another Controlling the Pipeline GPUs consist of large numbers of small programmable processors (shader cores) Cores run small programs called shaders Cores have low throughput and lack advanced processor features GPU can contain hundreds to thousands of cores to perform large amounts of work 8

The Six Types of Shaders Vertex Shaders enable operations to be performed per vertex Tessellation Control Shader determines level of tessellation and generate data for tessellation engine Tessellation Evaluation Shader operates on the output vertices of the tessellation engine Geometry Shaders allow the creation and destruction of geometric primitives (points, lines, triangles) at run-time Pixel Shaders enable operation to be performed per fragment Compute Shader independent pipeline that operates on work items Effects The shaders require matching incoming and outgoing data between the stages Shaders do not exist in a vacuum The various shader stages can be utilized to create a myriad of effects Non-realistic to hyper-realistic effects are possible with shaders 9

Fundamentals Tools Stage Artist use digital content creation (DCC) applications to produce content without understanding how it works Maya in this course, free student edition: http://www.autodesk.com/education/student-software Utilize image editing program such as Adobe Photoshop or GIMP to create 2D Images You can use the Rowan Cloud s copy: www.rowan.edu/cloud Tools in this stage are expected to be easy to use and reliable so that non-technical people can get their work done! Fundamentals Tools Stage 10

Fundamentals Asset Conditioning Stage DCC is usually far more complex than the game engine can accept DCC Graph structure History of edits Animation controls Application specific features Application data format is usually a closed design We will utilize the FBX format to convert models from Maya to Unity in this course The Geometry Stage Moving to the GPU! Geometry Stage (Conceptual) The Geometry Stage is divided into several functional stages. 11

The Geometry Stage The Concept Primary purpose: get your objects to the proper place Object was created with local coordinates Must be converted into world coordinates Oriented according to the view Distort the projection Finally, the 3D world must be flattened into a 2D image (screen space coordinates) The Geometry Stage Local Coordinates: World Coordinates: 12

The Geometry Stage - Model & World Transform A model transitions between many spaces or coordinate systems A model begins in Local Space or Model Space The origin of the model is situated in a convenient place for the artist/programmer TIP: From here on out, remember this to keep your axes straight: XYZ = RGB The Geometry Stage - Model & World Transform This model needs a model transform Dictates the necessary transformations (translate, rotate, scale) to move it from model space into world space. 13

The Geometry Stage - Model & World Transform A single model may have multiple model transforms We call the model a mesh Each reference to that model data is an instance This allows us to have multiple copies of the same model, without duplicate base geometry (saves on memory) Local vs World In Unity, we can affect an object through either its local or world coordinates If you modify the position or rotation, you are doing so in World Space If you modify the local position or local rotation, you do so with respect to the parent s space Let s create a set of planets with carefully setup pivot points and groups 14

Local vs World Local vs World Create the following Hierarchy Download textures of Earth, Moon and Sun Setup Materials and assign to each object The pivot objects are empty game objects just meant to provide a pivot point to offset about 15

Local vs World The Geometry Stage - Model & World Transform A model is composed of vertices and normals Vertices can be thought of as points, which dictate the shape of the object Normals are utilized primarily in lighting to determine the amount of illumination on a surface Vertices and normals are transformed inside the pipeline in the vertex shader 16

The Geometry Stage Frustum Eye Coordinates Projection Coordinates Unity Camera Our camera component defines several important items How the screen is cleared and what color The projection type The near and far plane, which can impact the visuals of your scene in several ways 17

The Geometry Stage - World & View Transform After world space, we transform into view space The view field is called the View Frustum The purpose of the view camera is to orient the entire world so that the camera is facing down the +/- Z axis (Direct X is Left or positive, OpenGL is Right or negative) The Geometry Stage - World & View Transform There is no spoon - The Matrix The reality behind the camera is that there is no camera While we can create a camera class that operates like a reallife camera, it is still a mathematical abstraction Does not need to follow realworld rules We don t move a camera, we move the world in the opposite direction 18

The Geometry Stage - World & View Transform The Geometry Stage - View Transform To transform all of the objects in model space to camera / eye / view space, we invert the camera s transformations. This is accomplished by negating all the terms. 19

The Geometry Stage - Projections Our geometry still exists as 3D coordinates Convert the 3D points into homogenous coordinates Orthographic Projection and Perspective Projection Projection involves a volume, with orthographic being a rectangular box and perspective being a frustum All points are normalized between a [1,1,1] and [-1,-1,-1] volume The Geometry Stage - Orthographic Projection Orthographic view: keep lines parallel after being transformed Easiest method: disregard the z value through the following transform P 0 = 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 20

The Geometry Stage - Orthographic Projection This form of orthographic projection is non-invertible (determinant is zero) Once we step down from 3D to 2D, the data is lost! The Geometry Stage - Orthographic Projection This final cube is called the canonical view volume The coordinates are referred to as the normalized device coordinates 21

The Geometry Stage - Orthographic Projection A canonical view volume is created through a translate and scale transform P 0 = S s T t 2 r l 0 0 0 0 2 t b 0 0 0 0 2 f n 0 0 0 0 1 l + r 1 0 0 2 t + b 0 1 0 2 f + n 0 0 1 2 0 0 0 1 The Geometry Stage - Perspective Projection In a perspective projection, parallel lines converge Meant to represent our own way of viewing the world The father an object is, the smaller it becomes 22

The Geometry Stage - Projection Transform The view volume as seen from the side of a camera. The Geometry Stage - Projection Transform A scene waiting to be projected into 2D space 23

The Geometry Stage - Projection Transform An example of an orthographic view transformation The Geometry Stage - Projection Transform An example of a perspective view transformation 24

GPU Pipeline Overview The GPU implements the geometry and rasterization stages of the pipeline Each of the GPU stages have levels of configurability / programmability Green - full programmability Yellow - configurability Blue - completely fixed 25

GPU Pipeline Overview Vertex Shader Implement model and view transform, shading and projection (output is a vertex in eye coordinates) Geometry Shader Optional, operates on points, lines and triangles to destroy or create new primitives Clipping, Screen Mapping, Triangle Setup, Triangle Traversal fixed-function stages (implemented in hardware) Pixel Shader performs the final pixel value from fragments Merger customizable stage for merging buffers Vertex Shader The first stage that performs any graphical processing Deals exclusively with vertices Have access to vertex position, color, normal, UVs and more It is also responsible for transforming vertices from model to homogenous coordinate space 26

Vertex Shader Each incoming vertex is processed and outputted The vertex shader can not create or destroy vertices There is no communication between vertices and you don t know the order of processing Vertex Shader Some uses: Object deformations (twists, bends) Procedural deformations for flags and cloth Per-vertex lighting Page curls, heat haze, water ripples The vertex information is passed on to the optional Tessellation/Geometry Stages, Clipping, Screen Mapping Triangle Setup, Triangle Traversal 27

Tessellation Control & Evaluation Shaders Process of breaking large primitive patches into smaller primitives before rendering Most common use is to add geometric detail to lower fidelity meshes Has three phases: 1. Tessellation Control Shader 2. Fixed-function tessellation engine 3. Tessellation Evaluation Shader These stages are sandwiched between the vertex shader and geometry shader This shader is OPTIONAL and does not need to be active! Tessellation Control & Evaluation Shaders 28

Tessellation Control & Evaluation Shader A patch is a group of vertices Input vertices are referred to as control points Tessellation control shader responsible for: Per-patch inner and outer tessellation factors Position and other attributes for each output control point Per-patch user-defined varyings Tessellation Control defines to the engine how to split up the patch based on inner and outer factors Tessellation Evaluation Shader takes in tessellated points in a coordinate system with respect to the control points Up to YOU to determine final location for points Geometry Shader Located after the vertex shader/tessellation shaders and is optional Input is an object such as triangles, line segments or points and the associated vertices. Unique in that is can generate / transform / delete the amount of data passing through the pipeline Additional vertices outside the processed object can be passed in, which can be utilized for algorithms dependent on nearest-neighbors 29

Geometry Shader Optional phase, when not included vertex/tessellation data is interpolated and fed to the fragment shader We specify with layout qualifiers the type of primitive input and the expected primitive output: Input / Ouput Primitive Modes points lines triangles lines_adjacency triangles_adjacency Yes, we can change the input from one mode to another with geometry shaders! Clipping No reason to compute anything outside the view volume Any point that falls completely outside the -1, +1 view cube will not be passed onto the next stage Primitives that are on the boundary require clipping 30

Clipping You can also define additional clipping planes to slice the object, referred to as sectioning The clipping stage is almost always controlled by a fixed-operation stage in the 3D hardware. Clipping 31

Screen Mapping The X and Y coordinates are transformed into screen coordinates (At this point, we still retain the X, Y & Z coordinates) All three coordinates are referred to as window coordinates All that remains is scaling and translating from -1 to +1 to screen space Screen Mapping We can map the camera s image to only a portion of the scene by manipulating the Viewport Rect We could setup multiple cameras and have them all be visible by manipulating the Viewport Rect Here we see four cameras with four separate views to the window 32

Rasterization Stage Conversion of 2D vertices into color pixels (picture elements) This processes is called Rasterization or Scan Conversion. These are fixed-functionality stages! This stage is broken down into the following sub-stages: Rasterization Stage Triangle Setup Data for the triangles are computed for scan conversion and interpolation. Triangle Traversal Each pixel that has its center covered by a triangle has a fragment computed Determines which pixels are covered is called triangle traversal or scan conversion. Fragments are computed using interpolating techniques between the three vertices that make up the triangle 33

Pixel Shader Pixel shader can only operate on it s fragments passed in You can not operate on neighboring pixels One exception to this rule and it has to do with gradient calculation You can return: a fragment color depth information, reject processing fragments fog computations alpha testing and more You can have multiple render targets (MRTs) GTA : http://www.adriancourreges.com/blog/2015/11/02/gt a-v-graphics-study/ Doom : http://www.adriancourreges.com/blog/2016/09/09/do om-2016-graphics-study/ 34

http://www.valvesoftware.com/publications/2004/gdc2004_half-life2_shading.pdf 35

36

37

38

Rasterization Pixel Shading Applying textures is performed in this stage UV coordinates are passed from the vertex shader Coordinates are used to lookup the color value for the fragment being processed Merging Stage Depths and colors of the fragments are merged into the frame buffer Stencil Buffer and Z-buffer operations are performed as well as color blending for transparency and compositing This stage features a suite of highly configurable options, but it is not programmable You can specify the mathematical operation utilized during fragment combinations using addition/multiplication/etc and even clamps and bitwise logical operations 39

Rasterization - Merging The Z-Buffer contains the current closest fragment on the screen If a new fragment is closer, the z-buffer value is overwritten If it is further away the fragment is discarded. There is a significant weakness with the z-buffer Requires semi-transparent objects to be rendered in proper order from back to front Rasterization - Merging There are additional buffers beyond the color and z- buffer. Alpha channel Alpha checks can be processed and fragments discarded if they do not match a set alpha value. Stencil buffer An offscreen buffer to record the locations of rendered primitives. Can create cut outs and only allow certain parts of our screen rendered. (Good for reflections) Frame buffer This is actually every buffer utilized in the application, however sometimes it is referred to as the combined Z and color buffers Accumulation Buffer Accumulation of multiple frames into one buffer. (Can be used for motion blur or depth of fields and a number of other effects) 40

Rasterization - Merging This is the end of the pipeline All primitives are now rasterized A double buffer is utilized where graphics are drawn to an internal (not visible) buffer Either a bit swap is performed or a pointer swap and the monitor is update with the new image In this lab a technique called quad buffering is employed Utilizes 4 buffers, two for each eye, one front and one back buffer Requires additional memory and processing time 41