The 3D Graphics Rendering Pipeline

Similar documents
Pipeline Operations. CS 4620 Lecture 10

Rasterization Overview

Pipeline Operations. CS 4620 Lecture 14

CHAPTER 1 Graphics Systems and Models 3

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

CS451Real-time Rendering Pipeline

Computer Graphics I Lecture 11

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

Chapter 2 A top-down approach - How to make shaded images?

Interpolation using scanline algorithm

CSE 690: GPGPU. Lecture 2: Understanding the Fabric - Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department

CSE 167: Introduction to Computer Graphics Lecture #5: Projection. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Graphics for VEs. Ruth Aylett

Models and The Viewing Pipeline. Jian Huang CS456

CS4620/5620: Lecture 14 Pipeline

Introduction to Computer Graphics 7. Shading

OpenGL Transformations

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

CS 130 Final. Fall 2015

3D Rasterization II COS 426

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Advanced Lighting Techniques Due: Monday November 2 at 10pm

CS 4620 Program 3: Pipeline

Models and Architectures

3D Graphics Pipeline II Clipping. Instructor Stephen J. Guy

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

CS 464 Review. Review of Computer Graphics for Final Exam

COMP environment mapping Mar. 12, r = 2n(n v) v

Mach band effect. The Mach band effect increases the visual unpleasant representation of curved surface using flat shading.

Introduction to Visualization and Computer Graphics

3D Graphics for Game Programming (J. Han) Chapter II Vertex Processing

Modeling the Virtual World

Interactive Computer Graphics A TOP-DOWN APPROACH WITH SHADER-BASED OPENGL

Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to mode

Illumination and Shading

Ray Tracer Due date: April 27, 2011

Graphics Pipeline. CS535 Fall Daniel G. Aliaga Department of Computer Science Purdue University

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

Graphics for VEs. Ruth Aylett

Lecture 4 of 41. Lab 1a: OpenGL Basics

Geometry: Outline. Projections. Orthographic Perspective

Computer Graphics 1. Chapter 7 (June 17th, 2010, 2-4pm): Shading and rendering. LMU München Medieninformatik Andreas Butz Computergraphik 1 SS2010

Homework #2. Shading, Ray Tracing, and Texture Mapping

CS 591B Lecture 9: The OpenGL Rendering Pipeline

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

Computergrafik. Matthias Zwicker. Herbst 2010

Photorealism: Ray Tracing

CSE528 Computer Graphics: Theory, Algorithms, and Applications

CS 112 The Rendering Pipeline. Slide 1

The Rasterization Pipeline

The Traditional Graphics Pipeline

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Introduction Rasterization Z-buffering Shading. Graphics 2012/2013, 4th quarter. Lecture 09: graphics pipeline (rasterization and shading)

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

GLOBAL EDITION. Interactive Computer Graphics. A Top-Down Approach with WebGL SEVENTH EDITION. Edward Angel Dave Shreiner

Computer Graphics. Illumination and Shading

The Traditional Graphics Pipeline

3D Polygon Rendering. Many applications use rendering of 3D polygons with direct illumination

Introduction to Computer Graphics with WebGL

Spring 2009 Prof. Hyesoon Kim

Motivation. Sampling and Reconstruction of Visual Appearance. Effects needed for Realism. Ray Tracing. Outline

COMP3421. Introduction to 3D Graphics

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

Topics and things to know about them:

COMP3421. Introduction to 3D Graphics

Computer Science 426 Midterm 3/11/04, 1:30PM-2:50PM

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI

Comp 410/510 Computer Graphics. Spring Shading

CPSC / Illumination and Shading

Course Title: Computer Graphics Course no: CSC209

Today. Rendering algorithms. Rendering algorithms. Images. Images. Rendering Algorithms. Course overview Organization Introduction to ray tracing

CS452/552; EE465/505. Intro to Lighting

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Level of Details in Computer Rendering

The Traditional Graphics Pipeline

CS 325 Computer Graphics

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

Spring 2011 Prof. Hyesoon Kim

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Module 13C: Using The 3D Graphics APIs OpenGL ES

Deferred Rendering Due: Wednesday November 15 at 10pm

Computer Graphics and Visualization. What is computer graphics?

Viewing COMPSCI 464. Image Credits: Encarta and

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

Computer Graphics: Programming, Problem Solving, and Visual Communication

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Chapter 7 - Light, Materials, Appearance

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

CS5620 Intro to Computer Graphics

Virtual Cameras and The Transformation Pipeline

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment

Shading, Advanced Rendering. Week 7, Wed Feb 28

Transcription:

The 3D Graphics Rendering Pipeline Animação e Visualização Tridimensional João Madeiras Pereira

Which Visualization? Interactive Real-Time Rendering 3D scenes Realism Real-Time Interaction Application paradigm: games

The Graphics Rendering Pipeline The process of creating a 2-D Image from a 3-D model (scene) is called rendering. The rendering pipeline has three functional stages. The speed of the pipeline is that of its slowest stage. Application Geometry Rasterizer

Low-Level API vs. High-Level API

Low level rendering pipeline

OpenGL rendering pipeline(s)

The Rendering Pipeline Application Geometry Rasterizer

The application stage Is done entirely in software by the CPU; It reads Input devices (such as gloves, mouse); It changes the coordinates of the virtual camera; It performs collision detection and collision response (based on object properties) for haptics; One form of collision response if force feedback.

Application stage optimization Reduce model complexity (models with less polygons less to feed down the pipe); Low res. Model ~ 600 polygons Higher resolution model 134,754 polygons.

Computing Architectures The Rendering Pipeline Application Geometry Rasterizer Rendering pipeline

The Graphics Rendering Pipeline (revisited) Application Geometry Rasterizer The Geometry Functional Sub-Stages Model & View Transformation Lighting Projection Clipping Screen Mapping

The Rendering Pipeline Application Geometry Rasterizer The Geometry Functional Sub-Stages Model & View Transformation Lighting Projection Clipping Screen Mapping

Viewing Transformation The Viewing Transform transform objects from World Coord. to Camera coord. (eye space) Its matrix captures the position and orientation of the virtual camera in the virtual world; The camera is located at the origin of the camera coordinate system, looking in the negative Z axis, with Y pointing upwards, and X to the right. Normalized vector n = lookat CamPos y w z w y v x v v u n x w -z v Right-handed frame in OGL but n = -z v

Viewing Transform Default (and initial) VRC left-handed frame: Basis vectors u [1 0 0]; v [0 1 0] and n [0 0-1] Origin:VRP Moving the camera implies to orient it and then translate it to the new VRP (vrp x vrp y vrp z ): T [VRP] R. Two ways of understanding this transformation : One simple frame (WCS): moving the virtual camera is equivalent to transform the objects by using the inverse camera transformation: (T [VRP] R) -1, R -1 T[-VRP] Align both frames: First, we perform a translation T[ VRP] and then a rotation. Corresponde em efectuar uma translação seguida de uma rotação Viewing Transform: M view = R rot T trans T trans = T[ VRP] And rotation??? 18

19 Each column of a rotation matrix represents the axis of the resulting frame The camera frame is initially aligned with World frame, except that it is pointing in the negative camera z-direction u: [1 0 0] [u x u y u z ] v: [0 1 0] [v x v y v z ] -n: [0 0 1] [-n x -n y -n z ] Object rotation is (R Camera ) -1. Rotation matrix is orthogonal; so inverting it means to calculate its transpose. The rows of the resulting matrix are the vectors u, v and n. 1 0 0 0 0 0 0 z y x z y x z y x rot n n n v v v u u u R 1 0 0 0 0 0 0 z z z y y y x x x Camera n v u n v u n v u R Calculating the camera rotation

Viewing Transformation Viewing Transformation M vis = R rot T trans 1 0 0 0 0 0 0 z y x z y x z y x rot n n n v v v u u u R 1 0 0 0 1 0 0 0 1 0 0 0 1 z y x trans VRP VRP VRP T 1 0 0 0 VRP n n n n VRP v v v v VRP u u u u M z y x z y x z y x vis

Back-Faces Culling In World coord. sys or Camera coord. sys (prefer) Based on dot product between face normal and viewing vector

The lighting sub-stage It calculates the vertex color based on: type and number of simulated light sources; the lighting model; the surface material properties; atmospheric effects such as fog or smoke. Lighting results in object shading which makes the scene more realistic.

Phong Lightning Model n specular l ( <<) ( >>) r ambient + diffuse v I k a L a k d L d max P l n,0 k L max r v s s,0

Phong Lightning Model Several (i) Light Sources Specular Component I s k s i L is ( r v) i Diffuse Component I d k d i L id ( l n) i

Phong Lightning Model Light Attenuation Intensity varies with distance bewteen surface and light source Given by: Where L I a bd cd d: distance from object to light source a, b, c: empyrical constants L: intensity at light source 2

Phong Lightning Model I i a bd 1 i cd 2 i kd Lid max( li n) kslis max( ri v) kalia L 1 n v L 2 l r 1 1 l r 2 2 r 3 l 3 L 3 P

Phong Lightning Model Object k ra, k ga, k ba : ambient reflection coefficients k rd, k gd, k bd : diffuse reflection coefficient k rs, k gs, k bs : specular reflection coefficient Light Source L ra, L ga, L ba : ambienty light intensity L rd, L gd, L bd : diffuse light intensity L rs, L gs, L bs : specular light intensity

Computation of Blinn Approximation r is expensive h Instead, halfway vector is used Vector normal a uma hipotética faceta reflectora pura Vector médio normalizado h l l v v l v 2 Unit Vectors l and v l n h v

Blinn Approximation Estimation of specular component Use Instead of n h r v Selecting a value for that guarantees: n h r v

Modified Phong Lightning Model (Blinn-Phong Lightning Model) I k a L a i a bd 1 i cd 2 i k d L i max[( l i n),0] k s L i max[( n i h ) i,0] L 1 h h 1 2 n v r 3 l 1 L 2 l 2 L 3 l 3 P

Ilumination and Materials in OpenGL

VR Modeling The Rendering Pipeline Application Geometry Rasterizer The Geometry Functional Sub-Stages Model & View Transformation Lighting Projection Clipping Screen Mapping

Projection Transformations: Models what portion (volume) of the virtual world the camera actually sees. There are two kinds of projections, parallel projection and perspective projection projection plane projection reference point Parallel projection Perspective projection

Orthogonal View-Volume

Frustum The portion of the virtual world seen by the camera at a given time is limited by front and back clipping planes. These are at z=n and z= f. Only what is within the viewing cone (also called frustum) is sent down the rendering pipe. clipping plane (l,t,f) projection reference point Y Z X (r,b,n)

Abertura do Volume Perspectivo FOV: Field of View Y v Vista lateral do volume v VRP n Jan. Vis. D CW h Z v V : abertura vertical tg ( V / 2) = h / D Vista topo do volume VRP u n D Jan. Vis. CW w Z v W : abertura horizontal tg ( W / 2) = w / D X v

Projection transformation

Por omissão: Câmara em OpenGL Câmara na origem do referencial do mundo, aponta para z Volume de visualização é cubo centrado na origem c/ lado 2

Especificação em OpenGL glmatrixmode (GL_MODELVIEW) glulookat()... glmatrixmode(gl_projection)...

Exemplo em OpenGL Calcule as dimensões da janela de visualização do frustum simétrico definido por: gluperspective(120.0f, 1.33f, 15.0, 120.0) Nota: Sinopse do comando gluperspective: void gluperspective( GLdouble fovy, GLdouble aspect, GLdouble near, GLdouble far)

void glortho( GLdouble left, GLdouble right, GLdouble bottom, GLdouble top, GLdouble nearval, GLdouble farval);

Canonical Mapping from Frustum: The perspective transform maps the viewing volume to a unit cube with extreme points at (-1,-1,-1) and (1,1,1). This is also called the canonical (normalized) view volume. (l,t,f) T projection = [ ] 2n/(r-l) 0 -(r+l)/(r-l) 0 0 2n/(t-b) -(t+b)/(t-b) 0 0 0 (f+n)/(f-n) -2fn/(f-n) 0 0 1 0 Y Y Z X (r,b,n) Z X

void glfrustum( GLdouble left, GLdouble right, GLdouble bottom, GLdouble top, GLdouble nearval, GLdouble farval);

void gluperspective( GLdouble fovy, GLdouble aspect, GLdouble znear, GLdouble zfar);

VR Modeling The Rendering Pipeline Application Geometry Rasterizer The Geometry Functional Sub-Stages Model & View Transformation Lighting Projection Clipping Screen Mapping

Clipping Transformation: Since the frustum maps to the unit cube, only objects inside it will be rendered. Some objects are partly inside the unit cube (ex. the line and the rectangle). Then they need to be clipped. The vertex V 1 is replaced by new one at the intersection between the line and the viewing cone, etc. V 1 Scene clipping Unit cube V 1 X X V 3 V 2 V 3 V 2 Z Z

VR Modeling The Rendering Pipeline Application Geometry Rasterizer The Geometry Functional Sub-Stages Model & View Transformation Lighting Projection Clipping Screen Mapping

Screen Mapping (Viewport Transformation): The scene is rendered into a window with corners (x 1,y 1 ), (x 2,y 2 ) Screen mapping is a translation followed by a scaling that affects the x and y coordinates of the primitives (objects), but not their z coordinates. Screen (Window in OGL) coordinates plus z [0,1] are passed to the rasterizer stage of the pipeline. In OpenGL this coordinate space is called Window Space. Screen mapping V 1 X (x2,y2) V 3 V 2 (x1,y1) Z

The rendering speed vs. surface polygon type The way surfaces are described influences rendering speed. If surfaces are described by triangle meshes, the rendering will be faster than for the same object described by independent quadrangles or higher-order polygons. This is due to the graphics board architecture which may be optimized to render triangles.

Computing Architectures The Rendering Pipeline Application Geometry Rasterizer

The Rasterizer Stage Performs operations in hardware for speed; Converts vertices information from the geometry stage (x,y,z, color, texture) into pixel information on the screen; The pixel color information is in color buffer; The pixel z-value is stored in the Z-buffer (has same size as color buffer); Assures that the primitives that are visible from the point of view of the camera are displayed.

The Rasterizer Stage - continued The scene is rendered in the back buffer; It is then swapped with the front buffer which stores the current image being displayed; This process eliminates flicker and is called double buffering ; All the buffers on the system are grouped into the frame buffer.

OpenGL rendering pipeline(s)

Detail Steps in OpenGL

Shading Techniques Wire-frame is simplest only shows polygon visible edges; The flat shaded model assigns same color to all pixels on a polygon (or side) of the object; Gouraud or smooth shading interpolates colors Inside the polygons based on the color of the edges; Phong shading interpolates the vertex normals before calculating the light intensity based on the model described most realistic shading model.

Local illumination methods I p = I b (I b - I a ) x b -x p x b -x a Flat shading model Gouraud shading model Phong shading model

Computing architectures Wire-frame model Flat shading model Gouraud shading model

Scene illumination Local methods (Flat shaded, Gouraud shaded, Phong shaded) treat objects in isolation. They are computationally faster than global illumination methods; Global illumination treats the influence of one object on another object s appearance. It is more demanding from a computation point of view but produces more realistic scenes.

Flat shaded Utah Teapot Phong shaded Utah Teapot

Global scene illumination The inter-reflections and shadows cast by objects on each other.

How to create textures: Models are available on line in texture libraries of cars, people, construction materials, etc. Custom textures from scanned photographs or Using an interactive paint program to create bitmaps

Texturing methods Objects rendered using Phong reflection model and Gouraud or Phong interpolated shading often appear rather plastic and floating in air Texture effects can be added to give more realistic looking surface appearance Texture mapping Texture mapping uses pattern to be put on a surface of an object Light maps Light maps combine texture and lighting through a modulation process Bump Mapping Smooth surface is distorted to get variation of the surface Environmental mapping Environmental mapping (reflection maps) enables ray-tracing like output

Where does mapping take place? Most mapping techniques are implemented at the end of the rendering pipeline Texture mapping as a part of shading process, but a part that is done on a fragment-to-fragment basis Very efficient because few polygons make it past the clipper

How to set (s,t) texture coordinates? Set the coordinates manually Set the texture coordinates for each vertex ourselves Automatically compute the coordinates Use an algorithm that sets the texture coordinates for us