The Application Stage. The Game Loop, Resource Management and Renderer Design

Similar documents
Rendering Objects. Need to transform all geometry then

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Could you make the XNA functions yourself?

Chapter Answers. Appendix A. Chapter 1. This appendix provides answers to all of the book s chapter review questions.

Working with Metal Overview

GPU Memory Model. Adapted from:

CS451Real-time Rendering Pipeline

Rendering Grass with Instancing in DirectX* 10

CS130 : Computer Graphics. Tamar Shinar Computer Science & Engineering UC Riverside

DX10, Batching, and Performance Considerations. Bryan Dudash NVIDIA Developer Technology

How to Work on Next Gen Effects Now: Bridging DX10 and DX9. Guennadi Riguer ATI Technologies

X. GPU Programming. Jacobs University Visualization and Computer Graphics Lab : Advanced Graphics - Chapter X 1

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI

3D Rendering Pipeline

Chapter 1 Introduction

Spring 2009 Prof. Hyesoon Kim

Spring 2011 Prof. Hyesoon Kim

The Graphics Pipeline

RSX Best Practices. Mark Cerny, Cerny Games David Simpson, Naughty Dog Jon Olick, Naughty Dog

Real - Time Rendering. Pipeline optimization. Michal Červeňanský Juraj Starinský

NVIDIA Parallel Nsight. Jeff Kiel

Rasterization Overview

Beginning Direct3D Game Programming: 1. The History of Direct3D Graphics

Lecture 2. Shaders, GLSL and GPGPU

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

Per-Face Texture Mapping for Realtime Rendering. Realtime Ptex

A Trip Down The (2011) Rasterization Pipeline

Ciril Bohak. - INTRODUCTION TO WEBGL

Direct3D API Issues: Instancing and Floating-point Specials. Cem Cebenoyan NVIDIA Corporation

6.837 Introduction to Computer Graphics Final Exam Tuesday, December 20, :05-12pm Two hand-written sheet of notes (4 pages) allowed 1 SSD [ /17]

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Shaders (some slides taken from David M. course)

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Copyright Khronos Group, Page Graphic Remedy. All Rights Reserved

GUERRILLA DEVELOP CONFERENCE JULY 07 BRIGHTON

WebGL (Web Graphics Library) is the new standard for 3D graphics on the Web, designed for rendering 2D graphics and interactive 3D graphics.

Graphics Hardware. Graphics Processing Unit (GPU) is a Subsidiary hardware. With massively multi-threaded many-core. Dedicated to 2D and 3D graphics

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

Craig Peeper Software Architect Windows Graphics & Gaming Technologies Microsoft Corporation

Programmable GPUs. Real Time Graphics 11/13/2013. Nalu 2004 (NVIDIA Corporation) GeForce 6. Virtua Fighter 1995 (SEGA Corporation) NV1

Graphics Performance Optimisation. John Spitzer Director of European Developer Technology

Lecture 25: Board Notes: Threads and GPUs

OPENGL RENDERING PIPELINE

Introduction to Shaders.

2.11 Particle Systems

Coding OpenGL ES 3.0 for Better Graphics Quality

2: Introducing image synthesis. Some orientation how did we get here? Graphics system architecture Overview of OpenGL / GLU / GLUT

Real-Time Hair Simulation and Rendering on the GPU. Louis Bavoil

PROFESSIONAL. WebGL Programming DEVELOPING 3D GRAPHICS FOR THE WEB. Andreas Anyuru WILEY. John Wiley & Sons, Ltd.

GPU Memory Model Overview

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Building scalable 3D applications. Ville Miettinen Hybrid Graphics

Optimisation. CS7GV3 Real-time Rendering

Case 1:17-cv SLR Document 1-3 Filed 01/23/17 Page 1 of 33 PageID #: 60 EXHIBIT C

Grafica Computazionale: Lezione 30. Grafica Computazionale. Hiding complexity... ;) Introduction to OpenGL. lezione30 Introduction to OpenGL

Mattan Erez. The University of Texas at Austin

CSE4030 Introduction to Computer Graphics

Vulkan and Animation 3/13/ &height=285&playerId=

The Illusion of Motion Making magic with textures in the vertex shader. Mario Palmero Lead Programmer at Tequila Works

Shaders. Slide credit to Prof. Zwicker

OpenGL ES for iphone Games. Erik M. Buck

Hardware-driven Visibility Culling Jeong Hyun Kim

Programming Guide. Aaftab Munshi Dan Ginsburg Dave Shreiner. TT r^addison-wesley

EECE 478. Learning Objectives. Learning Objectives. Rasterization & Scenes. Rasterization. Compositing

Graphics Hardware. Instructor Stephen J. Guy

Beyond Programmable Shading. Scheduling the Graphics Pipeline

Further Developing GRAMPS. Jeremy Sugerman FLASHG January 27, 2009

EECS 487: Interactive Computer Graphics

Real - Time Rendering. Graphics pipeline. Michal Červeňanský Juraj Starinský

Quick Shader 0.1 Beta

Performance OpenGL Programming (for whatever reason)

CS 4620 Program 3: Pipeline

Lecture 9: Deferred Shading. Visual Computing Systems CMU , Fall 2013

Software Engineering. ò Look up Design Patterns. ò Code Refactoring. ò Code Documentation? ò One of the lessons to learn for good design:

CS130 : Computer Graphics Lecture 2: Graphics Pipeline. Tamar Shinar Computer Science & Engineering UC Riverside

Graphics and Interaction Rendering pipeline & object modelling

NVIDIA nfinitefx Engine: Programmable Pixel Shaders

VISIBILITY & CULLING. Don t draw what you can t see. Thomas Larsson, Afshin Ameri DVA338, Spring 2018, MDH

CHAPTER 1 Graphics Systems and Models 3

Rendering Subdivision Surfaces Efficiently on the GPU

Next-Generation Graphics on Larrabee. Tim Foley Intel Corp

Enhancing Traditional Rasterization Graphics with Ray Tracing. March 2015

Adaptive Point Cloud Rendering

DirectX10 Effects and Performance. Bryan Dudash

Enhancing Traditional Rasterization Graphics with Ray Tracing. October 2015

Project 1, 467. (Note: This is not a graphics class. It is ok if your rendering has some flaws, like those gaps in the teapot image above ;-)

CPSC 436D Video Game Programming

Sign up for crits! Announcments

OpenGL Programmable Shaders

Interactive Computer Graphics A TOP-DOWN APPROACH WITH SHADER-BASED OPENGL

Computer graphics 2: Graduate seminar in computational aesthetics

DEFERRED RENDERING STEFAN MÜLLER ARISONA, ETH ZURICH SMA/

Module 13C: Using The 3D Graphics APIs OpenGL ES

The Rasterization Pipeline

1.2.3 The Graphics Hardware Pipeline

Understanding M3G 2.0 and its Effect on Producing Exceptional 3D Java-Based Graphics. Sean Ellis Consultant Graphics Engineer ARM, Maidenhead

DirectX10 Effects. Sarah Tariq

Evolution of GPUs Chris Seitz

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

Transcription:

1 The Application Stage The Game Loop, Resource Management and Renderer Design

Application Stage Responsibilities 2 Set up the rendering pipeline Resource Management 3D meshes Textures etc. Prepare data for rendering (primitives, textures, etc.) Collision detection Frustum Culling Sends data down the pipeline and modifies pipeline state Draw Calls Changing shaders, textures, rasterizer states, blend states etc.

3 Virtual GPU Device Creation and Setup The Graphics API and the Renderer

The Graphics API 4 We use a graphics API to set up the GPU pipeline and to send data down the pipeline. A graphics API is low-level library that communicates with the GPU driver and allows us easy access to GPU features. The most common APIs are opengl and Direct3D.

The Graphics API 5 Graphical API (User Mode) Driver (Kernel Mode) GPU The Graphical API sends commands to the Kernel Mode Driver. The driver then sends the actual commands to the GPU. There is a level of overhead introduced due to both the driver and the API. What are the benefits of an API?

The Renderer and the Application 6 MVC Design Pattern The Renderer The View Responsible for rendering the scene Written in a graphics API Handles all rendering and rendering resource management The Application The Model Determines what needs to be drawn Runs all object updates (physics, collision detection etc.) User Input Handler The controller

The Game Loop 7 While (!exit ) Process User Input Keyboard/mouse Input GUI Input Update Game AI Physics Collision Detection GUI updates Render Scene 3D Scene GUI Controller Model View

8 Vertex Buffers, Index Buffers, Textures, Shader Constants GPU Resource Management

Resource Types 9 The most common GPU resources are: VERTEX BUFFERS INDEX BUFFERS TEXTURES

Vertex Buffers 10 Large arrays containing vertex data Stored in video memory Requires input layout to describe memory layout of each element in the array VB Vertex Vertex Vertex Vertex Vertex Vertex Vertex Float3 position Float3 normal Float4 color Float flag Vertex Layout r8g8b8-0 r8g8b8 12 r8g8b8a8 24 r32-40

Vertex Buffers 11 Static Buffers Initialized with vertex data when created Cannot be changed via CPU Useful for static meshes Dynamic Buffers Empty memory when created Can be accessed and modified by CPU CPU access performed via Mapping Useful for changing vertex data, i.e. particle systems, text

Index Buffers 12 Contains a list of vertex buffer indices Stored in video memory Vertex Buffer 0 1 2 3 4 Vertex Vertex Vertex Vertex Vertex Index Buffer 1 3 2 1 3 4 0 1 2 4 Why is this useful?

Textures 13 Large arrays of data values stored in video memory 1D,2D,3D Textures don t necessarily contain image data Height Maps, Normal Maps, Depth textures Depth Texture Normal Map Height Map

Resource Management 14 Limited Video Memory Remember to RELEASE resources when finished with them do not just delete the pointer! Low-level GPU resource management is the job of the renderer Simulation/Game will determine what resources need to be loaded or freed.

15 A trip down the pipeline 3D Rendering Basics

Rendering Basics 3D Meshes 16 All 3D objects are comprised of a set of connected convex polygons (faces). Each face is defined by a set of points (vertices). A model is often defined as a list of vertices, with additional information on how to connect the vertices to create the required faces. Each vertex is positioned relative to the models own co-ordinate system (space), the objects co-ordinate system is known as model space If needed texture mapping and lighting information is defined at each vertex. Model and art courtesy of quigmire.com

Rendering Basics - Texturing 17 To add more detail to a model we skin or wrap the model using a 2D image which defines how the surface of the object should appear. Each face is mapped to portion of the 2D image by a technique called texture mapping or UV mapping. UV mapping refers to the co-ordinate system used by the texture. The texture is 2D and u represents the horizontal axis and v represents the vertical axis. Model and art courtesy of quigmire.com Mapping is the process of representing one coordinate system in terms of another.

Rendering Basics - Lighting 18 The previous mesh had very little detail in it, and so the artist has improved it by increasing the polygon (face) count. This is called a high polygon model. High polygon models are smoother in appearance and allow for more detailed lighting. Additional Lighting parameters are now added to the model so that it can be correctly shaded. Model and art courtesy of quigmire.com In the model we can see the illusion of details like button holes and stitching, these illusions are achieved by advanced lighting techniques such as normal mapping.

Rendering Basics Final Result 19 The image shown here is the result of the combination of the three basic techniques we just described. All the information needed to render the model was defined in the application stage, the vertex positions relative to the model s coordinate system, all the necessary resources such as textures and bump maps, all necessary lighting info such as materials and normal vectors are set up prior to sending anything to the GPU. Once all the data and resources have been allocated for the model then we send all the data down the pipeline using draw calls. Model and art courtesy of quigmire.com

A trip down the Pipeline 20 Okay so how do we send a 3D object down the pipeline? First step is to set up the input assembler stage: Bind the vertex buffer containing the object to the IA stage Set the input layout of the vertex buffer Set the primitive topology of the object s vertex buffer Bind any index buffers used to draw the object Vertex Buffer Input Layout Primitive Topology Index Buffer Input Assembler Stage Vertex Shader Rasterizer Stage Pixel Shader Output Merger Stage

A trip down the Pipeline 21 Now we need to set the vertex and pixel shader programs. Then we need to set up all vertex shader data matrices, constants, textures, samplers, etc Then we need to set up all pixel shader data matrices, constants, textures, samplers, etc Vertex Buffer Input Layout Primitive Topology Index Buffer Matrices Textures Samplers etc Matrices Textures Samplers etc Input Assembler Stage Vertex Shader VS1 Rasterizer Stage Pixel Shader PS1 Output Merger Stage

A trip down the Pipeline 22 The final step is to configure the Rasterizer and Merger stages. Now we can call the DRAW function to send all the data down the pipeline. DRAW Vertex Buffer Input Layout Primitive Topology Index Buffer Matrices Textures Samplers etc RS Params Matrices Textures Samplers etc OM Params Input Assembler Stage Vertex Shader VS1 Rasterizer Stage Pixel Shader PS1 Output Merger Stage

Direct3D 10 Effect System 23 The Direct3D 10 Effects contains the entire pipeline state apart from the IA stage. All stage data (matrices, textures, etc.) need to be set via the effect interface. When you apply a technique you set the pipeline state (including all stage data). EFFECT FILE TECHNIQUE1 VS Program GS Program PS Program Rasterizer State Output Merger State Texture Samplers ETC. TECHNIQUE2 VS Program GS Program PS Program Rasterizer State Output Merger State Texture Samplers ETC. TECHNIQUE3 VS Program GS Program PS Program Rasterizer State Output Merger State Texture Samplers ETC.