NVIDIA nfinitefx Engine: Programmable Pixel Shaders

Similar documents
Per-Pixel Lighting and Bump Mapping with the NVIDIA Shading Rasterizer

Mattan Erez. The University of Texas at Austin

Spring 2011 Prof. Hyesoon Kim

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

Spring 2009 Prof. Hyesoon Kim

Graphics Hardware. Instructor Stephen J. Guy

Hardware-Assisted Relief Texture Mapping

The Application Stage. The Game Loop, Resource Management and Renderer Design

Chapter 1 Introduction. Marc Olano

Hardware Displacement Mapping

CS451Real-time Rendering Pipeline

Shaders. Slide credit to Prof. Zwicker

Graphics Hardware. Graphics Processing Unit (GPU) is a Subsidiary hardware. With massively multi-threaded many-core. Dedicated to 2D and 3D graphics

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI

1.2.3 The Graphics Hardware Pipeline

Chapter 1. Introduction Marc Olano

Surface Rendering. Surface Rendering

CS130 : Computer Graphics. Tamar Shinar Computer Science & Engineering UC Riverside

Real Time Rendering of Complex Height Maps Walking an infinite realistic landscape By: Jeffrey Riaboy Written 9/7/03

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Rendering Grass with Instancing in DirectX* 10

CS 354R: Computer Game Technology

CS427 Multicore Architecture and Parallel Computing

Designing a Self-Calibrating Pipeline for Projection Mapping Application. Kevin Wright Kevin Moule

Introduction. What s New in This Edition

MMGD0206 Computer Graphics. Chapter 1 Development of Computer Graphics : History

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Rasterization Overview

GeForce4. John Montrym Henry Moreton

Introduction to Computer Graphics. Overview. What is Computer Graphics?

Shaders (some slides taken from David M. course)

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

ASYNCHRONOUS SHADERS WHITE PAPER 0

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

3D Computer Games Technology and History. Markus Hadwiger VRVis Research Center

X. GPU Programming. Jacobs University Visualization and Computer Graphics Lab : Advanced Graphics - Chapter X 1

Texture Mapping. Images from 3D Creative Magazine

2.11 Particle Systems

Building scalable 3D applications. Ville Miettinen Hybrid Graphics

CS 130 Final. Fall 2015

White Paper. Solid Wireframe. February 2007 WP _v01

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Shader Series Primer: Fundamentals of the Programmable Pipeline in XNA Game Studio Express

Level of Details in Computer Rendering

Soft Particles. Tristan Lorach

Lecture 25: Board Notes: Threads and GPUs

Lecture 2. Shaders, GLSL and GPGPU

The Need for Programmability

Computer Graphics. Bing-Yu Chen National Taiwan University

Mattan Erez. The University of Texas at Austin

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

CSE 591: GPU Programming. Introduction. Entertainment Graphics: Virtual Realism for the Masses. Computer games need to have: Klaus Mueller

Self-shadowing Bumpmap using 3D Texture Hardware

Shading Shades. Frank Jargstorff. June 1, Abstract

Real-Time Rendering of a Scene With Many Pedestrians

Monday Morning. Graphics Hardware

CS GAME PROGRAMMING Question bank

CS452/552; EE465/505. Clipping & Scan Conversion

GPU-Based Volume Rendering of. Unstructured Grids. João L. D. Comba. Fábio F. Bernardon UFRGS

Rendering Objects. Need to transform all geometry then

Lecturer Athanasios Nikolaidis

PowerVR Hardware. Architecture Overview for Developers

Deferred Rendering Due: Wednesday November 15 at 10pm

Graphics Performance Optimisation. John Spitzer Director of European Developer Technology

Programmable GPUS. Last Time? Reading for Today. Homework 4. Planar Shadows Projective Texture Shadows Shadow Maps Shadow Volumes

Hardware Shading: State-of-the-Art and Future Challenges

Programming Graphics Hardware

Dave Shreiner, ARM March 2009

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

Com S 336 Final Project Ideas

COS 116 The Computational Universe Laboratory 10: Computer Graphics

The Rasterization Pipeline

Consumer graphics cards for fast image processing based on the Pixel Shader 3.0 standard

Vulkan: Architecture positive How Vulkan maps to PowerVR GPUs Kevin sun Lead Developer Support Engineer, APAC PowerVR Graphics.

CS130 : Computer Graphics Lecture 2: Graphics Pipeline. Tamar Shinar Computer Science & Engineering UC Riverside

Shaders CSCI 4229/5229 Computer Graphics Fall 2017

COMP30019 Graphics and Interaction Rendering pipeline & object modelling

Lecture outline. COMP30019 Graphics and Interaction Rendering pipeline & object modelling. Introduction to modelling

The Rasterization Pipeline

Module 13C: Using The 3D Graphics APIs OpenGL ES

lecture 18 - ray tracing - environment mapping - refraction

Generations of Consumer Computer Graphics as Seen in Demos. Markku Reunanen, Aalto University

CS 179: GPU Programming

Drawing Fast The Graphics Pipeline

3D Rendering Pipeline

Graphics Processing Unit Architecture (GPU Arch)

Current Trends in Computer Graphics Hardware

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

GPU Architecture and Function. Michael Foster and Ian Frasch

Technical Report. GLSL Pseudo-Instancing

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Optimisation. CS7GV3 Real-time Rendering

Performance OpenGL Programming (for whatever reason)

3D GRAPHICS. design. animate. render

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

Mobile Graphics Ecosystem. Tom Olson OpenGL ES working group chair

Evolution of GPUs Chris Seitz

Transcription:

NVIDIA nfinitefx Engine: Programmable Pixel Shaders

The NVIDIA nfinitefx Engine: The NVIDIA nfinitefx TM engine gives developers the ability to program a virtually infinite number of special effects and custom looks. Instead of every developer choosing from the same hard-coded palette of effects and ending up with the same generic look and feel, developers can specify personalized combinations of graphics operations and create an their own custom effects. Games and other graphics-intensive applications are differentiated and offer more exciting and stylized visual effects. Two patented architectural advancements enable the nfinitefx engine s programmability and its multitude of effects: Vertex Shaders and Pixel Shaders. This paper covers the engine s programmable Pixel Shaders. Realism for Lighting and Materials: Programmable Pixel Shaders The final output of any 3D graphics hardware consists of pixels. Depending on resolution, in excess of 2 million pixels may need to be rendered, lit, shaded, and colored. Pixel Shaders create lighting and other custom shading effects at the pixel level. This is an unprecedented level of hardware control for consumers. Pixel Shader Basics Today s 3D graphics hardware lets developers create virtual worlds. Ideally, users can freely interact with and navigate through these worlds at high frame rates. Last year's NVIDIA GeForce2 series performed fixed function, per-pixel lighting. (Image shown is from Giants by Planet Moon.) As 3D graphics hardware has evolved over the years, new features have been continuously added resulting in steady improvements to image quality. In the early days of 3D accelerators, the best technologies consisted of bilinear filtering of relatively low-resolution texture maps. These lowresolution texture maps were mapped onto relatively large polygons, resulting in blurry textures and relatively poor image quality. In 1999, the NVIDIA GeForce2 series introduced a new technology: the NVIDIA Shading Rasterizer (NSR). The NSR gave developers the ability to add When first introduced, the image quality in Tomb Raider II was per-pixel lighting effects including sophisticated considered excellent, but wouldn't pass muster today. techniques such as dot3 bump mapping. Today, the latest NVIDIA technology gives developers full control of the pixel shading process through the use of programmable Pixel Shaders. Part of the NVIDIA nfinitefx engine, programmable Pixel Shaders provide unprecedented control for determining light, shade, and color of each individual pixel. NVIDIA Corporation 1

Architectural Details The NVIDIA GeForce2 GTS performed dot3 and other per-pixel effects through its fixed-function pixel pipelines. Textures could be combined, using either single pass or multi-pass techniques, to create effects. There was no loopback, so effects that needed dependent texture reads, such as true, reflective bump mapping were simply not possible. The GeForce2 texture stage pipeline. Note that the GeForce2 GTS supported only two texture-blend operations per pass. While programmers were somewhat limited by the NSR s fixed-function pipeline, it was possible to create superb per-pixel effects, including dot3 bump mapping so realistic looking that it seemed to have a huge amount of additional geometric detail. NVIDIA nfinitefx Programmable Pixel Shaders The NVIDIA nfinitefx engine adds a high degree of flexibility for several reasons: -It s programmable, allowing developers to create their own custom pixel-shading effects. -There are more texture operations and texture address registers available. -Dependent texture reads are now possible, giving programmers even greater flexibility. The NVIDIA nfinitefx texture pipeline supports up to eight texture-blend operations, dependent texture reads, and programmable Pixel Shaders. NVIDIA Corporation 2

Note that the NVIDIA nfinitefx architecture doesn t preclude the use of Microsoft DirectX 7 texture operations. However, if the developer uses the more sophisticated Pixel Shader application programming interface (API) in Microsoft DirectX 8, more complex operations are possible. Unlike Vertex Shaders, however, there is no feasible way of emulating this using software. Using Pixel Shaders on graphics hardware that doesn't support them forces the entire graphics pipeline to run in software and causing performance to fall by 100x! The NVIDIA nfinitefx Pixel Shader architecture has unbelievably fast floating-point performance, and is capable of generating the per-pixel effects with superb performance. The figure on the left shows one of the more sophisticated techniques possible. The object is an animated, bumpy, reflective surface something impossible to achieve in real time with previous 3D hardware. A shiny, bumpy surface "breathing" in real time The Joy of Four Textures Because of its abilities to handle four textures in a single pass, the NVIDIA nfinitefx engine achieves excellent performance and enables previously impossible pixel-level effects on consumer-level platforms. Applying multiple textures in one pass almost always yields better performance than performing multiple passes. Multiple passes translate into multiple geometry transformations and multiple z- buffer calculations, slowing the overall rendering process. Making Bump Mapping Work Bump mapping creates the illusion of additional geometry, such as the crinkles in the warping NVIDIA logo or the deep wrinkles in the earlier Giants shot. Dot3 bump mapping requires the application of multiple layers of textures. The key textures are derived from the height map and the normal map. These help define the visible surface geometry. The height map is a simple gray scale, where the shades of gray represent the relative height or indentation. The normal map is an RGB color map that defines the effect from light shining on the surface. Note that The height map is on the left; the normal map is on the right. normal in this instance refers to a mathematical normal, which defines a vector direction. The Pixel Shader architecture of the NVIDIA nfinitefx engine is capable of applying the required textures in a single pass. The final, bump-mapped surface. NVIDIA Corporation 3

Pixel Shader Programmability When people think of programs, they almost always think of high-level languages such as C++ or BASIC. However, Pixel Shader programs are much lower-level programs assembly language programs. The native assembly language for the NVIDIA nfinitefx programmable Pixel Shaders is the one defined in the Microsoft DirectX 8 API. A simple cartoon effect created with programmable shaders. Developers can program Pixel Shaders to create personalized effects or shading algorithms. For example, Pixel Shaders could be programmed to provide a low-tech cartoon look or myriad of other stylized looks. The possible range of effects is limited only by each developer s imagination. The nfinitefx engine, coupled with DirectX 8 or the OpenGL drivers, give the programmer an unprecedented level of control with unbelievable performance. Conclusion The NVIDIA nfinitefx Pixel Shader architecture represents incredible breakthrough in 3D graphics hardware. Incorporating blazing-fast floating-point performance, total control at the pixel level, and programmable shaders, NVIDIA s nfinitefx engine brings advanced image quality to the PC desktop. NVIDIA Corporation 4