DirectX 11 First Elements

Similar documents
Chapter Answers. Appendix A. Chapter 1. This appendix provides answers to all of the book s chapter review questions.

Low-Overhead Rendering with Direct3D. Evan Hart Principal Engineer - NVIDIA

Rendering Grass with Instancing in DirectX* 10

Beginning Direct3D Game Programming: 1. The History of Direct3D Graphics

The Application Stage. The Game Loop, Resource Management and Renderer Design

Real - Time Rendering. Graphics pipeline. Michal Červeňanský Juraj Starinský

Microsoft Direct3D Devices 1. Device Types. Devices 2. Creating a Device

Shaders Part II. Traditional Rendering Pipeline

Rain. Sarah Tariq

Could you make the XNA functions yourself?

Many rendering scenarios, such as battle scenes or urban environments, require rendering of large numbers of autonomous characters.

DirectX Programming #4. Kang, Seongtae Computer Graphics, 2009 Spring

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Getting fancy with texture mapping (Part 2) CS559 Spring Apr 2017

CS 354R: Computer Game Technology

A Trip Down The (2011) Rasterization Pipeline

Graphics Performance Optimisation. John Spitzer Director of European Developer Technology

Geometry Shader - Silhouette edge rendering

DirectX10 Effects and Performance. Bryan Dudash

Shaders (some slides taken from David M. course)

The Graphics Pipeline

CS 4620 Program 3: Pipeline

DX10, Batching, and Performance Considerations. Bryan Dudash NVIDIA Developer Technology

Pipeline Operations. CS 4620 Lecture 10

Graphics Hardware. Instructor Stephen J. Guy

Optimizing DirectX Graphics. Richard Huddy European Developer Relations Manager

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

DirectX 12. Why DirectX? Limited platforms Windows 10 XBox One. Large, connected API Direct3D DirectXMath XInput XAudio2

Why DirectX? Limited platforms. Large, connected API. Good for games on Microsoft platforms. Windows 10 XBox One. Direct3D DirectXMath XInput XAudio2

Enabling immersive gaming experiences Intro to Ray Tracing

How to Work on Next Gen Effects Now: Bridging DX10 and DX9. Guennadi Riguer ATI Technologies

Enhancing Traditional Rasterization Graphics with Ray Tracing. October 2015

Understanding M3G 2.0 and its Effect on Producing Exceptional 3D Java-Based Graphics. Sean Ellis Consultant Graphics Engineer ARM, Maidenhead

Shaders. Slide credit to Prof. Zwicker

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Advanced Lighting Techniques Due: Monday November 2 at 10pm

COMP371 COMPUTER GRAPHICS

EECE 478. Learning Objectives. Learning Objectives. Rasterization & Scenes. Rasterization. Compositing

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

DirectX10 Effects. Sarah Tariq

CS559 Computer Graphics Fall 2015

Evolution of GPUs Chris Seitz

Optimisation. CS7GV3 Real-time Rendering

Pipeline Operations. CS 4620 Lecture 14

Pump Up Your Pipeline

Graphics Hardware. Graphics Processing Unit (GPU) is a Subsidiary hardware. With massively multi-threaded many-core. Dedicated to 2D and 3D graphics

Rendering Objects. Need to transform all geometry then

CS451Real-time Rendering Pipeline

TSBK03 Screen-Space Ambient Occlusion

TDA362/DIT223 Computer Graphics EXAM (Same exam for both CTH- and GU students)

Why modern versions of OpenGL should be used Some useful API commands and extensions

From Art to Engine with Model I/O

Adaptive Point Cloud Rendering

CS GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1. Markus Hadwiger, KAUST

Shaders in Eve Online Páll Ragnar Pálsson

Rendering Subdivision Surfaces Efficiently on the GPU

PROFESSIONAL. WebGL Programming DEVELOPING 3D GRAPHICS FOR THE WEB. Andreas Anyuru WILEY. John Wiley & Sons, Ltd.

Shader Series Primer: Fundamentals of the Programmable Pipeline in XNA Game Studio Express

Lab 1 Sample Code. Giuseppe Maggiore

Module 13C: Using The 3D Graphics APIs OpenGL ES

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Optimizing for DirectX Graphics. Richard Huddy European Developer Relations Manager

Supplement to Lecture 22

Shadows in the graphics pipeline

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

RSX Best Practices. Mark Cerny, Cerny Games David Simpson, Naughty Dog Jon Olick, Naughty Dog

Chapter 7 - Light, Materials, Appearance

Programming Graphics Hardware

Real Time Rendering of Complex Height Maps Walking an infinite realistic landscape By: Jeffrey Riaboy Written 9/7/03

Computer Graphics 1. Chapter 7 (June 17th, 2010, 2-4pm): Shading and rendering. LMU München Medieninformatik Andreas Butz Computergraphik 1 SS2010

CS 130 Final. Fall 2015

Rasterization Overview

Tutorial on GPU Programming #2. Joong-Youn Lee Supercomputing Center, KISTI

Raise your VR game with NVIDIA GeForce Tools

CIS 536/636 Introduction to Computer Graphics. Kansas State University. CIS 536/636 Introduction to Computer Graphics

Real - Time Rendering. Pipeline optimization. Michal Červeňanský Juraj Starinský

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y.

Spring 2009 Prof. Hyesoon Kim

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

CS4620/5620: Lecture 14 Pipeline

Game Technology. Lecture Physically Based Rendering. Dipl-Inform. Robert Konrad Polona Caserman, M.Sc.

Spring 2011 Prof. Hyesoon Kim

PowerVR Hardware. Architecture Overview for Developers

CS130 : Computer Graphics. Tamar Shinar Computer Science & Engineering UC Riverside

GDC 2014 Barthold Lichtenbelt OpenGL ARB chair

Course Recap + 3D Graphics on Mobile GPUs

Animation Essentially a question of flipping between many still images, fast enough

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

Next-Generation Graphics on Larrabee. Tim Foley Intel Corp

Input Nodes. Surface Input. Surface Input Nodal Motion Nodal Displacement Instance Generator Light Flocking

CS 130 Exam I. Fall 2015

Game Architecture. 2/19/16: Rasterization

CMSC427 Final Practice v2 Fall 2017

NVIDIA Tools for Artists

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Introduction to Shaders.

Point based Rendering

Transcription:

DirectX 11 First Elements

0. Project changes

changed Dx11Base.h void Update( float dt ); void Render( ); to virtual void Update( float dt ) = 0; virtual void Render( ) = 0;

new App3D.h #ifndef _APP_3D_H_ #define _APP_3D_H_ #include "Dx11Base.h" class App3D : public Dx11Base { public: App3D( ); virtual ~App3D( ); bool LoadContent( ); void UnloadContent( ); void Update( float dt ); void Render( ); }; #endif

new App3D.cpp #include"app3d.h" #include<xnamath.h> App3D::App3D() {} App3D::~App3D() {} void App3D::UnloadContent() {} void App3D::Update(float dt) {} bool App3D::LoadContent() { return true; } void App3D::Render() { if( d3dcontext_ == 0 ) return; float clearcolor[4] = { 0.0f, 0.0f, 0.25f, 1.0f }; d3dcontext_->clearrendertargetview( backbuffertarget_, clearcolor ); swapchain_->present( 0, 0 ); }

removed from Dx11Base.cpp void Dx11Base::Update( float dt ) {} void Dx11Base::Render() { if (d3dcontext_ == 0) return; float clearcolor[4] = { 0.0f, 0.0f, 0.25f, 1.0f }; d3dcontext_->clearrendertargetview(backbuffertarget_, clearcolor); swapchain_->present(0, 0); }

changed main.cpp #include App3D.h Dx11Base app3d; to App3D app3d;

1. Textures

Map Could be an image over a surface, but it s not the only use for it, there are many more uses.

Color map Texture that is mapped over a surface, usually to give the appearance of having a more complex makeup than it really has.

Color map Normal map Refraction angles Specular map Mask highlights

Ambient occlusion map Global illumination occluded places.

Light map Prerendered lights and shadows.

Shadow map Used during real-time generation and rendering.

Displacement map Used for deform the geometry.

Cubemap Used for reflections and environment lighting.

Alpha map Used for transparency.

2. Sprites

Sprite 2D graphical element that appears on the screen.

Billboard Sprite that is always facing the camera.

3. 3D Graphics Geometry

Vertex Shapes are defined in game graphics as a collection of points connected together via lines whose insides are shaded in by the graphics hardware. These points are referred to as vertices (the plural term for a vertex), and these lines are referred to as edges.

Triangle Every 3D model can be described as a group of contiguous polygons. Each polygon is a shape that can be composed by a group of contiguous triangles. Mesh Collection of triangles, describing a part of a 3D model.

Basic Primitive Types Triangle information on 3D graphics, and other primitive types, are stored as array of vertices. This array could be interpreted in different ways: Point list Triangle list Line list Triangle strip Line strip Triangle fan

Point lists Is a collection of vertices that are rendered as isolated points. Can use them in 3D scenes for star fields, or dotted lines on the surface of a polygon.

Line lists Is a list of isolated, straight line segments. Line lists are useful for such tasks as adding sleet or heavy rain to a 3D scene. Applications create a line list by filling an array of vertices.

Line strips A line strip is a primitive that is composed of connected line segments. Can be used for creating polygons that are not closed.

Triangle list A triangle list is a list of isolated triangles. They might or might not be near each other. A triangle list must have at least three vertices and the total number of vertices must be divisible by three.

Triangle strip A triangle strip is a series of connected triangles. The first three vertices define the first triangle and the fourth vertex, along with the previous two vertices define the second triangle and so on.

Triangle fan A triangle fan is similar to a triangle strip, except that all the triangles share one vertex, as shown in the illustration.

Triangle fan is not more supported Since DirectX 10 triangle fan is no more an option on the basic primitives, because are not cache friendly. This is because for drawing a triangle fan, is necessary to continuously jumping back to the first element on the list.

Indexed geometry Using only unique vertices on the array of vertices and keeping on other array indexes of that list to define which points make up which triangle.

3D model

Lowpoly

3D model advanced techniques

Vertex properties Each vertex has a host of information the shaders will need to produce an effect. Most common properties are: Position Color Normal Texture coordinates Bone weights and indices

Pixel data Per-pixel data is calculated using interpolation. The pixel shader receives interpolated data from the vertex shader (or geometry shader if is present). This includes positions, colors, texture coordinates and all other attributes provided by the previous shader stage.

Pixel data If is necessary to specify per-pixel data, it is often in the form of texture images, such as a normal map texture.

4. Vertex Buffers

Vertex Buffers A buffer is memory of a specific size. For example a buffer of 40 bytes is necessary for storing 10 integers (if each integer = 4 bytes) Vertex buffer is a Direct3D buffer of type ID3D11Buffer that is used to store all the vertex data for a mesh.

Vertex Buffers These buffers resides in an optimal location of memory, such the GPU, which is chosen by the device driver. Most advanced 3D commercial video games use techniques to determine what geometry is visible beforehand and only submit those that are either visible or potentially visible to the graphics hardware.

Vertex Buffer Creation struct VertexPos { XMFLOAT3 pos; }; VertexPos vertices[] = { XMFLOAT3( 0.5f, 0.5f, 0.5f ), XMFLOAT3( 0.5f, -0.5f, 0.5f ), XMFLOAT3( -0.5f, -0.5f, 0.5f ) };

Vertex Buffer Creation D3D11_BUFFER_DESC vertexdesc; ZeroMemory( &vertexdesc, sizeof( vertexdesc ) ); vertexdesc.usage = D3D11_USAGE_DEFAULT; vertexdesc.bindflags = D3D11_BIND_VERTEX_BUFFER; vertexdesc.bytewidth = sizeof( VertexPos ) * 3; D3D11_SUBRESOURCE_DATA resourcedata; ZeroMemory( &resourcedata, sizeof(resourcedata) ); resourcedata.psysmem = vertices; ID3D11Buffer* vertexbuffer; HRESULT result = d3ddevice_->createbuffer( &vertexdesc, &resourcedata, &vertexbuffer );

Vertex Buffer Creation Buffer descriptor. Buffer details, because could be created another type of buffer than a vertex buffer. Sub resources. Used for pass the vertex data to the buffer. Create buffer. Function for finally create the buffer.

5. Input Layout

Input Layout Vertex buffer is a chunk of data, and we need to specify the attributes, the ordering and size. Input Layout is used for this task, with: D3D11_INPUT_ELEMENT_DESC

Input Layout typedef struct D3D11_INPUT_ELEMENT_DESC { LPCSTR SemanticName; UINT SemanticIndex; DXGI_FORMAT Format; UINT InputSlot; UINT AlignedByteOffset; D3D11_INPUT_CLASSIFICATION InputSlotClass; UINT InstanceDataStepRate; } D3D11_INPUT_ELEMENT_DESC;

Input Layout SemanticName. Describes the purpose of the element: POSITION, COLOR,... SemanticIndex. Vertex can use multiple elements with same semantic (texture coordinates, for example) Format. For example DXGI_FORMAT_R32G32B32_FLOAT could be used for position with 4 bytes (32 bits) floating-poing axes

Input Layout Input Slot. Which vertex buffer the element is found in. Aligned byte offset. Starting byte offset into the vertex buffer where it can find this element. Input slot class. Describe whether the element is to be used for each vertex of for each instance (per object), used for draw multiple objects with a single draw call.

Input Layout Instance data step rate. Which is used to tell how many instances to draw in the scene. Input layouts are created using: HRESULT CreateInputLayout( const D3D11_INPUT_ELEMENT_DESC* pinputelementdescs, UINT NumElements, const void* pshaderbytecodewithinputsignature, SIZE_T BytecodeLength, ID3D11InputLayout** ppinputlayout );

Input Layout creation pinputelementdescs y NumElements. Are the array of elements in the vertex layout and number of elements in that array. pshaderbytecodewithinputsignature. Is the compiled vertex shader code. BytecodeLength. Is the size of the shader s bytecode. The vertex shader s input signature must match our input layout. ppinputlayout. Is the pointer of the object that will be created with this function call.

6. Basic Shaders

Basic shaders Shaders that are not optional are the pixel and vertex: ID3D11VertexShader* solidcolorvs; ID3D11PixelShader* solidcolorps; Vertex shader is needed before creating the input layout, since vertex shader s signature must match the input layout.

Shader compilation Compiled code executed on GPU, written on HLSL and could be load & compile with: HRESULT D3DX11CompileFromFile( LPCTSTR psrcfile, const D3D10_SHADER_MACRO* pdefines, LPD3D10INCLUDE pinclude, LPCSTR pfunctionname, LPCSTR pprofile, UINT Flags1, UINT Flags2, ID3DX11ThreadPump* ppump, ID3D10Blob** ppshader, ID3D10Blob** pperrormsgs, HRESULT* phresult );

HLSL HLSL is the High Level Shading Language for DirectX, used to create C like programmable shaders. Was created on DirectX 9 to set up the programmable 3D pipeline. Pipeline can be programmed using a combination of assembly instructions, HLSL instructions and fixed-function statements.

Vertex & Pixel shader creation With compiled shader, a vertex shader can be created with: HRESULT CreateVertexShader( const void* pshaderbytecode, SIZE_T BytecodeLength, ID3D11ClassLinkage* pclasslinkage, ID3D11VertexShader** ppvertexshader ); CreatePixelShader has the same signature.

Basic Vertex shader A basic example of a vertex shader: float4 VS( float4 pos : POSITION ) : SV_POSITION { return pos; } SV stands for System-value, which can t be modify by CPU. Are input / output from a shader stage or are generated entirely by the GPU.

Basic Pixel shader A basic example of a pixel shader: float4 PS( float4 pos : SV_POSITION ) : SV_TARGET { return float4( 0.0f, 1.0f, 0.0f, 1.0f ); } Note that the input is a position on 3D space (SV_POSITION) and the output is a color value (SV_TARGET)

Pixel limitations Modern computer monitors are commonly raster display, which means the screen is actually a twodimensional grid of small dots called pixels. Each pixel contains a color independent of other pixels. When we render a triangle on the screen, we don't really render a triangle as one entity. Rather, we light up the group of pixels that are covered by the triangle's area.

Pixel limitations

7. Rendering

Input Assembler For rendering in Direct3D, we need to set up the input assembly, bind our shaders and other assets (such as textures), and draw each mesh. To set up Input Assembly is used: IASetInputLayout IASetVertexBuffers IASetPrimitiveTopology

IASetInputLayout Used to bind the vertex layout that we created when we called the device s CreateInputLayout. This is done each time we are about to render geometry that uses a specific input layout.

IASetVertexBuffers Used to set one or more vertex buffers: void IASetVertexBuffers( UINT StartSlot, UINT NumBuffers, ID3D11Buffer* const* ppvertexbuffers, const UINT* pstrides, const UINT* poffsets );

IASetPrimitiveTopology Used to tell Direct3D what type of geometry we are rendering, for example: D3D11_PRIMITIVE_TOPOLOGY_POINTLIST D3D11_PRIMITIVE_TOPOLOGY_LINELIST D3D11_PRIMITIVE_TOPOLOGY_LINESTRIP D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP

Setting Shaders After setting the input assembler we can set shaders, this can be done with the functions: VSSetShader PSSetShader

Draw Once we ve set and bound all of the necessary data for our geometry, we need to call the Draw function: void Draw( UINT VertexCount, UINT StartVertexLocation );

Present Last step is calling swap chain s Present function, that allows us to present the renderer image to the screen: HRESULT Present( UINT SyncInterval, UINT Flags );