Computer Graphics with OpenGL ES (J. Han) Chapter XI Normal Mapping

Similar documents
Advanced Lighting Techniques Due: Monday November 2 at 10pm

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

Computer Graphics with OpenGL ES (J. Han) Chapter 6 Fragment shader

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1)

The Application Stage. The Game Loop, Resource Management and Renderer Design

Input Nodes. Surface Input. Surface Input Nodal Motion Nodal Displacement Instance Generator Light Flocking

Computer Graphics. Shadows

CSE 167: Introduction to Computer Graphics Lecture #19: Wrapping Up. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017


Chapter Answers. Appendix A. Chapter 1. This appendix provides answers to all of the book s chapter review questions.

Recall: Indexing into Cube Map

Deferred Rendering Due: Wednesday November 15 at 10pm

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

The Rasterization Pipeline

Advanced Rendering & Shading

CS559 Computer Graphics Fall 2015

Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to mode

Quick Shader 0.1 Beta

Computer Graphics 10 - Shadows

Optimal Shaders Using High-Level Languages

Tutorial 12: Real-Time Lighting B

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Bump Mapping Which one of these two image has a better visual effect?

CT5510: Computer Graphics. Texture Mapping

CS 418: Interactive Computer Graphics. Basic Shading in WebGL. Eric Shaffer

Computer Graphics Coursework 1

Spring 2012 Final. CS184 - Foundations of Computer Graphics. University of California at Berkeley

Lecture 5 Vertex and Fragment Shaders-1. CITS3003 Graphics & Animation

RENDERING SPEEDS OF DYNAMIC AND STATIC OBJECTS WITH TANGENT SPACE NORMAL MAPPING ON 3D GAMES

CMSC427 Final Practice v2 Fall 2017

Shadows. Prof. George Wolberg Dept. of Computer Science City College of New York

Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University

The Graphics Pipeline

Evolution of GPUs Chris Seitz

INFOGR Computer Graphics

INFOGR Computer Graphics

-=Bui Tuong Phong's Lighting=- University of Utah, but with shaders. Anton Gerdelan Trinity College Dublin

DH2323 DGI13 Lab 3 Rasterization

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Hardware Displacement Mapping

The Graphics Pipeline

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel,

Teaching a Modern Graphics Pipeline Using a Shader-based Software Renderer

Computer Graphics with OpenGL ES (J. Han) Chapter VII Rasterizer

Real Time Data Plotting

CS 130 Final. Fall 2015

THE AUSTRALIAN NATIONAL UNIVERSITY Final Examinations(Semester 2) COMP4610/COMP6461 (Computer Graphics) Final Exam

HW-Tessellation Basics

CS 4620 Midterm, March 21, 2017

Chapter 1 Introduction. Marc Olano

Computer Graphics (CS 543) Lecture 10: Normal Maps, Parametrization, Tone Mapping

Graphics Hardware. Instructor Stephen J. Guy

Parallax Bumpmapping. Whitepaper

Shading Shades. Frank Jargstorff. June 1, Abstract

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

Z-Buffer hold pixel's distance from camera. Z buffer

Seminar 2. Tessellation & Interpolation. Pierre Moreau. EDAF80 Introduction to Computer Graphics

Vertex Shaders for Geometry Compression

Ciril Bohak. - INTRODUCTION TO WEBGL

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim

CS GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1. Markus Hadwiger, KAUST

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

Real Time Rendering of Complex Height Maps Walking an infinite realistic landscape By: Jeffrey Riaboy Written 9/7/03

Graphics for VEs. Ruth Aylett

SGI OpenGL Shader. Marc Olano SGI

More Texture Mapping. Texture Mapping 1/46

Programming Graphics Hardware

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

11 - Bump Mapping. Bump-Mapped Objects. Bump-Mapped Objects. Bump-Mapped Objects. Limitations Of Texture Mapping. Bumps: Perturbed Normals

Rasterization. CS 4620 Lecture Kavita Bala w/ prior instructor Steve Marschner. Cornell CS4620 Fall 2015 Lecture 16

1 Preview. Dr. Scott Gordon Computer Science Dept. CSUS. Virtual Cameras, Viewing Transformations: CSc-155 Advanced Computer Graphics

Perspective Projection and Texture Mapping

Rippling Reflective and Refractive Water

Approximate Catmull-Clark Patches. Scott Schaefer Charles Loop

Computergrafik. Matthias Zwicker. Herbst 2010

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer

3D Graphics for Game Programming (J. Han) Chapter II Vertex Processing

CS 450: COMPUTER GRAPHICS TEXTURE MAPPING SPRING 2015 DR. MICHAEL J. REALE

Programming Graphics Hardware. GPU Applications. Randy Fernando, Cyril Zeller

Canonical Shaders for Optimal Performance. Sébastien Dominé Manager of Developer Technology Tools

EXAMINATIONS 2016 TRIMESTER 2

NVIDIA nfinitefx Engine: Programmable Pixel Shaders

Homework #2. Shading, Ray Tracing, and Texture Mapping

Graphics for VEs. Ruth Aylett

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

Sébastien Dominé, NVIDIA. CgFX

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

Zeyang Li Carnegie Mellon University

Drawing a Crowd. David Gosselin Pedro V. Sander Jason L. Mitchell. 3D Application Research Group ATI Research

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

Technical Game Development II. Reference: Rost, OpenGL Shading Language, 2nd Ed., AW, 2006 The Orange Book Also take CS 4731 Computer Graphics

Chapter IV Fragment Processing and Output Merging. 3D Graphics for Game Programming

Heightmap Translator v1.0 User Guide. Document Version 1.0. Heightmap Translator V Sigrasoft, Inc.

Technical Report. Mesh Instancing

OUTLINE. Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system

CS GPU and GPGPU Programming Lecture 11: GPU Texturing 1. Markus Hadwiger, KAUST

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

Shaders. Slide credit to Prof. Zwicker

The Graphics Pipeline

Parallax Occlusion in Direct3D 11

Transcription:

Chapter XI Normal Mapping

Bumpy Surfaces Image texturing only Fast Not realistic Highly tessellated mesh Realistic Slow 11-2

Surface Normal and Lighting Recall the interaction among light sources, surfaces, and camera/eye. Why light and why dark? Surface normals play key roles in lighting. 11-3

Normal Mapping Store the surface normals of the high-frequency surface into a texture, called the normal map. At run time, use the normal map for lighting. 11-4

Height Field A popular method to represent a high-frequency surface is to use a height field. It is a function h(x,y) that returns a height or z value given (x,y) coordinates. The height field is sampled with a 2D array of regularly spaced (x,y) coordinates, and the height values are stored in a texture named height map. The height map can be drawn in gray scales. If the height is in the range of [0,255], the lowest height 0 is colored in black, and the highest 255 is colored in white. 11-5

Height Map for Terrain Rendering 11-6

Normal Map Simple image-editing operations can create a gray-scale image (height map) from an image texture (from (a) to (b)). The next step from (b) to (c) is done automatically. 11-7

Normal Map (cont d) Creation of a normal map from a height map Visualization of a normal map 11-8

Normal Mapping (cont d) The polygon mesh is rasterized and texture coordinates (s,t) are used to access the normal map. The normal at (s,t) is obtained by filtering the normal map. Recall the diffuse reflection term, max(n l,0)s d m d. The normal n is fetched from the normal map. m d is fetched from the image texture. 11-9

Normal Mapping (cont d) 11-10

Normal Mapping (cont d) 11-11

Normal Mapping (cont d) 11-12

Normal Mapping (cont d) 11-13

Normal Mapping (cont d) 11-14

Tangent-space Normal Mapping Recall that texturing is described as wrapping a texture onto an object surface. + = + = + = + = 11-15

Tangent-space Normal Mapping (cont d) A point at any surface has a surface normal. The normals of the normal map should be considered as perturbed instances of the surface normals. In the example, n(s p,t p ) will replace n p, and n(s q,t q ) will replace n q. A tricky problem to face: Normals are assumed to be defined at the world space, but does the normal map contain world-space normals? The answer is NO. 11-16

Tangent-space Normal Mapping (cont d) For a surface point, consider a tangent space, the z-axis of which corresponds to the surface normal. Assuming that a tangent space is defined for a surface point to be normalmapped, the normal fetched from the normal map is taken as defined in the tangent space of the point, not in the world space. In this respect, the normal is named the tangent-space normal. 11-17

Tangent Space The basis of the tangent space {T,B,N} Vertex normal N defined per vertex at the modeling stage. Tangent T needs to compute Binormal B needs to compute 11-18

Tangent Space (cont d) void ImlRenderer::ComputeTangent() { vector<vec3> tritangents; // Compute Tangent Basis for(int i=0; i< mindexarray.size(); i += 3){ vec3 p0 = mvertexarray.at(mindexarray.at(i)).pos; vec3 p1 = mvertexarray.at(mindexarray.at(i+1)).pos; vec3 p2 = mvertexarray.at(mindexarray.at(i+2)).pos; vec3 uv0 = vec3(mvertexarray.at(mindexarray.at(i)).tex, 0); vec3 uv1 = vec3(mvertexarray.at(mindexarray.at(i+1)).tex, 0); vec3 uv2 = vec3(mvertexarray.at(mindexarray.at(i+2)).tex, 0); vec3 deltapos1 = p1 - p0; vec3 deltapos2 = p2 - p0; vec3 deltauv1 = uv1 - uv0; vec3 deltauv2 = uv2 - uv0; } // Compute the tangent float r = 1.0f / (deltauv1.x * deltauv2.y - deltauv1.y * deltauv2.x); vec3 computedtangent = (deltapos1 * deltauv2.y - deltapos2 * deltauv1.y) * r; tritangents.push_back(computedtangent); tritangents.push_back(computedtangent); tritangents.push_back(computedtangent); } // Initialize mtangents for(int i=0; i < mvertexarray.size(); ++i){ mtangentarray.push_back(vec3(0)); } // Accumulate tangents by indices for(int i=0; i < mindexarray.size(); ++i) { mtangentarray.at(mindexarray.at(i)) = mtangentarray.at(mindexarray.at(i)) + tritangents.at(i); } 11-19

Tangent-space Normal Mapping (cont d) The diffuse term of the Phong lighting model A light source is defined in the world space, and so is l. In contrast, n fetched from the normal map is defined in the tangent space. To resolve this inconsistency, n has to be transformed into the world space. Typically, the per-vertex TBN-basis is pre-computed, is stored in the vertex array and is passed to the fragment shader. Then, the fragment shader constructs a matrix that rotates the world-space light vector into the pervertex tangent space. 11-20

Tangent-space Normal Mapping (cont d) 11-21

Tangent-space Normal Mapping (cont d) 11-22

Tangent-space Normal Mapping (cont d) 11-23

Tangent-space Normal Mapping (cont d) 11-24

Tangent-space Normal Mapping (cont d) Unreal demo at 2005. 11-25