SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03.

Similar documents
CS 432 Interactive Computer Graphics

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

Computergraphics Exercise 15/ Shading & Texturing

CS452/552; EE465/505. Texture Mapping in WebGL

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel,

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

Discussion 3. PPM loading Texture rendering in OpenGL

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

CT5510: Computer Graphics. Texture Mapping

SUMMARY. CS380: Introduction to Computer Graphics Projection Chapter 10. Min H. Kim KAIST School of Computing 18/04/12. Smooth Interpolation

CS452/552; EE465/505. Intro to Lighting

Computer Graphics with OpenGL ES (J. Han) Chapter 6 Fragment shader

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Pipeline Operations. CS 4620 Lecture 14

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

INF3320 Computer Graphics and Discrete Geometry

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Overview. By end of the week:

Lab 9 - Metal and Glass

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Shadow Rendering. CS7GV3 Real-time Rendering

OPENGL RENDERING PIPELINE

Shadows. Prof. George Wolberg Dept. of Computer Science City College of New York

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

Lessons Learned from HW4. Shading. Objectives. Why we need shading. Shading. Scattering

Programming Graphics Hardware

The Rasterization Pipeline

SUMMARY. CS380: Introduction to Computer Graphics Varying Variables Chapter 13. Min H. Kim KAIST School of Computing 18/04/26.

Recall: Indexing into Cube Map

CPSC 436D Video Game Programming

COMP environment mapping Mar. 12, r = 2n(n v) v

Sign up for crits! Announcments

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017

Computer Graphics (CS 543) Lecture 8c: Environment Mapping (Reflections and Refractions)

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

CS 4620 Program 3: Pipeline

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Computer Graphics Coursework 1

Shaders. Slide credit to Prof. Zwicker

Shading/Texturing. Dr. Scott Schaefer

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Computer Graphics (CS 4731) Lecture 16: Lighting, Shading and Materials (Part 1)

CS559 Computer Graphics Fall 2015

CS452/552; EE465/505. Image Processing Frame Buffer Objects

Pipeline Operations. CS 4620 Lecture 10

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Computer Graphics (CS 543) Lecture 7b: Intro to lighting, Shading and Materials + Phong Lighting Model

Lecture 5 Vertex and Fragment Shaders-1. CITS3003 Graphics & Animation

lecture 18 - ray tracing - environment mapping - refraction

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Computer Graphics (CS 543) Lecture 8a: Per-Vertex lighting, Shading and Per-Fragment lighting

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

CS4621/5621 Fall Computer Graphics Practicum Intro to OpenGL/GLSL

Lighting and Texturing

CS 432 Interactive Computer Graphics

INF3320 Computer Graphics and Discrete Geometry

General Purpose computation on GPUs. Liangjun Zhang 2/23/2005

SHADER PROGRAMMING. Based on Jian Huang s lecture on Shader Programming

SUMMARY. CS380: Introduction to Computer Graphics Ray tracing Chapter 20. Min H. Kim KAIST School of Computing 18/05/29. Modeling

Programming with OpenGL Part 3: Shaders. Ed Angel Professor of Emeritus of Computer Science University of New Mexico

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer

Homework 3: Programmable Shaders

The Graphics Pipeline

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

9. Illumination and Shading

Texture Mapping. Mike Bailey.

INFOGR Computer Graphics

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

CS GPU and GPGPU Programming Lecture 12: GPU Texturing 1. Markus Hadwiger, KAUST

The Graphics Pipeline

-=Catmull's Texturing=1974. Part I of Texturing

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Computer Graphics. Illumination and Shading

Texture Mapping 1/34

CS GPU and GPGPU Programming Lecture 16+17: GPU Texturing 1+2. Markus Hadwiger, KAUST

INTRODUCTION TO OPENGL PIPELINE

Texture and other Mappings

CS 432 Interactive Computer Graphics

CS 418: Interactive Computer Graphics. Basic Shading in WebGL. Eric Shaffer

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources.

12.2 Programmable Graphics Hardware

Transcription:

CS380: Introduction to Computer Graphics Texture Mapping Chapter 15 Min H. Kim KAIST School of Computing Materials SUMMARY 2 1

Light blob from PVC plastic Recall: Given any vector w (not necessarily of unit norm) and a unit normal vector n, we can compute the bounce vector (mirror reflection) of w as B( w) = 2( w n) n w 3 Diffuse reflectance model Diffuse materials, like rough wood, appear equally bright when observed from all direction Thus, when calculating the color of a point on a diffuse surface, we do not need to use the v vector at all. v f r = const.= ρ π Szymon Rusinkiewicz (Princeton) 4 2

Blinn-Phong reflectance model Simple BRDF describing specular reflection, modeled in OpenGL n is normal h is halfway between I and v f r = ρ π + k s (n h)α Szymon Rusinkiewicz (Princeton) Chapter 15 TEXTURE MAPPING 6 3

Texture mapping We have already seen and used texture mapping In basic texturing, we simply glue part of an image onto a triangle by specifying texture coordinates at the three vertices. 7 Texture mapping Bunch of OpenGL codes to load a texture and set various parameters (lin/const, mipmap, wrapping rules). A uniform variable is used to point to the desired texture unit. 8 4

Texture mapping Varying variables are used to store texture coordinates. In this simplest incarnation, we just fetch r,g,b values from the texture and send them directly to the frame buffer. 9 Texture mapping Alternatively, the texture data could be interpreted as, say, the diffuse material color of the surface point, which would then be followed by the diffuse material computation described earlier. 10 5

Texture mapping (main.cpp) initglstate() glactivetexture(gl_texture0); glgentextures(1, &h_texture); glbindtexture(gl_texture_2d, h_texture); gltexparameteri(gl_texture_2d, GL_TEXTURE_WRAP_S, GL_CLAMP); gltexparameteri(gl_texture_2d, GL_TEXTURE_WRAP_T, GL_CLAMP); gltexparameteri(gl_texture_2d, GL_TEXTURE_MIN_FILTER, GL_LINEAR); gltexparameteri(gl_texture_2d, GL_TEXTURE_MAG_FILTER, GL_LINEAR); int twidth, theight; packed_pixel_t * pixdata = ppmread("reachup.ppm", &twidth, &theight); assert(pixdata); glteximage2d(gl_texture_2d, 0, GL_SRGB, twidth, theight, 0, GL_RGB, GL_UNSIGNED_BYTE, pixdata); free(pixdata);... 11 Texture mapping initshaders() display() h_texunit0 = safe_glgetuniformlocation(h_program, "texunit0"); h_atexcoord = safe_glgetattriblocation(h_program, "atexcoord"); safe_gluniform1i(h_texunit0, 0); Texture location (0,0) à lower left, (1,1) à upper right Glfloat sqtex[12] = { 0, 0, 1, 1, 1, 0, 0, 0, 0, 1, 1, 1 Min H. Kim (KAIST) Foundations of 3D Computer Graphics, S. Gortler, MIT }; Press, 2012 12 6

Texture mapping Vertex shader #version 130 uniform float uvertexscale; uniform mat4 uprojmatrix; uniform mat4 umodelviewmatrix; in vec2 avertex; in vec2 atexcoord; in vec3 acolor; out vec3 vcolor; out vec2 vtexcoord; void main() { gl_position = vec4(uprojmatrix * umodelviewmatrix * avertex); vcolor = acolor; vtexcoord = atexcoord; } 13 Texture mapping Fragment shader #version 130 uniform sampler2d utexunit0; // texture data in vec2 vtexcoord; // texture coordinates out vec4 fragcolor; void main() { vec4 texcolor0 = texture2d(utexunit0, vtexcoord); fragcolor = texcolor0; } 14 7

Normal mapping The data from a texture can also be interpreted in more interesting ways. In normal mapping, the r,g,b values from a texture are interpreted as the three coordinates of the normal at the point. This normal data can then be used as part of some material simulation, 15 Normal mapping Original mesh 4M triangles Simplified mesh 500 triangles Simplified mesh and normal mapping 500 triangles 16 8

Normal mapping Normal data has three coordinate values of unit vectors, each in the range [-1 1], while RGB textures store three values, each in the range [0 1] (0 255) So need some conversions: R = normal_x/2.0 + 0.5; normal_x = 2*R-1; Sometime need to deal with coordinate system conversions. 17 Environment cube maps Textures can also be used to model the environment in the distance around the object being rendered. In this case, we typically use six square textures representing the faces of a large cube surrounding the scene. +y -x -z +x +z -y 18 9

Capturing Environment Maps Photographing a light probe produces an environment map representing incident radiance from all directions. B( w) = 2( w n) n w 19 Capturing Environment Maps Hoo remove the camera from the environment map? 20 10

Sphere vs. Cube 21 Spherical Environment Map & Cube Map Galileo s Tomb Light prove captured Equirectangular (EQ) image http://www.pauldebevec.com/probes/ Six-face cube map 22 11

Environment cube maps Each texture pixel represents the color as seen along one direction in the environment. This is called a cube map. GLSL provides a cubetexture data type, samplercube specifically for this purpose. 23 Environment cube maps During the shading of a point, we can treat the material at that point as a perfect mirror and fetch the environment data from the appropriate incoming direction. 24 12

Environment map shader We calculate B( v) in the previous lecture. This bounced vector will point points towards the environment direction, which would be observed in a mirrored surface. By looking up the cube map, using this direction, we give the surface the appearance of a mirror. B( w) = 2( w n) n w 25 Environment map shader Fragment shader #version 130 uniform sampler2d utexunit0; in vec3 nnormal; in vec4 vposition; out vec4 fragcolor; vec3 reflect(vec3 w, vec3 n){ return n*(dot(w,n)*2.0) - w; // bounce vector } void main() { vec3 normal = normalize(vnormal); vec3 reflected = reflect(normalize(vec3(-vposition)), normal); vec4 texcolor0 = texturecube(utexunit0, reflected); fragcolor = vec4(texcolor0.r, texcolor0.g, texcolor0.b, 1.0);; } 26 13

Environment map shader -vposition represents the view vector v texturecube is a special GLSL function that takes a direction vector and returns the color stored at this direction in the cube texture map. Here we assume eye-coordinates, but frame changes may be needed. 27 Environment map shader This can be used for refraction. 28 14

Projector texture mapping (camera mapping) There are times when we wish to glue our texture onto our triangles using a projector model, instead of the affine gluing model. For example, we may wish to simulate a slide projector illuminating some triangles in space. 29 Projector texture mapping 30 15

RECAP: projection p y e z e [0,0,0,1] t Canonical square space: x n = x e z e, y n = y e z e w n x n w n y n w n w n x c = y c w c = 1 0 0 0 0 1 0 0 0 0 1 0 x e y e z e 1 31 Projector texture mapping The slide projector ( camera) is modeled using 4 by 4, modelview and projection matrices, M s and P s. x t y t = P s M s x o y o z o 1 32 16

Projector texture mapping With the texture coordinates defined as x t = x t and y t = y t To color a point on a triangle with object coordinates [x o, y o,z o,1] t, we fetch the texture data stored at location [x t, y t ] t 33 Projector texture mapping The three quantities x t, y t and are all affine functions of (x o, y o,z o ). Thus these quantities will be properly interpolated over a triangle when implemented as varying variables. In the fragment shader, we need to divide by to obtain the actual texture coordinates. When doing projector texture mapping, we do not need to pass any texture coordinates of triangles as attribute variables to our vertex shader. 34 17

Projector texture mapping We simply use the object coordinates already available to us. We do need to pass in the object coordinates via the necessary projector matrices (affine/ projection matrices), using uniform variables. 35 Projector texture mapping Projector vertex shader #version 130 uniform mat4 umodelviewmatrix; uniform mat4 uprojmatrix; uniform mat4 usprojmatrix; uniform mat4 usmodelviewmatrix; in vec4 avertex; out vec4 atexcoord; void main(){ vtexcoord = usprojmatrix * usmodelviewmatrix * avertex; gl_position = uprojmatrix * umodelviewmatrix * avertex; } 36 18

Projector texture mapping Projector fragment shader #version 130 uniform sampler2d vtexunit0; in vec4 atexcoord; out vec4 fragcolor; void main(){ vec2 tex2; tex2.x = vtexcoord.x/vtexcoord.w; tex2.y = vtexcoord.y/vtexcoord.w; vec4 texcolor0 = texture2d(vtexunit0, tex2); fragcoor = texcolor0; } x t y t = P s M s x o y o z o 1 37 Projector texture mapping Conveniently, OpenGL even gives us a special call texture2dproj(vtexunit0, ptexcoord), that actually does the divide for us. Inconveniently, when designing our slide projector matrix usprojmatrix, we have to deal with the fact that the canonical texture image domain in OpenGL is the unit square, whose lower left and upper right corners have coordinates [0,0] t and [1,1] t used for the display window. 38 19

Multipass rendering More interesting rendering effects can be obtained using multiple rendering passes over the geometry in the scene. In this approach, the results of all but the final pass are stored offline and not drawn to the screen. To do this, the data is rendered into something called, a FrameBufferObject, or FBO. After rendering, the FBO data is then loaded as a texture, and thus can be used as input data in the next rendering pass. 39 Multipass rendering 1 st pass rendering (cube map) 2 nd pass rendering (mirror ball) 40 20

Multipass rendering 1 st pass rendering (cube map) 2 nd pass rendering (mirror ball) 41 Shadow mapping The idea is to first create and store a z-buffered image from the point of view of the light, and then compare what we see in our vieo what the light saw it its view. 42 21

Shadow mapping If a point observed by the eye is not observed by the light, then there must be some occluding object in between, and we should drahat point as if it were in shadow. 1 st pass rendering 43 Shadow mapping In a first pass, we render into an FBO the scene as observed from some camera whose origin coincides with the position of the point light source. Let us model this camera transform as: x t y t z t = P s M s x o y o z o 1 for appropriate matrices, P s and M s. 44 22

Shadow mapping During this first pass, we render the scene to an FBO using M s as the modelview matrix and as the projection matrix. P s In the FBO (of light view), we store, not the color of the point, but rather its z t value. Due to z-buffering, the data stored at a pixel in the FBO represents the z t value of the geometry closest to the light along the relevant line of sight. This FBO is then transferred to a texture. 45 Shadow mapping During the second rendering pass, we render our desired image from the eye s point of view, but for each pixel (fragment shader), we need to compare the distances of (vposition-to-light) and the depth map from light to check if the point we are observing was also observed by the light, or if it was blocked by something closer in the light s view. 46 23

Shadow mapping To do this, we use the same computation that was done with projector texture mapping Doing so, in the fragment shader, we can obtain the varying variables x t, y t and z t associated with the point [x o, y o,z o,1] t. 2 We then compare this nd pass rendering z t value (projector) with the z t value (light) stored at [x t, y t ] t in the texture of light. 47 Shadow mapping If these values agree then we are looking at a point that was also seen by the light; such a point is not in shadow and should be shaded accordingly in fragment shader. Conversely, if these values disagree, then the point we are looking at was occluded in the light s image, is in shadow and should be shaded as such. 48 24