Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University

Similar documents
CS 432 Interactive Computer Graphics

CT5510: Computer Graphics. Texture Mapping

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

Texture mapping. Computer Graphics CSE 167 Lecture 9

CS452/552; EE465/505. Texture Mapping in WebGL

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

INF3320 Computer Graphics and Discrete Geometry

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

-=Catmull's Texturing=1974. Part I of Texturing

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03.

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

Texturing. Slides done by Tomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Discussion 3. PPM loading Texture rendering in OpenGL

CS 450: COMPUTER GRAPHICS TEXTURE MAPPING SPRING 2015 DR. MICHAEL J. REALE

Computergraphics Exercise 15/ Shading & Texturing

Texture Mapping and Sampling

三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒

Steiner- Wallner- Podaras

Texture Mapping. Mike Bailey.

Texture and other Mappings

Lecture 07: Buffers and Textures

INF3320 Computer Graphics and Discrete Geometry

Shading. Flat shading Gouraud shading Phong shading

Texture Mapping CSCI 4229/5229 Computer Graphics Fall 2016

Cap. 3 Textures. Mestrado em Engenharia Informática (6931) 1º ano, 1º semestre

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques

Lighting and Texturing

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

Introduction to Visualization and Computer Graphics

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

CSE 167: Lecture #17: Volume Rendering. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Computer Graphics Texture Mapping

+ = Texturing: Glue n-dimensional images onto geometrical objects. Texturing. Texture magnification. Texture coordinates. Bilinear interpolation

Grafica Computazionale

Textures. Texture coordinates. Introduce one more component to geometry

Overview. By end of the week:

Texture Mapping and Special Effects

OpenGL Texture Mapping. Objectives Introduce the OpenGL texture functions and options

CSE 167: Introduction to Computer Graphics Lecture #9: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

CS179: GPU Programming

CS212. OpenGL Texture Mapping and Related

Lecture 22 Sections 8.8, 8.9, Wed, Oct 28, 2009

Preparing for Texture Access. Stored Texture Shaders. Accessing Texture Maps. Vertex Shader Texture Access

Stored Texture Shaders

GRAFIKA KOMPUTER. ~ M. Ali Fauzi

Shading/Texturing. Dr. Scott Schaefer

ก ก ก.

CSE528 Computer Graphics: Theory, Algorithms, and Applications

The Rasterization Pipeline

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

CS 354R: Computer Game Technology

CS GPU and GPGPU Programming Lecture 12: GPU Texturing 1. Markus Hadwiger, KAUST

CSE 167: Lecture 11: Textures 2. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Texture Mapping 1/34

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Pipeline Operations. CS 4620 Lecture 14

General Purpose computation on GPUs. Liangjun Zhang 2/23/2005

Programming Graphics Hardware

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Computergrafik. Matthias Zwicker. Herbst 2010

Billboards!! A texture mapped polygon, which always faces the viewer

CS GPU and GPGPU Programming Lecture 16+17: GPU Texturing 1+2. Markus Hadwiger, KAUST

Texture Mapping. Texture Mapping. Map textures to surfaces. Trompe L Oeil ( Deceive the Eye ) The texture. Texture map

Lecture 5 3D graphics part 3

Com S 336 Final Project Ideas

GLSL Applications: 2 of 2

Texture Mapping 1/34

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Computer Graphics. - Texturing Methods -

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Zeyang Li Carnegie Mellon University

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

3D Rasterization II COS 426

Texture Mapping II. Light maps Environment Maps Projective Textures Bump Maps Displacement Maps Solid Textures Mipmaps Shadows 1. 7.

Overview. Goals. MipMapping. P5 MipMap Texturing. What are MipMaps. MipMapping in OpenGL. Generating MipMaps Filtering.

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel,

Computer Graphics (CS 543) Lecture 8c: Environment Mapping (Reflections and Refractions)

Surface Rendering. Surface Rendering

Textures III. Week 10, Wed Mar 24

Bump Mapping Which one of these two image has a better visual effect?

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CMSC 425: Lecture 12 Texture Mapping Thursday, Mar 14, 2013

Advanced Lighting Techniques Due: Monday November 2 at 10pm

CS 130 Final. Fall 2015

Practical Texturing (WebGL) CS559 Fall 2016 Lecture 20 November 7th 2016

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CPSC 436D Video Game Programming

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

Transcription:

Texture Mapping Computer Graphics, 2015 Lecture 9 Johan Nysjö Centre for Image analysis Uppsala University

What we have rendered so far: Looks OK, but how do we add more details (and colors)?

Texture mapping Allows us to add extra details without adding extra geometry (polygons) Original mesh Same mesh, but with Blinn-Phong shading + diffuse, gloss, and normalmap textures Model and texture source: http://www.turbosquid.com

Techniques covered in this lecture Basic texture mapping Displacement mapping Bump and normal mapping Environment mapping Procedural textures Billboarding

Textures In most real-time rendering applications (e.g., games), textures are image-based 2D arrays of data Can be created from, e.g., photos or hand-drawn images 1D or 3D textures are also used in some applications

Texture elements The elements in a texture are called texels (texture pixels) and can hold many different types of information, e.g., Color (most common) Normals Opacity Intensity Gloss Height Ambient occlusion Different textures of a brick wall

Texture coordinates Texture coordinates range from 0 to 1 and are typically denoted s, t, and p, where s is is used for 1D textures (s,t) are used for 2D textures (s,t,p) are used for 3D textures Texture space for 2D texture maps

Assigning 2D texture coordinates to 3D models Each vertex in the rendered model must be assigned a texture coordinate Can be done manually when defining simple geometric objects like planes, cubes, spheres, etc For more complex models, texture coordinates can be assigned semi-automatically by unwrapping the mesh on a 2D grid in a 3D modelling software like Blender. Usually done by the 3D artist who created the model. Also possible to use projector functions to generate texture coordinates automatically

Mesh unwrapping Original mesh with diffuse shading Same mesh with diffuse texture Diffuse texture displayed on the unwrapped mesh

Projector functions Image source: http://www.realtimerendering.com

Specifying texture coordinates in OpenGL Old fixed function OpenGL: glmultitexcoord sets the current texture coordinates for a vertex Modern shader-based OpenGL: Provide texture coordinates as vertex attributes

Mapping a 2D texture on a sphere using longitude/latitude as texcoords 2D texture Texture mapping

Multiple textures can be applied Sphere rendered with four different textures

Another multi-texture example Diffuse Gloss Normal

Loading texture images from file OpenGL does not provide any function for loading images, but there are plenty of third-party image libraries that can do this for you: LodePNG (see example on next slide) libpng libjpeg FreeImage (supports most popular image formats) SOIL (reads BMP, JPEG, PNG, TGA, DDS, etc) DevIL

Loading a PNG image with the LodePNG library LodePNG consists of two files: lodepng.cpp and lodepng.h. Include these files in your project/code base, and load the PNG image as #include lodepng.h... std::string filename = "diffuse.png"; std::vector<unsigned char> data; unsigned width, height; unsigned error = lodepng::decode(data, width, height, filename); if (error!= 0) { std::cout << "Error: " << lodepng_error_text(error) << std::endl; exit(exit_failure); }...

Creating an OpenGL 2D texture from the loaded image // Generate a name for the texture GLuint texture; glgentextures(1, &texture); // Bind the texture to the target GL_TEXTURE_2D glbindtexture(gl_texture_2d, texture); // Set sampler parameters gltexparameteri(gl_texture_2d, gltexparameteri(gl_texture_2d, gltexparameteri(gl_texture_2d, gltexparameteri(gl_texture_2d, GL_TEXTURE_MIN_FILTER, GL_LINEAR); GL_TEXTURE_MAG_FILTER, GL_LINEAR); GL_TEXTURE_WRAP_S, GL_REPEAT); GL_TEXTURE_WRAP_T, GL_REPEAT); // Upload the image data into the texture glteximage2d(gl_texture_2d, 0, // GL_RGBA8, // width, height, 0, // GL_RGBA, GL_UNSIGNED_BYTE, // &(data[0])); // glbindtexture(gl_texture_2d, 0); // unbind target, level of detail internal format width, height, border external format, type pixels the texture

Activating the texture At rendering, you need to select the active texture unit (GL_TEXTURE0, GL_TEXTURE1, etc), bind your texture to this unit*, and assign the texture unit number to a uniform sampler variable in the shader. Example: // Select texture unit 0 as active texture unit glactivetexture(gl_texture0); // Bind your texture to this unit glbindtexture(gl_texture_2d, texture); // Assign the texture unit number to a uniform sampler variable // (you can call this variable whatever you want) program.setuniform1i( u_texture0, 0); // Draw... *If you have multiple textures, you can bind the first texture to GL_TEXTURE0, the second to GL_TEXTURE1, and so on.

Accessing the 2D texture in a fragment shader Pass the texture coordinate of each vertex as a varying vec2 variable (e.g., v_texcoord) from the vertex shader to the fragment shader In the fragment shader, declare the uniform sampler variable (texture unit) as uniform sampler2d u_texture0; and access the 2D texture as vec4 color = texture(u_texture0, v_texcoord);

Links to some useful documentation glgentextures glbindtexture gltexparameter glteximage1d, glteximage2d, glteximage3d glactivetexture gldeletetexture

Texture coordinate range Texture coordinates range from 0.0 to 1.0 What happens if we go outside this range? (0,0) (1,0)? (0,1) (1,1)

Corresponder functions (wrapping) OpenGL uses so-called corresponder functions to determine how the image should be wrapped when the texture coordinates lie outside the [0,1) range: GL_CLAMP_TO_EDGE GL_CLAMP_TO_BORDER GL_REPEAT GL_MIRRORED_REPEAT

GL_CLAMP_TO_EDGE

GL_CLAMP_TO_BORDER

GL_REPEAT

GL_MIRRORED_REPEAT

Texture filtering The resolution of the texture map rarely matches the resolution of the viewport We can have many texels within one pixel (minification) or many pixels representing one texel (magnification) OpenGL handles this via filtering operations: GL_NEAREST GL_LINEAR GL_NEAREST_MIPMAP, GL_NEAREST_MIPMAP_LINEAR, GL_LINEAR_MIPMAP, GL_LINEAR_MIPMAP_LINEAR

Nearest-neighbor (GL_NEAREST) filtering Original image Magnification

Bilinear (GL_LINEAR) filtering Original image Magnification

Bicubic filtering* Original image Magnification *Not available in OpenGL, but can be implemented in fragment shaders

Texture aliasing artifacts The mismatch between texture resolution and viewport resolution can lead to visible aliasing or jaggies artifacts Nearest sampling Linear filtering (better, but still aliasing) Image source: Edward Angel

Antialiasing and Mipmapping Aliasing artifacts can be efficiently suppressed with a technique called mipmapping The original texture is filtered down repeatedly into smaller images, often called a mipmap chain During rendering, OpenGL selects the mipmap level that best matches the viewport resolution Level N Level 0 Image source: http://www.realtimerendering.com

Mipmap linear filtering Linear filtering Mipmap linear filtering (less aliasing!) Image source: Edward Angel

Mipmaps Can be generated automatically with the function void glgeneratemipmap(glenum target); where target is one of the symbolic constants GL_TEXTURE_2D or GL_TEXTURE_CUBE_MAP The resulting texture will be about 33% larger than the original texture You should almost always use mipmapping! It improves the visual quality. See the documentation for more information

Displacement mapping New geometry (vertices) is added and then displaced (in the normal direction) using a map containing height values. Main drawback: expensive to store and render the extra geometry! Original surface Displacement mapping Image source: Wikipedia

Bump mapping Uses a 2D grayscale texture containing height values to perturb the surface normals of the rendered object More efficient than displacement mapping since no extra geometry (vertices) is needed. The eye is made believe the surface is bumpy (has many details) The contour is unchanged Introduced by Blinn in 1978

Bump mapping Bump map Sphere rendered with this bump map Image source: http://www.realtimerendering.com

Bump mapping vs. displacement mapping Displacement mapping Image source: Wikipedia

Bump mapping vs. displacement mapping Bump mapping Displacement mapping Image source: Wikipedia

Normal mapping Similar to bump mapping, but uses a 2D RGB texture containing normal vectors (a so-called normal map) to perturb the surface normals of the rendered object A common usage of normal mapping is to generate ( bake ) a normal map of a high-polygon model and apply this normal map on a low-polygon version of the same model Allows us to reduce the number of polygons without reducing the amount of details

Normal map example The RGB colors used in the normal map show the orientation of the normals Normal map of a brick wall

Normal mapping The normal is mapped to the surface

Applying the brick normal map on a sphere Without normal mapping With normal mapping

Normal mapping case study The left bunny contains 69566 triangles, the right one 1392 triangles. How can we make the low-polygon bunny look like the high-polygon bunny (without adding vertices)? Highpoly modell Lowpoly modell

Unwrapping the lowpoly model Seams marked by the user Unwrapped mesh

Baking a normal map of the highpoly bunny Highpoly model and lowpoly model (overlaid) Normal map of the highpoly model 2D texture

Applying the normal map on the lowpoly bunny (without shading) Lowpoly model with normal map

Applying the normal map on the lowpoly bunny (with shading) Lowpoly model with normal map and shading

Looks almost the same as the highpoly model! Original highpoly model 69566 triangles Lowpoly model with normal map 1392 triangles

Bump vs. Normal mapping Bump mapping Gray-value map containing height values Smaller size on disk Compute normals on-thefly in the shader function Somewhat slower rendering (in general) Normal mapping RGB map containing normal vectors Larger size on disk Normals are already precomputed Somewhat faster rendering (in general)

Tangent space Which direction should we displace the surface normals in when we apply bump or normal mapping in 3D? Not obvious... We need to define a local coordinate system for each vertex and displace the normal in that coordinate system The coordinate system used for normal/bump mapping is called tangent space

Tangent space Defined by three orthogonal unit vectors: the surface normal n, the binormal b, and the tangent t. See Chapter 7.10 in the course book for more details on how to define the tangent space Image source: http://www.realtimerendering.com

Environment mapping An image-based lighting (IBL) technique for approximating global illumination effects such as reflection and refraction The incoming light from the environment is stored in textures and mapped onto the reflecting/refracting object

Cube environment mapping A cube map is a type of texture that stores the incoming light from the environment in the six sides of a cube

Cube environment mapping Find where a perfect reflection hits the cube map. This can be done by calculating a reflection vector R from the surface normal N and the incident vector* I: Use the reflection vector as texture coordinate to fetch the incoming light from the cube map texture *A vector that points from the camera to the reflecting point on the surface

Example - Cubemap +Y +X +Z -X -Z -Y Texture courtesy: Humus

Reflection

Refraction

Pre-filtered cube maps Glossy surface Rough surface

Physically-based rendering Textures and model: Andrew Maximov

Procedural textures Procedural textures can be generated on-the-fly on the GPU by evaluating a function that describes the pattern of interest Unlike image-based textures, procedural textures require no storage and can be rendered at arbitrary resolution Other advantages: easy to create animated patterns and avoid ugly seams

Perlin noise A method invented by Ken Perlin for producing textures from structured chaos Perlin noise has been used extensively for rendering of natural-looking objects with noisy or fractal patterns, e.g., terrain, vegetation, fur, clouds, water, smoke, fire, etc Image source: http://www.realtimerendering.com

Perlin noise 2D Perlin noise, one octave 2D Perlin noise, five octaves

Demo: Vertex displacement with animated 3D Perlin noise

Worley noise Introduced by Steven Worley in 1996 Produces a distance map to irregularly positioned feature points The distance map can be used to create patterns that look like cells, scales, blobs, etc Worley noise generates a different class of patterns than Perlin noise, but has similar applications

Worley noise 2D Worley noise 2D Worley noise, thresholded

Procedural noise textures in GLSL GLSL implementations of Perlin and Worley noise can be obtained from the following Git repositories: Link 1 Link 2 (includes a few demos) Just include the noise functions in your vertex or fragment shader

Generating noise textures Two approaches: Compute the noise on the CPU and store the result in a 1D, 2D, or 3D texture, which is then uploaded to the GPU memory and used as a conventional texture. Compute the noise on-the-fly on the GPU in a vertex or fragment shader. Possible to do in real-time on modern GPUs.

Skyboxes Used for rendering backgrounds in 3D scenes Basic idea: place the viewer inside a large cube, and use, e.g., cube mapping to project 2D background images on the cube's faces Creates an illusion of a 3D surrounding Skyboxes can be stationary or move along with the viewer Skydomes are similar, but are constructed with spheres or hemispheres

Skybox texture example

Alpha mapping and billboarding Instead of representing objects such as trees or grass as solid surfaces, we can use a set of billboards (quads with semi-transparent textures) to create an illusion of 3D objects Image source: http://www.realtimerendering.com

Impostor rendering An impostor of, e.g., a sphere can be created by rendering an image of the sphere as a screen- or worldaligned billboard. Thousands or millions of such billboards can be rendered at little cost Impostor Impostor Real sphere Real sphere Image source: http://www.realtimerendering.com

Particle systems Used for modeling realistic visual effects such as smoke, fire, water, dust, explosions, cloth, etc. Very common in games. Basic idea: Define a set of initial attributes (position, color, opacity, size, etc) for each particle In each frame, update the particle simulation on the CPU/GPU and render the particles as semi-transparent textured quads Image source: http://www.realtimerendering.com

Particle systems, example Image source: Wikipedia

Volume rendering The 3D texture mapping capabilities of modern GPUs enable efficient visualization of volumetric (3D) images One of the most common volume rendering techniques is GPU-accelerated ray-casting I'll talk more about this topic tomorrow