CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Similar documents
Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

CPSC / Texture Mapping

CSE528 Computer Graphics: Theory, Algorithms, and Applications

CS 130 Final. Fall 2015

Pipeline Operations. CS 4620 Lecture 10

COMP 175: Computer Graphics April 11, 2018

Surface Rendering. Surface Rendering

Graphics for VEs. Ruth Aylett

Texture mapping. Computer Graphics CSE 167 Lecture 9

CS 354R: Computer Game Technology

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Complex Features on a Surface. CITS4241 Visualisation Lectures 22 & 23. Texture mapping techniques. Texture mapping techniques

3D Rasterization II COS 426

Topics and things to know about them:

CS 498 VR. Lecture 19-4/9/18. go.illinois.edu/vrlect19

Graphics for VEs. Ruth Aylett

COMP 175 COMPUTER GRAPHICS. Lecture 11: Recursive Ray Tracer. COMP 175: Computer Graphics April 9, Erik Anderson 11 Recursive Ray Tracer

Computer Graphics. - Texturing Methods -

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1)

+ = To Do. Adding Visual Detail. Texture Mapping. Parameterization. Option: Varieties of projections. Foundations of Computer Graphics (Fall 2012)

Computer Graphics 10 - Shadows

CS 4620 Midterm, March 21, 2017

Computer Graphics. Shadows

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

Texture Mapping and Sampling

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Graphics and Interaction Rendering pipeline & object modelling

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources.

+ = To Do. Texture Mapping. Adding Visual Detail. Parameterization. Option: Varieties of projections. Computer Graphics. geometry

Texture Mapping II. Light maps Environment Maps Projective Textures Bump Maps Displacement Maps Solid Textures Mipmaps Shadows 1. 7.

Computer Graphics I Lecture 11

Pipeline Operations. CS 4620 Lecture 14

Introduction to Visualization and Computer Graphics

Complex Shading Algorithms

CS 5600 Spring

03 RENDERING PART TWO

Topic 12: Texture Mapping. Motivation Sources of texture Texture coordinates Bump mapping, mip-mapping & env mapping

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007

CS5620 Intro to Computer Graphics

LEVEL 1 ANIMATION ACADEMY2010

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

CS 431/636 Advanced Rendering Techniques

Topic 11: Texture Mapping 11/13/2017. Texture sources: Solid textures. Texture sources: Synthesized

CHAPTER 1 Graphics Systems and Models 3

COMP30019 Graphics and Interaction Rendering pipeline & object modelling

Lecture outline. COMP30019 Graphics and Interaction Rendering pipeline & object modelling. Introduction to modelling

CSE 167: Lecture 11: Textures 2. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Chapter 2 A top-down approach - How to make shaded images?

Topic 11: Texture Mapping 10/21/2015. Photographs. Solid textures. Procedural

Introduction Ray tracing basics Advanced topics (shading) Advanced topics (geometry) Graphics 2010/2011, 4th quarter. Lecture 11: Ray tracing

INF3320 Computer Graphics and Discrete Geometry

Texturing Theory. Overview. All it takes is for the rendered image to look right. -Jim Blinn 11/10/2018

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Barycentric Coordinates and Parameterization

Einführung in Visual Computing

Game Architecture. 2/19/16: Rasterization

Computer Graphics. Illumination Models and Surface-Rendering Methods. Somsak Walairacht, Computer Engineering, KMITL

Texture Mapping. Michael Kazhdan ( /467) HB Ch. 14.8,14.9 FvDFH Ch. 16.3, , 16.6

Real-Time Shadows. Last Time? Textures can Alias. Schedule. Questions? Quiz 1: Tuesday October 26 th, in class (1 week from today!

Computer Graphics. Illumination and Shading

Deferred Rendering Due: Wednesday November 15 at 10pm

Institutionen för systemteknik

CS 325 Computer Graphics

Chapter 10. Surface-Rendering Methods. Somsak Walairacht, Computer Engineering, KMITL

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Input Nodes. Surface Input. Surface Input Nodal Motion Nodal Displacement Instance Generator Light Flocking

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

Computer Graphics. Illumination and Shading

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

Textures. Texture coordinates. Introduce one more component to geometry

03 Vector Graphics. Multimedia Systems. 2D and 3D Graphics, Transformations

Photorealistic 3D Rendering for VW in Mobile Devices

CS4620/5620: Lecture 14 Pipeline

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

TDA362/DIT223 Computer Graphics EXAM (Same exam for both CTH- and GU students)

CPSC / Illumination and Shading

CS 464 Review. Review of Computer Graphics for Final Exam

Point based Rendering

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

More Texture Mapping. Texture Mapping 1/46

Computergrafik. Matthias Zwicker. Herbst 2010

diffuse diffuse reflection refraction diffuse mapping diffuse reflection reflection filter mapping mapping reflection

Interpolation using scanline algorithm

Rasterization Overview

The Bricks2D texture has axial projection methods available such as spherical, cubic, planar, front, cylindrical and UV.

Visualisatie BMT. Rendering. Arjan Kok

CPSC GLOBAL ILLUMINATION

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

The Rasterization Pipeline

CMSC427 Final Practice v2 Fall 2017

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Global Rendering. Ingela Nyström 1. Effects needed for realism. The Rendering Equation. Local vs global rendering. Light-material interaction

Advanced Lighting Techniques Due: Monday November 2 at 10pm

Virtual Reality for Human Computer Interaction

Homework #2. Shading, Projections, Texture Mapping, Ray Tracing, and Bezier Curves

CS 4620 Program 3: Pipeline

Transcription:

CS 475 / CS 675 Computer Graphics Lecture 11 : Texture

Texture Add surface detail Paste a photograph over a surface to provide detail. Texture can change surface colour or modulate surface colour. http://en.wikipedia.org/wiki/uv_mapping

Texture Mapping Why use texture mapping to increase detail? Expensive solution: add more detail to model + detail incorporated as a part of object modeling tools aren t very good for adding detail model takes longer to render model takes up more space in memory complex detail cannot be reused http://en.wikipedia.org/wiki/microsoft_flight_simulator Efficient solution: map a texture onto model + texture maps can be reused + texture maps take up space in memory, but can be shared, and compression and caching techniques can reduce overhead significantly compared to real detail + texture mapping can be done quickly (we ll see how) + placement and creation of texture maps can be made intuitive (e.g., tools for adjusting mapping, painting directly onto object) texture maps do not affect the geometry of the object

Texture Mapping What kind of detail can go into these maps? diffuse, ambient and specular colors specular exponents transparency, reflectivity fine detail surface normals (bumps) data to visualize projected lighting and shadows games use billboards for distant detail. http://www.flipcode.com/archives/light_mapping_theory_and_implementation.shtml l Courtesy of Mark Harris, UNC Chapel Hil

Texture Mapping What are Mappings? A function is a mapping functions map values in their subset of a domain into their subset of a co-domain each value in the domain will be mapped to one value in the co-domain Can transform one space into another with a function Viewing pipeline takes the WCS to VCS to CCS to NDCS these are all mappings. What is Texture Mapping? We have points on a surface in object-space the domain We want to get values from a texture map the co-domain What function(s) should we use?

Texture Mapping Basic Idea Texture mapping is the process of mapping a geometric point to a color in a texture map We want to map arbitrary geometry to a pix(-cture)map of arbitrary dimension We do this in two steps: map a point on the arbitrary geometry to a point on an abstract unit square. (uv plane) map a point on abstract unit square to a point on the actual pixmap of arbitrary dimension v 0,0 1,1 u Second step is easier, so we present it first Van Gogh

Texture Mapping From a unit square to a pixmap A 2D example: mapping from unit u,v square to texture map (arbitrary pixmap, possibly with alpha values) v 100, 100 1,1 0,0 0,0 u Step 1: transform a point on abstract continuous texture plane to a point in discrete texture map Step 2: Get color at transformed point in texture image, for e.g., (0.0, 0.0) => (0, 0), (1.0, 1.0) => (100, 100), (0.75, 0.45) => (75, 45)

Texture Mapping In general, for any point (u, v) on unit plane, the corresponding point on the texture image is: (u * pixmap width, v * pixmap height). v 0,0 1,1 u There are infinitely many points on unit plane so we will get sampling errors. The uv plane is continuous version of the texture pixmap. Now the unit uv square acts as stretchable rubber sheet to wrap around the texture mapped object.

Texture Mapping Polygons Interpolating Texture Coordinates Pre-calculate texture coordinates for each vertex, for e.g., as shown for the cuboid below. 0,1 1,1 Vertices will have different texture coordinates on different faces. 0,0 Interpolate uv coordinates linearly across triangles as part of Gouraud shading u1, v 1 1,0 u p, v p u2, v 2 u3, v 3

Texture Mapping Polygons Interpolating Texture Coordinates Affine texture mapping will interpolate texture coordinates linearly across the screen. This results in distorted textures. u t = 1 t u 0 tu 1, 0 t 1 Perspective texture mapping accounts for the actual 3D positions of vertices. This is expensive. Instead of interpolating the texture coordinates directly, the coordinates are divided by their depth (relative to the viewer), and the reciprocal of the depth value is also interpolated and used to recover the perspective-correct coordinate. u0 u1 1 t t z0 z1 ut =, 0 t 1 1 1 1 t t z0 z1

Texture Mapping Polygons Interpolating Texture Coordinates http://en.wikipedia.org/wiki/texture_mapping

Texture Mapping Cylinders and Cones Imagine a standard cylinder or cone as a stack of circles. Use position of point on perimeter to determine u, use height of point in stack to determine v. Map top and bottom caps separately, as onto a plane Calculating v : height of point in object space, which ranges between [-.5,.5], gets mapped to a v range of [0, 1]. 0.5 0.2 P v=p y 0.5 Py 0.5 Calculating u : map points on circular perimeter to u values between 0 and 1; 0 radians is 0, 2π radians is 1. Then u = θ/2π. These mappings are arbitrary: any function mapping the angle around the cylinder (θ) to the range 0 to 1 will work.

Texture Mapping Cylinders and Cones Want to convert a point P on the perimeter to an angle. θ is measured clockwise due to the right-handed coordinate system, but one still expects u to increase as we travel circle counter-clockwise, as shown by colored arrows below. = / 2, u=0.25 0 = u=0.5 = 0 Px u=0.0 x z Pz = /2, u=0.75 if 0, u= / 2 [0.0, 0.5] if 0, u=1 / 2 [0.5, 1.0 ] Circle is not necessarily a unit circle =atan2 P z, P x P so use arctangent instead of sine or cosine Use atan2 to get whole circle (atan gives only half a circle).

Texture Mapping Sphere A sphere is a stack of circles of varying radii. Calculating u : as explained previously. y 1.0 P v r z 0.0 1 =sin Py r, 2 2 x v= 0.5 Calculating v : v is the latitude and varies from 0 to 1. When v is 0 or 1, there is a singularity and u should equal so specific value. Do not specifically check for this because we will rarely see this value within floating point precision.

Texture Mapping Complex Geometries Texture mapping of simple polygons was easy. How do we texture map more complicated shapes? E.g. A house How should we texture map it? We could texture map each polygonal face of the house (we know how to do this already as they are planar). This causes discontinuities at edges of polygons. We want smooth mapping without edge discontinuities Intuitive approach: reduce to a solved problem. Pretend the house is a sphere for texture-mapping purposes Texture mapping in raytracing

Texture Mapping Complex Geometries Intuitive approach: place bounding sphere around complicated shape uv-coords using Spherical mapping Radial Ray Bounding sphere House

Texture Mapping Complex Geometries Another Method using an inscribed sphere Assume intersection point lies on some sphere and calculate its uv coordinates using spherical projection. Use the distance from the center of the object to the intersection point as radius for the sphere. The sphere changes with the point s location. Intersection Point Intersection Point center center

Texture Mapping Complex Geometries Cylindrical Texture Mapping Spherical Texture Mapping http://www.siggraph.org/education/materials/hypergraph/mapping/r_wolfe/r_wolfe_mapping_1.htm

Environment Mapping Faking reflections http://www.siggraph.org/education/materials/hypergraph/mapping/r_wolfe/r_wolfe_mapping_1.htm Cube (environment) map is made by stitching together 6 projective textures.

Shadow Mapping Faking shadows Transform camera to each directional light source Render the scene from light source point of view (keeping the same far clipping plane), only updating Z buffer Read Z buffer into a texture map (the shadow map) While rendering the scene from the original eye point, convert every pixel to light space. If in light space the distance from the pixel to the light is greater than value in shadow map, it is in shadow. Major aliasing problem because size of shadow maps is limited. Implemented on hardware http://en.wikipedia.org/wiki/shadow_mapping

Texture Aliasing Reason Solution 1 Solution 2 http://www.siggraph.org/education/materials/hypergraph/mapping/r_wolfe/r_wolfe_mapping_1.htm

Texture Aliasing This is expensive to do if averaging is done over large areas repeatedly. http://www.siggraph.org/education/materials/hypergraph/mapping/r_wolfe/r_wolfe_mapping_1.htm

Texture Aliasing Mipmapping alleviates this problem by pre calculating the average colours. In the new, smaller texture map, each pixel contains the average of four pixels from the original texture. http://www.siggraph.org/education/materials/hypergraph/mapping/r_wolfe/r_wolfe_mapping_1.htm

Texture Aliasing http://www.siggraph.org/education/materials/hypergraph/mapping/r_wolfe/r_wolfe_mapping_1.htm

Surface Detail Ideal Geometry Adjust normals with a bump map. Approximate geometry with a displacement map. Replace surface normals with a normal map.

Bump Mapping http://en.wikipedia.org/wiki/bump_mapping Object Bump Map Bump Mapped Surface Original Surface Bump Mapped Object Use an array of values to perturb surface normals (calculate gradient at every image pixel and add it to the normal). http://freespace.virgin.net/hugo.elias/graphics/x_polybm.htm

Normal Mapping Bump mapping uses a single-channel (grayscale) map to perturb existing normals on the surface; treat grayscale map as height field perturb the surface normals by the gradient of the map lighting is calculated using the normal Normal mapping uses a multichannel (rgb) map to completely replace the existing normals. RGB values of each pixel corresponding to the x,y,z components of the normal vector x,y,z components of the mapped normals are usually in object-space but other spaces are sometimes used level of detail of output renders is limited by resolution of normal maps instead of by the number of polygons in rendered meshes! runtime surface normal determination is trivial it s just a look-up making this technique particularly useful for real-time applications Limitations: silhouettes do not reflect added detail though runtime is easy, initial creation of normal maps is not straight-forward

Normal Mapping Normal map can be used to fake additional geometry. Image courtesy of ATI/AMD

Normal Mapping Images courtesy of www.unrealtechnology.com Unreal Engine 3 (Gears of War, Epic Games)

Normal Mapping Creation of meaningful normal maps is not simple Unlike bump maps, normal maps can not simply be painted by hand because color in a normal map is constantly varied and r,g,b values are spatially meaningful. Normal map for a face. Normal map for a brick wall. http://developer.nvidia.com/object/real-time-normal-map-dxt-compression.html www.bitmanagement.com Normal maps must be generated by specialized software, such as Pixologic s Zbrush (www.pixologic.com) A high resolution model is created by the artist and its normals are used to generate maps for lower-res versions of the model

Displacement Mapping Actual geometric position of points over the surface are displaced along the surface normal according to the values in the texture. http://www.xbitlabs.com/articles/video/display/matrox parhelia.html