Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Similar documents
CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

CS452/552; EE465/505. Texture Mapping in WebGL

Introduction to Computer Graphics with WebGL

Discrete Techniques. 11 th Week, Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

Objectives. Texture Mapping and NURBS Week 7. The Limits of Geometric Modeling. Modeling an Orange. Three Types of Mapping. Modeling an Orange (2)

OpenGL Texture Mapping. Objectives Introduce the OpenGL texture functions and options

Computer Graphics. Three-Dimensional Graphics VI. Guoying Zhao 1 / 73

CS 5600 Spring

CS212. OpenGL Texture Mapping and Related

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

CS 432 Interactive Computer Graphics

Surface Rendering. Surface Rendering

CS452/552; EE465/505. Image Processing Frame Buffer Objects

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Introduction to Computer Graphics with WebGL

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Lecture 07: Buffers and Textures

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

INF3320 Computer Graphics and Discrete Geometry

Computer Graphics. Bing-Yu Chen National Taiwan University

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

CS335 Graphics and Multimedia. Slides adopted from OpenGL Tutorial

Cap. 3 Textures. Mestrado em Engenharia Informática (6931) 1º ano, 1º semestre

Steiner- Wallner- Podaras

Texture Mapping. Mike Bailey.

CT5510: Computer Graphics. Texture Mapping

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Lecture 22 Sections 8.8, 8.9, Wed, Oct 28, 2009

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

CSC Graphics Programming. Budditha Hettige Department of Statistics and Computer Science

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University

-=Catmull's Texturing=1974. Part I of Texturing

INF3320 Computer Graphics and Discrete Geometry

Discussion 3. PPM loading Texture rendering in OpenGL

GRAFIKA KOMPUTER. ~ M. Ali Fauzi

三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒

CS179: GPU Programming

Lecture 5 3D graphics part 3

Grafica Computazionale

General Purpose computation on GPUs. Liangjun Zhang 2/23/2005

Computergraphics Exercise 15/ Shading & Texturing

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03.

Texture mapping. Computer Graphics CSE 167 Lecture 9

Imaging and Raster Primitives

Texture Mapping and Sampling

CSE 167: Lecture 11: Textures 2. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CSE 167: Introduction to Computer Graphics Lecture #9: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

Pixels and Buffers. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques

Stored Texture Shaders

Copyright Khronos Group 2012 Page 1. Teaching GL. Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012

Texturing. Slides done by Tomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Computer Graphics Texture Mapping

CS GPU and GPGPU Programming Lecture 11: GPU Texturing 1. Markus Hadwiger, KAUST

Preparing for Texture Access. Stored Texture Shaders. Accessing Texture Maps. Vertex Shader Texture Access

Buffers. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Shading/Texturing. Dr. Scott Schaefer

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction

Overview. Goals. MipMapping. P5 MipMap Texturing. What are MipMaps. MipMapping in OpenGL. Generating MipMaps Filtering.

Lighting and Texturing

Texture Mapping CSCI 4229/5229 Computer Graphics Fall 2016

Most device that are used to produce images are. dots (pixels) to display the image. This includes CRT monitors, LCDs, laser and dot-matrix

CS GPU and GPGPU Programming Lecture 12: GPU Texturing 1. Markus Hadwiger, KAUST

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Practical Texturing (WebGL) CS559 Fall 2016 Lecture 20 November 7th 2016

CS GPU and GPGPU Programming Lecture 16+17: GPU Texturing 1+2. Markus Hadwiger, KAUST

CMSC 425: Lecture 12 Texture Mapping Thursday, Mar 14, 2013

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017

Textures. Texture coordinates. Introduce one more component to geometry

Texturing Theory. Overview. All it takes is for the rendered image to look right. -Jim Blinn 11/10/2018

We assume that you are familiar with the following:

Programming Graphics Hardware

Texture and other Mappings

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

The Graphics Pipeline

Overview. By end of the week:

ก ก ก.

Magnification and Minification

CHAPTER 1 Graphics Systems and Models 3

Texture Mapping. Texture Mapping. Map textures to surfaces. Trompe L Oeil ( Deceive the Eye ) The texture. Texture map

Computer Graphics Texture Mapping

CS452/552; EE465/505. Shadow Mapping in WebGL

CS 130 Final. Fall 2015

Rendering Objects. Need to transform all geometry then

Texture Mapping and Special Effects

+ = Texturing: Glue n-dimensional images onto geometrical objects. Texturing. Texture magnification. Texture coordinates. Bilinear interpolation

The Application Stage. The Game Loop, Resource Management and Renderer Design

Mali Demos: Behind the Pixels. Stacy Smith

Shaders. Slide credit to Prof. Zwicker

Rationale for Non-Programmable Additions to OpenGL 2.0

Tutorial 3: Texture Mapping

Transcription:

Buffers 1

Objectives Introduce additional WebGL buffers Reading and writing buffers Buffers and Images 2

Buffer Define a buffer by its spatial resolution (n x m) and its depth (or precision) k, the number of bits/pixel pixel 3

WebGL Frame Buffer 4

Where are the Buffers? HTML5 Canvas Default front and back color buffers Under control of local window system Physically on graphics card Depth buffer also on graphics card Stencil buffer Holds masks Most RGBA buffers 8 bits per component Latest are floating point (IEEE) 5

Other Buffers desktop OpenGL supported other buffers auxiliary color buffers accumulation buffer these were on application side now deprecated GPUs have their own or attached memory texture buffers off-screen buffers not under control of window system may be floating point 6

Images Framebuffer contents are unformatted usually RGB or RGBA one byte per component no compression Standard Web Image Formats jpeg, gif, png WebGL has no conversion functions Understands standard Web formats for texture images 7

The (Old) Pixel Pipeline OpenGL has a separate pipeline for pixels Writing pixels involves Moving pixels from processor memory to the frame buffer Format conversions Mapping, Lookups, Tests Reading pixels Format conversion 8

Packing and Unpacking Compressed or uncompressed Indexed or RGB Bit Format little or big endian WebGL (and shader-based OpenGL) lacks most functions for packing and unpacking use texture functions instead can implement desired functionality in fragment shaders 9

Deprecated Functionality gldrawpixels glcopypixels glbitmap 10

Buffer Reading WebGL can read pixels from the framebuffer with gl.readpixels Returns only 8 bit RGBA values In general, the format of pixels in the frame buffer is different from that of processor memory and these two types of memory reside in different places Need packing and unpacking Reading can be slow Drawing through texture functions and off-screen memory (frame buffer objects) 11

WebGL Pixel Function gl.readpixels(x,y,width,height,format,type,myimage) start pixel in frame buffer var myimage[512*512*4]; size type of pixels type of image pointer to processor memory gl.readpixels(0,0, 512, 512, gl.rgba, gl.unsigned_byte, myimage); 12

Render to Texture GPUs now include a large amount of texture memory that we can write into Advantage: fast (not under control of window system) Using frame buffer objects (FBOs) we can render into texture memory instead of the frame buffer and then read from this memory Image processing GPGPU 13

BitBlt 14

Objectives Introduce reading and writing of blocks of bits or bytes Prepare for later discussion compositing and blending 15

Writing into Buffers WebGL does not contain a function for writing bits into frame buffer Use texture functions instead We can use the fragment shader to do bit level operations on graphics memory Bit Block Transfer (BitBlt) operations act on blocks of bits with a single instruction 16

BitBlt Conceptually, we can consider all of memory as a large two-dimensional array of pixels We read and write rectangular block of pixels The frame buffer is part of this memory memory source writing into the frame buffer frame buffer (destination) 17

Writing Model Read destination pixel before writing source 18

Bit Writing Modes Source and destination bits are combined bitwise 16 possible functions (one per column in table) replace XOR OR 19

XOR mode XOR is especially useful for swapping blocks of memory such as menus that are stored off screen If S represents screen and M represents a menu the sequence S S M M S M S S M swaps S and M Same strategy used for rubber band lines and cursors 20

Cursor Movement Consider what happens as we move a cursor across the display We cover parts of objects Must return to original colors when cursor moves away 21

Rubber Band Line Fix one point Draw line to location of cursor Must return state of crossed objects when line moves 22

Texture Mapping 23

Objectives Introduce Mapping Methods Texture Mapping Environment Mapping Bump Mapping Consider basic strategies Forward vs backward mapping Point sampling vs area averaging 24

The Limits of Geometric Modeling Although graphics cards can render over 10 million polygons per second, that number is insufficient for many phenomena Clouds Grass Terrain Skin 25

Modeling an Orange Consider the problem of modeling an orange (the fruit) Start with an orange-colored sphere Too simple Replace sphere with a more complex shape Does not capture surface characteristics (small dimples) Takes too many polygons to model all the dimples 26

Modeling an Orange (2) Take a picture of a real orange, scan it, and paste onto simple geometric model This process is known as texture mapping Still might not be sufficient because resulting surface will be smooth Need to change local shape Bump mapping 27

Three Types of Mapping Texture Mapping Uses images to fill inside of polygons Environment (reflection mapping) Uses a picture of the environment for texture maps Allows simulation of highly specular surfaces Bump mapping Emulates altering normal vectors during the rendering process 28

Texture Mapping geometric model texture mapped 29

Environment Mapping 30

Bump Mapping 31

Where does mapping take place? Mapping techniques are implemented at the end of the rendering pipeline Very efficient because few polygons make it past the clipper 32

Objectives Basic mapping strategies Forward vs backward mapping Point sampling vs area averaging 34

Is it simple? Although the idea is simple---map an image to a surface---there are 3 or 4 coordinate systems involved 2D image 3D surface 35

Coordinate Systems Parametric coordinates May be used to model curves and surfaces Texture coordinates Used to identify points in the image to be mapped Object or World Coordinates Conceptually, where the mapping takes place Window Coordinates Where the final image is really produced 36

Texture Mapping parametric coordinates texture coordinates window coordinates 37 world coordinates

Mapping Functions Basic problem is how to find the maps Consider mapping from texture coordinates to a point a surface Appear to need three functions x = x(s,t) y = y(s,t) z = z(s,t) But we really want to go the other way t s (x,y,z) 38

Backward Mapping We really want to go backwards Given a pixel, we want to know to which point on an object it corresponds Given a point on an object, we want to know to which point in the texture it corresponds Need a map of the form s = s(x,y,z) t = t(x,y,z) Such functions are difficult to find in general 39

Two-part mapping One solution to the mapping problem is to first map the texture to a simple intermediate surface Example: map to cylinder 40

Cylindrical Mapping parametric cylinder x = r cos 2π u y = r sin 2πu z = v/h maps rectangle in u,v space to cylinder of radius r and height h in world coordinates s = u t = v maps from texture space 41

Spherical Map We can use a parametric sphere x = r cos 2πu y = r sin 2πu cos 2πv z = r sin 2πu sin 2πv in a similar manner to the cylinder but have to decide where to put the distortion Spheres are used in environmental maps 42

Box Mapping Easy to use with simple orthographic projection Also used in environment maps 43

Second Mapping Map from intermediate object to actual object Normals from intermediate to actual Normals from actual to intermediate Vectors from center of intermediate actual intermediate 44

Aliasing Point sampling of the texture can lead to aliasing errors miss blue stripes point samples in u,v (or x,y,z) space 45 point samples in texture space

Area Averaging A better but slower option is to use area averaging preimage pixel Note that preimage of pixel is curved 46

WebGL Texture Mapping I 47

Objectives Introduce WebGL texture mapping two-dimensional texture maps assigning texture coordinates forming texture images 48

Basic Stragegy Three steps to applying a texture 1. specify the texture read or generate image assign to texture enable texturing 2. assign texture coordinates to vertices Proper mapping function is left to application 3. specify texture parameters wrapping, filtering 49

Texture Mapping y z x geometry display t image 50 s

Texture Example The texture (below) is a 256 x 256 image that has been mapped to a rectangular polygon which is viewed in perspective 51

Texture Mapping and the WebGL Pipeline Images and geometry flow through separate pipelines that join during fragment processing complex textures do not affect geometric complexity vertices image geometry pipeline texel pipeline fragment processor 52

Specifying a Texture Image Define a texture image from an array of texels (texture elements) in CPU memory Use an image in a standard format such as JPEG Scanned image Generate by application code WebGL supports only 2 dimensional texture maps no need to enable as in desktop OpenGL desktop OpenGL supports 1-4 dimensional texture maps 53

Define Image as a Texture glteximage2d( target, level, components, w, h, border, format, type, texels ); target: type of texture, e.g. GL_TEXTURE_2D level: used for mipmapping (discussed later) components: elements per texel w, h: width and height of texels in pixels border: used for smoothing (discussed later) format and type: describe texels texels: pointer to texel array glteximage2d(gl_texture_2d, 0, 3, 512, 512, 0, GL_RGB, GL_UNSIGNED_BYTE, my_texels); 54

Using a GIF image // specify image in JS file var image = new Image(); image.onload = function() { configuretexture( image ); } image.src = "SA2011_black.gif // or specify image in HTML file with <img> tag // <img id = "teximage" src = "SA2011_black.gif"></img> 56 var image = document.getelementbyid("teximage ) window.onload = configuretexture( image );

Mapping a Texture Based on parametric texture coordinates Specify as a 2D vertex attribute t 0, 1 Texture Space a 1, 1 Object Space (s, t) = (0.2, 0.8) A 57 b 0, 0 1, 0 c s (0.4, 0.2) B C (0.8, 0.4)

Cube Example var texcoord = [ vec2(0, 0), vec2(0, 1), vec2(1, 1), vec2(1, 0) ]; function quad(a, b, c, d) { pointsarray.push(vertices[a]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[0]); pointsarray.push(vertices[b]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[1]); // etc 58

Interpolation WebGL uses interpolation to find proper texels from specified texture coordinates Can be distortions good selection of tex coordinates poor selection of tex coordinates texture stretched over trapezoid showing effects of bilinear interpolation 59

WebGL Texture Mapping II 60

Objectives Introduce the WebGL texture functions and options texture objects texture parameters example code 61

Using Texture Objects 1. specify textures in texture objects 2. set texture filter 3. set texture function 4. set texture wrap mode 5. set optional perspective correction hint 6. bind texture object 7. enable texturing 8. supply texture coordinates for vertex coordinates can also be generated 62

Texture Parameters WebGL has a variety of parameters that determine how texture is applied Wrapping parameters determine what happens if s and t are outside the (0,1) range Filter modes allow us to use area averaging instead of point samples Mipmapping allows us to use textures at multiple resolutions Environment parameters determine how texture mapping interacts with shading 63

Wrapping Mode Clamping: if s,t > 1 use 1, if s,t <0 use 0 Wrapping: use s,t modulo 1 gl.texparameteri(gl.texture_2d, gl.texture_wrap_s, gl.clamp ) gl.texparameteri( gl.texture_2d, gl.texture_wrap_t, gl.repeat ) t 64 texture s gl.repeat wrapping gl.clamp wrapping

Magnification and Minification More than one texel can cover a pixel (minification) or more than one pixel can cover a texel (magnification) Can use point sampling (nearest texel) or linear filtering ( 2 x 2 filter) to obtain texture values Texture Magnification Polygon Texture Minification Polygon 65

Filter Modes Modes determined by gl.texparameteri( target, type, mode ) gl.texparameteri(gl.texture_2d, gl.texure_mag_filter, GL_NEAREST); gl.texparameteri(gl.texture_2d, gl.texure_min_filter, gl.linear); 66

Mipmapped Textures Mipmapping allows for prefiltered texture maps of decreasing resolutions Lessens interpolation errors for smaller textured objects Declare mipmap level during texture definition gl.teximage2d(gl.texture_*d, level, ) 67

Example point sampling linear filtering mipmapped point sampling mipmapped linear filtering 68

Applying Textures Texture can be applied in many ways texture fully determines color modulated with a computed color blended with and environmental color Fixed function pipeline has a function gltexenv to set mode deprecated can get all desired functionality via fragment shader Can also use multiple texture units 69

Other Texture Features Environment Maps Start with image of environment through a wide angle lens Can be either a real scanned image or an image created in OpenGL Use this texture to generate a spherical map Alternative is to use a cube map Multitexturing Apply a sequence of textures through cascaded texture units 70

Applying Textures Textures are applied during fragments shading by a sampler Samplers return a texture color from a texture object varying vec4 color; //color from rasterizer varying vec2 texcoord; //texture coordinate from rasterizer uniform sampler2d texture; //texture object from application void main() { gl_fragcolor = color * texture2d( texture, texcoord ); } 71

Vertex Shader Usually vertex shader will output texture coordinates to be rasterized Must do all other standard tasks too Compute vertex position Compute vertex color if needed attribute vec4 vposition; //vertex position in object coordinates attribute vec4 vcolor; //vertex color from application attribute vec2 vtexcoord; //texture coordinate from application varying vec4 color; //output color to be interpolated varying vec2 texcoord; //output tex coordinate to be interpolated 72

A Checkerboard Image 73 var image1 = new Uint8Array(4*texSize*texSize); for ( var i = 0; i < texsize; i++ ) { for ( var j = 0; j <texsize; j++ ) { var patchx = Math.floor(i/(texSize/numChecks)); var patchy = Math.floor(j/(texSize/numChecks)); if(patchx%2 ^ patchy%2) c = 255; else c = 0; //c = 255*(((i & 0x8) == 0) ^ ((j & 0x8) == 0)) image1[4*i*texsize+4*j] = c; image1[4*i*texsize+4*j+1] = c; image1[4*i*texsize+4*j+2] = c; image1[4*i*texsize+4*j+3] = 255; } }

Cube Example var texcoord = [ vec2(0, 0), vec2(0, 1), vec2(1, 1), vec2(1, 0) ]; function quad(a, b, c, d) { pointsarray.push(vertices[a]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[0]); pointsarray.push(vertices[b]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[1]); // etc 74

Texture Object 75 function configuretexture( image ) { var texture = gl.createtexture(); gl.bindtexture( gl.texture_2d, texture ); gl.pixelstorei(gl.unpack_flip_y_webgl, true); gl.teximage2d( gl.texture_2d, 0, gl.rgb, gl.rgb, gl.unsigned_byte, image ); gl.generatemipmap( gl.texture_2d ); gl.texparameteri( gl.texture_2d, gl.texture_min_filter, gl.nearest_mipmap_linear ); gl.texparameteri( gl.texture_2d, gl.texture_mag_filter, gl.nearest ); gl.activetexture(gl.texture0); gl.uniform1i(gl.getuniformlocation(program, "texture"), 0); }

Linking with Shaders var vtexcoord = gl.getattriblocation( program, "vtexcoord" ); gl.enablevertexattribarray( vtexcoord ); gl.vertexattribpointer( vtexcoord, 2, gl.float, false, 0, 0); // Set the value of the fragment shader texture sampler variable // ("texture") to the the appropriate texture unit. In this case, // zero for GL_TEXTURE0 which was previously set by calling // gl.activetexture(). gl.uniform1i( glgetuniformlocation(program, "texture"), 0 ); 76