1 Introduction to Computer Graphics with WebGL Ed Angel Lighting in WebGL WebGL lighting Application must specify - Normals - Material properties - Lights State-based shading functions have been deprecated (glnormal, glmaterial, gllight) - Replace by vertex attributes and sending uniforms to the shaders Compute in application or in shaders 2 Normalization Cosine terms in lighting calculations can be computed using dot product Unit length vectors simplify calculation Usually we want to set the magnitudes to have unit length but - Length can be affected by transformations - Rotation and translation preserve length (rigid body transformations) - Scaling does not preserve length GLSL has a normalization function 3
4 Normal for Triangle plane n (p - p 0 ) = 0 n p 2 n = (p 2 - p 0 ) (p 1 - p 0 ) normalize n n/ n p p 0 p 1 Note that right-hand rule determines outward face Adding Normals for Quads function quad(a, b, c, d) { var t1 = subtract(vertices[b], vertices[a]); var t2 = subtract(vertices[c], vertices[b]); var normal = cross(t1, t2); var normal = vec3(normal); normal = normalize(normal); pointsarray.push(vertices[a]); normalsarray.push(normal);... 5 Specifying a Point Light Source For each light source, we can set an RGBA for the diffuse, specular, and ambient components, and for the position var diffuse0 = vec4(1.0, 0.0, 0.0, 1.0); var ambient0 = vec4(1.0, 0.0, 0.0, 1.0); var specular0 = vec4(1.0, 0.0, 0.0, 1.0); var light0_pos = vec4(1.0, 2.0, 3,0, 1.0); 6
7 Distance and Direction The source colors are specified in RGBA The position is given in homogeneous coordinates - If w =1.0, we are specifying a finite location - If w =0.0, we are specifying a parallel source with the given direction vector The coefficients in distance terms are usually quadratic (1/(a+b*d+c*d*d)) where d is the distance from the point being rendered to the light source Moving Light Sources Light sources are geometric objects whose positions or directions are affected by the model-view matrix Depending on where we place the position (direction) setting function, we can - Move the light source(s) with the object(s) - Fix the object(s) and move the light source(s) - Fix the light source(s) and move the object(s) - Move the light source(s) and object(s) independently 8 Light Properties var lightposition = vec4(1.0, 1.0, 1.0, 0.0 ); var lightambient = vec4(0.2, 0.2, 0.2, 1.0 ); var lightdiffuse = vec4( 1.0, 1.0, 1.0, 1.0 ); var lightspecular = vec4( 1.0, 1.0, 1.0, 1.0 ); 9
1 Material Properties Material properties should match the terms in the light model - Reflectivities - w component gives opacity var materialambient = vec4( 1.0, 0.0, 1.0, 1.0 ); var materialdiffuse = vec4( 1.0, 0.8, 0.0, 1.0); var materialspecular = vec4( 1.0, 0.8, 0.0, 1.0 ); var materialshininess = 100.0; 0 Using MV.js for Products var ambientproduct = mult(lightambient, materialambient); var diffuseproduct = mult(lightdiffuse, materialdiffuse); var specularproduct = mult(lightspecular, materialspecular); gl.uniform4fv(gl.getuniformlocation(program, "ambientproduct"), flatten(ambientproduct)); gl.uniform4fv(gl.getuniformlocation(program, "diffuseproduct"), flatten(diffuseproduct)); gl.uniform4fv(gl.getuniformlocation(program, "specularproduct"), flatten(specularproduct)); gl.uniform4fv(gl.getuniformlocation(program, "lightposition"), flatten(lightposition)); gl.uniform1f(gl.getuniformlocation(program, "shininess"),materialshininess); 1 1 Front and Back Faces Every face has a front and back For many objects, we never see the back face so we don t care how or if it s rendered If it matters, we can handle in shader back faces not visible back faces visible 1 2
1 Transparency Material properties are specified as RGBA values The A value can be used to make the surface translucent The default is that all surfaces are opaque Later we will enable blending and use this feature However with the HTML5 canvas, A<1 will mute colors - Allows blending other elements onto HTML5 canvas 3
1 Introduction to Computer Graphics with WebGL Ed Angel Polygonal Shading Polygonal Shading In per vertex shading, shading calculations are done for each vertex - Vertex colors become vertex shades and can be sent to the vertex shader as a vertex attribute - Alternately, we can send the parameters to the vertex shader and have it compute the shade By default, vertex shades are interpolated across a triangle if passed to the fragment shader as a varying variable (smooth shading) We can also use uniform variables to shade each triangle with a single shade (flat shading) 2 Polygon Normals Triangles have a single normal - Shades at the vertices as computed by the modified Phong model can be almost same - Identical for a distant viewer (default) or if there is no specular component Consider model of sphere We want different normals at each vertex even though this concept is not quite correct mathematically 3
4 Smooth Shading We can set a new normal at each vertex Easy for sphere model - If centered at origin n = p Now smooth shading works Note silhouette edge Mesh Shading The previous example is not general because we knew the normal at each vertex analytically For polygonal models, Gouraud proposed we use the average of the normals around a mesh vertex n = (n 1 +n 2 +n 3 +n 4 )/ n 1 +n 2 +n 3 +n 4 5 Gouraud and Phong Shading Gouraud Shading - Find average normal at each vertex (vertex normals) - Apply modified Phong model at each vertex - Interpolate vertex shades across each polygon Phong shading - Find vertex normals - Interpolate vertex normals across edges - Interpolate edge normals across polygon - Apply modified Phong model at each fragment 6
7 Comparison If the polygon mesh approximates surfaces with a high curvatures, Phong shading may look smooth while Gouraud shading may show edges Phong shading requires much more work than Gouraud shading - Until recently not available in real time systems - Now can be done using fragment shaders Both need data structures to represent meshes so we can obtain vertex normals Implementation of Gourand is per vertex shading Implementation of Phong is per fragment shading
1 Introduction to Computer Graphics with WebGL Ed Angel Per Fragment vs Per Vertex Shading // vertex shader Vertex Lighting Shaders I attribute vec4 vposition; attribute vec4 vnormal; varying vec4 fcolor; uniform vec4 ambientproduct, diffuseproduct, specularproduct; uniform mat4 modelviewmatrix; uniform mat4 projectionmatrix; uniform vec4 lightposition; uniform float shininess; void main() { 2 Vertex Lighting Shaders II vec3 pos = -(modelviewmatrix * vposition).xyz; vec3 light = lightposition.xyz; vec3 L = normalize( light - pos ); vec3 E = normalize( -pos ); vec3 H = normalize( L + E ); // Transform vertex normal into eye coordinates vec3 N = normalize( (modelviewmatrix*vnormal).xyz); // Compute terms in the illumination equation 3
Vertex Lighting Shaders III // Compute terms in the illumination equation vec4 ambient = AmbientProduct; float Kd = max( dot(l, N), 0.0 ); vec4 diffuse = Kd*DiffuseProduct; float Ks = pow( max(dot(n, H), 0.0), Shininess ); vec4 specular = Ks * SpecularProduct; if( dot(l, N) < 0.0 ) specular = vec4(0.0, 0.0, 0.0, 1.0); gl_position = Projection * ModelView * vposition; } color = ambient + diffuse + specular; color.a = 1.0; 4 Vertex Lighting Shaders IV // fragment shader precision mediump float; varying vec4 fcolor; voidmain() { gl_fragcolor = fcolor; } 5 Fragment Lighting Shaders I // vertex shader attribute vec4 vposition; attribute vec4 vnormal; varying vec3 N, L, E; uniform mat4 modelviewmatrix; uniform mat4 projectionmatrix; uniform vec4 lightposition; 6
Fragment Lighting Shaders II void main() { vec3 pos = -(modelviewmatrix * vposition).xyz; vec3 light = lightposition.xyz; }; L = normalize( light - pos ); E = -pos; N = normalize( (modelviewmatrix*vnormal).xyz); gl_position = projectionmatrix * modelviewmatrix * vposition; 7 Fragment Lighting Shaders III // fragment shader precision mediump float; uniform vec4 ambientproduct; uniform vec4 diffuseproduct; uniform vec4 specularproduct; uniform float shininess; varying vec3 N, L, E; void main() { 8 Fragment Lighting Shaders IV } vec4 fcolor; vec3 H = normalize( L + E ); vec4 ambient = ambientproduct; float Kd = max( dot(l, N), 0.0 ); vec4 diffuse = Kd*diffuseProduct; float Ks = pow( max(dot(n, H), 0.0), shininess ); vec4 specular = Ks * specularproduct; if( dot(l, N) < 0.0 ) specular = vec4(0.0, 0.0, 0.0, 1.0); fcolor = ambient + diffuse +specular; fcolor.a = 1.0; gl_fragcolor = fcolor; 9
Teapot Examples 1 0
1 Introduction to Computer Graphics with WebGL Ed Angel Buffers Buffer Define a buffer by its spatial resolution (n x m) and its depth (or precision) k, the number of bits/pixel pixel 2 WebGL Frame Buffer 3
4 Where are the Buffers? HTML5 Canvas - Default front and back color buffers - Under control of local window system - Physically on graphics card Depth buffer also on graphics card Stencil buffer - Holds masks Most RGBA buffers 8 bits per component Latest are floating point (IEEE) Other Buffers desktop OpenGL supported other buffers - auxiliary color buffers - accumulation buffer - these were on application side - now deprecated GPUs have their own or attached memory - texture buffers - off-screen buffers not under control of window system may be floating point 5 Images Framebuffer contents are unformatted - usually RGB or RGBA - one byte per component - no compression Standard Web Image Formats - jpeg, gif, png WebGL has no conversion functions - Understands standard Web formats for texture images 6
7 The (Old) Pixel Pipeline OpenGL has a separate pipeline for pixels - Writing pixels involves Moving pixels from processor memory to the frame buffer Format conversions Mapping, Lookups, Tests - Reading pixels Format conversion Packing and Unpacking Compressed or uncompressed Indexed or RGB Bit Format - little or big endian WebGL (and shader-based OpenGL) lacks most functions for packing and unpacking - use texture functions instead - can implement desired functionality in fragment shaders 8 Deprecated Functionality gldrawpixels glcopypixels glbitmap 9
1 Buffer Reading WebGL can read pixels from the framebuffer with gl.readpixels Returns only 8 bit RGBA values In general, the format of pixels in the frame buffer is different from that of processor memory and these two types of memory reside in different places - Need packing and unpacking - Reading can be slow Drawing through texture functions and off-screen memory (frame buffer objects) 0 WebGL Pixel Function gl.readpixels(x,y,width,height,format,type,myimage) start pixel in frame buffer var myimage[512*512*4]; size type of pixels type of image pointer to processor memory gl.readpixels(0,0, 512, 512, gl.rgba, gl.unsigned_byte, myimage); 1 1 Render to Texture GPUs now include a large amount of texture memory that we can write into Advantage: fast (not under control of window system) Using frame buffer objects (FBOs) we can render into texture memory instead of the frame buffer and then read from this memory - Image processing - GPGPU 1 2
1 Introduction to Computer Graphics with WebGL Ed Angel Texture Mapping The Limits of Geometric Modeling Although graphics cards can render over 10 million polygons per second, that number is insufficient for many phenomena - Clouds - Grass - Terrain - Skin 2 Modeling an Orange Consider the problem of modeling an orange (the fruit) Start with an orange-colored sphere - Too simple Replace sphere with a more complex shape - Does not capture surface characteristics (small dimples) - Takes too many polygons to model all the dimples 3
4 Modeling an Orange (2) Take a picture of a real orange, scan it, and paste onto simple geometric model - This process is known as texture mapping Still might not be sufficient because resulting surface will be smooth - Need to change local shape - Bump mapping Three Types of Mapping Texture Mapping - Uses images to fill inside of polygons Environment (reflection mapping) - Uses a picture of the environment for texture maps - Allows simulation of highly specular surfaces Bump mapping - Emulates altering normal vectors during the rendering process 5 Texture Mapping geometric model texture mapped 6
7 Environment Mapping Bump Mapping 8 Where does mapping take place? Mapping techniques are implemented at the end of the rendering pipeline - Very efficient because few polygons make it past the clipper 9
1 Introduction to Computer Graphics with WebGL Ed Angel Texture Mapping 2 Is it simple? Although the idea is simple---map an image to a surface---there are 3 or 4 coordinate systems involved 2D image 3D surface 2 Coordinate Systems Parametric coordinates - May be used to model curves and surfaces Texture coordinates - Used to identify points in the image to be mapped Object or World Coordinates - Conceptually, where the mapping takes place Window Coordinates - Where the final image is really produced 3
4 Texture Mapping parametric coordinates texture coordinates world coordinates window coordinates Mapping Functions Basic problem is how to find the maps Consider mapping from texture coordinates to a point a surface Appear to need three functions x = x(s,t) y = y(s,t) (x,y,z) z = z(s,t) t But we really want to go the other way s 5 Backward Mapping We really want to go backwards - Given a pixel, we want to know to which point on an object it corresponds - Given a point on an object, we want to know to which point in the texture it corresponds Need a map of the form s = s(x,y,z) t = t(x,y,z) Such functions are difficult to find in general 6
7 Two-part mapping One solution to the mapping problem is to first map the texture to a simple intermediate surface Example: map to cylinder Cylindrical Mapping parametric cylinder x = r cos 2π u y = r sin 2πu z = v/h maps rectangle in u,v space to cylinder of radius r and height h in world coordinates s = u t = v maps from texture space 8 Spherical Map We can use a parametric sphere x = r cos 2πu y = r sin 2πu cos 2πv z = r sin 2πu sin 2πv in a similar manner to the cylinder but have to decide where to put the distortion Spheres are used in environmental maps 9
1 Box Mapping Easy to use with simple orthographic projection Also used in environment maps 0 Second Mapping Map from intermediate object to actual object - Normals from intermediate to actual - Normals from actual to intermediate - Vectors from center of intermediate actual intermediate 1 1 Aliasing Point sampling of the texture can lead to aliasing errors miss blue stripes point samples in u,v (or x,y,z) space point samples in texture space 1 2
1 Area Averaging A better but slower option is to use area averaging preimage pixel Note that preimage of pixel is curved 3
1 Introduction to Computer Graphics with WebGL Ed Angel WebGL Texture Mapping Basic Stragegy Three steps to applying a texture 1. Create a texture object specify texture parameters wrapping, filtering 2. Specify the texture read or generate image assign to texture object 3. Assign texture coordinates to vertices Proper mapping function is left to application 2 Texture Mapping y z x geometry display t image s 3
4 Specifying a Texture Image Specify a texture image from an array of texels (texture elements) in CPU memory Use an image in a standard format such as JPEG - Scanned image - Generate by application code WebGL supports only 2 dimensional texture maps - no need to enable as in desktop OpenGL - desktop OpenGL supports 1-4 dimensional texture maps Define Image as a Texture gl.teximage2d( target, level, components, w, h, border, format, type, texels ); target: type of texture (gl.texture_2d) level: used for mipmapping (discussed later) components: elements per texel w, h: width and height of texels in pixels border: used for smoothing (no longer used) format and type: describe texels texels: pointer to texel array gl.teximage2d(gl.texture_2d, 0, 3, 512, 512, 0, gl.rgb, gl.unsigned_byte, mytexels); 5 A Checkerboard Image var image1 = new Uint8Array(4*texSize*texSize); for ( var i = 0; i < texsize; i++ ) { for ( var j = 0; j <texsize; j++ ) { var patchx = Math.floor(i/(texSize/numChecks)); var patchy = Math.floor(j/(texSize/numChecks)); if(patchx%2 ^ patchy%2) c = 255; else c = 0; } } image1[4*i*texsize+4*j] = c; image1[4*i*texsize+4*j+1] = c; image1[4*i*texsize+4*j+2] = c; image1[4*i*texsize+4*j+3] = 255; 6
Using a GIF image // specify image in JS file var image = new Image(); image.onload = function() { configuretexture( image ); } image.src = "SA2011_black.gif // or specify image in HTML file with <img> tag // <img id = "teximage" src = "SA2011_black.gif"> // </img> var image = document.getelementbyid("teximage ) window.onload = configuretexture( image ); 7 Mapping a Texture Based on parametric texture coordinates Specify as a 2D vertex attribute t 0, 1 Texture Space a 1, 1 Object Space (s, t) = (0.2, 0.8) A c b 0, 0 1, 0 s (0.4, 0.2) B C (0.8, 0.4) 8 Cube Example var texcoord = [ vec2(0, 0), vec2(0, 1), vec2(1, 1), vec2(1, 0) ]; function quad(a, b, c, d) { pointsarray.push(vertices[a]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[0]); pointsarray.push(vertices[b]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[1]); // etc 9
1 Interpolation WebGL uses interpolation to find proper texels from specified texture coordinates Can be distortions good selection of tex coordinates poor selection of tex coordinates texture stretched over trapezoid showing effects of bilinear interpolation 0