Introduction to Computer Graphics with WebGL

Similar documents
Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Lighting and Shading II. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

CS452/552; EE465/505. Lighting & Shading

Objectives. Introduce the OpenGL shading Methods 1) Light and material functions on MV.js 2) per vertex vs per fragment shading 3) Where to carry out

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

CS 418: Interactive Computer Graphics. Basic Shading in WebGL. Eric Shaffer

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

CS452/552; EE465/505. Texture Mapping in WebGL

CS 5600 Spring

Discrete Techniques. 11 th Week, Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of

Computer Graphics (CS 543) Lecture 8a: Per-Vertex lighting, Shading and Per-Fragment lighting

Surface Rendering. Surface Rendering

CS 4600 Fall Utah School of Computing

Lessons Learned from HW4. Shading. Objectives. Why we need shading. Shading. Scattering

CSE528 Computer Graphics: Theory, Algorithms, and Applications

C O M P U T E R G R A P H I C S. Computer Graphics. Three-Dimensional Graphics V. Guoying Zhao 1 / 65

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

Computer Graphics. Three-Dimensional Graphics VI. Guoying Zhao 1 / 73

Objectives. Texture Mapping and NURBS Week 7. The Limits of Geometric Modeling. Modeling an Orange. Three Types of Mapping. Modeling an Orange (2)

Computer Graphics (CS 4731) Lecture 18: Lighting, Shading and Materials (Part 3)

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CS 432 Interactive Computer Graphics

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

CS 432 Interactive Computer Graphics

CS559 Computer Graphics Fall 2015

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

INF3320 Computer Graphics and Discrete Geometry

CHAPTER 1 Graphics Systems and Models 3

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

CS 130 Final. Fall 2015

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

CS GPU and GPGPU Programming Lecture 11: GPU Texturing 1. Markus Hadwiger, KAUST

The Rasterization Pipeline

Three-Dimensional Graphics V. Guoying Zhao 1 / 55

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

CS452/552; EE465/505. Image Processing Frame Buffer Objects

Introduction to Computer Graphics with WebGL

Shadows. Prof. George Wolberg Dept. of Computer Science City College of New York

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Overview. Shading. Shading. Why we need shading. Shading Light-material interactions Phong model Shading polygons Shading in OpenGL

CS GPU and GPGPU Programming Lecture 12: GPU Texturing 1. Markus Hadwiger, KAUST

Introduction to Programmable GPUs CPSC 314. Real Time Graphics

OpenGL shaders and programming models that provide object persistence

Computergraphics Exercise 15/ Shading & Texturing

OPENGL RENDERING PIPELINE

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pixels and Buffers. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

CS Surface Rendering

OpenGL Texture Mapping. Objectives Introduce the OpenGL texture functions and options

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Pipeline Operations. CS 4620 Lecture 14

Rendering. Illumination Model. Wireframe rendering simple, ambiguous Color filling flat without any 3D information

Topics and things to know about them:

Deferred Rendering Due: Wednesday November 15 at 10pm

CS GPU and GPGPU Programming Lecture 16+17: GPU Texturing 1+2. Markus Hadwiger, KAUST

CS452/552; EE465/505. Intro to Lighting

Illumination and Shading with GLSL/WebGL

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

CPSC / Texture Mapping

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

CS770/870 Spring 2017 Open GL Shader Language GLSL

CS770/870 Spring 2017 Open GL Shader Language GLSL

Computer Graphics. Illumination Models and Surface-Rendering Methods. Somsak Walairacht, Computer Engineering, KMITL

Interactive Computer Graphics A TOP-DOWN APPROACH WITH SHADER-BASED OPENGL

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Models and Architectures

Shaders. Slide credit to Prof. Zwicker

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Methodology for Lecture. Importance of Lighting. Outline. Shading Models. Brief primer on Color. Foundations of Computer Graphics (Spring 2010)

Pipeline Operations. CS 4620 Lecture 10

For each question, indicate whether the statement is true or false by circling T or F, respectively.

Computer Graphics. Illumination and Shading

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Introduction to Computer Graphics with WebGL

CS 4620 Midterm, October 23, 2018 SOLUTION

WebGL and GLSL Basics. CS559 Fall 2015 Lecture 10 October 6, 2015

3D Rasterization II COS 426

Chapter 10. Surface-Rendering Methods. Somsak Walairacht, Computer Engineering, KMITL

Efficient and Scalable Shading for Many Lights

Illumination & Shading I

We assume that you are familiar with the following:

CS 130 Exam I. Fall 2015

Shading. Flat shading Gouraud shading Phong shading

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

GLOBAL EDITION. Interactive Computer Graphics. A Top-Down Approach with WebGL SEVENTH EDITION. Edward Angel Dave Shreiner

CS 354R: Computer Game Technology

Texture mapping. Computer Graphics CSE 167 Lecture 9

Texture and other Mappings

Modern Hardware II, Curves. Week 12, Wed Apr 7

Transcription:

1 Introduction to Computer Graphics with WebGL Ed Angel Lighting in WebGL WebGL lighting Application must specify - Normals - Material properties - Lights State-based shading functions have been deprecated (glnormal, glmaterial, gllight) - Replace by vertex attributes and sending uniforms to the shaders Compute in application or in shaders 2 Normalization Cosine terms in lighting calculations can be computed using dot product Unit length vectors simplify calculation Usually we want to set the magnitudes to have unit length but - Length can be affected by transformations - Rotation and translation preserve length (rigid body transformations) - Scaling does not preserve length GLSL has a normalization function 3

4 Normal for Triangle plane n (p - p 0 ) = 0 n p 2 n = (p 2 - p 0 ) (p 1 - p 0 ) normalize n n/ n p p 0 p 1 Note that right-hand rule determines outward face Adding Normals for Quads function quad(a, b, c, d) { var t1 = subtract(vertices[b], vertices[a]); var t2 = subtract(vertices[c], vertices[b]); var normal = cross(t1, t2); var normal = vec3(normal); normal = normalize(normal); pointsarray.push(vertices[a]); normalsarray.push(normal);... 5 Specifying a Point Light Source For each light source, we can set an RGBA for the diffuse, specular, and ambient components, and for the position var diffuse0 = vec4(1.0, 0.0, 0.0, 1.0); var ambient0 = vec4(1.0, 0.0, 0.0, 1.0); var specular0 = vec4(1.0, 0.0, 0.0, 1.0); var light0_pos = vec4(1.0, 2.0, 3,0, 1.0); 6

7 Distance and Direction The source colors are specified in RGBA The position is given in homogeneous coordinates - If w =1.0, we are specifying a finite location - If w =0.0, we are specifying a parallel source with the given direction vector The coefficients in distance terms are usually quadratic (1/(a+b*d+c*d*d)) where d is the distance from the point being rendered to the light source Moving Light Sources Light sources are geometric objects whose positions or directions are affected by the model-view matrix Depending on where we place the position (direction) setting function, we can - Move the light source(s) with the object(s) - Fix the object(s) and move the light source(s) - Fix the light source(s) and move the object(s) - Move the light source(s) and object(s) independently 8 Light Properties var lightposition = vec4(1.0, 1.0, 1.0, 0.0 ); var lightambient = vec4(0.2, 0.2, 0.2, 1.0 ); var lightdiffuse = vec4( 1.0, 1.0, 1.0, 1.0 ); var lightspecular = vec4( 1.0, 1.0, 1.0, 1.0 ); 9

1 Material Properties Material properties should match the terms in the light model - Reflectivities - w component gives opacity var materialambient = vec4( 1.0, 0.0, 1.0, 1.0 ); var materialdiffuse = vec4( 1.0, 0.8, 0.0, 1.0); var materialspecular = vec4( 1.0, 0.8, 0.0, 1.0 ); var materialshininess = 100.0; 0 Using MV.js for Products var ambientproduct = mult(lightambient, materialambient); var diffuseproduct = mult(lightdiffuse, materialdiffuse); var specularproduct = mult(lightspecular, materialspecular); gl.uniform4fv(gl.getuniformlocation(program, "ambientproduct"), flatten(ambientproduct)); gl.uniform4fv(gl.getuniformlocation(program, "diffuseproduct"), flatten(diffuseproduct)); gl.uniform4fv(gl.getuniformlocation(program, "specularproduct"), flatten(specularproduct)); gl.uniform4fv(gl.getuniformlocation(program, "lightposition"), flatten(lightposition)); gl.uniform1f(gl.getuniformlocation(program, "shininess"),materialshininess); 1 1 Front and Back Faces Every face has a front and back For many objects, we never see the back face so we don t care how or if it s rendered If it matters, we can handle in shader back faces not visible back faces visible 1 2

1 Transparency Material properties are specified as RGBA values The A value can be used to make the surface translucent The default is that all surfaces are opaque Later we will enable blending and use this feature However with the HTML5 canvas, A<1 will mute colors - Allows blending other elements onto HTML5 canvas 3

1 Introduction to Computer Graphics with WebGL Ed Angel Polygonal Shading Polygonal Shading In per vertex shading, shading calculations are done for each vertex - Vertex colors become vertex shades and can be sent to the vertex shader as a vertex attribute - Alternately, we can send the parameters to the vertex shader and have it compute the shade By default, vertex shades are interpolated across a triangle if passed to the fragment shader as a varying variable (smooth shading) We can also use uniform variables to shade each triangle with a single shade (flat shading) 2 Polygon Normals Triangles have a single normal - Shades at the vertices as computed by the modified Phong model can be almost same - Identical for a distant viewer (default) or if there is no specular component Consider model of sphere We want different normals at each vertex even though this concept is not quite correct mathematically 3

4 Smooth Shading We can set a new normal at each vertex Easy for sphere model - If centered at origin n = p Now smooth shading works Note silhouette edge Mesh Shading The previous example is not general because we knew the normal at each vertex analytically For polygonal models, Gouraud proposed we use the average of the normals around a mesh vertex n = (n 1 +n 2 +n 3 +n 4 )/ n 1 +n 2 +n 3 +n 4 5 Gouraud and Phong Shading Gouraud Shading - Find average normal at each vertex (vertex normals) - Apply modified Phong model at each vertex - Interpolate vertex shades across each polygon Phong shading - Find vertex normals - Interpolate vertex normals across edges - Interpolate edge normals across polygon - Apply modified Phong model at each fragment 6

7 Comparison If the polygon mesh approximates surfaces with a high curvatures, Phong shading may look smooth while Gouraud shading may show edges Phong shading requires much more work than Gouraud shading - Until recently not available in real time systems - Now can be done using fragment shaders Both need data structures to represent meshes so we can obtain vertex normals Implementation of Gourand is per vertex shading Implementation of Phong is per fragment shading

1 Introduction to Computer Graphics with WebGL Ed Angel Per Fragment vs Per Vertex Shading // vertex shader Vertex Lighting Shaders I attribute vec4 vposition; attribute vec4 vnormal; varying vec4 fcolor; uniform vec4 ambientproduct, diffuseproduct, specularproduct; uniform mat4 modelviewmatrix; uniform mat4 projectionmatrix; uniform vec4 lightposition; uniform float shininess; void main() { 2 Vertex Lighting Shaders II vec3 pos = -(modelviewmatrix * vposition).xyz; vec3 light = lightposition.xyz; vec3 L = normalize( light - pos ); vec3 E = normalize( -pos ); vec3 H = normalize( L + E ); // Transform vertex normal into eye coordinates vec3 N = normalize( (modelviewmatrix*vnormal).xyz); // Compute terms in the illumination equation 3

Vertex Lighting Shaders III // Compute terms in the illumination equation vec4 ambient = AmbientProduct; float Kd = max( dot(l, N), 0.0 ); vec4 diffuse = Kd*DiffuseProduct; float Ks = pow( max(dot(n, H), 0.0), Shininess ); vec4 specular = Ks * SpecularProduct; if( dot(l, N) < 0.0 ) specular = vec4(0.0, 0.0, 0.0, 1.0); gl_position = Projection * ModelView * vposition; } color = ambient + diffuse + specular; color.a = 1.0; 4 Vertex Lighting Shaders IV // fragment shader precision mediump float; varying vec4 fcolor; voidmain() { gl_fragcolor = fcolor; } 5 Fragment Lighting Shaders I // vertex shader attribute vec4 vposition; attribute vec4 vnormal; varying vec3 N, L, E; uniform mat4 modelviewmatrix; uniform mat4 projectionmatrix; uniform vec4 lightposition; 6

Fragment Lighting Shaders II void main() { vec3 pos = -(modelviewmatrix * vposition).xyz; vec3 light = lightposition.xyz; }; L = normalize( light - pos ); E = -pos; N = normalize( (modelviewmatrix*vnormal).xyz); gl_position = projectionmatrix * modelviewmatrix * vposition; 7 Fragment Lighting Shaders III // fragment shader precision mediump float; uniform vec4 ambientproduct; uniform vec4 diffuseproduct; uniform vec4 specularproduct; uniform float shininess; varying vec3 N, L, E; void main() { 8 Fragment Lighting Shaders IV } vec4 fcolor; vec3 H = normalize( L + E ); vec4 ambient = ambientproduct; float Kd = max( dot(l, N), 0.0 ); vec4 diffuse = Kd*diffuseProduct; float Ks = pow( max(dot(n, H), 0.0), shininess ); vec4 specular = Ks * specularproduct; if( dot(l, N) < 0.0 ) specular = vec4(0.0, 0.0, 0.0, 1.0); fcolor = ambient + diffuse +specular; fcolor.a = 1.0; gl_fragcolor = fcolor; 9

Teapot Examples 1 0

1 Introduction to Computer Graphics with WebGL Ed Angel Buffers Buffer Define a buffer by its spatial resolution (n x m) and its depth (or precision) k, the number of bits/pixel pixel 2 WebGL Frame Buffer 3

4 Where are the Buffers? HTML5 Canvas - Default front and back color buffers - Under control of local window system - Physically on graphics card Depth buffer also on graphics card Stencil buffer - Holds masks Most RGBA buffers 8 bits per component Latest are floating point (IEEE) Other Buffers desktop OpenGL supported other buffers - auxiliary color buffers - accumulation buffer - these were on application side - now deprecated GPUs have their own or attached memory - texture buffers - off-screen buffers not under control of window system may be floating point 5 Images Framebuffer contents are unformatted - usually RGB or RGBA - one byte per component - no compression Standard Web Image Formats - jpeg, gif, png WebGL has no conversion functions - Understands standard Web formats for texture images 6

7 The (Old) Pixel Pipeline OpenGL has a separate pipeline for pixels - Writing pixels involves Moving pixels from processor memory to the frame buffer Format conversions Mapping, Lookups, Tests - Reading pixels Format conversion Packing and Unpacking Compressed or uncompressed Indexed or RGB Bit Format - little or big endian WebGL (and shader-based OpenGL) lacks most functions for packing and unpacking - use texture functions instead - can implement desired functionality in fragment shaders 8 Deprecated Functionality gldrawpixels glcopypixels glbitmap 9

1 Buffer Reading WebGL can read pixels from the framebuffer with gl.readpixels Returns only 8 bit RGBA values In general, the format of pixels in the frame buffer is different from that of processor memory and these two types of memory reside in different places - Need packing and unpacking - Reading can be slow Drawing through texture functions and off-screen memory (frame buffer objects) 0 WebGL Pixel Function gl.readpixels(x,y,width,height,format,type,myimage) start pixel in frame buffer var myimage[512*512*4]; size type of pixels type of image pointer to processor memory gl.readpixels(0,0, 512, 512, gl.rgba, gl.unsigned_byte, myimage); 1 1 Render to Texture GPUs now include a large amount of texture memory that we can write into Advantage: fast (not under control of window system) Using frame buffer objects (FBOs) we can render into texture memory instead of the frame buffer and then read from this memory - Image processing - GPGPU 1 2

1 Introduction to Computer Graphics with WebGL Ed Angel Texture Mapping The Limits of Geometric Modeling Although graphics cards can render over 10 million polygons per second, that number is insufficient for many phenomena - Clouds - Grass - Terrain - Skin 2 Modeling an Orange Consider the problem of modeling an orange (the fruit) Start with an orange-colored sphere - Too simple Replace sphere with a more complex shape - Does not capture surface characteristics (small dimples) - Takes too many polygons to model all the dimples 3

4 Modeling an Orange (2) Take a picture of a real orange, scan it, and paste onto simple geometric model - This process is known as texture mapping Still might not be sufficient because resulting surface will be smooth - Need to change local shape - Bump mapping Three Types of Mapping Texture Mapping - Uses images to fill inside of polygons Environment (reflection mapping) - Uses a picture of the environment for texture maps - Allows simulation of highly specular surfaces Bump mapping - Emulates altering normal vectors during the rendering process 5 Texture Mapping geometric model texture mapped 6

7 Environment Mapping Bump Mapping 8 Where does mapping take place? Mapping techniques are implemented at the end of the rendering pipeline - Very efficient because few polygons make it past the clipper 9

1 Introduction to Computer Graphics with WebGL Ed Angel Texture Mapping 2 Is it simple? Although the idea is simple---map an image to a surface---there are 3 or 4 coordinate systems involved 2D image 3D surface 2 Coordinate Systems Parametric coordinates - May be used to model curves and surfaces Texture coordinates - Used to identify points in the image to be mapped Object or World Coordinates - Conceptually, where the mapping takes place Window Coordinates - Where the final image is really produced 3

4 Texture Mapping parametric coordinates texture coordinates world coordinates window coordinates Mapping Functions Basic problem is how to find the maps Consider mapping from texture coordinates to a point a surface Appear to need three functions x = x(s,t) y = y(s,t) (x,y,z) z = z(s,t) t But we really want to go the other way s 5 Backward Mapping We really want to go backwards - Given a pixel, we want to know to which point on an object it corresponds - Given a point on an object, we want to know to which point in the texture it corresponds Need a map of the form s = s(x,y,z) t = t(x,y,z) Such functions are difficult to find in general 6

7 Two-part mapping One solution to the mapping problem is to first map the texture to a simple intermediate surface Example: map to cylinder Cylindrical Mapping parametric cylinder x = r cos 2π u y = r sin 2πu z = v/h maps rectangle in u,v space to cylinder of radius r and height h in world coordinates s = u t = v maps from texture space 8 Spherical Map We can use a parametric sphere x = r cos 2πu y = r sin 2πu cos 2πv z = r sin 2πu sin 2πv in a similar manner to the cylinder but have to decide where to put the distortion Spheres are used in environmental maps 9

1 Box Mapping Easy to use with simple orthographic projection Also used in environment maps 0 Second Mapping Map from intermediate object to actual object - Normals from intermediate to actual - Normals from actual to intermediate - Vectors from center of intermediate actual intermediate 1 1 Aliasing Point sampling of the texture can lead to aliasing errors miss blue stripes point samples in u,v (or x,y,z) space point samples in texture space 1 2

1 Area Averaging A better but slower option is to use area averaging preimage pixel Note that preimage of pixel is curved 3

1 Introduction to Computer Graphics with WebGL Ed Angel WebGL Texture Mapping Basic Stragegy Three steps to applying a texture 1. Create a texture object specify texture parameters wrapping, filtering 2. Specify the texture read or generate image assign to texture object 3. Assign texture coordinates to vertices Proper mapping function is left to application 2 Texture Mapping y z x geometry display t image s 3

4 Specifying a Texture Image Specify a texture image from an array of texels (texture elements) in CPU memory Use an image in a standard format such as JPEG - Scanned image - Generate by application code WebGL supports only 2 dimensional texture maps - no need to enable as in desktop OpenGL - desktop OpenGL supports 1-4 dimensional texture maps Define Image as a Texture gl.teximage2d( target, level, components, w, h, border, format, type, texels ); target: type of texture (gl.texture_2d) level: used for mipmapping (discussed later) components: elements per texel w, h: width and height of texels in pixels border: used for smoothing (no longer used) format and type: describe texels texels: pointer to texel array gl.teximage2d(gl.texture_2d, 0, 3, 512, 512, 0, gl.rgb, gl.unsigned_byte, mytexels); 5 A Checkerboard Image var image1 = new Uint8Array(4*texSize*texSize); for ( var i = 0; i < texsize; i++ ) { for ( var j = 0; j <texsize; j++ ) { var patchx = Math.floor(i/(texSize/numChecks)); var patchy = Math.floor(j/(texSize/numChecks)); if(patchx%2 ^ patchy%2) c = 255; else c = 0; } } image1[4*i*texsize+4*j] = c; image1[4*i*texsize+4*j+1] = c; image1[4*i*texsize+4*j+2] = c; image1[4*i*texsize+4*j+3] = 255; 6

Using a GIF image // specify image in JS file var image = new Image(); image.onload = function() { configuretexture( image ); } image.src = "SA2011_black.gif // or specify image in HTML file with <img> tag // <img id = "teximage" src = "SA2011_black.gif"> // </img> var image = document.getelementbyid("teximage ) window.onload = configuretexture( image ); 7 Mapping a Texture Based on parametric texture coordinates Specify as a 2D vertex attribute t 0, 1 Texture Space a 1, 1 Object Space (s, t) = (0.2, 0.8) A c b 0, 0 1, 0 s (0.4, 0.2) B C (0.8, 0.4) 8 Cube Example var texcoord = [ vec2(0, 0), vec2(0, 1), vec2(1, 1), vec2(1, 0) ]; function quad(a, b, c, d) { pointsarray.push(vertices[a]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[0]); pointsarray.push(vertices[b]); colorsarray.push(vertexcolors[a]); texcoordsarray.push(texcoord[1]); // etc 9

1 Interpolation WebGL uses interpolation to find proper texels from specified texture coordinates Can be distortions good selection of tex coordinates poor selection of tex coordinates texture stretched over trapezoid showing effects of bilinear interpolation 0