Discrete Techniques 11 th Week, 2010 Buffer Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of bits/pixel Pixel
OpenGL Frame Buffer OpenGL Buffers Color buffers can be displayed Front Back Auxiliary Overlay Depth Accumulation High resolution buffer Stencil Hold masks
Writing Buffers Conceptually, we can consider all of memory as a large two-dimensional array of pixels We read and write rectangular block of pixels Bit block transfer (bitblt) operations The frame buffer is part of this memory Memory Source Destination Frame buffer Writing into frame buffer Writing Model Read destination pixel before writing source
Bit Writing Modes Source and destination bits are combined bitwise 16 possible functions (one per column in table) Replace XOR OR XOR Mode We can use XOR by enabling logic operations and selecting the XOR write mode In OpenGL: glenable( GL _ COLOR _ LOGIC _ OP ); gllogicop( GL_XOR ); // default: GL_COPY XOR is especially useful for swapping blocks of memory such as menus that t are stored off screen Property: d = (d s ) s If S represents screen and M represents a menu, the sequence swaps the S and M S S M M S M S S M
The Pixel Pipeline OpenGL has a separate pipeline for pixels Writing pixels involves Moving pixels from processor memory to the frame buffer Format conversions Mapping, lookups, tests Reading pixels Format conversion Buffer Selection OpenGL can draw into or read from any of the color buffers (front, back, auxiliary) Default to the back buffer Change with gldrawbuffer and glreadbuffer functions Note that format of the pixels in the frame buffer is different from that of processor memory and these two types of memory reside in different places Need packing and unpacking Drawing and reading can be slow
1-bit digital images Bitmaps OpenGL treats 1-bit pixels bitmaps) (bitmaps differently from multi-bit pixels ( pixelmaps) (pixelmaps ) Bitmaps are masks that determine if the corresponding pixel in the frame buffer is drawn with the present raster color 0 color unchanged 1 color changed based on writing mode Bitmaps are useful for raster text GLUT font: GLUT_ BIT_ MAP_ 8_ BY_ 13 Drawing Bitmaps glbitmap(width, idth height, x0, y0, xi, yi, bitmap); Offset from raster position Increments in raster position after bitmap drawn First raster position Second raster position
Raster Position Bitmaps appear at a location determined by the raster position, which is part of the OpenGL state OpenGL function glrasterpos{234}{sidf}(type x, type y, type z, type w); glrasterpos{234}{sidf}v(type } *array); The position is transformed to window coordinates using the current model-view and projection matrices The bitmap is mask Raster Color Where is a one in the bitmap, we see a color based upon the current raster color that is part of the OpenGL state Where is a zero, the color of the bitmap does not affect the corresponding pixel in the frame buffer Same as drawing color o set by glcolor*() o Fixed by last call to glrasterpos*() Ex) glcolor3f( 1.0f, 0.0f, 0f 0.0f 0f ); glrasterpos3f( x, y, z ); glcolor3f( 0.0f, 0.0f, 1.0f ); glbitmap(... ); Ones in bitmap drawn in red glbegin( GL_LINES ); glvertex3f(... ); Geometry drawn in blue
Example: Checker Board GLubyte checker[512]; GLubyte wb[2] = { 0x00, 0xff }; for( int i=0; i<64; i++ ) for( int j=0; j<8; j++ ) checker[i*8+j] = wb[(i/8+j)%2]; glcolor3f( 0.0f, 0.0f, 0.0f ); glrasterpos3f( 1.0f, 1.0f, 1.0f ); glbitmap( 64, 64, 0.0, 0.0, 0.0, 0.0, checker ); Pixel Maps OpenGL works with rectangular array of pixels called pixel maps or images Pixels are in one byte (8 bit) chunks Luminance (gray scale) images 1 byte/pixel RGB 3 bytes/pixel Three functions Draw pixels: processor memory to frame buffer Read pixels: frame buffer to processor memory Copy pixels: frame buffer to frame buffer
OpenGL Pixel Functions glreadpixels(x, els( y, width, height, format, type, myimage); Start pixel in frame buffer Size Type of pixels Type of image Pointer to processor memory Ex) Glubyte myimage[512][512][3]; glreadpixels( 0, 0, 512, 512, GL_RGB, GL_UNSIGNED_BYTE, myimage ); gldrawpixels(width, height, format, type, myimage); Start at raster position Image Formats We often work with images in a standard format (JPEG, TIFF, GIF) How do we read/write such images with OpenGL? No support in OpenGL OpenGL knows nothing of image formats Some code available on Web Can write readers/writers for some simple formats in OpenGL
The Limits of Geometric Modeling Although graphics card can render over 10 million polygons per second, that number is insufficient for many phenomena Clouds, grass, terrain, skin, etc. Consider the problem of modeling an orange An orange-colored sphere too simple texture mapping More complex shape too many polygons to model all the dimples bump mapping Three Types of Mapping Texture mapping Uses images to fill inside of polygons Environment (reflection) mapping Uses a picture of the environment for texture maps Allows simulation of highly specular surfaces Bump mapping Emulates altering normal vectors during the rendering process
Texture Mapping Geometric Model Texture Mapped Environment Mapping
Bump Mapping Where Does Mapping Take Place? Mapping techniques are implemented at the end of the rendering pipeline Very efficient because a few polygons make it past clipper Vertices Geometry processing Rasterization Fragment processing Frame buffer Pixels Pixel processing
Is It Simple? Mapping a pattern (texture) to a surface 2D Image 3D Surface Although the idea is simple map an image to a surface there are 3 or 4 coordinate system involved Coordinate Systems Parameteric coordinates May be used to model curves and surfaces Texture coordinates Used to identify points in the image to be mapped Object or world coordinates Conceptually, where the mapping takes place Window or screen coordinates Where the final image is really produced
Texture Mapping Parametric Coordinates Texture Coordinates World Coordinates Screen Coordinates Terminology for Texture Mapping Texel (texture element) Textures are brought into processor memory as arrays Texture coordinates T(s, t) Continuous rectangular 2D texture t pattern Generally varying over the interval (0, 1) Texture map World coordinates texture coordinates x = x ( s, t ) y = y( s, t) s = s x, y, z, w z = z( s, t) t = t( x, y, z, w) w = w s, t ( ) ( )
Mapping Functions Basic problem is how to find the maps Consider mapping from texture coordinates to a point on a surface Appear to need four functions x = y = z = x ( s, t) ( s, t) ( s, t) (, t ) y z w = w s But we really want to go the other way t s (x,y,z,w) y z Backward Mapping We really want to go backward Given a texel, we want to know to which point on an object it corresponds forward Given a point on an object, we want to know to which point in the texture it corresponds backward Need a map of the form ( x, y, z w) ( x, y, z w ) s = s, t = t, Such functions are difficult to find in general
Two-Part Mapping One solution to the mapping problem is to first map the texture to a simple intermediate surface such as a cylinder, a sphere, a box Example: Texture Mapping with Cylinder Texture Mapping with a Box Cylindrical mapping Parametric cylinder: Spherical mapping Parametric sphere: First Mapping x = r cos 2 π u y = r sin 2πu z = v / h x = r cos 2πu y = r sin 2πu cos 2πv s=u t=v z = r sin 2πu sin 2πv Spheres are used in environmental maps Box mapping s=u t=v Easy to use with simple orthographic projection Also used in environment maps r: radius h: height
Second Mapping Map from intermediate object to actual object Using the normals from intermediate to actual Using the normals from actual to intermediate Using the vectors from center of the object to intermediate Actual Intermediate Aliasing Point sampling of the texture can lead to aliasing errors Miss blue stripes Point samples in u,v (or x,y,z) space Point samples in texture space
Area Averaging A better but slower option is to use area averaging Preimage Preimage Pixel The projection of the corners of a pixel backward into object space Preimage of the pixel is curved Magnification and Minification More than one texel can cover a pixel minification) (minification or more than one pixel can cover a texel magnification) (magnification Can use point sampling (nearest texel) or linear filtering i (2 x 2 filter) to obtain texture t values Texels used with linear filtering
Mipmapped Textured Mipmapping allows for prefiltered texture maps of decreasing resolutions To lessen interpolation ti errors for smaller textured object Fast and easy for hardware Example: Texture Filtering Point sampling Linear fl filtering Mipmap + Point sampling Mipmap + Linear filtering
Other Texture Features Environment maps Start with image of environment through wide angle lens Can be either a real scanned image or an image created in OpenGL Use this texture to generate a spherical map Use automatic texture coordinate generation Multitexturing Apply a sequence of textures through multiple texture units Bump Mapping Render objects so that they appear to have fine details (bumps) that give the surface a rough appearance affected by the light position
Opacity and Transparency Opaque surfaces permit no light to pass through Transparent surfaces permit all light to pass Translucent surfaces pass some light Translucency = 1 Opacity(α) Opaque Surface α =1 Physical Models Dealing with translucency in a physically correct manner is difficult due to The complexity of the internal interactions of light and matter Using a pipeline renderer Scene with translucent objects
Writing Model for Blending Use A component of RGBA (or RGBα) colorto store opacity During rendering we can expand our writing model to use RGBA values Source component Source blending factor Blend Destination component Destination blending factor Color Buffer Blending Fragments from multiple objects contribute to color of the same pixel Alpha blending Creating images with transparent objects Alpha channel RGBA color mode Opacity Measure of how much light penetrates through the surface 1: completely opaque, 0: transparent Transparency = 1 Opacity
Blending Equation We can define source and destination blending factors for each RGBA component s = [s r, s g, s b, s α ] d = [d r, d g, d b, d α ] Suppose that t the source and destination colors are b = [b r, b g g, b b b, b α α] ] c = [c r, c g, c b, c α ] Blend as c = b s + c d c = [b r s r + c r d r, b g s g + c g d g, b b s b + c b d b, b α s α + c α d α ] Example: Blending Suppose that we start with the opaque background color (R 0,G 0,B 0,1) This color becomes the initial destination color We now want to blend in a translucent polygon with color (R 1,G 1,B 1,α 1 ) Select GL_SRC_ALPHA and GL_ONE_MINUS_SRC_ALPHA as the source and destination blending factors R 1 = α 1 R 1 +(1- α 1 ) R 0 G 1 = α 1 G 1 +(1- α 1 ) G 0 B 1 = α 1 B 1 +(1- α 1 ) B 0 Note that this formula is correct if polygon is either opaque or transparent
Is this image correct? Probably not Order Dependency Polygons are rendered in the order they pass down the pipeline Blending functions are order dependent Opaque and Translucent Polygons Suppose that we have a group of polygons some of which are opaque and some translucent How do we use hidden-surface removal? Opaque polygons block all polygons behind them and affect the depth buffer Translucent polygons should not affect depth buffer Render with gldepthmask(gl_false) which makes depth buffer read-only Sort polygons first to remove order dependency
Fog We can composite with a fixed color and have the blending factors depend on depth Simulates a fog effect Blend source color C s and fog color C f by f is the fog factor Exponential Gaussian Linear (depth cueing) C s = f C s + (1 f) C f Fog Functions
Line Aliasing Ideal raster line is one pixel wide All line segments, other than vertical and horizontal segments, partially cover pixels Simple algorithms color only whole pixels Lead to the jaggies or aliasing Similar issue for polygons Raster Line Antialiasing Can try to color a pixel by adding a fraction of its color to the frame buffer Fraction depends on percentage of pixel covered by fragment Setting the alpha value for the corresponding pixel to be a number between 0 and 1 that is the amount of that pixel covered by the fragment Fraction depends d on whether there is overlap No Overlap Overlap
Area Averaging Use average area α 1 +α 2 αα 1 α 2 as blending factor Example: Antialiasing Without antialiasing Antialiasing
Accumulation Buffer Compositing and blending are limited by resolution of the frame buffer Typically 8 bits per color component The accumulation buffer is a high resolution buffer (16 or more bits per component) that t avoids this problem Write into it or read from it with a scale factor Slower than direct compositing into the frame buffer Compositing Applications Image filtering (convolution) Whole scene antialiasing Motion effects