Discrete Techniques. 11 th Week, Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of

Similar documents
Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

Buffers. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Computer Graphics. Three-Dimensional Graphics VI. Guoying Zhao 1 / 73

CS 5600 Spring

Pixels and Buffers. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Introduction to Computer Graphics with WebGL

Surface Rendering. Surface Rendering

CS 432 Interactive Computer Graphics

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Objectives. Texture Mapping and NURBS Week 7. The Limits of Geometric Modeling. Modeling an Orange. Three Types of Mapping. Modeling an Orange (2)

Lecture 07: Buffers and Textures

CHAPTER 1 Graphics Systems and Models 3

Definition. Blending & Compositing. Front & Back Buffers. Left & Right Buffers. Blending combines geometric objects. e.g.

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Textures. Texture coordinates. Introduce one more component to geometry

World Coordinate System

Normalized Device Coordinate System (NDC) World Coordinate System. Example Coordinate Systems. Device Coordinate System

Scan line algorithm. Jacobs University Visualization and Computer Graphics Lab : Graphics and Visualization 272

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

Chapter IV Fragment Processing and Output Merging. 3D Graphics for Game Programming

OpenGL: Open Graphics Library. Introduction to OpenGL Part II. How do I render a geometric primitive? What is OpenGL

Graphics Hardware and Display Devices

OpenGL Texture Mapping. Objectives Introduce the OpenGL texture functions and options

CT5510: Computer Graphics. Texture Mapping

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

E.Order of Operations

CS 130 Final. Fall 2015

Computer Graphics. Bing-Yu Chen National Taiwan University

Chapter 3. Texture mapping. Learning Goals: Assignment Lab 3: Implement a single program, which fulfills the requirements:

EECE 478. Learning Objectives. Learning Objectives. Rasterization & Scenes. Rasterization. Compositing

Rasterization Computer Graphics I Lecture 14. Scan Conversion Antialiasing Compositing [Angel, Ch , ]

Texture Mapping and Special Effects

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

INF3320 Computer Graphics and Discrete Geometry

Texture mapping. Computer Graphics CSE 167 Lecture 9

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Computer Graphics. Chapter 4 Attributes of Graphics Primitives. Somsak Walairacht, Computer Engineering, KMITL 1

Cap. 3 Textures. Mestrado em Engenharia Informática (6931) 1º ano, 1º semestre

Lectures OpenGL Introduction

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

Module 13C: Using The 3D Graphics APIs OpenGL ES

INF3320 Computer Graphics and Discrete Geometry

Image Processing. CSCI 420 Computer Graphics Lecture 22

Image Processing. Alpha Channel. Blending. Image Compositing. Blending Errors. Blending in OpenGL

Texture Mapping 1/34

CS451Real-time Rendering Pipeline

Texture Mapping 1/34

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CS212. OpenGL Texture Mapping and Related

INF3320 Computer Graphics and Discrete Geometry

OpenGL. Jimmy Johansson Norrköping Visualization and Interaction Studio Linköping University

CS Computer Graphics: Raster Graphics, Part 3

Computer Graphics - Week 7

Graphics Pipeline & APIs

CS 354R: Computer Game Technology

CSCI 4620/8626. Primitives and Attributes

Computer Graphics. Texture Filtering & Sampling Theory. Hendrik Lensch. Computer Graphics WS07/08 Texturing

Graphics Programming

Adaptive Point Cloud Rendering

Texture Mapping. Michael Kazhdan ( /467) HB Ch. 14.8,14.9 FvDFH Ch. 16.3, , 16.6

Computer Graphics: Programming, Problem Solving, and Visual Communication

FROM VERTICES TO FRAGMENTS. Lecture 5 Comp3080 Computer Graphics HKBU

Computer Graphics. Attributes of Graphics Primitives. Somsak Walairacht, Computer Engineering, KMITL 1

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Image Processing Computer Graphics I Lecture 15

Image Processing. Blending. Blending in OpenGL. Image Compositing. Blending Errors. Antialiasing Revisited Computer Graphics I Lecture 15

Computer Graphics. Bing-Yu Chen National Taiwan University

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

More Visible Surface Detection. CS116B Chris Pollett Mar. 16, 2005.

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Texturing Theory. Overview. All it takes is for the rendered image to look right. -Jim Blinn 11/10/2018

3D Rasterization II COS 426

CMSC 425: Lecture 4 More about OpenGL and GLUT Tuesday, Feb 5, 2013

Real-Time Rendering. Tomas Möller Eric Haines. A K Peters Natick, Massachusetts

Computer Graphics. Lecture 8 Antialiasing, Texture Mapping

Scalar Field Visualization I

Renderer Implementation: Basics and Clipping. Overview. Preliminaries. David Carr Virtual Environments, Fundamentals Spring 2005

Shadow Algorithms. CSE 781 Winter Han-Wei Shen

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

Graphics Pipeline & APIs

0. Introduction: What is Computer Graphics? 1. Basics of scan conversion (line drawing) 2. Representing 2D curves

CSE4030 Introduction to Computer Graphics

Graphics and Interaction Rendering pipeline & object modelling

Models and Architectures

Spring 2009 Prof. Hyesoon Kim

GRAFIKA KOMPUTER. ~ M. Ali Fauzi

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

CPSC / Texture Mapping

Computer Graphics. Shadows

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

The simplest and most obvious method to go from a continuous to a discrete image is by point sampling,

The Rasterization Pipeline

Texture Mapping. Texture (images) lecture 16. Texture mapping Aliasing (and anti-aliasing) Adding texture improves realism.

lecture 16 Texture mapping Aliasing (and anti-aliasing)

Transcription:

Discrete Techniques 11 th Week, 2010 Buffer Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of bits/pixel Pixel

OpenGL Frame Buffer OpenGL Buffers Color buffers can be displayed Front Back Auxiliary Overlay Depth Accumulation High resolution buffer Stencil Hold masks

Writing Buffers Conceptually, we can consider all of memory as a large two-dimensional array of pixels We read and write rectangular block of pixels Bit block transfer (bitblt) operations The frame buffer is part of this memory Memory Source Destination Frame buffer Writing into frame buffer Writing Model Read destination pixel before writing source

Bit Writing Modes Source and destination bits are combined bitwise 16 possible functions (one per column in table) Replace XOR OR XOR Mode We can use XOR by enabling logic operations and selecting the XOR write mode In OpenGL: glenable( GL _ COLOR _ LOGIC _ OP ); gllogicop( GL_XOR ); // default: GL_COPY XOR is especially useful for swapping blocks of memory such as menus that t are stored off screen Property: d = (d s ) s If S represents screen and M represents a menu, the sequence swaps the S and M S S M M S M S S M

The Pixel Pipeline OpenGL has a separate pipeline for pixels Writing pixels involves Moving pixels from processor memory to the frame buffer Format conversions Mapping, lookups, tests Reading pixels Format conversion Buffer Selection OpenGL can draw into or read from any of the color buffers (front, back, auxiliary) Default to the back buffer Change with gldrawbuffer and glreadbuffer functions Note that format of the pixels in the frame buffer is different from that of processor memory and these two types of memory reside in different places Need packing and unpacking Drawing and reading can be slow

1-bit digital images Bitmaps OpenGL treats 1-bit pixels bitmaps) (bitmaps differently from multi-bit pixels ( pixelmaps) (pixelmaps ) Bitmaps are masks that determine if the corresponding pixel in the frame buffer is drawn with the present raster color 0 color unchanged 1 color changed based on writing mode Bitmaps are useful for raster text GLUT font: GLUT_ BIT_ MAP_ 8_ BY_ 13 Drawing Bitmaps glbitmap(width, idth height, x0, y0, xi, yi, bitmap); Offset from raster position Increments in raster position after bitmap drawn First raster position Second raster position

Raster Position Bitmaps appear at a location determined by the raster position, which is part of the OpenGL state OpenGL function glrasterpos{234}{sidf}(type x, type y, type z, type w); glrasterpos{234}{sidf}v(type } *array); The position is transformed to window coordinates using the current model-view and projection matrices The bitmap is mask Raster Color Where is a one in the bitmap, we see a color based upon the current raster color that is part of the OpenGL state Where is a zero, the color of the bitmap does not affect the corresponding pixel in the frame buffer Same as drawing color o set by glcolor*() o Fixed by last call to glrasterpos*() Ex) glcolor3f( 1.0f, 0.0f, 0f 0.0f 0f ); glrasterpos3f( x, y, z ); glcolor3f( 0.0f, 0.0f, 1.0f ); glbitmap(... ); Ones in bitmap drawn in red glbegin( GL_LINES ); glvertex3f(... ); Geometry drawn in blue

Example: Checker Board GLubyte checker[512]; GLubyte wb[2] = { 0x00, 0xff }; for( int i=0; i<64; i++ ) for( int j=0; j<8; j++ ) checker[i*8+j] = wb[(i/8+j)%2]; glcolor3f( 0.0f, 0.0f, 0.0f ); glrasterpos3f( 1.0f, 1.0f, 1.0f ); glbitmap( 64, 64, 0.0, 0.0, 0.0, 0.0, checker ); Pixel Maps OpenGL works with rectangular array of pixels called pixel maps or images Pixels are in one byte (8 bit) chunks Luminance (gray scale) images 1 byte/pixel RGB 3 bytes/pixel Three functions Draw pixels: processor memory to frame buffer Read pixels: frame buffer to processor memory Copy pixels: frame buffer to frame buffer

OpenGL Pixel Functions glreadpixels(x, els( y, width, height, format, type, myimage); Start pixel in frame buffer Size Type of pixels Type of image Pointer to processor memory Ex) Glubyte myimage[512][512][3]; glreadpixels( 0, 0, 512, 512, GL_RGB, GL_UNSIGNED_BYTE, myimage ); gldrawpixels(width, height, format, type, myimage); Start at raster position Image Formats We often work with images in a standard format (JPEG, TIFF, GIF) How do we read/write such images with OpenGL? No support in OpenGL OpenGL knows nothing of image formats Some code available on Web Can write readers/writers for some simple formats in OpenGL

The Limits of Geometric Modeling Although graphics card can render over 10 million polygons per second, that number is insufficient for many phenomena Clouds, grass, terrain, skin, etc. Consider the problem of modeling an orange An orange-colored sphere too simple texture mapping More complex shape too many polygons to model all the dimples bump mapping Three Types of Mapping Texture mapping Uses images to fill inside of polygons Environment (reflection) mapping Uses a picture of the environment for texture maps Allows simulation of highly specular surfaces Bump mapping Emulates altering normal vectors during the rendering process

Texture Mapping Geometric Model Texture Mapped Environment Mapping

Bump Mapping Where Does Mapping Take Place? Mapping techniques are implemented at the end of the rendering pipeline Very efficient because a few polygons make it past clipper Vertices Geometry processing Rasterization Fragment processing Frame buffer Pixels Pixel processing

Is It Simple? Mapping a pattern (texture) to a surface 2D Image 3D Surface Although the idea is simple map an image to a surface there are 3 or 4 coordinate system involved Coordinate Systems Parameteric coordinates May be used to model curves and surfaces Texture coordinates Used to identify points in the image to be mapped Object or world coordinates Conceptually, where the mapping takes place Window or screen coordinates Where the final image is really produced

Texture Mapping Parametric Coordinates Texture Coordinates World Coordinates Screen Coordinates Terminology for Texture Mapping Texel (texture element) Textures are brought into processor memory as arrays Texture coordinates T(s, t) Continuous rectangular 2D texture t pattern Generally varying over the interval (0, 1) Texture map World coordinates texture coordinates x = x ( s, t ) y = y( s, t) s = s x, y, z, w z = z( s, t) t = t( x, y, z, w) w = w s, t ( ) ( )

Mapping Functions Basic problem is how to find the maps Consider mapping from texture coordinates to a point on a surface Appear to need four functions x = y = z = x ( s, t) ( s, t) ( s, t) (, t ) y z w = w s But we really want to go the other way t s (x,y,z,w) y z Backward Mapping We really want to go backward Given a texel, we want to know to which point on an object it corresponds forward Given a point on an object, we want to know to which point in the texture it corresponds backward Need a map of the form ( x, y, z w) ( x, y, z w ) s = s, t = t, Such functions are difficult to find in general

Two-Part Mapping One solution to the mapping problem is to first map the texture to a simple intermediate surface such as a cylinder, a sphere, a box Example: Texture Mapping with Cylinder Texture Mapping with a Box Cylindrical mapping Parametric cylinder: Spherical mapping Parametric sphere: First Mapping x = r cos 2 π u y = r sin 2πu z = v / h x = r cos 2πu y = r sin 2πu cos 2πv s=u t=v z = r sin 2πu sin 2πv Spheres are used in environmental maps Box mapping s=u t=v Easy to use with simple orthographic projection Also used in environment maps r: radius h: height

Second Mapping Map from intermediate object to actual object Using the normals from intermediate to actual Using the normals from actual to intermediate Using the vectors from center of the object to intermediate Actual Intermediate Aliasing Point sampling of the texture can lead to aliasing errors Miss blue stripes Point samples in u,v (or x,y,z) space Point samples in texture space

Area Averaging A better but slower option is to use area averaging Preimage Preimage Pixel The projection of the corners of a pixel backward into object space Preimage of the pixel is curved Magnification and Minification More than one texel can cover a pixel minification) (minification or more than one pixel can cover a texel magnification) (magnification Can use point sampling (nearest texel) or linear filtering i (2 x 2 filter) to obtain texture t values Texels used with linear filtering

Mipmapped Textured Mipmapping allows for prefiltered texture maps of decreasing resolutions To lessen interpolation ti errors for smaller textured object Fast and easy for hardware Example: Texture Filtering Point sampling Linear fl filtering Mipmap + Point sampling Mipmap + Linear filtering

Other Texture Features Environment maps Start with image of environment through wide angle lens Can be either a real scanned image or an image created in OpenGL Use this texture to generate a spherical map Use automatic texture coordinate generation Multitexturing Apply a sequence of textures through multiple texture units Bump Mapping Render objects so that they appear to have fine details (bumps) that give the surface a rough appearance affected by the light position

Opacity and Transparency Opaque surfaces permit no light to pass through Transparent surfaces permit all light to pass Translucent surfaces pass some light Translucency = 1 Opacity(α) Opaque Surface α =1 Physical Models Dealing with translucency in a physically correct manner is difficult due to The complexity of the internal interactions of light and matter Using a pipeline renderer Scene with translucent objects

Writing Model for Blending Use A component of RGBA (or RGBα) colorto store opacity During rendering we can expand our writing model to use RGBA values Source component Source blending factor Blend Destination component Destination blending factor Color Buffer Blending Fragments from multiple objects contribute to color of the same pixel Alpha blending Creating images with transparent objects Alpha channel RGBA color mode Opacity Measure of how much light penetrates through the surface 1: completely opaque, 0: transparent Transparency = 1 Opacity

Blending Equation We can define source and destination blending factors for each RGBA component s = [s r, s g, s b, s α ] d = [d r, d g, d b, d α ] Suppose that t the source and destination colors are b = [b r, b g g, b b b, b α α] ] c = [c r, c g, c b, c α ] Blend as c = b s + c d c = [b r s r + c r d r, b g s g + c g d g, b b s b + c b d b, b α s α + c α d α ] Example: Blending Suppose that we start with the opaque background color (R 0,G 0,B 0,1) This color becomes the initial destination color We now want to blend in a translucent polygon with color (R 1,G 1,B 1,α 1 ) Select GL_SRC_ALPHA and GL_ONE_MINUS_SRC_ALPHA as the source and destination blending factors R 1 = α 1 R 1 +(1- α 1 ) R 0 G 1 = α 1 G 1 +(1- α 1 ) G 0 B 1 = α 1 B 1 +(1- α 1 ) B 0 Note that this formula is correct if polygon is either opaque or transparent

Is this image correct? Probably not Order Dependency Polygons are rendered in the order they pass down the pipeline Blending functions are order dependent Opaque and Translucent Polygons Suppose that we have a group of polygons some of which are opaque and some translucent How do we use hidden-surface removal? Opaque polygons block all polygons behind them and affect the depth buffer Translucent polygons should not affect depth buffer Render with gldepthmask(gl_false) which makes depth buffer read-only Sort polygons first to remove order dependency

Fog We can composite with a fixed color and have the blending factors depend on depth Simulates a fog effect Blend source color C s and fog color C f by f is the fog factor Exponential Gaussian Linear (depth cueing) C s = f C s + (1 f) C f Fog Functions

Line Aliasing Ideal raster line is one pixel wide All line segments, other than vertical and horizontal segments, partially cover pixels Simple algorithms color only whole pixels Lead to the jaggies or aliasing Similar issue for polygons Raster Line Antialiasing Can try to color a pixel by adding a fraction of its color to the frame buffer Fraction depends on percentage of pixel covered by fragment Setting the alpha value for the corresponding pixel to be a number between 0 and 1 that is the amount of that pixel covered by the fragment Fraction depends d on whether there is overlap No Overlap Overlap

Area Averaging Use average area α 1 +α 2 αα 1 α 2 as blending factor Example: Antialiasing Without antialiasing Antialiasing

Accumulation Buffer Compositing and blending are limited by resolution of the frame buffer Typically 8 bits per color component The accumulation buffer is a high resolution buffer (16 or more bits per component) that t avoids this problem Write into it or read from it with a scale factor Slower than direct compositing into the frame buffer Compositing Applications Image filtering (convolution) Whole scene antialiasing Motion effects