Imaging and Raster Primitives

Similar documents
Computer Graphics. Bing-Yu Chen National Taiwan University

CSC Graphics Programming. Budditha Hettige Department of Statistics and Computer Science

CS335 Graphics and Multimedia. Slides adopted from OpenGL Tutorial

CS212. OpenGL Texture Mapping and Related

Most device that are used to produce images are. dots (pixels) to display the image. This includes CRT monitors, LCDs, laser and dot-matrix

Objectives. Texture Mapping and NURBS Week 7. The Limits of Geometric Modeling. Modeling an Orange. Three Types of Mapping. Modeling an Orange (2)

OpenGL Texture Mapping. Objectives Introduce the OpenGL texture functions and options

Computer Graphics. Three-Dimensional Graphics VI. Guoying Zhao 1 / 73

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Lecture 5. Scan Conversion Textures hw2

GRAFIKA KOMPUTER. ~ M. Ali Fauzi

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Texture Mapping and Sampling

CS 432 Interactive Computer Graphics

CS452/552; EE465/505. Texture Mapping in WebGL

Lecture 07: Buffers and Textures

Lecture 22 Sections 8.8, 8.9, Wed, Oct 28, 2009

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques

Texture Mapping. Mike Bailey.

INF3320 Computer Graphics and Discrete Geometry

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

CS Computer Graphics: Raster Graphics, Part 3

Discrete Techniques. 11 th Week, Define a buffer by its spatial resolution (n m) and its depth (or precision) k, the number of

E.Order of Operations

Texture Mapping and Special Effects

CS 130 Final. Fall 2015

ก ก ก.

General Purpose computation on GPUs. Liangjun Zhang 2/23/2005

Texture Mapping CSCI 4229/5229 Computer Graphics Fall 2016

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

ERKELEY DAVIS IRVINE LOS ANGELES RIVERSIDE SAN DIEGO SAN FRANCISCO EECS 104. Fundamentals of Computer Graphics. OpenGL

Chapter 6 Texture Mapping

Rasterization Overview

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

Cap. 3 Textures. Mestrado em Engenharia Informática (6931) 1º ano, 1º semestre

Grafica Computazionale

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Chapter 9 Texture Mapping An Overview and an Example Steps in Texture Mapping A Sample Program Specifying the Texture Texture Proxy Replacing All or

INF3320 Computer Graphics and Discrete Geometry

三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒

World Coordinate System

OpenGL. Jimmy Johansson Norrköping Visualization and Interaction Studio Linköping University

Normalized Device Coordinate System (NDC) World Coordinate System. Example Coordinate Systems. Device Coordinate System

Computer Graphics. Bing-Yu Chen National Taiwan University

CMSC 425: Lecture 12 Texture Mapping Thursday, Mar 14, 2013

- Rasterization. Geometry. Scan Conversion. Rasterization

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

CS 354R: Computer Game Technology

CSE 167: Lecture 11: Textures 2. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Computer Graphics Fundamentals. Jon Macey

3D Rasterization II COS 426

-=Catmull's Texturing=1974. Part I of Texturing

Texturing Theory. Overview. All it takes is for the rendered image to look right. -Jim Blinn 11/10/2018

Computergraphics Exercise 15/ Shading & Texturing

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Overview. Goals. MipMapping. P5 MipMap Texturing. What are MipMaps. MipMapping in OpenGL. Generating MipMaps Filtering.

Computational Strategies

An Interactive Introduction to OpenGL Programming. An Interactive Introduction to OpenGL Programming. SIGGRAPH New Orleans 1

4 Introduction to OpenGL Programming. Chapter 4. Introduction to OpenGL Programming. Department of Computer Science and Engineering 4-1

Buffers, Textures, Compositing, and Blending. Overview. Buffers. David Carr Virtual Environments, Fundamentals Spring 2005 Based on Slides by E.

Advanced Rendering Techniques

Lighting and Texturing


CSE 167: Introduction to Computer Graphics Lecture #9: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

Programming Guide. Aaftab Munshi Dan Ginsburg Dave Shreiner. TT r^addison-wesley

For each question, indicate whether the statement is true or false by circling T or F, respectively.

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

CSE528 Computer Graphics: Theory, Algorithms, and Applications

CS 464 Review. Review of Computer Graphics for Final Exam

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

9.Texture Mapping. Chapter 9. Chapter Objectives

QUESTION 1 [10] 2 COS340-A October/November 2009

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

Programming Graphics Hardware

Introduction to OpenGL. Prof. Dr.-Ing. Lars Linsen

CS4620/5620: Lecture 14 Pipeline

Computer Graphics - Week 7

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

Shading/Texturing. Dr. Scott Schaefer

Texturing. Slides done by Tomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

An Interactive Introduction to OpenGL Programming

CHAPTER 1 Graphics Systems and Models 3

Computer Graphics with OpenGL ES (J. Han) Chapter VII Rasterizer

Computergrafik. Matthias Zwicker. Herbst 2010

Pipeline Operations. CS 4620 Lecture 10

Texture and other Mappings

Lecture 4 of 41. Lab 1a: OpenGL Basics

OpenGL SUPERBIBLE. Fifth Edition. Comprehensive Tutorial and Reference. Richard S. Wright, Jr. Nicholas Haemel Graham Sellers Benjamin Lipchak

EECE 478. Learning Objectives. Learning Objectives. Rasterization & Scenes. Rasterization. Compositing

CS130 : Computer Graphics Lecture 2: Graphics Pipeline. Tamar Shinar Computer Science & Engineering UC Riverside

CS 5610 / Spring Depth Buffering and Hidden Surface Removal

Framebuffer Objects. Emil Persson ATI Technologies, Inc.

+ = Texturing: Glue n-dimensional images onto geometrical objects. Texturing. Texture magnification. Texture coordinates. Bilinear interpolation

OpenGL: Open Graphics Library. Introduction to OpenGL Part II. How do I render a geometric primitive? What is OpenGL

Transcription:

Realtime 3D Computer Graphics & Virtual Reality Bitmaps and Textures Imaging and Raster Primitives Vicki Shreiner

Imaging and Raster Primitives Describe OpenGL s raster primitives: bitmaps and image rectangles Demonstrate how to get OpenGL to read and render pixel rectangles Bitmaps Pixel-based primitives 2D array of bit masks for pixels Images update pixel color based on current color 2D array of pixel color information complete color information for each pixel OpenGL doesn t understand image formats

Pixel Pipeline CPU CPU Poly. Poly. DL DL Pixel Pixel Per Per Vertex Vertex Texture Texture Raster Raster Frag Frag FB FB Programmable pixel storage and transfer operations glbitmap(), gldrawpixels() CPU Pixel Storage Modes Pixel-Transfer Operations (and Pixel Map) Rasterization (including Pixel Zoom) Per Fragment Operations Frame Buffer Texture Memory glcopytex*image(); glreadpixels(), glcopypixels() Positioning Image Primitives glrasterpos3f( x, y, z ) raster position transformed like geometry discarded if raster position is outside of viewport may need to fine tune viewport for desired results Raster Position

Rendering Bitmaps glbitmap( GLsizei width, GLsizei height, GLfloat xorig, GLfloat yorig, GLfloat xmove, GLfloat ymove, GLubyte *bitmap ) render bitmap in current color at ( x xorig, y y orig ) advance raster position by yorig ( x move, y move ) after rendering xorig width height xmove Rendering Fonts using Bitmaps OpenGL uses bitmaps for font rendering each character is stored in a display list containing a bitmap window system specific routines to access system fonts glxusexfont() wglusefontbitmaps()

Rendering Images gldrawpixels( width, height, format, type, pixels ) render pixels with lower left of image at current raster position numerous formats and data types for specifying storage in memory best performance by using format and type that matches hardware Reading Pixels glreadpixels( x, y, width, height, format, type, pixels ) read pixels from specified (x,y) position in framebuffer pixels automatically converted from framebuffer format into requested format and type Framebuffer pixel copy glcopypixels( x, y, width, height, type )

Pixel Zoom glpixelzoom( x, y ) expand, shrink or reflect pixels around current raster position fractional zoom supported Raster Position glpixelzoom(1.0, -1.0); Storage and Transfer Modes Storage modes control accessing memory byte alignment in host memory extracting a subimage Transfer modes allow modify pixel values scale and bias pixel component values replace colors using pixel maps

Texture Mapping *(some taken from Ed Angel) Texture Mapping CPU CPU Poly. Poly. DL DL Pixel Pixel Per Per Vertex Vertex Texture Texture Raster Raster Frag Frag FB FB Apply a 1D, 2D, or 3D image to geometric primitives Uses of Texturing simulating materials reducing geometric complexity image warping reflections screen y t image z x geometry s

Texture Mapping and the OpenGL Pipeline Images and geometry flow through separate pipelines that join at the rasterizer complex textures do not affect geometric complexity vertices image geometry pipeline pixel pipeline rasterizer Rendering a texture *Following pictures are courtesy of F.S.Hill Jr. Computer Graphics using OpenGL 2nd

Rendering a texture: scanline Rendering a texture: linear interpolation

Rendering a texture Affine and projective transformations preserve straightness. If moving equal steps across Ls, how to step in Lt? Affine combination of two points Given the linear combination of points A = B = B, B,,1) using the scalars s and t: ( 1 2 B3 sa + tb = ( sa1 + tb1, sa2 + tb2, sa3 + tb3, s + t) ( A1, A2, A3,1) and This is a valid vector if s+t=0. It is a valid point if s+t=1. 1. If the coefficients of a linear combination sum to unity (identity element) we call it an affine combination. 2. Any affine combination of points is a legitimate point. Why not building any linear combination of points P = sa + tb s +t do not sum to unity (identity element)? if

Affine combination of two points -> A shift of the origin is the problem, let s shift it by vector v, so A is shifted to A+v and B is shifted to B+v. If P is a valid point it must be shifted to P =P+v! But we have P =sa+tb+(s+t)v. This is not in general P+v, only if s+t=1! Linear interpolation of two points Let P be a point defined by a point A and a vector v scaled by s and substitute v with the difference of a point B and A. P = A + sv P = A + s( B A) This can be rewritten as an affine combination of points as: P = A + s( B A) P = sb + (1 s) A This performs a linear combination between points A and B! For each component c of P, Pc(s) is the value which is the fraction s between Ac and Bc. This important operation has its own popular name lerp() (linear interpolation). double lerp( double a, double b, double s){ return ( a + (b a) * s );}

Correspondence of motion along transformed lines Let M be an affine or general perspective transformation. The points A and B of a segment map to a and b. The point R(g) maps to a point r(f). How does g vary if f changes? Why in the direction f->g? ->The process is to be embedded in the raster stage of the rendering pipeline! Correspondence of motion along transformed lines Let a ~ = ( a1, a2, a3, a4) be the homogenous rep of a, therefore a 1 a2 a3 a =,, is calculated by perspective division. a4 a4 a4 M maps A to a => a ~ M ( A, 1) T ~ = and b = M ( B, 1 ) T T R ( g) = lerp( A, B, g) maps to M ( lerp( A, B, g),1) = lerp( a~ b ~,, g) = ( lerp ( a b, g), lerp( a, b, g), lerp( a, b, g), lerp( a, b, g) ) 1, 1 2 2 3 3 4 4 ~ r ( f ) r( f ) The latter being the homogenous coordinate version of the point.

Correspondence of motion along transformed lines ( lerp ( a b, g), lerp( a, b, g), lerp( a, b, g), lerp( a, b, g) ) 1, 1 2 2 3 3 4 4 Component wise (for one comp.) perspective division results in ( a1, b1, g) ( a, b, g) lerp r1 ( f ) = but we also have lerp 4 4 ( a, b f ) r ( f ) = lerp, a and hence (for one comp.) 1 b1 r 1( f ) = lerp,, f a4 b4 f resulting in g = b 4 lerp,1, f a4 Correspondence of motion along transformed lines f g = b lerp a 4 4,1, f R(g) maps to r(f) with different fractions for f and g! g = f f = 0 f = 1 b = a 4 4

Finding the point The point R(g) that maps to r(f) is as follows (for one comp.): A 1 B1 lerp,, f = a4 b4 R 1 1 1 lerp,, f a4 b4 Is there a difference if the matrix M is an affine or a perspective projection? a b4 In the affine case, 4and are both unity and the relation between R(g) and r(f) degenerates to a linear dependency (remember the lerp() definition) and equal steps along ab correspond to equal steps along AB. Finding the point The perspective transformation case, given a matrix M (here the one that transforms from eye coordinates to clip coordinates): N 0 0 0 0 N 0 0 M = Given a point A this leads to: 0 0 c d T M ( A,1) = ( NA1, NA2, ca3 + d, A3 ) 0 0 1 0 a = The last component 4 A 3 is the position of the point along the z-axis the view plane normal in camera coordinates (depth of point in front of the eye). a,b 4 4 interpreted as the depth represent a line parallel to the view plane if they are equal, hence there is no foreshortening effect.

Texture processing during the scanline process: hyperbolic interpolation We search for: ( sleft, tleft ) and( sright, tright ) given: y top f = ( y ybott ) ( ytop ybott ) follows: s ( left) s A sb lerp,, f = a4 b4 ( y) 1 1 lerp,, f a4 b4 How does it look like for t (left)? t ( left) t A tb lerp,, f = a4 b4 ( y) 1 1 lerp,, f a4 b4 Texture processing during the scanline process: hyperbolic interpolation s ( left) s A sb lerp,, f = a4 b4 ( y) 1 1 lerp,, f a4 b4 Same denominator for and : A linear interpolation. s (left) Nominator is a linear interpolation of texture coordinates devided by a 4 and b 4. This is called rational linear rendering [Heckbert91] or hyperbolic interpolation [Blinn96]. Given y, the term sa a4, sb b4, ta a4, tb b4,1 a4, 1 b4 is constant. Needed values for nominator and denominator can be found incrementally (see Gouraud shading). Division is required for and. s (left) t (left ) t (left)

Scanline processing using hyperbolic interpolation in the pipeline Each vertex V is associated with texture coords (s,t ) and a normal n. The modelview matrix M transforms into eye coords with sa = s, ta = t. Perspective transformation alters only A which results into a ~. What happens during clipping? Hyperbolic interpolation in the pipeline D= ( d, d, d d ) During clipping a new point 2 3 4 is created by di = lerp ( ai, bi, t), i = 1, K,4, for some t. The same is done for color and texture data. This creates a new vertex, for the given vertex we have the array 1, ( a a, a, a, s, t,,1 ) 1, 2 3 4 A A c Which finally undergoes perspective division leading to ( x, y, z,1, sa a4, ta a4, c, 1 a4 ) What is (x,y,z)? -> (x,y,z) = position of point in normalized device coordinates. Why don t we devide c?

Hyperbolic interpolation in the pipeline What happens now if we calculate (s,t) in the described way? ->Referencing to arbitrary points into (s,t) space might further produce visual artifacts due to sampling errors. Hyperbolic interpolation in the pipeline Point sampling vs. bilinear filtering.

Texture Example The texture (below) is a 256 x 256 image that has been mapped to a rectangular polygon which is viewed in perspective Applying Textures I Three steps specify texture read or generate image assign to texture assign texture coordinates to vertices specify texture parameters wrapping, filtering

Applying Textures II specify textures in texture objects set texture filter set texture function set texture wrap mode set optional perspective correction hint bind texture object enable texturing supply texture coordinates for vertex coordinates can also be generated Texture Objects Like display lists for texture images one image per texture object may be shared by several graphics contexts Generate texture names glgentextures( n, *texids );

Texture Objects (cont.) Create texture objects with texture data and state Bind textures before using glbindtexture( target, id ); Specify Texture Image CPU CPU Poly. Poly. DL DL Pixel Pixel Per Per Vertex Vertex Texture Texture Raster Raster Frag Frag FB FB Define a texture image from an array of texels in CPU memory glteximage2d( target, level, components, w, h, border, format, type, *texels ); dimensions of image must be powers of 2 Texel colors are processed by pixel pipeline pixel scales, biases and lookups can be done

Converting A Texture Image If dimensions of image are not power of 2 gluscaleimage( format, w_in, h_in,type_in, *data_in, w_out, h_out,type_out, *data_out ); *data_in is for source image *data_out is for destination image Image interpolated and filtered during scaling Example class RGB{ // holds a color triple each with 256 possible intensities public: unsigned char r,g,b; }; //The RGBpixmap class stores the number of rows and columns //in the pixmap, as well as the address of the first pixel //in memory: class RGBpixmap{ public: int nrows, ncols; // dimensions of the pixmap RGB* pixel; // array of pixels int readbmpfile(char * fname); // read BMP file into this pixmap void makecheckerboard(); void settexture(gluint texturename); };

Example cont. void RGBpixmap:: makecheckerboard() { // make checkerboard patten nrows = ncols = 64; pixel = new RGB[3 * nrows * ncols]; If(!pixel){cout << out of memory! ;return;} long count = 0; for(int i = 0; i < nrows; i++) for(int j = 0; j < ncols; j++) { int c = (((i/8) + (j/8)) %2) * 255; pixel[count].r = c; // red pixel[count].g = c; // green pixel[count++].b = 0; // blue } } Example cont. void RGBpixmap :: settexture(gluint texturename) { glbindtexture(gl_texture_2d,texturename); gltexparameteri(gl_texture_2d, GL_TEXTURE_MAG_FILTER,GL_NEAREST); gltexparameteri(gl_texture_2d, GL_TEXTURE_MIN_FILTER,GL_NEAREST); glteximage2d(gl_texture_2d, 0, GL_RGB,nCols,nRows,0, GL_RGB, GL_UNSIGNED_BYTE, pixel); }

Specifying a Texture: Other Methods Use frame buffer as source of texture image uses current buffer as source image glcopyteximage2d(...) glcopyteximage1d(...) Modify part of a defined texture gltexsubimage2d(...) gltexsubimage1d(...) Do both with glcopytexsubimage2d(...), etc. Mapping a Texture CPU CPU Poly. Poly. DL DL Pixel Pixel Per Per Vertex Vertex Texture Texture Raster Raster Frag Frag FB FB Based on parametric texture coordinates gltexcoord*() specified at each vertex t 0, 1 Texture Space a 1, 1 Object Space (s, t) = (0.2, 0.8) A b 0, 0 1, 0 c s (0.4, 0.2) B C (0.8, 0.4)

Generating Texture Coordinates Automatically generate texture coords gltexgen{ifd}[v]() specify a plane generate texture coordinates based upon Ax + By + Cz + D = 0 distance from plane generation modes GL_OBJECT_LINEAR GL_EYE_LINEAR GL_SPHERE_MAP Tutorial: Texture

Texture Application Methods Filter Modes minification or magnification special mipmap minification filters Wrap Modes clamping or repeating Texture Functions how to mix primitive s color with texture s color blend, modulate or replace texels Filter Modes Example: gltexparameteri( target, type, mode ); Texture Polygon Magnification Texture Polygon Minification

Mipmapped Textures Mipmap allows for prefiltered texture maps of decreasing resolutions Lessens interpolation errors for smaller textured objects Declare mipmap level during texture definition glteximage*d( GL_TEXTURE_*D, level, ) GLU mipmap builder routines glubuild*dmipmaps( ) OpenGL 1.2 introduces advanced LOD controls Wrapping Mode Example: gltexparameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP ) gltexparameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT ) t texture s GL_REPEAT wrapping GL_CLAMP wrapping

Texture Functions Controls how texture is applied gltexenv{fi}[v]( GL_TEXTURE_ENV, prop, param ) GL_TEXTURE_ENV_MODE modes GL_MODULATE GL_BLEND GL_REPLACE Set blend color with GL_TEXTURE_ENV_COLOR Perspective Correction Hint Texture coordinate and color interpolation either linearly in screen space or using depth/perspective values (slower) Noticeable for polygons on edge glhint( GL_PERSPECTIVE_CORRECTION_HINT, hint ) where hint is one of GL_DONT_CARE GL_NICEST GL_FASTEST

Is There Room for a Texture? Query largest dimension of texture image typically largest square texture doesn t consider internal format size glgetintegerv( GL_MAX_TEXTURE_SIZE, &size ) Texture proxy will memory accommodate requested texture size? no image specified; placeholder if texture won t fit, texture state variables set to 0 doesn t know about other textures only considers whether this one texture will fit all of memory Texture Residency Working set of textures high-performance, usually hardware accelerated textures must be in texture objects a texture in the working set is resident for residency of current texture, check GL_TEXTURE_RESIDENT state If too many textures, not all are resident can set priority to have some kicked out first establish 0.0 to 1.0 priorities for texture objects

Advanced OpenGL Topics Dave Shreiner Advanced OpenGL Topics Display Lists and Vertex Arrays Alpha Blending and Antialiasing Using the Accumulation Buffer Fog Feedback & Selection Fragment Tests and Operations Using the Stencil Buffer