Image I/O and OpenGL Textures

Size: px
Start display at page:

Download "Image I/O and OpenGL Textures"

Transcription

1 Image I/O and OpenGL Textures

2 Creating Images Using dynamic memory allocation we can create an array for an RGB image. The easiest way to do this is as follows Create an array based on the Width, Height and Number of Pixels in the Image depth Loop through these and fill in the pixels for each RGB Component Write to some image file format Free the array Most of the steps for this are simple but the saving of the image file relies on another library

3 ImageMagick / Magick++ Magick++ provides a simple C++ API to the ImageMagick image processing library which supports reading and writing a huge number of image formats as well as supporting a broad spectrum of traditional image processing operations. Magick++ provides access to most of the features available from the C API but in a simple object-oriented and well-documented framework. For More details look at the following url : The simplest operation with the Magick++ Library is the dumping of an array to an image file. This will be used in the following example

4 1 #include <iostream> 2 #include <Magick++.h> 3 #include <iostream> 4 #include <math.h> 5 #include <cstdlib> 6 7 // define widht and height of image 8 const int WIDTH=720; 9 const int HEIGHT=576; int main(void) // allocate and array of char for image 14 // where data is packed in RGB format where 0=no intensity 15 // 255 = full intensity 16 char *image = new char [WIDTH*HEIGHT*3*sizeof(char)]; 17 // index into our image array 18 unsigned long int index=0; 19 // now loop for width and height of image and fill in 20 for(int y=0; y<height; ++y) for(int x=0; x<width; ++x) // set red channel to full 25 image[index]=255; 26 // G&B to off 27 image[index+1]=0; 28 image[index+2]=0; 29 // now skip to next RGB block 30 index+=3; 31 // end of width loop 32 // end of height loop 33 // now create an image data block 34 Magick::Image output(width,height,"rgb",magick::charpixel,image); 35 // set the output image depth to 16 bit 36 output.depth(16); 37 // write the file 38 output.write("test.tiff"); 39 // delete the image data. 40 delete [] image; 41 return EXIT_SUCCESS; 42 include the image magick headers allocate space for image data Loop to create image Write image to file

5 Building Like SDL image magick has a config script we can use in qmake TEMPLATE = app TARGET = SimpleImageWrite CONFIG -= app_bundle DEPENDPATH +=. INCLUDEPATH +=. QMAKE_CXXFLAGS+=$$system(Magick++-config --cppflags ) LIBS+=$$system(Magick++-config --ldflags --libs ) macx:includepath+=/usr/local/include/imagemagick macx:libs+= -L/usr/local/lib -lmagick++-q16 # Input SOURCES += SimpleImageWrite.cpp

6 Why not use char[][]? You will notice that the array used for the image data is a char [] You may think it would be easier to use a two dimensional array for x,y co-ordinates However we will see in various examples this is not the case. It doesn t take much code to allow use to set individual pixels in a single char [] array.

7 new / delete The first example uses new and delete for the image data array Obviously we could forget to delete the array so the next examples will use the boost smart pointers In particular we will use the boost::scoped_array template.

8 Setting Individual Pixels We can set individual pixels by accessing the memory data based on x,y co-ordinates We then set the block of 3 pixels for each R,G,B values The following code does this void setpixel(char *_data,unsigned int _x,unsigned int _y, char _r,char _g, char _b) unsigned int index=(_y*width*3)+_x*3; _data[index]=_r; _data[index+1]=_g; _data[index+2]=_b;

9 Setting Background Colour Once the SetPixel function is generated we can use it to set the background colour void setbgcolour(char *_data,char _r, char _g, char _b) for(unsigned int y=0; y<height; ++y) for(unsigned int x=0; x<width; ++x) setpixel(_data,x,y,_r,_g,_b);

10 Example int main() boost::scoped_array<char > image(new char [WIDTH*HEIGHT*3*sizeof(char)]); // clear to white setbgcolour(image.get(),255,255,255); int checksize=20; for(int y=0; y<height; ++y) for(int x=0; x<width; ++x) if(abs((x /checksize + y /checksize)) % 2 < 1 ) setpixel(image.get(),x,y,255,0,0); else setpixel(image.get(),x,y,255,255,255); Magick::Image output(width,height,"rgb",magick::charpixel,image.get()); output.depth(16); output.write("test.png"); return EXIT_SUCCESS;

11 The % (modulus) Operator The remainder operator (%) returns the integer remainder of the result of dividing the first operand with the second For example the value of 7 % 2 is 1 7/2 299/ = = 2 3 2= = %2=1 200 = 299 % 100 =

12 The % (modulus) Operator The magnitude of m % n must always be lest than the division n The table below show some of the results of the % operator 1 3 % 5 = 3 5 % 3 = % 5 = 4 5 % 4 = % 5 = 0 15 % 5 = % 5 = 1 15 % 6 = % 5 = 2 15 % -7 (varies 1 under gcc) 6 8 % 5 = 3 15 % 0 is undefined (core dump under gcc)

13 x%40 y %40 int main() boost::scoped_array<char > image(new char [WIDTH*HEIGHT*3*sizeof(char)]); // clear to white setbgcolour(image.get(),255,255,255); for(int y=0; y<height; ++y) for(int x=0; x<width; ++x) if( (y%20) && (x%20)) setpixel(image.get(),x,y,255,0,0); else setpixel(image.get(),x,y,255,255,255); Magick::Image output(width,height,"rgb",magick::charpixel,image.get()); output.depth(16); output.write("test.png"); return EXIT_SUCCESS; x%20 y %10 x%100 y %2

14 Fake Sphere The following function is used to describe a sphere // code modified from Computer Graphics with OpenGL F.S. Hill // get the value on the sphere at co-ord s,t float fakesphere(float _s, float _t) float r=sqrt((_s-0.5)*(_s-0.5)+(_t-0.5)*(_t-0.5)); if(r<0.5) return 1-r/0.5; else return 1.0;

15 Fake Sphere This function will work for any value of s and t in the range of 0-1. The values will then range from 1.0 outside the sphere to black edges and then to white in the centre as shown on the image above Using the template code add this function and draw a sphere.

16 int main() boost::scoped_array<float > image(new float [WIDTH*HEIGHT*3*sizeof(float)]); // index into our data structure unsigned long int index=0; // Our step in texture space from 0-1 within the width of the image float sstep=1.0/width; float tstep=1.0/height; // actual S,T value for texture space float s=0.0; float t=0.0; // loop for the image dimensions for(int y=0; y<height; y++) for(int x=0; x<width; x++) // fill the data values with sphere values image[index]=fakesphere(s,t); image[index+1]=fakesphere(s,t); image[index+2]=fakesphere(s,t); // update the S value s+=sstep; // step to the next image index index+=3; // update the T value t+=tstep; // reset S to the left hand value s=0.0; Magick::Image output(width,height,"rgb",magick::floatpixel,image.get()); output.depth(16); output.write("test.png"); return EXIT_SUCCESS;

17 Repeating Patterns As the previous function works from 0-1 if we make the Sphere values range from 0-8 and only use the part after the decimal point we can create a pattern as shown above To do this we use the C++ function fmod The fmod() functions computes the floating-point remainder of x/ y. Specifically, the functions return the value x-i*y, for some integer i such that, if y is non-zero, the result has the same sign as x and magnitude less than the magnitude of y. So to make the value of T repeat 8 times we would use 1 float ss=fmod(s*8,1); 2 float tt=fmod(t*8,1);

18 int main() boost::scoped_array<float > image(new float [WIDTH*HEIGHT*3*sizeof(float)]); // index into our data structure unsigned long int index=0; // Our step in texture space from 0-1 within the width of the image float sstep=1.0/width; float tstep=1.0/height; // actual S,T value for texture space float s=0.0; float t=0.0; float ss,tt; // loop for the image dimensions for(int y=0; y<height; y++) for(int x=0; x<width; x++) ss=fmod(s*40,1.0); tt=fmod(t*40,1.0); // fill the data values with sphere values image[index]=fakesphere(ss,tt); image[index+1]=fakesphere(ss,tt); image[index+2]=fakesphere(ss,tt); // update the S value s+=sstep; // step to the next image index index+=3; // update the T value t+=tstep; // reset S to the left hand value s=0.0; Magick::Image output(width,height,"rgb",magick::floatpixel,image.get()); output.depth(16); output.write("test.png"); return EXIT_SUCCESS; ss=fmod(s*6,1); ss=fmod(s*2,1); tt=fmod(t*4,1); tt=fmod(t*6,1); ss=fmod(s*4,2); ss=fmod(s*16,4); tt=fmod(t*4,2); tt=fmod(t*16,4

19 Reading Images The read method of the image will attempt to read the image and determine the format. We can then access the different elements of the image (size, pixels etc) via the different methods The following example loads an image and generates mipmaps

20 mipmapping mipmapping is a technique where an image is reduce each time in size (as a power or 2) This is done by sampling the image and storing the average pixels in the new mipmap There algorithm used can vary using different filtering techniques.

21 Magick::Image image; image.read(argv[1]); int width=image.size().width(); int height=image.size().height(); // only going to deal with RGB for now unsigned char *sourceimage= new unsigned char[width*height*3]; unsigned int i=-1; // this is slow and we could use the image.getpixels to acces the raw data, however this will mean // we have to manage bits per pixe and other type information the method below is easy to use // as the quantum will always be converted for us to the correct type (uchar) Magick::Color c; for(int h=0; h<height; ++h) for(int w=0; w<width; ++w) c=image.pixelcolor(w,h); sourceimage[++i]= c.redquantum(); sourceimage[++i]= c.greenquantum(); sourceimage[++i]= c.bluequantum();

22 Getting Data void getrgb( const unsigned char *_data, int _x, int _y, unsigned char &o_r, unsigned char &o_g, unsigned char &o_b, int _width ) o_r=_data[((_width*3)*_y)+(_x*3)]; o_g=_data[((_width*3)*_y)+(_x*3)+1]; o_b=_data[((_width*3)*_y)+(_x*3)+2];

23 // loop until we run out of mip levels int miplevel=2; for(int ml=width/2; ml>=2; ml/=2) unsigned char *destimage = new unsigned char[(width/2*height/2)*3]; i=0; unsigned char r1,g1,b1; unsigned char r2,g2,b2; unsigned char r3,g3,b3; unsigned char r4,g4,b4; // now loop and average the source image data into the new one for(int h=0; h<height/miplevel; ++h) for(int w=0; w<width/miplevel; ++w) int dw=w*miplevel; int dh=h*miplevel; getrgb(sourceimage,dw,dh,r1,g1,b1,width); getrgb(sourceimage,dw+1,dh,r2,g2,b2,width); getrgb(sourceimage,dw,dh+1,r3,g3,b3,width); getrgb(sourceimage,dw+1,dh+1,r4,g4,b4,width); destimage[i]=sqrt ((r1*r1+r2*r2+r3*r3+r4*r4)/4); destimage[i+1]=sqrt ((g1*g1+g2*g2+g3*g3+g4*g4)/4); destimage[i+2]=sqrt ((b1*b1+b2*b2+b3*b3+b4*b4)/4); i+=3; // write out image and close Magick::Image output(width/miplevel,height/miplevel,"rgb",magick::charpixel,destimage); output.depth(16); char str[40]; static int f=0; sprintf(str,"%02dmipmap%dx%d.png",f++, ml,ml); output.write(str); miplevel*=2; // end of each mip

24 QImage Qt Has a built in image loading class called QImage It is built as a wrapper around other system image libraries a bit like ImageMagick It should load the same types of images as ImageMagick but not always. The following example loads in an image and uses the red channel to generate the height of the geometry.

25 In this example we get the width and height from the image and use this for the steps Then generate a series of points equally spaced in x,z but y is set to the value of the red channel.

26 // load our image and get size QImage image(m_imagename.c_str()); int imagewidth=image.size().width()-1; int imageheight=image.size().height()-1; std::cout<<"image size "<<imagewidth<<" "<<imageheight<<"\n"; // calculate the deltas for the x,z values of our point float wstep=_width/(float)imagewidth; float dstep=_depth/(float)imageheight; // now we assume that the grid is centered at 0,0,0 so we make // it flow from -w/2 -d/2 float xpos=-(_width/2.0); float zpos=-(_depth/2.0); // now loop from top left to bottom right and generate points std::vector <ngl::vec3> gridpoints; for(int z=0; z<=imageheight; ++z) for(int x=0; x<=imagewidth; ++x) // grab the colour and use for the Y (height) only use the red channel QColor c(image.pixel(x,z)); gridpoints.push_back(ngl::vec3(xpos,c.redf()*4,zpos)); // now store the colour as well gridpoints.push_back(ngl::vec3(c.redf(),c.greenf(),c.bluef())); // calculate the new position xpos+=wstep; // now increment to next z row zpos+=dstep; // we need to re-set the xpos for new row xpos=-(_width/2.0);

27 Indices Next we create a series of indices for the triangle strip. Once we have a complete row, we add a special index value that indicates that we are at the end of a row. this will be used later by the OpenGL restart command.

28 std::vector <GLuint> indices; // some unique index value to indicate we have finished with a row and // want to draw a new one GLuint restartflag=imagewidth*imageheight+9999; for(int z=0; z<imageheight; ++z) for(int x=0; x<imagewidth; ++x) // Vertex in actual row indices.push_back(z * (imagewidth+1) + x); // Vertex row below indices.push_back((z + 1) * (imagewidth+1) + x); // now we have a row of tri strips signal a re-start indices.push_back(restartflag);

29 // we could use an ngl::vertexarrayobject but in this case this will show how to // create our own as a demo / reminder // so first create a vertex array glgenvertexarrays(1, &m_vaoid); glbindvertexarray(m_vaoid); // now a VBO for the grid point data GLuint vboid; glgenbuffers(1, &vboid); glbindbuffer(gl_array_buffer, vboid); glbufferdata(gl_array_buffer,gridpoints.size()*sizeof(ngl::vec3),&gridpoints[0].m_x,gl_static_draw); // and one for the index values GLuint iboid; glgenbuffers(1, &iboid); glbindbuffer(gl_element_array_buffer, iboid); glbufferdata(gl_element_array_buffer, indices.size()*sizeof(gluint),&indices[0], GL_STATIC_DRAW); // setup our attribute pointers, we are using 0 for the verts (note the step is going to // be 2*Vec3 glvertexattribpointer(0,3,gl_float,gl_false,sizeof(ngl::vec3)*2,0); // this once is the colour pointer and we need to offset by 3 floats glvertexattribpointer(1,3,gl_float,gl_false,sizeof(ngl::vec3)*2,((float *)NULL + (3))); // enable the pointers glenablevertexattribarray(0); glenablevertexattribarray(1);

30 glenable(gl_primitive_restart); glprimitiverestartindex(restartflag); We now tell OpenGL to enable the primitive restart system An we tell it what index value should trigger a restart. This is very similar to the old glbegin / glend type commands but works on indexed buffer data When draw elements encounters the restartflag value it will re-start the draw.

31 Texture Mapping The realism of an image is greatly enhanced by adding surface textures to the various faces of a mesh object. In part a) images have been pasted onto each face of a box. Part b) shows an image which has been wrapped around a cylinder. The wall also appears to be made of bricks however it is just a flat plane with a repeated texture applied to it.

32 Basic Texture Techniques The basic technique begins with some texture function, texture(s,t) in texture space which is traditionally marked off by parameters named s and t. The function texture(s,t) produces a colour or intensity value for each value of s and t between 0 and 1. The Figure shows two examples of texture functions, where the value of texture(s,t) varies between 0 (dark) and 1(light). Part a shows a bitmap texture and part b shows a procedurally produced texture.

33 Bitmap Textures Textures are often formed from bitmap representations of images. Such a representation consists of an array of colour values such as texture[c][r] often called texels If the array has C columns and R rows, the indices c and r vary from 0 to C-1 and R-1 respectively In the simplest case the function texture(s,t) accesses samples in the array as in the code 1 Colour texture(float s, float t) 2 3 return texture[int (s*c)][(int) t *R]; 4

34 Bitmap Textures In this case Colour holds an RGB triple. For example if R=400 and C=600, then the texture(0.261,0.783) evaluates to texture[156][313] Note the variation in s from 0 to 1 encompasses 600 pixels whereas the same variation in t encompasses 400 pixels. To avoid distortion during rendering, this texture must be mapped onto a rectangle with aspect ration 6/4.

35 Procedural Textures An alternative way to define a texture is to use a mathematical function or Procedure. For instance the Spherical Shape that appear in the last image could be generated by the following code. 1 float fakesphere(float s, float t) 2 3 flat r=sqrt((s-0.5)*(s-0.5)+(t-0.5)*(t-0.5)); 4 if(r<0.3) return 1-r/0.3; 5 else return 0.2; 6 This function varies from 1 (white) at the center to 0 (black) at the edges of the apparent sphere. Anything that can be computed can provide a texture : smooth blend and swirls of colour, fractals, solid objects etc. This is the way most modern rendering tools provide their shaders

36 Pasting Textures onto a Flat Surface Since texture space is flat, it is simplest to paste texture onto a flat surface. The figure above shows a texture image mapped to a portion of a planar polygon F We must specify how to associate points on the texture with points on F In OpenGL 2.x we use the function gltexcoord2f() to associate a point in texture space, Pi=(si,ti) with each vertex Vi of the face. The function gltexcoord2f(s,t) sets the current texture coordinates to (s,t) and they are attached to subsequently defined vertices.

37 Pasting Textures onto a Flat Surface II Normally each call to glvertex3f is preceded by a call to gltexcoord2f so each vertex gets a new pair of texture coordinates. For example to define a quadrilateral face and to position a texture on it we send OpenGL four texture coordinates and four 3D points as follows 1 glbegin(gl_quads); 2 gltexcoord2f(0.0,0.0); glvertex(1.0,2.5,1.5); 3 gltexcoord2f(0.0,0.6); glvertex(1.0,3.7,1.5); 4 gltexcoord2f(0.8,0.6); glvertex(2.0,3.7,1.5); 5 gltexcoord2f(0.8,0.0); glvertex(2.0,2.5,1.5); 6 glend(); Attaching a Pi to each Vi is equivalent to prescribing a polygon P in texture space that has the same number of vertices as F. Usually P has the same shape as F so the mapping is linear and adds little distortion

38 OpenGL 3.x In OpenGL 3.2 and above we just past in the texture co-ordinates using attributes We then access these values in shader to determine the s,t values. Depending upon how these are created we may also have to do other transformations on the co-ordinates. struct vertdata GLfloat u; GLfloat v; GLfloat nx; GLfloat ny; GLfloat nz; GLfloat x; GLfloat y; GLfloat z; ; texture cords normals cords verts cords

39 Mapping a Square to a Rectangle The figure shows the common case in which the four corners of the texture square are associated with the four corners of a rectangle. In this example the texture is a 640 by 480 pixel bitmap, and it is pasted onto a rectangle with aspect ratio 640/480 so it appears without distortion. Note that the texture coordinates range from 0 to 1 still even though the size is

40 Repeating Textures The above figure show the use of texture coordinates that tile the texture, making it repeat. To do this, some texture coordinates that lie outside of the interval [0,1] are used. When the rendering routine encounters a value of s and t outside of the unit square, such as s=2.67 it ignores the integral part and uses only the fractional part 0.67.

41 Repeating Textures II Thus the point on a face that requires (s,t)=(2.6,3.77) is textured with texture(0.66,0.77). By default OpenGL tiles textures this way; if desired, it may be set to clamp texture values instead. Thus, a coordinate pair (s,t) is sent down the pipeline along with each vertex of the face. The notion is that points inside F will be filled with texture values lying inside P by finding the internal coordinate values (s,t) through the use of interpolation.

42 OpenGL Texture Mapping Steps To use texture mapping, you perform the following steps 1.Create a texture object and specify a texture for that object 2.Indicate how the texture is to be applied to each pixel. 3.Enable texture mapping 4.Draw the scene, supplying both texture and geometric coordinates.

43 Creating a Texture Object A texture is usually thought of as being a 2D image but can also be either a 1D modulation value or a 3D volume data set The data describing the texture may consist of one, two, three or four elements per texel. Typically image data is loaded from an image file to represent either R,G,B or R,G,B,A data. However procedural texture functions may also be used as shown below 1 float fakesphere(float s, float t) 2 3 float r=sqrt((s-0.5)*(s-0.5)+(t-0.5)*(t-0.5)); 4 if(r<0.3) return 1-r/0.3; 5 else return 0.2; 6

44 Indicate how the Texture is to be applied to Each Pixel You can choose any of four possible functions for computing the final RGBA value from the fragment colour and the texture image data. One possibility is simply to use the texture colour as the final colour. (replace mode) Another method is to use the texture to modulate or scale the fragment's colour. In modern OpenGL this is done in the shader

45 Enable Texture Mapping Texture mapping must be enabled before drawing the scene with textures. Texturing is enabled or disabled using glenable() and gldisable(); The type of texturing to enable is then specified using either GL_TEXTURE_1D GL_TEXTURE_2D GL_TEXTURE_3D

46 Specifying a Texture 1 void glteximage2d( 2 GLenum Target, GLint level, 3 GLint internalformat, 4 GLsizei width, GLsizei height, 5 GLint border, GLenum format, 6 GLenum type, const GLvoid *texels); The function glteximage2d defines a 2D texture it takes several arguments as shown below The Target is set to either GL_TEXTURE_2D or GL_PROXY_TEXTURE_2D Level is used to specify the level of multiple images (mipmaps) if this is not used set to 0. The internalformat specifies the format of the data there are 38 different constants but most common are GL_RGB and GL_RGBA

47 Specifying a Texture II width and height specify the extents of the image and values must be a power of 2 (128, 256, 512 etc) border specifies the width of a border which is either 0 (no border) or 1 border format and type specify the format of the data type of the texture image data. format is usually is GL_RGB GL_RGBA GL_LUMINANCE type tells how the data in the image is actually stored (i.e. unsigned int float char etc) and is set using GL_BYTE GL_INT GL_FLOAT GL_UNSIGNED_BYTE etc. Finally texels contains the texture image data.

48 gltexparameter 1 gltexparameterif(glenum target, glenum pname, TYPE param); gltexparameter is used to specify how textures behave. It has many different parameters as follows The target parameter is GL_TEXTURE_[1D,2D,3D] depending on the texture type The pname and param types are shown in the following table

49 gltexparameter values Parameter GL_TEXTURE_WRAP_S Values GL_CLAMP, GL_CLAMP_TO_EDGE, GL_REPEAT GL_TEXTURE_WRAP_T GL_CLAMP, GL_CLAMP_TO_EDGE, GL_REPEAT GL_TEXTURE_WRAP_R GL_CLAMP, GL_CLAMP_TO_EDGE, GL_REPEAT GL_TEXTURE_MAG_FILTER GL_TEXTURE_MIN_FILTER GL_NEAREST, GL_LINEAR GL_NEAREST, GL_LINEAR, GL_NEAREST_MIPMAP_NEAREST,GL_NEAREST_MIPMAP_LINEAR,GL_LINEAR_MIPMAP_NE AREST,GL_LINEAR_MIPMAP_LINEAR GL_TEXTURE_BORDER_COLOR any four colour values in [ ] GL_TEXTURE_PRIORITY [0.0, 1.0] for the current texture object GL_TEXTURE_MIN_LOD any floating point value GL_TEXTURE_MAX_LOD any floating point value GL_TEXTURE_BASE_LEVEL any non-negative integer GL_TEXTURE_MAX_LEVEL any non-negative integer

50 Creating a Texture Object with OpenGL 1 GLuint texturename; 2 float Data = some image data 3 4 glgentextures(1,&texturename); 5 glbindtexture(gl_texture_2d,texturename); 6 gltexparameteri(gl_texture_2d,gl_texture_mag_filter,gl_linear); 7 gltexparameteri(gl_texture_2d,gl_texture_min_filter,gl_linear); 8 gltexparameteri(gl_texture_2d,gl_texture_wrap_s,gl_clamp); 9 gltexparameteri(gl_texture_2d,gl_texture_wrap_t,gl_clamp); 10 glteximage2d(gl_texture_2d,0,gl_rgb,size,size,0,gl_rgb,gl_float,data); 11 gltexenvf(gl_texture_env, GL_TEXTURE_ENV_MODE, GL_REPLACE); In the above example texturename is the id of the texture object Data is an array of the RGB tuple data created for the texture (either procedurally or loaded in from a file)

51 Texture Co-ordinates GLfloat vertices[] = -1,1,-1,1,1,-1,1,-1,-1, -1,1,-1,-1,-1,-1,1,-1,-1, //back -1,1,1,1,1,1,1,-1,1, -1,-1,1, 1,-1,1,-1,1,1, //front -1,1,-1, 1,1,-1, 1,1,1, -1,1,1, 1,1,1, -1,1,-1, // top -1,-1,-1, 1,-1,-1, 1,-1,1, -1,-1,1, 1,-1,1, -1,-1,-1, // bottom -1,1,-1,-1,1,1,-1,-1,-1, -1,-1,-1,-1,-1,1,-1,1,1, // left 1,1,-1,1,1,1,1,-1,-1, 1,-1,-1,1,-1,1,1,1,1, // left ; GLfloat texture[] = 0,0,0,1,1,1,0,0,1,0,1,1, //back 0,1,1,0,1,1,0,0,1,0,0,1, // front 0,0,1,0,1,1,0,1,1,1,0,0, //top 0,0,1,0,1,1,0,1,1,1,0,0, //bottom 1,0,1,1,0,0,0,0,0,1,1,1, // left 1,0,1,1,0,0,0,0,0,1,1,1, // right ; The following example show how to specify Vertex and texture co-ordinates in OpenGL 3.x First we create an array of vertices and texture co-ordinates

52 // now we repeat for the UV data using the second VBO glbindbuffer(gl_array_buffer, vboid[1]); glbufferdata(gl_array_buffer, sizeof(texture)*sizeof(glfloat), texture, GL_STATIC_DRAW); glvertexattribpointer(1,2,gl_float,false,0,0); glenablevertexattribarray(1);... shader->bindattribute("textureshader",0,"invert"); shader->bindattribute("textureshader",1,"inuv");

53 Vertex Shader #version 400 MVP passed from app uniform mat4 MVP; // first attribute the vertex values from our VAO layout (location=0) in vec3 invert; // second attribute the UV values from our VAO layout (location=1)in vec2 inuv; // we use this to pass the UV values to the frag shader out vec2 vertuv; void main() // pre-calculate for speed we will use this a lot // calculate the vertex position gl_position = MVP*vec4(inVert, 1.0); // pass the UV values to the frag shader vertuv=inuv.st;

54 Fragment Shader #version 400 // this is a pointer to the current 2D texture object uniform sampler2d tex; // the vertex UV smooth in vec2 vertuv; // the final fragment colour layout (location =0) out vec4 outcolour; void main () // set the fragment colour to the current texture outcolour = texture(tex,vertuv);

55 Loading Images There are many ways to load image data and a number of libraries are available under linux. Qt provides us with QImage and we can use this to load the image data Ultimately when dealing with images for OpenGL we need the data in a contiguous block of RGB(A) memory To get this data we can build a simple texture structure to load from file and store this data

56 Loading Images void GLWindow::loadTexture() QImage *image = new QImage(); bool loaded=image->load("textures/crate.bmp"); if(loaded == true) int width=image->width(); int height=image->height(); unsigned char *data = new unsigned char[ width*height*3]; unsigned int index=0; QRgb colour; for( int y=0; y<height; ++y) for( int x=0; x<width; ++x) colour=image->pixel(x,y); data[index++]=qred(colour); data[index++]=qgreen(colour); data[index++]=qblue(colour); glgentextures(1,&m_texturename); glbindtexture(gl_texture_2d,m_texturename); gltexparameteri(gl_texture_2d, GL_TEXTURE_MAG_FILTER, GL_NEAREST); gltexparameteri(gl_texture_2d, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST); glteximage2d(gl_texture_2d,0,gl_rgb,width,height,0,gl_rgb,gl_unsigned_byte,data); glgeneratemipmap(gl_texture_2d); // Allocate the mipmaps

57 Qt Texture Loading QImage has a method QGLWidget::convertToGLFormat This takes any QImage and returns a QImage suitable for OpenGL texturing, this is shown in the following code. // QImage has a method to convert itself to a format suitable for OpenGL // we call this and then load to OpenGL finalimage = QGLWidget::convertToGLFormat(finalImage); // the image in in RGBA format and unsigned byte load it ready for later glteximage2d(gl_texture_2d, 0, GL_RGBA, finalimage.width(), finalimage.height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, finalimage.bits());

58 ngl::texture ngl has a very simple texture class which will load in an image file using QImage It will determine is the image is either RGB, or RGBA and allocate the correct texture data It will be default make the current active texture unit be texture 0 However we can set other texture units be calling setmultitexture before generating the textureid

59 MultiTexture In the previous examples only one texture unit is active at one time This can be quite limiting as we may have several texture maps we need to access in the shader at the same time. To do this we use the MultiTexture OpenGL features.

60 Normal Mapping In the following example we will be using three texture maps as shown One will be used for the base colour, one for normals and one for the specular highlights

61 // set our samplers for each of the textures this will correspond to the // multitexture id below shader->setshaderparam1i("tex",0); shader->setshaderparam1i("spec",1); shader->setshaderparam1i("normalmap",2); // load and set a texture for the colour ngl::texture t("textures/trollcolour.tiff"); t.setmultitexture(0); t.settexturegl(); // mip map the textures glgeneratemipmap(gl_texture_2d); // now one for the specular map ngl::texture spec("textures/2k_troll_spec_map.jpg"); spec.setmultitexture(1); spec.settexturegl(); // mip map the textures glgeneratemipmap(gl_texture_2d); // this is our normal map ngl::texture normal("textures/2k_ct_normal.tif"); normal.setmultitexture(2); normal.settexturegl(); // mip map the textures glgeneratemipmap(gl_texture_2d);

62 Normal Mapping In this case we are using normal maps generated from zbrush which are expressed in tangent space When normal mapping we calculate the normal, tangent and bi-tangent (sometime called binormal) of the current surface point so that all our calculations are done in the same space Thus we must do some calculations to our data

63 Normal Mapping We are going to load in our mesh from an obj file and use the normals in that to calculate the tangent and bitangent These will then be passed to our shader and used to transform our lights into tangent space for the shading calculations The following data structure will be passes to the shader for each vertex

64 // a simple structure to hold our vertex data struct vertdata GLfloat u; // tex cords from obj GLfloat v; // tex cords GLfloat nx; // normal from obj mesh GLfloat ny; GLfloat nz; GLfloat x; // position from obj GLfloat y; GLfloat z; GLfloat tx; // tangent GLfloat ty; GLfloat tz; GLfloat bx; // binormal GLfloat by; GLfloat bz; ;

65 ngl::obj The ngl::obj class will load in an obj file and allow us access to the stored vertex, uv, normal data It also gives us the face structure which we can access all of the data from. The following code shows the basic parsing of the face data std::vector <ngl::vec3> verts=mesh.getvertexlist(); std::vector <ngl::face> faces=mesh.getfacelist(); std::vector <ngl::vec3> tex=mesh.gettexturecordlist(); std::vector <ngl::vec3> normals=mesh.getnormallist();

66 for(unsigned int i=0;i<nfaces;++i) // now for each triangle in the face (remember we ensured tri above) for(int j=0;j<3;++j) // pack in the vertex data first d.x=verts[faces[i].m_vert[j]].m_x; d.y=verts[faces[i].m_vert[j]].m_y; d.z=verts[faces[i].m_vert[j]].m_z; d.nx=normals[faces[i].m_norm[j]].m_x; d.ny=normals[faces[i].m_norm[j]].m_y; d.nz=normals[faces[i].m_norm[j]].m_z; d.u=tex[faces[i].m_tex[j]].m_x; d.v=tex[faces[i].m_tex[j]].m_y;

67 Tangent Calculations // now we calculate the tangent / bi-normal (tangent) based on the article here // ngl::vec3 c1 = normals[faces[i].m_norm[j]].cross(ngl::vec3(0.0, 0.0, 1.0)); ngl::vec3 c2 = normals[faces[i].m_norm[j]].cross(ngl::vec3(0.0, 1.0, 0.0)); ngl::vec3 tangent; ngl::vec3 binormal; if(c1.length()>c2.length()) tangent = c1; else tangent = c2; // now we normalize the tangent so we don't need to do it in the shader tangent.normalize(); // now we calculate the binormal using the model normal and tangent (cross) binormal = normals[faces[i].m_norm[j]].cross(tangent); // normalize again so we don't need to in the shader binormal.normalize(); d.tx=tangent.m_x; d.ty=tangent.m_y; d.tz=tangent.m_z; d.bx=binormal.m_x; d.by=binormal.m_y; d.bz=binormal.m_z; // finally add it to our mesh VAO structure vbomesh.push_back(d);

68 Vertex Shader #version 400 // first attribute the vertex values from our VAO layout (location =0) in vec3 invert; // second attribute the UV values from our VAO layout (location =1) in vec2 inuv; // third attribute the normals values from our VAO layout (location =2) in vec3 innormal; // forth attribute the Tangents values from our VAO layout (location =3) in vec3 intangent; // fith attribute the binormal values from our VAO layout (location =4) in vec3 inbinormal;.. void main() // calculate the vertex position gl_position = MVP*vec4(inVert, 1.0); // pass the UV values to the frag shader vertuv=inuv.st; vec4 worldposition = MV * vec4(invert, 1.0); // now fill the array of light pos and half vectors for the avaliable lights for (int i=0; i<3; ++i) vec3 lightdir = normalize(light[i].position.xyz - worldposition.xyz); // transform light and half angle vectors by tangent basis // this is based on code from here // // as our values are already normalized we don't need to here lightvec[i].x = dot (lightdir, intangent ); lightvec[i].y = dot (lightdir, inbinormal); lightvec[i].z = dot (lightdir, innormal); vec3 halfvector = normalize(worldposition.xyz + lightdir); halfvec[i].x = dot (halfvector, intangent); halfvec[i].y = dot (halfvector, inbinormal); halfvec[i].z = dot (halfvector, innormal);

69 Fragment Shader our output fragment colour out vec4 fragcolour; void main () // lookup normal from normal map, move from [0,1] to [-1, 1] range, normalize vec3 normal=normalize( texture(normalmap, vertuv.st).xyz * ); // we need to flip the z as this is done in zbrush normal.z = -normal.z; // default material values to be accumulated float lamberfactor; vec4 diffusematerial = texture(tex, vertuv.st); // compute specular lighting vec4 specularmaterial=texture(spec, vertuv.st) ; float shininess ; for (int i=0; i<3; ++i) lamberfactor= max (dot (lightvec[i], normal), 0.0) ; // so light is hitting use here calculate and accumulate values if (lamberfactor > 0.0) // get the phong / blinn values shininess = pow (max (dot ( halfvec[i],normal), 0.0), specpower); fragcolour +=diffusematerial * light[i].diffuse * lamberfactor; //fragcolour += specularmaterial * light[i].specular * shininess;

70 References Computer Graphics With OpenGL 2nd Ed, F.S. Hill Jr The OpenGL Programming Guide 4th Ed Shreiner et-al

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques Texture Mapping Textures The realism of an image is greatly enhanced by adding surface textures to the various faces of a mesh object. In part a) images have been pasted onto each face of a box. Part b)

More information

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실. Graphics Texture Mapping 고려대학교컴퓨터그래픽스연구실 3D Rendering Pipeline 3D Primitives 3D Modeling Coordinates Model Transformation 3D World Coordinates Lighting 3D World Coordinates Viewing Transformation 3D Viewing

More information

Lecture 22 Sections 8.8, 8.9, Wed, Oct 28, 2009

Lecture 22 Sections 8.8, 8.9, Wed, Oct 28, 2009 s The s Lecture 22 Sections 8.8, 8.9, 8.10 Hampden-Sydney College Wed, Oct 28, 2009 Outline s The 1 2 3 4 5 The 6 7 8 Outline s The 1 2 3 4 5 The 6 7 8 Creating Images s The To create a texture image internally,

More information

Texture Mapping. Mike Bailey.

Texture Mapping. Mike Bailey. Texture Mapping 1 Mike Bailey mjb@cs.oregonstate.edu This work is licensed under a Creative Commons Attribution-NonCommercial- NoDerivatives 4.0 International License TextureMapping.pptx The Basic Idea

More information

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation Lecture 19: OpenGL Texture Mapping CITS3003 Graphics & Animation E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley 2012 Objectives Introduce the OpenGL texture functions and options

More information

Lighting and Texturing

Lighting and Texturing Lighting and Texturing Michael Tao Michael Tao Lighting and Texturing 1 / 1 Fixed Function OpenGL Lighting Need to enable lighting Need to configure lights Need to configure triangle material properties

More information

Discussion 3. PPM loading Texture rendering in OpenGL

Discussion 3. PPM loading Texture rendering in OpenGL Discussion 3 PPM loading Texture rendering in OpenGL PPM Loading - Portable PixMap format 1. 2. Code for loadppm(): http://ivl.calit2.net/wiki/images/0/09/loadppm.txt ppm file format: Header: 1. P6: byte

More information

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science Texture Mapping CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science 1 Objectives Introduce Mapping Methods - Texture Mapping - Environment Mapping - Bump Mapping Consider

More information

Lecture 4 Dynamic Memory Allocation and ImageMagick

Lecture 4 Dynamic Memory Allocation and ImageMagick Lecture 4 Dynamic Memory Allocation and ImageMagick sizeof 1 #include 2 #include 3 #include 4 int main() 5 { 6 7 std::cout

More information

CS 432 Interactive Computer Graphics

CS 432 Interactive Computer Graphics CS 432 Interactive Computer Graphics Lecture 7 Part 2 Texture Mapping in OpenGL Matt Burlick - Drexel University - CS 432 1 Topics Texture Mapping in OpenGL Matt Burlick - Drexel University - CS 432 2

More information

三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒

三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒 三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒 1 In this chapter, you will learn The basics of texture mapping Texture coordinates Texture objects and texture binding Texture specification

More information

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018 CSE 167: Introduction to Computer Graphics Lecture #7: Textures Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018 Announcements Project 2 due this Friday at 2pm Grading in

More information

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017 Applying Textures Lecture 27 Robb T. Koether Hampden-Sydney College Fri, Nov 3, 2017 Robb T. Koether (Hampden-Sydney College) Applying Textures Fri, Nov 3, 2017 1 / 24 Outline 1 Applying Textures 2 Photographs

More information

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics CS4621/5621 Fall 2015 Basics of OpenGL/GLSL Textures Basics Professor: Kavita Bala Instructor: Nicolas Savva with slides from Balazs Kovacs, Eston Schweickart, Daniel Schroeder, Jiang Huang and Pramook

More information

Lecture 07: Buffers and Textures

Lecture 07: Buffers and Textures Lecture 07: Buffers and Textures CSE 40166 Computer Graphics Peter Bui University of Notre Dame, IN, USA October 26, 2010 OpenGL Pipeline Today s Focus Pixel Buffers: read and write image data to and from

More information

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping CISC 3620 Lecture 7 Lighting and shading Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping Exam results Grade distribution 12 Min: 26 10 Mean: 74 8 Median:

More information

Computergraphics Exercise 15/ Shading & Texturing

Computergraphics Exercise 15/ Shading & Texturing Computergraphics Exercise 15/16 3. Shading & Texturing Jakob Wagner for internal use only Shaders Vertex Specification define vertex format & data in model space Vertex Processing transform to clip space

More information

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015 Buffers 1 Objectives Introduce additional WebGL buffers Reading and writing buffers Buffers and Images 2 Buffer Define a buffer by its spatial resolution (n x m) and its depth (or precision) k, the number

More information

Grafica Computazionale

Grafica Computazionale Grafica Computazionale lezione36 Informatica e Automazione, "Roma Tre" June 3, 2010 Grafica Computazionale: Lezione 33 Textures Introduction Steps in Texture Mapping A Sample Program Texturing algorithms

More information

ก ก ก.

ก ก ก. 418382 ก ก ก ก 5 pramook@gmail.com TEXTURE MAPPING Textures Texture Object An OpenGL data type that keeps textures resident in memory and provides identifiers

More information

INF3320 Computer Graphics and Discrete Geometry

INF3320 Computer Graphics and Discrete Geometry INF3320 Computer Graphics and Discrete Geometry Texturing Christopher Dyken Martin Reimers 06.10.2010 Page 1 Texturing Linear interpolation Real Time Rendering: Chapter 5: Visual Appearance Chapter 6:

More information

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017 CSE 167: Introduction to Computer Graphics Lecture #8: Textures Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017 Announcements Project 2 is due this Friday at 2pm Next Tuesday

More information

Texture Mapping CSCI 4229/5229 Computer Graphics Fall 2016

Texture Mapping CSCI 4229/5229 Computer Graphics Fall 2016 Texture Mapping CSCI 4229/5229 Computer Graphics Fall 2016 What are texture maps? Bitmap images used to assign fine texture to displayed surfaces Used to make surfaces appear more realistic Must move with

More information

CS452/552; EE465/505. Texture Mapping in WebGL

CS452/552; EE465/505. Texture Mapping in WebGL CS452/552; EE465/505 Texture Mapping in WebGL 2-26 15 Outline! Texture Mapping in WebGL Read: Angel, Chapter 7, 7.3-7.5 LearningWebGL lesson 5: http://learningwebgl.com/blog/?p=507 Lab3 due: Monday, 3/2

More information

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica

Texturas. Objectives. ! Introduce Mapping Methods. ! Consider two basic strategies. Computação Gráfica Texturas Computação Gráfica Objectives! Introduce Mapping Methods! Texture Mapping! Environmental Mapping! Bump Mapping! Light Mapping! Consider two basic strategies! Manual coordinate specification! Two-stage

More information

Texture Mapping and Sampling

Texture Mapping and Sampling Texture Mapping and Sampling CPSC 314 Wolfgang Heidrich The Rendering Pipeline Geometry Processing Geometry Database Model/View Transform. Lighting Perspective Transform. Clipping Scan Conversion Depth

More information

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016 CSE 167: Introduction to Computer Graphics Lecture #8: Textures Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016 Announcements Project 2 due this Friday Midterm next Tuesday

More information

Texture Mapping. Texture Mapping. Map textures to surfaces. Trompe L Oeil ( Deceive the Eye ) The texture. Texture map

Texture Mapping. Texture Mapping. Map textures to surfaces. Trompe L Oeil ( Deceive the Eye ) The texture. Texture map CSCI 42 Computer Graphic Lecture 2 Texture Mapping A way of adding urface detail Texture Mapping Jernej Barbic Univerity of Southern California Texture Mapping + Shading Filtering and Mipmap Non-color

More information

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY 1(61) Information Coding / Computer Graphics, ISY, LiTH TSBK 07 Computer Graphics Ingemar Ragnemalm, ISY 1(61) Lecture 6 Texture mapping Skyboxes Environment mapping Bump mapping 2(61)2(61) Texture mapping

More information

Objectives. Texture Mapping and NURBS Week 7. The Limits of Geometric Modeling. Modeling an Orange. Three Types of Mapping. Modeling an Orange (2)

Objectives. Texture Mapping and NURBS Week 7. The Limits of Geometric Modeling. Modeling an Orange. Three Types of Mapping. Modeling an Orange (2) CS 480/680 INTERACTIVE COMPUTER GRAPHICS Texture Mapping and NURBS Week 7 David Breen Department of Computer Science Drexel University Objectives Introduce Mapping Methods Texture Mapping Environmental

More information

CS212. OpenGL Texture Mapping and Related

CS212. OpenGL Texture Mapping and Related CS212 OpenGL Texture Mapping and Related Basic Strategy Three steps to applying a texture 1. specify the texture read or generate image assign to texture enable texturing 2. assign texture coordinates

More information

CT5510: Computer Graphics. Texture Mapping

CT5510: Computer Graphics. Texture Mapping CT5510: Computer Graphics Texture Mapping BOCHANG MOON Texture Mapping Simulate spatially varying surface properties Phong illumination model is coupled with a material (e.g., color) Add small polygons

More information

Information Coding / Computer Graphics, ISY, LiTH. OpenGL! ! where it fits!! what it contains!! how you work with it 11(40)

Information Coding / Computer Graphics, ISY, LiTH. OpenGL! ! where it fits!! what it contains!! how you work with it 11(40) 11(40) Information Coding / Computer Graphics, ISY, LiTH OpenGL where it fits what it contains how you work with it 11(40) OpenGL The cross-platform graphics library Open = Open specification Runs everywhere

More information

OpenGL Texture Mapping. Objectives Introduce the OpenGL texture functions and options

OpenGL Texture Mapping. Objectives Introduce the OpenGL texture functions and options OpenGL Texture Mapping Objectives Introduce the OpenGL texture functions and options 1 Basic Strategy Three steps to applying a texture 1. 2. 3. specify the texture read or generate image assign to texture

More information

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

Fog example. Fog is atmospheric effect. Better realism, helps determine distances Fog example Fog is atmospheric effect Better realism, helps determine distances Fog Fog was part of OpenGL fixed function pipeline Programming fixed function fog Parameters: Choose fog color, fog model

More information

GRAFIKA KOMPUTER. ~ M. Ali Fauzi

GRAFIKA KOMPUTER. ~ M. Ali Fauzi GRAFIKA KOMPUTER ~ M. Ali Fauzi Texture Mapping WHY TEXTURE? Imagine a Chess Floor Or a Brick Wall How to Draw? If you want to draw a chess floor, each tile must be drawn as a separate quad. A large flat

More information

CMSC 425: Lecture 12 Texture Mapping Thursday, Mar 14, 2013

CMSC 425: Lecture 12 Texture Mapping Thursday, Mar 14, 2013 CMSC 425: Lecture 12 Texture Mapping Thursday, Mar 14, 2013 Surface Detail: We have discussed the use of lighting as a method of producing more realistic images. This is fine for smooth surfaces of uniform

More information

Cap. 3 Textures. Mestrado em Engenharia Informática (6931) 1º ano, 1º semestre

Cap. 3 Textures. Mestrado em Engenharia Informática (6931) 1º ano, 1º semestre Cap. 3 Textures Mestrado em Engenharia Informática (6931) 1º ano, 1º semestre Overview Objectives Notion of texture Motivation Texture mapping, texture patterns, and texels Mapping textures to polygons,

More information

Assignment #5: Scalar Field Visualization 3D: Direct Volume Rendering

Assignment #5: Scalar Field Visualization 3D: Direct Volume Rendering Assignment #5: Scalar Field Visualization 3D: Direct Volume Rendering Goals: Due October 4 th, before midnight This is the continuation of Assignment 4. The goal is to implement a simple DVR -- 2D texture-based

More information

Steiner- Wallner- Podaras

Steiner- Wallner- Podaras Texturing 2 3 Some words on textures Texturing = mapping 2D image to a model (*You will hear more on other texturing- methods in the course.) Not a trivial task! 4 Texturing how it works 5 UV coordinates

More information

Overview. Goals. MipMapping. P5 MipMap Texturing. What are MipMaps. MipMapping in OpenGL. Generating MipMaps Filtering.

Overview. Goals. MipMapping. P5 MipMap Texturing. What are MipMaps. MipMapping in OpenGL. Generating MipMaps Filtering. Overview What are MipMaps MipMapping in OpenGL P5 MipMap Texturing Generating MipMaps Filtering Alexandra Junghans junghana@student.ethz.ch Advanced Filters You can explain why it is a good idea to use

More information

Geometry Shaders. And how to use them

Geometry Shaders. And how to use them Geometry Shaders And how to use them OpenGL Pipeline (part of it) Vertex data Vertex shader Vertices Primitives Geometry shader Primitives Fragments Fragment shader Color Depth Stencil Vertex Data Attributes

More information

Texture and other Mappings

Texture and other Mappings Texture and other Mappings Texture Mapping Bump Mapping Displacement Mapping Environment Mapping Example: Checkerboard Particularly severe problems in regular textures 1 The Beginnings of a Solution: Mipmapping

More information

-=Catmull's Texturing=1974. Part I of Texturing

-=Catmull's Texturing=1974. Part I of Texturing -=Catmull's Texturing=1974 but with shaders Part I of Texturing Anton Gerdelan Textures Edwin Catmull's PhD thesis Computer display of curved surfaces, 1974 U.Utah Also invented the z-buffer / depth buffer

More information

Computer Graphics Seminar

Computer Graphics Seminar Computer Graphics Seminar MTAT.03.305 Spring 2018 Raimond Tunnel Computer Graphics Graphical illusion via the computer Displaying something meaningful (incl art) Math Computers are good at... computing.

More information

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03.

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15 Min H. Kim KAIST School of Computing Materials SUMMARY 2 1 Light blob from PVC plastic Recall: Given any vector w (not necessarily of

More information

CS 432 Interactive Computer Graphics

CS 432 Interactive Computer Graphics CS 432 Interactive Computer Graphics Lecture 2 Part 1 Primitives and Buffers Matt Burlick - Drexel University - CS 432 1 Rendering in OpenGL Ok, so now we want to actually draw stuff! OpenGL (like most

More information

COMP371 COMPUTER GRAPHICS

COMP371 COMPUTER GRAPHICS COMP371 COMPUTER GRAPHICS SESSION 12 PROGRAMMABLE SHADERS Announcement Programming Assignment #2 deadline next week: Session #7 Review of project proposals 2 Lecture Overview GPU programming 3 GPU Pipeline

More information

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 2: First Program

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 2: First Program Comp 410/510 Computer Graphics Spring 2017 Programming with OpenGL Part 2: First Program Objectives Refine the first program Introduce a standard program structure - Initialization Program Structure Most

More information

CS452/552; EE465/505. Image Processing Frame Buffer Objects

CS452/552; EE465/505. Image Processing Frame Buffer Objects CS452/552; EE465/505 Image Processing Frame Buffer Objects 3-12 15 Outline! Image Processing: Examples! Render to Texture Read: Angel, Chapter 7, 7.10-7.13 Lab3 new due date: Friday, Mar. 13 th Project#1

More information

Computer Graphics Texture Mapping

Computer Graphics Texture Mapping ! Computer Graphics 2013! 13. Texture Mapping Hongxin Zhang State Key Lab of CAD&CG, Zhejiang University 2013-10-28 About the final examination - Next Friday (Nov. 8th) Night, - 7:30PM - 9:00PM (one and

More information

INF3320 Computer Graphics and Discrete Geometry

INF3320 Computer Graphics and Discrete Geometry INF3320 Computer Graphics and Discrete Geometry Texturing Christopher Dyken Martin Reimers 06.10.2010 Page 1 Texturing Linear interpolation Real Time Rendering: Chapter 5: Visual Appearance Chapter 6:

More information

Chapter 9 Texture Mapping An Overview and an Example Steps in Texture Mapping A Sample Program Specifying the Texture Texture Proxy Replacing All or

Chapter 9 Texture Mapping An Overview and an Example Steps in Texture Mapping A Sample Program Specifying the Texture Texture Proxy Replacing All or Chapter 9 Texture Mapping An Overview and an Example Steps in Texture Mapping A Sample Program Specifying the Texture Texture Proxy Replacing All or Part of a Texture Image One Dimensional Textures Using

More information

CPSC 436D Video Game Programming

CPSC 436D Video Game Programming CPSC 436D Video Game Programming OpenGL/Shaders Opengl RENDERING PIPELINE Copyright: Alla Sheffer 1 Opengl RENDERING PIPELINE C/C++ OpenGL GLSL (automatic) (automatic) GLSL (automatic) opengl Low-level

More information

Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University

Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University Texture Mapping Computer Graphics, 2015 Lecture 9 Johan Nysjö Centre for Image analysis Uppsala University What we have rendered so far: Looks OK, but how do we add more details (and colors)? Texture mapping

More information

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction TEXTURE MAPPING 1 OUTLINE Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction 2 BASIC STRAGEGY Three steps to applying a texture 1. specify the texture

More information

Announcements. Written Assignment 2 is out see the web page. Computer Graphics

Announcements. Written Assignment 2 is out see the web page. Computer Graphics Announcements Written Assignment 2 is out see the web page 1 Texture and other Mappings Shadows Texture Mapping Bump Mapping Displacement Mapping Environment Mapping Watt Chapter 8 COMPUTER GRAPHICS 15-462

More information

Texture Mapping 1/34

Texture Mapping 1/34 Texture Mapping 1/34 Texture Mapping Offsets the modeling assumption that the BRDF doesn t change in u and v coordinates along the object s surface Store a reflectance as an image called a texture Map

More information

GLSL Overview: Creating a Program

GLSL Overview: Creating a Program 1. Create the OpenGL application GLSL Overview: Creating a Program Primarily concerned with drawing Preferred approach uses buffer objects All drawing done in terms of vertex arrays Programming style differs

More information

CS179: GPU Programming

CS179: GPU Programming CS179: GPU Programming Lecture 4: Textures Original Slides by Luke Durant, Russel McClellan, Tamas Szalay Today Recap Textures What are textures? Traditional uses Alternative uses Recap Our data so far:

More information

QUESTION 1 [10] 2 COS340-A October/November 2009

QUESTION 1 [10] 2 COS340-A October/November 2009 2 COS340-A QUESTION 1 [10] a) OpenGL uses z-buffering for hidden surface removal. Explain how the z-buffer algorithm works and give one advantage of using this method. (5) Answer: OpenGL uses a hidden-surface

More information

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014 CSE 167: Introduction to Computer Graphics Lecture #6: Lights Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014 Announcements Project 2 due Friday, Oct. 24 th Midterm Exam

More information

Lecture 5. Scan Conversion Textures hw2

Lecture 5. Scan Conversion Textures hw2 Lecture 5 Scan Conversion Textures hw2 Announcements Homework deadlines Mini Project Proposals due June 26th E.T. 06 Vector Graphics Algebraic equations describe shapes. Can render type and large areas

More information

Computational Strategies

Computational Strategies Computational Strategies How can the basic ingredients be combined: Image Order Ray casting (many options) Object Order (in world coordinate) splatting, texture mapping Combination (neither) Shear warp,

More information

CSE 167: Introduction to Computer Graphics Lecture #7: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

CSE 167: Introduction to Computer Graphics Lecture #7: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016 CSE 167: Introduction to Computer Graphics Lecture #7: GLSL Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016 Announcements Project 2 due Friday 4/22 at 2pm Midterm #1 on

More information

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016 Computergrafik Matthias Zwicker Universität Bern Herbst 2016 2 Today Basic shader for texture mapping Texture coordinate assignment Antialiasing Fancy textures 3 Texture mapping Glue textures (images)

More information

OpenGL. 1 OpenGL OpenGL 1.2 3D. (euske) 1. Client-Server Model OpenGL

OpenGL. 1 OpenGL OpenGL 1.2 3D. (euske) 1. Client-Server Model OpenGL OpenGL (euske) 1 OpenGL - 1.1 OpenGL 1. Client-Server Model 2. 3. 1.2 3D OpenGL (Depth-Buffer Algorithm Z-Buffer Algorithm) (polygon ) ( rendering) Client-Server Model X Window System ( GL ) GL (Indy O

More information

Shading/Texturing. Dr. Scott Schaefer

Shading/Texturing. Dr. Scott Schaefer Shading/Texturing Dr. Scott Schaefer Problem / Problem / Problem 4/ Problem / Problem / Shading Algorithms Flat Shading Gouraud Shading Phong Shading / Flat Shading Apply same color across entire polygon

More information

Assignment #3: Scalar Field Visualization 3D: Cutting Plane, Wireframe Iso-surfacing, and Direct Volume Rendering

Assignment #3: Scalar Field Visualization 3D: Cutting Plane, Wireframe Iso-surfacing, and Direct Volume Rendering Assignment #3: Scalar Field Visualization 3D: Cutting Plane, Wireframe Iso-surfacing, and Direct Volume Rendering Goals: Due October 9 th, before midnight With the results from your assignement#2, the

More information

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models

3D Programming. 3D Programming Concepts. Outline. 3D Concepts. 3D Concepts -- Coordinate Systems. 3D Concepts Displaying 3D Models 3D Programming Concepts Outline 3D Concepts Displaying 3D Models 3D Programming CS 4390 3D Computer 1 2 3D Concepts 3D Model is a 3D simulation of an object. Coordinate Systems 3D Models 3D Shapes 3D Concepts

More information

Computer graphics Labs: OpenGL (2/2) Vertex Shaders and Fragment Shader

Computer graphics Labs: OpenGL (2/2) Vertex Shaders and Fragment Shader University of Liège Departement of Aerospace and Mechanical engineering Computer graphics Labs: OpenGL (2/2) Vertex Shaders and Fragment Shader Exercise 1: Introduction to shaders (Folder square in archive

More information

CSE 167: Lecture 11: Textures 2. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

CSE 167: Lecture 11: Textures 2. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011 CSE 167: Introduction to Computer Graphics Lecture 11: Textures 2 Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011 Announcements Homework assignment #5 due Friday, Nov 4,

More information

CSE 167. Discussion 03 ft. Glynn 10/16/2017

CSE 167. Discussion 03 ft. Glynn 10/16/2017 CSE 167 Discussion 03 ft Glynn 10/16/2017 Announcements - Midterm next Tuesday(10/31) - Sample midterms are up - Project 1 late grading until this Friday - You will receive 75% of the points you ve earned

More information

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology Texturing Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology 1 Texturing: Glue n-dimensional images onto geometrical objects l Purpose:

More information

Introduction to Computer Graphics with WebGL

Introduction to Computer Graphics with WebGL Introduction to Computer Graphics with WebGL Ed Angel The Mandelbrot Set Fractals Fractal (fractional geometry) objects generate some of the most complex and beautiful graphics - The mathematics describing

More information

Building Models. Prof. George Wolberg Dept. of Computer Science City College of New York

Building Models. Prof. George Wolberg Dept. of Computer Science City College of New York Building Models Prof. George Wolberg Dept. of Computer Science City College of New York Objectives Introduce simple data structures for building polygonal models - Vertex lists - Edge lists Deprecated

More information

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel,

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel, EDAF80 Introduction to Computer Graphics Seminar 3 Shaders Michael Doggett 2017 Slides by Carl Johan Gribel, 2010-13 Today OpenGL Shader Language (GLSL) Shading theory Assignment 3: (you guessed it) writing

More information

Computer Graphics. Bing-Yu Chen National Taiwan University

Computer Graphics. Bing-Yu Chen National Taiwan University Computer Graphics Bing-Yu Chen National Taiwan University Introduction to OpenGL General OpenGL Introduction An Example OpenGL Program Drawing with OpenGL Transformations Animation and Depth Buffering

More information

Computer Graphics. Three-Dimensional Graphics VI. Guoying Zhao 1 / 73

Computer Graphics. Three-Dimensional Graphics VI. Guoying Zhao 1 / 73 Computer Graphics Three-Dimensional Graphics VI Guoying Zhao 1 / 73 Texture mapping Guoying Zhao 2 / 73 Objectives Introduce Mapping Methods Texture Mapping Environment Mapping Bump Mapping Consider basic

More information

Imaging and Raster Primitives

Imaging and Raster Primitives Realtime 3D Computer Graphics & Virtual Reality Bitmaps and Textures Imaging and Raster Primitives Vicki Shreiner Imaging and Raster Primitives Describe OpenGL s raster primitives: bitmaps and image rectangles

More information

Texture Mapping 1/34

Texture Mapping 1/34 Texture Mapping 1/34 Texture Mapping Offsets the modeling assumption that the BRDF doesn t change in u and v coordinates along the object s surface Store a reflectance as an image called a texture Map

More information

Texture Mapping. Texture (images) lecture 16. Texture mapping Aliasing (and anti-aliasing) Adding texture improves realism.

Texture Mapping. Texture (images) lecture 16. Texture mapping Aliasing (and anti-aliasing) Adding texture improves realism. lecture 16 Texture mapping Aliasing (and anti-aliasing) Texture (images) Texture Mapping Q: Why do we need texture mapping? A: Because objects look fake and boring without it. Adding texture improves realism.

More information

Methodology for Lecture

Methodology for Lecture Basic Geometry Setup Methodology for Lecture Make mytest1 more ambitious Sequence of steps Demo Review of Last Demo Changed floor to all white, added global for teapot and teapotloc, moved geometry to

More information

Texture mapping. Computer Graphics CSE 167 Lecture 9

Texture mapping. Computer Graphics CSE 167 Lecture 9 Texture mapping Computer Graphics CSE 167 Lecture 9 CSE 167: Computer Graphics Texture Mapping Overview Interpolation Wrapping Texture coordinates Anti aliasing Mipmaps Other mappings Including bump mapping

More information

last time put back pipeline figure today will be very codey OpenGL API library of routines to control graphics calls to compile and load shaders

last time put back pipeline figure today will be very codey OpenGL API library of routines to control graphics calls to compile and load shaders last time put back pipeline figure today will be very codey OpenGL API library of routines to control graphics calls to compile and load shaders calls to load vertex data to vertex buffers calls to load

More information

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic. Shading Models There are two main types of rendering that we cover, polygon rendering ray tracing Polygon rendering is used to apply illumination models to polygons, whereas ray tracing applies to arbitrary

More information

Computação Gráfica. Computer Graphics Engenharia Informática (11569) 3º ano, 2º semestre. Chap. 4 Windows and Viewports

Computação Gráfica. Computer Graphics Engenharia Informática (11569) 3º ano, 2º semestre. Chap. 4 Windows and Viewports Computação Gráfica Computer Graphics Engenharia Informática (11569) 3º ano, 2º semestre Chap. 4 Windows and Viewports Outline : Basic definitions in 2D: Global coordinates (scene domain): continuous domain

More information

OpenGL. Jimmy Johansson Norrköping Visualization and Interaction Studio Linköping University

OpenGL. Jimmy Johansson Norrköping Visualization and Interaction Studio Linköping University OpenGL Jimmy Johansson Norrköping Visualization and Interaction Studio Linköping University Background Software interface to graphics hardware 250+ commands Objects (models) are built from geometric primitives

More information

9.Texture Mapping. Chapter 9. Chapter Objectives

9.Texture Mapping. Chapter 9. Chapter Objectives Chapter 9 9.Texture Mapping Chapter Objectives After reading this chapter, you ll be able to do the following: Understand what texture mapping can add to your scene Specify texture images in compressed

More information

Mipmaps. Lecture 23 Subsection Fri, Oct 30, Hampden-Sydney College. Mipmaps. Robb T. Koether. Discrete Sampling.

Mipmaps. Lecture 23 Subsection Fri, Oct 30, Hampden-Sydney College. Mipmaps. Robb T. Koether. Discrete Sampling. Lecture 23 Subsection 8.8.2 Hampden-Sydney College Fri, Oct 30, 2009 Outline 1 2 3 4 5 dumay.info Outline 1 2 3 4 5 dumay.info Suppose we are drawing a 2-dimensional black-and-white checkerboard pattern.

More information

CS770/870 Spring 2017 Open GL Shader Language GLSL

CS770/870 Spring 2017 Open GL Shader Language GLSL Preview CS770/870 Spring 2017 Open GL Shader Language GLSL Review traditional graphics pipeline CPU/GPU mixed pipeline issues Shaders GLSL graphics pipeline Based on material from Angel and Shreiner, Interactive

More information

CS770/870 Spring 2017 Open GL Shader Language GLSL

CS770/870 Spring 2017 Open GL Shader Language GLSL CS770/870 Spring 2017 Open GL Shader Language GLSL Based on material from Angel and Shreiner, Interactive Computer Graphics, 6 th Edition, Addison-Wesley, 2011 Bailey and Cunningham, Graphics Shaders 2

More information

OpenGL Performances and Flexibility. Visual Computing Laboratory ISTI CNR, Italy

OpenGL Performances and Flexibility. Visual Computing Laboratory ISTI CNR, Italy OpenGL Performances and Flexibility Visual Computing Laboratory ISTI CNR, Italy The Abstract Graphics Pipeline Application 1. The application specifies vertices & connectivity. Vertex Processing 2. The

More information

Lecture 5 3D graphics part 3

Lecture 5 3D graphics part 3 Lecture 5 3D graphics part 3 Shading; applying lighting Surface detail: Mappings Texture mapping Light mapping Bump mapping Surface detail Shading: takes away the surface detail of the polygons Texture

More information

CPSC / Texture Mapping

CPSC / Texture Mapping CPSC 599.64 / 601.64 Introduction and Motivation so far: detail through polygons & materials example: brick wall problem: many polygons & materials needed for detailed structures inefficient for memory

More information

lecture 16 Texture mapping Aliasing (and anti-aliasing)

lecture 16 Texture mapping Aliasing (and anti-aliasing) lecture 16 Texture mapping Aliasing (and anti-aliasing) Texture (images) Texture Mapping Q: Why do we need texture mapping? A: Because objects look fake and boring without it. Adding texture improves realism.

More information

Best practices for effective OpenGL programming. Dan Omachi OpenGL Development Engineer

Best practices for effective OpenGL programming. Dan Omachi OpenGL Development Engineer Best practices for effective OpenGL programming Dan Omachi OpenGL Development Engineer 2 What Is OpenGL? 3 OpenGL is a software interface to graphics hardware - OpenGL Specification 4 GPU accelerates rendering

More information

Tutorial 12: Real-Time Lighting B

Tutorial 12: Real-Time Lighting B Tutorial 12: Real-Time Lighting B Summary The last tutorial taught you the basics of real time lighting, including using the normal of a surface to calculate the diffusion and specularity. Surfaces are

More information

CSE 167: Introduction to Computer Graphics Lecture #9: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

CSE 167: Introduction to Computer Graphics Lecture #9: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013 CSE 167: Introduction to Computer Graphics Lecture #9: Textures Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013 Announcements Added Tuesday office hours for Krishna: 11am-12

More information

The Application Stage. The Game Loop, Resource Management and Renderer Design

The Application Stage. The Game Loop, Resource Management and Renderer Design 1 The Application Stage The Game Loop, Resource Management and Renderer Design Application Stage Responsibilities 2 Set up the rendering pipeline Resource Management 3D meshes Textures etc. Prepare data

More information