Computer graphics Labs: OpenGL (2/2) Vertex Shaders and Fragment Shader

Size: px
Start display at page:

Download "Computer graphics Labs: OpenGL (2/2) Vertex Shaders and Fragment Shader"

Transcription

1 University of Liège Departement of Aerospace and Mechanical engineering Computer graphics Labs: OpenGL (2/2) Vertex Shaders and Fragment Shader Exercise 1: Introduction to shaders (Folder square in archive OpenGL2.zip available on the course web page Introduction to the dynamic pipeline Initially, the graphic pipeline used for the rendering on graphic cards was fixed. If the graphic card manufacturers could optimize the hardware architecture, users would be unable to modify any algorithm. Only calculation on the CPU (Central Processing Unit) could overcome this problem. The first practical course fits into this framework. We were then using a set of functionalities provided by OpenGL to interface the graphic card. Under the pressure of the market for films and video games, the development of configurable graphic cards changed the graphic pipeline into a dynamic pipeline using programmable shaders. A shader can be defined as a small software piece executed by the GPU (Graphical Processing Unit), written in a language close to C, executing a part of the computations needed for the rendering. There are several types of shaders. Here, we will take a closer look to mainly two types: the Vertex Shaders, executed on each vertex of a mesh to display, and the Fragment Shaders (also called Pixel Shaders for Microsoft DirectX), executed for each displayed pixel. The algorithm here below shows were the shaders operate inside the pipeline of a standard rendering. Rendering pipeline for the display of a triangle mesh. Given M a triangle mesh. For each triangle T of mesh M For each vertex S of triangle T Transform S into the camera frame Project S on the camera projection plane Compute the illumination of S End For For each pixel P of triangle T Compute the colour of P Compute the depth of P End For End For Fragment shader Vertex shader The use of shaders makes obsolete some functionalities used during the first practical course, such as geometric transformations, the illumination functionalities and depth test. Despite the seeming complexity caused by these changes, GPU programming opens a new field for customizing visual effects. We will henceforth work in the framework of the all shader initiated by version 3.0 of OpenGL. Before using shaders programming, we will introduce the data management needed for this new graphic pipeline. 1

2 Data management: VBOs and VAOs The first step before entering the graphic pipeline is to provide OpenGL the geometry to be stored on the GPU. During the last practical course, we used a specific primitive for displaying a square (GL_POLYGON) together with the command glbegin [...] glend. However, graphic cards only manage points, segments and triangles. These functionalities have therefore been removed from OpenGL. The scene needs to be triangulated before to be displayed on the screen. Thus, a square is the union of two triangles, themselves made of three vertices (vertex in the singular). In OpenGL, a vertex is made of a set of attributes, as its position, colour, normal, texture coordinates, and so on. It is then possible to associate any data type (of geometric character or not) to a vertex. The only limitation is that these data must have a numeric representation (e.g., a temperature, a force vector). Once the set of attributes defined, the vertex list has to be stored inside a storage space of the graphic card called Vertex Buffer Object (VBO). The data storage management in VBOs provided by OpenGL is very flexible. For instance, consider a triangle where each vertex contains data on its position and colour (figure 1). Several storage options are available: 1a) non-interleaved with two VBOs, 1b) non-interleaved with one VBO, or 1c) interleaved with one VBO. a) b) c) Figure 1 : Data management of VBOs. VBOs allow an efficient storage of the data, but they do not suffice in themselves. OpenGL does neither know what is the type of the stored data inside the VBO, nor how to regroup them in order to interpret them. The solution is to use a Vertex Array Object (VAO) in order to give OpenGL enough information for interpreting the scene. A VAO stores the active attributes, the storage format (interleaved or no inside the active VBO), as well as the format (4 floating points for the position in homogeneous coordinates). The dynamic pipeline requires data to be transmitted to the graphic card, but also needs to be told how these data have to be processed. This role is devoted to the shaders. These small software pieces are typically used for computing images (mainly 3D transformations and illumination). However, they can also be used for other computations (physical simulations, digital arts...). The following figure illustrates the main interaction with the graphic pipeline. Data : - vertex list - attribut list (colours, normals...) - connectivity relations - Other.. (camera,light, textures...) Graphic pipeline Shaders 2D image made of pixels Figure 2 : Inputs/outputs of the dynamic pipeline 2

3 Through the following paragraphs you will learn how to use VBOs and VAOs and will load and then program your first shaders. Creating a square For this first exercise, we will geometrically represent a square. 1. Creating the geometry. Start by describing the mesh vertices inside an array. o For defining the vertex list, 4 homogeneous coordinates for each vertex are needed. In the file square.cpp, inside the function UpdateVertexBuffer, begin to allocate the variable vertexdata. This variable will store the data list associated to the mesh nodes (for the moment only coordinates) with floating data points (example: 0.75f). Firstly, we will duplicate the vertices shared by the two triangles. Also, we will sort the data by triangle. Compute the array length to declare. We will store this array length into a variable called size which will be reused into that function. o Then, at the end of the function, add the following command in order to free the storage associated to the array. delete[] vertexdata; o Finally, inside the function BuildVertexData, initialize the array data with the corresponding coordinates of the vertices following the figure below: 1 y 1 x Figure 3 : Mesh of the square 2. Sending data to the GPU thanks to the VBO. Although the data have been generated, they cannot be directly used by OpenGL. By default, OpenGL does not have access to the data stored. Therefore, the first task at hand consists in assigning a storage space visible to OpenGL and then fill-in this space with our data. This operation is performed thanks to the buffer introduced earlier: the VBO. Note that the notion of VBO can be extended to data different from vertices. We then simply refer to Buffer Object. This type of object is instanced and managed by OpenGL. The user can control this storage only indirectly, but benefit from the fast GPU storage access. Introduction to the method: o In order to be handled, almost all the different OpenGL objects (buffers and others) are identified by an unsigned integer (of type GLuint). o The first step consists in generating the identifiers of the objects to be created. To this aim, a command in the following fashion is used: glgen*(nb_objects, GLuint_ptr) Where the symbol * should be replaced by the object type, nb_objects corresponds to the number of objects to create, and GLuint_ptr corresponds to the address of the identifier. The identifier targeted by the pointer is then generated, but without allocating storage for the object. 3

4 o The object is then linked to a context thanks to the function: glbind*(target, GLuint) The symbol * should be replaced by the object type. The parameter target, chosen among a list of admissible targets (depending on the context), allows to change the function s behaviour. o We can then allocate storage to the object depending on the data to store. o Finally, we break the previously created link between the object and the context by replacing the object address by 0 inside the command glbind*(target, GLuint). o The last stage consists in freeing the storage. This is done with a command in the following fashion: gldelete*(nb_objects, GLuint_ptr) Its use is similar to its dual glgen*. Application: o The identifier (of type GLuint) for the buffer object used for storing the vertices is vertexbufferobject. o The initialization of the buffer corresponding to the identifier vertexbufferobject is performed in the function InitializeVertexBuffer. o Create one object associated to vertexbufferobject with the function glgenbuffers(gluint, GLuint*). o Then, go to the end of the function UpdateVertexBuffer, just before delete[] vertexdata. o Link the object vertexbufferobject to the target GL_ARRAY_BUFFER thanks to the command glbindbuffer. o You can then allocate storage for the object in order to store the array vertexdata by adding the command: glbufferdata(gl_array_buffer, size*sizeof(float), vertexdata, GL_STATIC_DRAW); This command allows to dimension the GPU storage to allocate the size size*sizeof(float), where size is the number of elements inside the array vertexdata, and then to copy the data contained in vertexdata. o Unlink the object vertexbufferobject from the target GL_ARRAY_BUFFER by calling again the function glbindbuffer and by replacing the object address by 0. o Finally, in the function DeleteVertexBuffer, add the command which frees the allocated resources to vertexbufferobject. 3. Identifying the data of the VBO via the VAO. We have just sent the vertex data to the GPU storage. However, the Buffer Objects are not formatted. For OpenGL, what we have done is only creating a Buffer Object and filling it with binary data. We have now to tell OpenGL that the data contained inside the buffer object correspond to vertices coordinates and what is the data format. o In the function Display, add the command glbindbuffer in order to link the vertexbufferobject to the target GL_ARRAY_BUFFER. o It is mandatory to enable the array in order to be able to use it. For this, add the following command. The argument is the index of the considered array: glenablevertexattribarray(0); 4

5 o Finally, add the following line: glvertexattribpointer(0,4,gl_float,gl_false,0,0); This call to the function glvertexattribpointer indicates OpenGL that the data format used for the vertices has 4 floats per vertex. The parameters are the following: the index of the array of vertices, the number of values per vertex, the data format of one value, a boolean indicating if the data has to be normalized, the last two arguments are set to 0 and will be introduced later on. 4. Image rendering. Now that OpenGL knows what the vertex coordinates are, we can use these coordinates for rendering a triangle. o Use the following command for drawing the triangles: gldrawarrays (GL_TRIANGLES, 0, 2*3) The first parameter tells OpenGL that we want to draw from a list of vertices of triangles. The second parameter is the first vertex number and the last parameter is the total number of vertices. o Disable the array, then unlink the object vertexbufferobject from the target GL_ARRAY_BUFFER with the following commands: gldisablevertexattribarray(0); glbindbuffer(gl_array_buffer,0); o Finally, run the code in order to visualize a white square (default colour). Although the obtained result is the expected one, some vertices are duplicated by the code. For our elementary case, the impact of this duplication is negligible. However, for more complex cases, this duplication may significantly lower the performances because the array to send to the GPU may become much bigger than necessary. 5. Creating an indexed array. In order to avoid this unnecessary overhead, we will use two arrays in parallel. Introduction to the method: o The first array will contain the vertex list without duplication. o The second array will contain the list of three successive indices making a triangle. o In the case of a shared vertex between several triangles, only its index will be duplicated. Application: o Begin to delete duplicated vertices from the array vertexdata. Do not forget to change the declaration and initialization of this array. o Then, in the function UpdateElementBuffer, allocate an array of GLuint called elementarray with the right size. o Delete this array at the end of the function. o Initialize elementarray in the function BuildElementArray with the vertex indices of each triangle. 5

6 6. Sending data to the GPU and identification. Once this new data structure set, we will send it to OpenGL. o Similarly to the vertexbufferobject, begin by initializing a buffer object called elementbufferobject inside the function InitializeElementBuffer. o Afterwards, in the function UpdateElementBuffer and in the same fashion as for vertexbufferobject, allocate storage for the object, using the target GL_ELEMENT_ARRAY_BUFFER. o Finish by freeing the allocated storage for the elementbufferobject in the function DeleteElementBuffer. 7. Image rendering. Now, it only remains to display the square. o In the function Display, replace the command gldrawarrays(gl_triangles, 0, 2*3) by the two following : glbindbuffer(gl_element_array_buffer, elementbufferobject); gldrawelements(gl_triangles, 2*3, GL_UNSIGNED_INT, 0); o After the command glbindbuffer(gl_array_buffer,0), unlink the object elementbufferobject from the target GL_ELEMENT_ARRAY_BUFFER in a similar fashion that this command. o Eventually, run the code in order to display the same white square as obtained previously. During this first stage, we have sent a list of vertices to OpenGL. We will now process these vertices inside the graphic pipeline thanks to the use of shaders. Without shaders no transformations of the vertices coordinates can be computed (their positions are used as is) and the pixels colours of a given object is set to white. In order to address this new stage, we will introduce here below some functionalities for interfacing the shaders with OpenGL. Introducing the GLSL language In order to directly program on a graphic card, the language GLSL (OpenGL Shading Language) has been specifically developed for OpenGL. The shaders are programs written in this language and executed in the OpenGL rendering process. However, these shaders need to be compiled before to be executed. Henceforth, the OpenGL code includes two compilations: one for the Vertex Shader and the other for the Fragment Shader. Moreover, these two compilations have to be followed by a link edition between these two shaders and the OpenGL program. Similarly to other integrated objects in OpenGL, objects have to be created for containing the shaders. Functionalities are dedicated for loading shaders from external files. In our code, the function InitializeProgram first loads the shaders with the command LoadShader. This command takes as arguments the shader type (GL_VERTEX_SHADER or GL_FRAGMENT_SHADER) and the file name. LoadShader, in turn, reads the corresponding shader file and calls CreateShader which compiles the shaders and checks if there is no error. Secondly, a program is created by CreateProgram. A new OpenGL object is then created with an identifier and shaders are then linked to this program. 6

7 8. Compilation and link edition for the shaders. Load the two available shaders in the archive in order to enable them: o In the function init, add the following command before the function call InitializeVertexBuffer : InitializeProgram(); o At the end of the del function, add the following command: DeleteProgram(); o You should see a gray square when running the code. 9. GLSL language syntax. We will now take a closer look to the syntax of the two shaders provided with the archive. o Open the files vertex.glsl and frag.glsl located in the folder data. These files contain: the GLSL version number, followed by one or more attributes, and by a main function. In a shader, several attributes can be defined by the user. These attributes corresponds to inputs and/or outputs. The variable type may be a scalar (int, float...), a vector (vec2, vec3...) and so on. The inputs can be preceded by the keyword in or uniform. The first keyword is used for a data array, each of its entry being processed in parallel by the shader. On the contrary, the second keyword refers to a constant entry for the whole execution of the shader. Output variables defined by the user are identified by the keyword out. Moreover, the GLSL language provides a certain number of built-in output variables, all prefixed by gl_. The most used one is gl_position which allows sending the vertex positions to the next shader. 10. Introduction to the Vertex Shader. As you may guess, the Vertex Shader takes on input (as attributes) data associated to a vertex. This shader should output at least one value: the built-in variable gl_position, initialized with the vertex position in the camera space. This output value will then be used by the rasterizer for filling the triangles with fragments. This shader may also compute output variables which will be interpolated between vertices and sent to the Fragment Shader for each computed fragment by the rasterizer (in the practical course 1, the colour gradient in the square has been obtained by interpolation). Furthermore, the Vertex Shader has access to the uniform variables which typically contain transformations to apply to the vertices of a given object. 7

8 The following scheme gives a sketch about how the Vertex Shader operates: vertex attributes "in" "uniform" variables Vertex Shader gl_position "built-in" output variables "out" Figure 4 : Inputs/outputs of the Vertex Shader In addition to the prefix in, input attributes are prefixed by a code similar to layout(location = index). This code specifies the index associated to the attribute which was defined at the VAO initialization. A common alternative is to resort to a function called glbindattriblocation. 11. Introduction to the Fragment Shader. After the processing of each vertex, the Fragment Shader is called for each fragment processed by the rasterizer. Fragments can be thought as pixels covering the apparent surfaces of the triangles in the scene. The rasterization stage allows, among other things, to interpolate the output variables of the Vertex Shader in order to use them as input variables in the Fragment Shader. The Fragment Shader then has to compute and output colour that will be displayed in the final image. This output colour can be set by declaring and setting an output variable of type vec4. The Fragment Shader has also access to the uniform attributes, which are mainly used for textures. We will use them in the next practical course. The following scheme shows how the Fragment Shader operates: Interpolated values "in" "uniform" variables Fragment Shader output colour "out" Figure 5 : Inputs/outputs of the Fragment Shader The Fragment Shader given in the archive is very simple: it sets all the fragments it received to a gray colour. o Change the associated value to the colour set at the Fragment Shader output and check the change by running the code. 8

9 The following scheme provides a global view of the graphic pipeline and its processes: VBO VAO position colour position colour position colour Vertex Shader Process vertex 0 Process vertex 1 Process vertex 2 Rasterizer Fragment Shader Process fragment 0 Process fragment n Frame buffer Figure 6 : Main processes of the dynamic pipeline Note: in addition to the two kinds of shaders introduced here, there exist two more recent types: geometric shaders (for modifying the mesh) and tessellation shaders (decomposing the mesh in subelements in order to add details to object at a low computational cost, using, for instance, the displacement mapping technique see the Blender Practical course 4 for more details). The program is now ready for the use of customized shaders. We will now program our shaders in order to add colours to our square. Vertex processing Firstly, we will use a Vertex Shader mimicking the behaviour of a fixed pipeline for defining the vertex positions and colours. 12. Adding the colour attribute We represent a colour with 4 floating point values between 0 and 1. The first 3 correspond to the 3 RGB channels while the last one corresponds to the alpha channel. As the storage of contiguous data is more efficient, and in order to avoid to store the vertex attributes in two different storage zones, you will have to add the colour after the position of each vertex in the array vertexdata. o Begin by adding the colour data associated to each vertex. Then, we have to send these informations to the graphic card via the VBO and state the data format via the VAO. As explained earlier, several storage methods are possible. Here, we choose to use one VBO with interleaved data for the vertex. 9

10 It is not necessary to make changes to the VBO. However, two VAO are necessary for pointing on the interleaved data: a first VAO of index 0 will point on the position while a second VAO of index 1 will point on the colour. Take a closer look to the prototype of the initialization function of a VAO: void glvertexattribpointer(gluint index,glint size, GLenum type, GLboolean normalized, GLsizei stride, const GLvoid* pointer); where stride corresponds to the size (in bytes) of a whole vertex and pointer to the offset in bytes of the first attribute. These two arguments have to be modified. We use 8 floating points per vertex (4 for the position and 4 for the colour). The value of the parameter stride is then 8*sizeof(GLfloat). The shift in bytes for getting the first address of the colour attribute in the VBO is 4*sizeof(GLfloat). Though, we cannot pass directly this value as argument because the associated type to pointer is const GLvoid*. A cast of this value is needed. A straightforward solution is to use the command: (const GLvoid*) (4*sizeof(GLfloat)) However, we will prefer the use of a function expliciting this cast operation: BUFFER_OFFSET(4*sizeof(GLfloat)) o Write the needed changes in the function Display. Begin by changing the stride parameter of the position VAO, and then define a new VAO of index 1 for the colour. As the same VBO is used by the two VAOs, one call to the function glbindbuffer is required inside the function Display. o Finally, disable the new VAO associated to the colour by calling the function gldisablevertexattribarray before disabling the VAO associated to the position. In this way, the deactivation is always done in the inverse order of the activation. 13. Programming the Vertex Shader In order to add colour information to a vertex, it is mandatory to change the inputs/outputs of the shader. o Open the file vertex.glsl. o Declare a new input attribute color for the model colour by choosing the right index value. o Then add an output attribute associated to the colour called thecolor. o In the main function, initialize the new output to the value color. 14. Programming the Fragment Shader o Open the file frag.glsl. o Declare a new input attribute for the colour. Be careful to give it the same name as the corresponding output variable of the Vertex Shader. o Change the initialization of the output attribute outputcolor by setting it to the value of the input colour. 10

11 We can note that for the moment no computations are done in this shader. It only sends back the interpolated colour by the rasterizer. o Modify the Fragment Shader for changing the colour by using mathematical functions available in the GLSL language (cos, sin, exp, abs, cf. o For instance, add a halo effect for which the fragment colours tone down as the distance to the origin increases. The computations are done in the Fragment Shader with the formula given here below. However, this calculation needs the position of the fragment obtained by interpolation. This information will be obtained by adding a new output variable in the Vertex Shader (the built-in variable gl_position is not available in the Fragment Shader). Note: use the function distance(vec4,vec4) for computing the distance from the origin. 11

12 University of Liège Departement of Aerospace and Mechanical engineering Exercise n 2: Representing a 3D map. (Folder surf in OpenGL2.zip available on the course web page Surface display Creating a surface This exercise aims at representing a field using a 3D surface. Altitude data field are given under the form of an image in grey levels. The software provides a function allowing the sampling of the altitude image(buildvertexdata(float *vertexdata)). This sampling function generates a regular grid consisting of dots spread on the image. The number of image divisions according to height and width is defined by the nbsubdiv variable. The result is a grid whose dimensions are: nbsubdiv+1 times nbsubdiv+1 dots. The grid dots are configured as presented in Figure 1. The grid is defined in the XY plan; the altitude is a dot on the Z coordinate. The vertices are stored in the floats vertexdata table. Vertices are stored in this table in a linear way, beginning from the upper left corner in Figure 1 (coordinates x=-1, y=1). The BuildVertexData function stores in an interleaved fashion data associated with location and the normal of each vertex in the vertexdata array. Y nbsubdiv 1 nbsubdiv Z X -1 (nbsubdiv+1)*(nbsubdiv+1)-1 Figure 1 Arrangement and numeration of grid dots The number of subdivisions may be modified by the user during running. Yet, the maximum value of subdivisions is fixed by the NB_SUBDIV_MAX constant. This number is thus the following: (NB_SUBDIV_MAX+1)*(NB_SUBDIV_MAX+1) For each vertex, there are 4 floats for the location and 4 floats for the normal. The vertexdata array is thus defined as following: float vertexdata [(4+4)*(NB_SUBDIV_MAX+1)*(NB_SUBDIV_MAX+1)] Then, the function BuildElementArray(GLuint *elementarray) initialises a GLuint, elementarray array, which is to be sent into the Element buffer. It contains the sequence of indices of the vertices this allows representing the surface using triangles from the vertex buffer through the Indexed Drawing. The maximum number of triangles is: 2*(NB_SUBDIV_MAX)*(NB_SUBDIV_MAX) The elementarray array is thus defined in the following way: GLuint elementarray [2*3*(NB_SUBDIV_MAX)*(NB_SUBDIV_MAX)]; 12

13 Buffer loading and display Then, both tables are copied in the memory of the graphic card. To this aim, two buffers are created by the functions InitializeVertexBuffer() and InitializeElementBuffer(). Both functions are called by the initialisation function at the beginning of the running. The data transfer to these buffers is carried out by the functions: UpdateVertexBuffer()and UpdateElementBuffer(). These functions are called each time a buffer modification is made necessary by the function UpdateSurface(). The Display() function is in charge of linking both buffers at the entry dots of the Vertex Shader and running the triangles display. Defining the Vertex Shader. The Vertex Shader uses a Lambertian illumination calculation for an infinitely far light source. The Shader s entry variables used by the software are the following: rotmat : Rotation matrix allowing the object manipulation. worldtoclipmat : Transformation matrix towards the canonical space. tolightdir : Vector pointing to the light source at an infinite distance. lightcolor : Colour of the light source. You may run your program. It shows a surface whose altitude at each dot has been calculated based on an altitude map. So far, the surface is plain-coloured and tinge differences come from the Lambertian illumination model. It is possible to manipulate the surface using the mouse. Besides, these keyboard shortcuts are implemented: [e], [f] and [v] respectively toggle the display into the modes Edge, Face and Vertex. [+] increases while [-] decreases grid density. [p] increases while [o] decreases relief amplitude. The technique used here is called displacement mapping, since the grid itself is deformed through the use of an image. 13

14 Using textures Using texture under OpenGL has been introduced during the first tutorial. Yet, when one wants to use textures through programming with the Shaders, a few extra stages are needed in initialisation phase. Figure 2 displays all necessary links to ensure the good functioning of a program using a texture. As shown, all links are articulated around a table called Texture Image Units. This unique table is generated by OpenGL; each of its boxes can be assigned to a texture. It links the Texture Object, the Sampler (that allows sampling of the texture) and the entry point of the aforesaid Sampler in the program. We will firstly set all links in the C++ program, and then we will describe how to access the texture in the Shaders using the Sampler. Program object Texture object Sampler object Figure 2 Texture use diagram (from: // 1. Texture initialisation o Start from initialising the texture by implementing the needed commands in the function InitializeTexture(). Reminder: texture initialisation is done this way: glgentextures(1,&pictexture); glbindtexture(gl_texture_2d,pictexture); glteximage2d(gl_texture_2d, 0, GL_RGB, PICTURE_SIZE, PICTURE_SIZE, 0, GL_RGB, GL_UNSIGNED_BYTE, image_pic); glbindtexture(gl_texture_2d,0); The parameter pictexture is the object texture s identifier, which is to be defined as a global variable (GLuint). The parameters of glteximage2d() are the following: GL_TEXTURE_2D: because it is a bidimensional image. 0: the level of the image loaded in the active texture (used for multiresolution textures). GL_RGB: internal format of the texture storing. PICTURE_SIZE: width of the image (constant defined in the upper part of the file surf.cpp). 14

15 PICTURE_SIZE: height of the image (constant defined in the upper part of the file surf.cpp). 0: the texture has no edges. GL_RGB: format of the provided image. GL_UNSIGNED_BYTE: storing format of the image. image_pic: the table the image is stored in. 2. Sampler initialisation. The Sampler allows sampling texture. That is, providing the texture value (colour) based on coordinates (u,v) in the case of a 2D texture. The Sampler initialisation will be carried out in the C++ program before it could be used by a Shader. The correspondent function is InitializeSampler(). o Use the following commands for initialising the Sampler: glgensamplers(1,&picsampler); glsamplerparameteri(picsampler, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glsamplerparameteri(picsampler, GL_TEXTURE_MIN_FILTER,GL_LINEAR); as for most OpenGL objects, the Sampler is identified by a GLuint. In this case, it is the picsampler that you need to define as a global variable. In both lines of the Sampler generation, one finds commands that have already been used during the first tutorial with one exception: this time they are applied to the Sampler not to the texture. Both lines define the interpolation type that is to be used. 3. Sampler definition in Vertex Shader. o For a moment, switch to the Vertex Shader and insert this Uniform declaration as an entry: uniform sampler2d picsampler; This line defines an entry point into the function for the Sampler that we just have defined. This Sampler will be used by the Vertex Shader for determining the colour of a vertex based on its coordinates (u,v). o The same way for all Uniforms objects, add the following line in order to determine its identifier in the function InitializeProgram()of the file surf.cpp where picsampleruniform is to be defined as a global variable: picsampleruniform = glgetuniformlocation(theprogram, "picsampler"); 4. Linking texture, the Sampler and the Shader to a Texture image unit. o As previously said, the table Texture image unit is the meeting point of all different objects linked to a texture. Switch then to the Display()function and add the following lines before the display command: gluniform1i(picsampleruniform,picunit); glactivetexture(gl_texture0+picunit); glbindtexture(gl_texture_2d,pictexture); glbindsampler(picunit,picsampler); The first line links the Vertex Shader s Uniform to a location in the Texture image units array designated by the picunit GLuint. The picunit value setting is left to the user; thus you only need to define it and assign a value to it (e.g., 0) just before using it. 15

16 We then activate the location correspondent to picunit amongst the Texture image units. The glactivetexture() command is peculiar because it is needed to provide it with the absolute location of the Texture image unit ; hence, the addition of GL_TEXTURE0 to picunit. We then use glbindtexture(), thanks to which the texture is automatically linked with the current Texture image unit. Then we use glbindsampler() with, as argument, the identifiers of the Texture image unit and the Sampler. 5. Links removal after rendering. o To end the Display()function, add, after the display command, the links removal commands: glbindsampler(picunit,0); glbindtexture(gl_texture_2d,0); 6. Defining texture coordinates: Before being able to use the texture in the Vertex Shader, we need to define the texture coordinate of each Vertex and then send all coordinates to the Vertex Shader. Since each vertex has its own property, we need to use the Vertex Buffer to transfer these data. o Modify the initial size of the table vertexdata in order to add the coordinates (u,v) of each vertex (two floats per vertex). o Add the texture coordinates definition into the function BuildVertexData. Texture coordinates should be comprised between 0 and 1. The texture coordinates origin coincides with the vertex 0 (see Figure 1); the U axis is parallel to X and the axis V to -Y. o Do not forget to modify the size of the array sent to the Vertex buffer into the function UpdateVertexBuffer(). o In the Vertex Shader, add a Vertex Attribute in the entry in order to get the texture coordinates: layout(location = 2) in vec2 texcoord; o In the Display() function, enable the Vertex Attribute to send texture coordinates. glenablevertexattribarray(2); glvertexattribpointer(2, 2, GL_FLOAT, GL_FALSE, vertexsize*sizeof(glfloat), BUFFER_OFFSET(8*sizeof(GLfloat))); where vertexsize designates a vertex dimension (you need to update this value). 7. Using the Sampler in the Vertex Shader. To use texture in the Vertex Shader, we will use the previously defined Sampler. The texture() function allows using this Sampler. The function uses as an argument the Sampler s object as well as texture coordinates of the dots we want to evaluate. The function sends back the value of the texture that is linked to the Sampler at the provided dot. o In the present case, replace the line defining the default colour to use the texture as colour for the current Vertex: vec4 color=texture(picsampler,texcoord); 16

17 When running the program in its current state, you will see the result is not realistic. It is indeed needed to greatly increase the number of Vertices for the texture image to appear in a clear way. This is due to the texture being evaluated only at the Vertices then being smoothened between those vertices by the Fragment Shader. Acting this way, we lost the main advantage of textures; that is avoiding modelling details. To solve this issue, it is needed to use the texture in the Fragment Shader for it to be evaluated at all dots and not only at Vertices. 8. Using the Sampler in the Shader Fragment. The Sampler is to be used in the Fragment Shader the same way we used the Vertex Shader, that is through the texture()function. Yet, texture coordinates are unknown in the Fragment Shader; so it is necessary to add an output value to the Vertex Shader in order to send these coordinates to the Fragment Shader. o We define the extra output value in the headline of the Vertex Shader for the texture coordinates to bee smoothened and sent to the Fragment Shader. By default, all output values of the Vertex Shader are interpolated on each element. out vec2 smoothtexcoord; o In the main function body, we then add the following line in order to define smoothtexcoord for each Vertex: smoothtexcoord=texcoord; Everything is in its place in the Vertex Shader. We now can proceed to the Fragment Shader. o In the Fragment Shader s headline, declare to new entry variables: one for the texture coordinates and the second for the Sampler: in vec2 smoothtexcoord; uniform sampler2d picsampler; Both sampler2d objects defined in the Vertex Shader and in the Fragment Shader bear the same name; therefore, they will be considered as identical. o Then, in the main function body, call the texture() function for defining the output colour: outputcolor=texture(picsampler,smoothtexcoord); 17

18 The program can now be run. As you may notice, the texture is displayed with the maximum resolution whatever the number of Vertices. Yet, the Lambertian reflection model, acting this way, is not applied to the rendering anymore. We could, like in the previous exercise, transfer this model to the Fragment Shader by interpolating the normals of the geometry. We will nevertheless use further on an alternative method, which is also based on textures. Normal mapping Another possible use of textures is the Normal mapping. This technique consists in using a texture (the map of normals) in order to determine normals pixel by pixel. The Normal mapping technique allows obtaining diffuse reflections or realistic speculars without resorting to a very accurate geometrical representation. In a practical manner, the normals map is an RGB image in which each colour channel corresponds to one direction. Knowing the coordinates (u,v) of a dot, we can reconstruct the normal at this dot through using each of the components. Red: Axis X Green: Axis Y Blue: Axis Z Figure 3 Normals map For instance, the normals map that you are going to use in the framework of this tutorial is represented by Figure 3 each colour being taken separately. The sea around the island is greyish for red and green channels all this corresponds to a 0 value. One can also notice that mountains in both channels have a white slope (positive component) and a black slope (negative component). The blue channel is very different from the two other ones and is almost white because all normals on the scene have a positive component according to the Z axis. 18

19 This normal map is provided by the image ground_1024_norm.jpg from the data folder. The image is uploaded into the image_norm table by the function main(). You will use it for the current scene: 1. Creation of an extra texture. Through the same path described previously, we will define a second texture. o Add three new global variables normtexture, normsampler and normsampleruniform of the type GLuint identifying texture, the Sampler and the Shader s entry variable. o Then redo all initialisation stages of the new texture. o Then do the same for the new Sampler. o Finally define a new entry variable of the type Sampler directly in the Fragment Shader. 2. As previously explained, for fully taking advantage of textures, it is more efficient to use the new texture in the Fragment Shader. Since the normal is determined in the Fragment Shader, the lighting calculations are necessarily carried out in it. o Switch then to the main function of Fragment Shader, add in it the following line: vec4 normaltexture = texture(normsampler, smoothtexcoord); where normsampler is the Sampler2D s name. As previously explained, this command asks the Sampler for the normal s components at the current dot. o Nevertheless, the normaltexture vector does not correspond exactly to the normal. This colour displays components that are all comprised between 0 and 1 whereas the origin normals components are comprised between -1 and 1. For the negative and the positive components to be represented, normals have been transformed. It is now needed to operate the opposite transformation. This transformation is luckily rather easy: just define the cameranormal vector in this manner: vec4 cameranormal = vec4((normaltexture.x-0.5)*2, (normaltexture.y-0.5)*2, (normaltexture.z-0.5)*2,0); o This vector should then be transformed by the rotation matrix before the lighting calculation. Define the matrix in the Fragment Shader and implement in it the Lambertian model. Get inspiration from the Vertex Shader. You now may run the program and test the effect of the normals map. To do so, cancel the output of function Elevation() for the map to be flat. You may also notice that the result does not depend on the number of Vertices. 19

20 Using several light sources So far we have used two types of OpenGL buffers: the Vertex Buffer and the Element Buffer. In this exercise, we will use a third type: the Uniform Buffer. This buffer type aims at containing uniform data from one or more Shader Programs. Its main advantage is allowing the sharing of data between various programs and the using of Uniform Blocks (that is, structured variables) in the Shaders. The code already contains the definition of a structure variable LightBlock that allows storing properties of several light sources (four in the current state): const int NUMBER_OF_LIGHTS = 4; struct LightBlock { glm::vec4 modelspacelightspos[number_of_lights]; glm::vec4 lightscolor[number_of_lights]; float lightsstate[number_of_lights]; }; In this structure, we find three arrays. The first one contains the light sources location; the second, their colours; and the third, their state (On/Off) represented by a float. A global instance of this structured variable (scenelights) is created and initialised in the function InitializeSceneLight(). Now it is only needed to: Create the Uniform Buffer. Define functions that allows uploading scenelights into the buffer. Implement the light sources treatment into the Fragment Shader. Link the buffer with the entry point corresponding to the Fragment Shader. These stages are covered by the following: 1. Creating and initialising the Uniform Buffer. o Creating a Uniform Buffer is carried out as usual. Firstly design the new global variable of type GLuint called scenelightsbo. Then insert the following line into the function InitializeSceneLight() after initialising the scenelights : glgenbuffers(1, &scenelightsbo); o Remains to initialise it at the right dimension: glbindbuffer(gl_uniform_buffer, scenelightsbo); glbufferdata(gl_uniform_buffer, sizeof(scenelights), NULL, GL_DYNAMIC_DRAW); glbindbuffer(gl_uniform_buffer, 0); the buffer is initialised with NULL. We then indicate GL_DYNAMIC_DRAW as the use type, because this buffer s data are deemed to frequently change (contrarily to the Vertex and Element Buffers). This latter parameter is used for indicating to OpenGL which optimisation level to use. 2. The uploading of GL_DYNAMIC_DRAW is carried out using the three functions: LoadLightsModelSpacePos(), LoadLightsColor() et LoadLightsState(). These functions are in charge for sending the sub-variable of scenelights corresponding to their name. The advantage of using three functions rather than only one is to limit data transfers when only one part of the variable needs to be updated. All three functions need to be implemented. 20

21 o Take, for instance, LoadLightsColor(). As indicated by its name, this function is in charge of sending the array lightscolor contained in scenelights. This data transfer is carried out through the function glbuffersubdata(): glbindbuffer(gl_uniform_buffer, scenelightsbo); glbuffersubdata(gl_uniform_buffer, offset, sizeof(scenelights.lightscolor), scenelights.lightscolor); glbindbuffer(gl_uniform_buffer, 0); glbuffersubdata() s second argument designates the location in accordance with the buffer s beginning where data are to be replaced. The third argument gives the length of data that are to be replaced. The fourth designates the data themselves. The offset variable designates thus the start location of the data track corresponding to lightscolor in the buffer. Given that only the modelspacelightspos table is placed before, the offset variable may be defined as follows: int offset=sizeof(scenelights.modelspacelightspos); o In the same manner, define LoadLightsModelSpacePos() and LoadLightsState(), taking into account the various offset values. o Add a function call to the three functions at the end of the initialisation function InitializeSceneLight(). 3. Let us now switch to the Fragment Shader where we will implement the treatment of several light sources. o First of all, we need to create an equivalent variable to scenelights. To do so, we will firstly define the number of light sources (in a static manner): const int numberoflights = 4; o Then, we will define a Uniform Block that has exactly the same structure as the C++ structured variable: uniform Light { vec4 modelspacelightspos[numberoflights]; vec4 lightscolor[numberoflights]; float lightsstate[numberoflights]; } Lgt; Light is the name of the Uniform Block it is this name that will be used when recovering its address from the C++ program. Then, we find in the same order data corresponding to the C++ structure. Finally, Lgt is the instance name. If no instance names are specified, Uniform Block members names are global. If it is not the case, it will be needed to add this name to access it. For instance, Lgt.modelSpaceLightsPos allows accessing to the positions array. 21

22 o In the Fragment Shader, implement the lighting calculation (a Lambertian one for finite-situated light sources). To clarify the code, you may implement into the Shader a function that will calculate the lighting of a dot under the form: vec4 computelight(vec4 cameranormal) { vec4 lighting =vec4(0,0,0,0); for(int i=0;i<numberoflights;i++) { } return lighting; } Use this function in the Shader s main function. Take care of the variable lightsstate that defines if a light source is active or not. Here, we would like for the light sources to be fixed compared to the ground. 4. Eventually, link the Uniform Buffer to the Shader s Light bloc. The same way as with textures, this link is to be drawn in two stages as illustrated by Figure 4. Figure 4 Linking a Uniform Buffer (from: o These links may be added at the end of InitializeSceneLight(). Firstly, we will link the Uniform Buffer to a box of the table managed by OpenGL, which is named Uniform Buffer Binding Points using the function glbindbufferrange(): glbindbufferrange(gl_uniform_buffer, scenelightsglobalindex, scenelightsbo,0, sizeof(scenelights)); scenelightsglobalindex designates the index to which the Buffer is linked in the Uniform Buffer Binding Points table. It is a GLuint whose value you should define. scenelightsbo designates the Buffer. 0 designates the place of the Buffer to link (possibility to introduce an offset). sizeof(scenelights) corresponds to the size of data to be linked. 22

23 o Then link the Shader s Uniform Block to the Binding Point you want: GLuint scenelightsbouniform = glgetuniformblockindex(theprogram, "Light"); gluniformblockbinding(theprogram, scenelightsbouniform, scenelightsglobalindex); Note: the function used for obtaining the Uniform Block address is different than the one previously used for simple uniform variables. The program can now be compiled and run. New light sources are fixed compared to the ground. Their coordinates are recalculated and rewritten in the buffer at each scene rotation. Moreover, it is now possible to pull those light sources away from the ground using the x key or to move them closer by clicking w. Furthermore, the keys 0, 1, 2 and 3 allow activating and deactivating each of the light sources. Because each light source is placed above each corner of the map, you now can modify the origin direction of light and see its influence of the normals map. Fog How do you introduce fog into the scene? Tip: in the Fragment Shader, gl_fragcoord may be used in order to recover a fragment s coordinated in the window space. The command gl_fragcoord.z gives you an image of the distance between the current fragment and the camera. You can use the following fog model: C out = f p *C tex +(1-f p )*C fog f p = exp(-(f d *gl_fragcoord.z)²) Where: o C out is the output color. o C tex is the input texture color. o C fog is the fog color. o f p is a fog parameter (to be clamped between 0 and 1). o f d is the fog density. 23

Discussion 3. PPM loading Texture rendering in OpenGL

Discussion 3. PPM loading Texture rendering in OpenGL Discussion 3 PPM loading Texture rendering in OpenGL PPM Loading - Portable PixMap format 1. 2. Code for loadppm(): http://ivl.calit2.net/wiki/images/0/09/loadppm.txt ppm file format: Header: 1. P6: byte

More information

CSE 167: Introduction to Computer Graphics Lecture #7: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

CSE 167: Introduction to Computer Graphics Lecture #7: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016 CSE 167: Introduction to Computer Graphics Lecture #7: GLSL Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016 Announcements Project 2 due Friday 4/22 at 2pm Midterm #1 on

More information

CS 450: COMPUTER GRAPHICS REVIEW: STATE, ATTRIBUTES, AND OBJECTS SPRING 2015 DR. MICHAEL J. REALE

CS 450: COMPUTER GRAPHICS REVIEW: STATE, ATTRIBUTES, AND OBJECTS SPRING 2015 DR. MICHAEL J. REALE CS 450: COMPUTER GRAPHICS REVIEW: STATE, ATTRIBUTES, AND OBJECTS SPRING 2015 DR. MICHAEL J. REALE OPENGL STATE MACHINE OpenGL state system or state machine Has list of all current state values called state

More information

COMP371 COMPUTER GRAPHICS

COMP371 COMPUTER GRAPHICS COMP371 COMPUTER GRAPHICS SESSION 12 PROGRAMMABLE SHADERS Announcement Programming Assignment #2 deadline next week: Session #7 Review of project proposals 2 Lecture Overview GPU programming 3 GPU Pipeline

More information

GLSL Overview: Creating a Program

GLSL Overview: Creating a Program 1. Create the OpenGL application GLSL Overview: Creating a Program Primarily concerned with drawing Preferred approach uses buffer objects All drawing done in terms of vertex arrays Programming style differs

More information

CS452/552; EE465/505. Image Processing Frame Buffer Objects

CS452/552; EE465/505. Image Processing Frame Buffer Objects CS452/552; EE465/505 Image Processing Frame Buffer Objects 3-12 15 Outline! Image Processing: Examples! Render to Texture Read: Angel, Chapter 7, 7.10-7.13 Lab3 new due date: Friday, Mar. 13 th Project#1

More information

Computergraphics Exercise 15/ Shading & Texturing

Computergraphics Exercise 15/ Shading & Texturing Computergraphics Exercise 15/16 3. Shading & Texturing Jakob Wagner for internal use only Shaders Vertex Specification define vertex format & data in model space Vertex Processing transform to clip space

More information

Shaders. Slide credit to Prof. Zwicker

Shaders. Slide credit to Prof. Zwicker Shaders Slide credit to Prof. Zwicker 2 Today Shader programming 3 Complete model Blinn model with several light sources i diffuse specular ambient How is this implemented on the graphics processor (GPU)?

More information

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 2: First Program

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 2: First Program Comp 410/510 Computer Graphics Spring 2017 Programming with OpenGL Part 2: First Program Objectives Refine the first program Introduce a standard program structure - Initialization Program Structure Most

More information

Starting out with OpenGL ES 3.0. Jon Kirkham, Senior Software Engineer, ARM

Starting out with OpenGL ES 3.0. Jon Kirkham, Senior Software Engineer, ARM Starting out with OpenGL ES 3.0 Jon Kirkham, Senior Software Engineer, ARM Agenda Some foundational work Instanced geometry rendering Uniform Buffers Transform feedback ETC2 Texture formats Occlusion Queries

More information

CPSC 436D Video Game Programming

CPSC 436D Video Game Programming CPSC 436D Video Game Programming OpenGL/Shaders Opengl RENDERING PIPELINE Copyright: Alla Sheffer 1 Opengl RENDERING PIPELINE C/C++ OpenGL GLSL (automatic) (automatic) GLSL (automatic) opengl Low-level

More information

CS 432 Interactive Computer Graphics

CS 432 Interactive Computer Graphics CS 432 Interactive Computer Graphics Lecture 2 Part 2 Introduction to Shaders Matt Burlick - Drexel University - CS 432 1 Shaders To understand shaders, let s look at the graphics pipeline again The job

More information

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University Graphics Programming Computer Graphics, VT 2016 Lecture 2, Chapter 2 Fredrik Nysjö Centre for Image analysis Uppsala University Graphics programming Typically deals with How to define a 3D scene with a

More information

Rendering Objects. Need to transform all geometry then

Rendering Objects. Need to transform all geometry then Intro to OpenGL Rendering Objects Object has internal geometry (Model) Object relative to other objects (World) Object relative to camera (View) Object relative to screen (Projection) Need to transform

More information

CSE 167. Discussion 03 ft. Glynn 10/16/2017

CSE 167. Discussion 03 ft. Glynn 10/16/2017 CSE 167 Discussion 03 ft Glynn 10/16/2017 Announcements - Midterm next Tuesday(10/31) - Sample midterms are up - Project 1 late grading until this Friday - You will receive 75% of the points you ve earned

More information

Computer graphics Labs: OpenGL (1/3) Geometric transformations and projections

Computer graphics Labs: OpenGL (1/3) Geometric transformations and projections University of Liège Department of Aerospace and Mechanical engineering Computer graphics Labs: OpenGL (1/3) Geometric transformations and projections Exercise 1: Geometric transformations (Folder transf

More information

CS 432 Interactive Computer Graphics

CS 432 Interactive Computer Graphics CS 432 Interactive Computer Graphics Lecture 7 Part 2 Texture Mapping in OpenGL Matt Burlick - Drexel University - CS 432 1 Topics Texture Mapping in OpenGL Matt Burlick - Drexel University - CS 432 2

More information

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation Lecture 19: OpenGL Texture Mapping CITS3003 Graphics & Animation E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley 2012 Objectives Introduce the OpenGL texture functions and options

More information

Computer Graphics Seminar

Computer Graphics Seminar Computer Graphics Seminar MTAT.03.305 Spring 2018 Raimond Tunnel Computer Graphics Graphical illusion via the computer Displaying something meaningful (incl art) Math Computers are good at... computing.

More information

Computação Gráfica. Computer Graphics Engenharia Informática (11569) 3º ano, 2º semestre. Chap. 4 Windows and Viewports

Computação Gráfica. Computer Graphics Engenharia Informática (11569) 3º ano, 2º semestre. Chap. 4 Windows and Viewports Computação Gráfica Computer Graphics Engenharia Informática (11569) 3º ano, 2º semestre Chap. 4 Windows and Viewports Outline : Basic definitions in 2D: Global coordinates (scene domain): continuous domain

More information

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science Texture Mapping CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science 1 Objectives Introduce Mapping Methods - Texture Mapping - Environment Mapping - Bump Mapping Consider

More information

Lighting and Texturing

Lighting and Texturing Lighting and Texturing Michael Tao Michael Tao Lighting and Texturing 1 / 1 Fixed Function OpenGL Lighting Need to enable lighting Need to configure lights Need to configure triangle material properties

More information

Information Coding / Computer Graphics, ISY, LiTH. OpenGL! ! where it fits!! what it contains!! how you work with it 11(40)

Information Coding / Computer Graphics, ISY, LiTH. OpenGL! ! where it fits!! what it contains!! how you work with it 11(40) 11(40) Information Coding / Computer Graphics, ISY, LiTH OpenGL where it fits what it contains how you work with it 11(40) OpenGL The cross-platform graphics library Open = Open specification Runs everywhere

More information

CS475/CS675 - Computer Graphics. OpenGL Drawing

CS475/CS675 - Computer Graphics. OpenGL Drawing CS475/CS675 - Computer Graphics OpenGL Drawing What is OpenGL? Open Graphics Library API to specify geometric objects in 2D/3D and to control how they are rendered into the framebuffer. A software interface

More information

Geometry Shaders. And how to use them

Geometry Shaders. And how to use them Geometry Shaders And how to use them OpenGL Pipeline (part of it) Vertex data Vertex shader Vertices Primitives Geometry shader Primitives Fragments Fragment shader Color Depth Stencil Vertex Data Attributes

More information

Building Models. Prof. George Wolberg Dept. of Computer Science City College of New York

Building Models. Prof. George Wolberg Dept. of Computer Science City College of New York Building Models Prof. George Wolberg Dept. of Computer Science City College of New York Objectives Introduce simple data structures for building polygonal models - Vertex lists - Edge lists Deprecated

More information

TSBK03 Screen-Space Ambient Occlusion

TSBK03 Screen-Space Ambient Occlusion TSBK03 Screen-Space Ambient Occlusion Joakim Gebart, Jimmy Liikala December 15, 2013 Contents 1 Abstract 1 2 History 2 2.1 Crysis method..................................... 2 3 Chosen method 2 3.1 Algorithm

More information

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics CS4621/5621 Fall 2015 Basics of OpenGL/GLSL Textures Basics Professor: Kavita Bala Instructor: Nicolas Savva with slides from Balazs Kovacs, Eston Schweickart, Daniel Schroeder, Jiang Huang and Pramook

More information

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017 Applying Textures Lecture 27 Robb T. Koether Hampden-Sydney College Fri, Nov 3, 2017 Robb T. Koether (Hampden-Sydney College) Applying Textures Fri, Nov 3, 2017 1 / 24 Outline 1 Applying Textures 2 Photographs

More information

Introduction to Computer Graphics with WebGL

Introduction to Computer Graphics with WebGL Introduction to Computer Graphics with WebGL Ed Angel The Mandelbrot Set Fractals Fractal (fractional geometry) objects generate some of the most complex and beautiful graphics - The mathematics describing

More information

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

Fog example. Fog is atmospheric effect. Better realism, helps determine distances Fog example Fog is atmospheric effect Better realism, helps determine distances Fog Fog was part of OpenGL fixed function pipeline Programming fixed function fog Parameters: Choose fog color, fog model

More information

CENG 477 Introduction to Computer Graphics. Graphics Hardware and OpenGL

CENG 477 Introduction to Computer Graphics. Graphics Hardware and OpenGL CENG 477 Introduction to Computer Graphics Graphics Hardware and OpenGL Introduction Until now, we focused on graphic algorithms rather than hardware and implementation details But graphics, without using

More information

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques Texture Mapping Textures The realism of an image is greatly enhanced by adding surface textures to the various faces of a mesh object. In part a) images have been pasted onto each face of a box. Part b)

More information

Shaders. Introduction. OpenGL Grows via Extensions. OpenGL Extensions. OpenGL 2.0 Added Shaders. Shaders Enable Many New Effects

Shaders. Introduction. OpenGL Grows via Extensions. OpenGL Extensions. OpenGL 2.0 Added Shaders. Shaders Enable Many New Effects CSCI 420 Computer Graphics Lecture 4 Shaders Jernej Barbic University of Southern California Shading Languages GLSL Vertex Array Objects Vertex Shader Fragment Shader [Angel Ch. 1, 2, A] Introduction The

More information

We assume that you are familiar with the following:

We assume that you are familiar with the following: We will use WebGL 1.0. WebGL 2.0 is now being supported by most browsers but requires a better GPU so may not run on older computers or on most cell phones and tablets. See http://webglstats.com/. We will

More information

From system point of view, a graphics application handles the user input, changes the internal state, called the virtual world by modeling or

From system point of view, a graphics application handles the user input, changes the internal state, called the virtual world by modeling or From system point of view, a graphics application handles the user input, changes the internal state, called the virtual world by modeling or animating it, and then immediately renders the updated model

More information

OpenGL pipeline Evolution and OpenGL Shading Language (GLSL) Part 2/3 Vertex and Fragment Shaders

OpenGL pipeline Evolution and OpenGL Shading Language (GLSL) Part 2/3 Vertex and Fragment Shaders OpenGL pipeline Evolution and OpenGL Shading Language (GLSL) Part 2/3 Vertex and Fragment Shaders Prateek Shrivastava CS12S008 shrvstv@cse.iitm.ac.in 1 GLSL Data types Scalar types: float, int, bool Vector

More information

Building Models. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Building Models. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science Building Models CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science 1 Objectives Introduce simple data structures for building polygonal models - Vertex lists - Edge

More information

CS 432 Interactive Computer Graphics

CS 432 Interactive Computer Graphics CS 432 Interactive Computer Graphics Lecture 2 Part 1 Primitives and Buffers Matt Burlick - Drexel University - CS 432 1 Rendering in OpenGL Ok, so now we want to actually draw stuff! OpenGL (like most

More information

CSE 4431/ M Advanced Topics in 3D Computer Graphics. TA: Margarita Vinnikov

CSE 4431/ M Advanced Topics in 3D Computer Graphics. TA: Margarita Vinnikov CSE 4431/5331.03M Advanced Topics in 3D Computer Graphics TA: Margarita Vinnikov mvinni@cse.yorku.ca Goals of any 3d application is speed. You should always limit the amount of polygons actually rendered

More information

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE UGRAD.CS.UBC.C A/~CS314 Mikhail Bessmeltsev 1 WHAT IS RENDERING? Generating image from a 3D scene 2 WHAT IS RENDERING? Generating image

More information

OUTLINE. Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system

OUTLINE. Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system GRAPHICS PIPELINE 1 OUTLINE Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system 2 IMAGE FORMATION REVISITED Can we mimic the synthetic

More information

Copyright Khronos Group 2012 Page 1. Teaching GL. Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012

Copyright Khronos Group 2012 Page 1. Teaching GL. Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012 Copyright Khronos Group 2012 Page 1 Teaching GL Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012 Copyright Khronos Group 2012 Page 2 Agenda Overview of OpenGL family of APIs Comparison

More information

Could you make the XNA functions yourself?

Could you make the XNA functions yourself? 1 Could you make the XNA functions yourself? For the second and especially the third assignment, you need to globally understand what s going on inside the graphics hardware. You will write shaders, which

More information

Vertex Buffer Objects

Vertex Buffer Objects 1 Vertex Buffer Objects This work is licensed under a Creative Commons Attribution-NonCommercial- NoDerivatives 4.0 International License Mike Bailey mjb@cs.oregonstate.edu VertexBuffers.pptx Vertex Buffer

More information

Vertex Buffer Objects. Vertex Buffer Objects: The Big Idea

Vertex Buffer Objects. Vertex Buffer Objects: The Big Idea 1 Vertex Buffer Objects This work is licensed under a Creative Commons Attribution-NonCommercial- NoDerivatives 4.0 International License Mike Bailey mjb@cs.oregonstate.edu VertexBuffers.pptx Vertex Buffer

More information

Blis: Better Language for Image Stuff Project Proposal Programming Languages and Translators, Spring 2017

Blis: Better Language for Image Stuff Project Proposal Programming Languages and Translators, Spring 2017 Blis: Better Language for Image Stuff Project Proposal Programming Languages and Translators, Spring 2017 Abbott, Connor (cwa2112) Pan, Wendy (wp2213) Qinami, Klint (kq2129) Vaccaro, Jason (jhv2111) [System

More information

Mobile Application Programing: Android. OpenGL Operation

Mobile Application Programing: Android. OpenGL Operation Mobile Application Programing: Android OpenGL Operation Activities Apps are composed of activities Activities are self-contained tasks made up of one screen-full of information Activities start one another

More information

8 Three-Dimensional Object Representations. Chapter 8. Three-Dimensional Object Representations. Department of Computer Science and Engineering 8-1

8 Three-Dimensional Object Representations. Chapter 8. Three-Dimensional Object Representations. Department of Computer Science and Engineering 8-1 Chapter 8 Three-Dimensional Object Representations 8-1 8.1 Overview The main goal of three-dimensional computer graphics is to generate two-dimensional images of a scene or of an object based on a a description

More information

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 3: Shaders

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 3: Shaders Comp 410/510 Computer Graphics Spring 2018 Programming with OpenGL Part 3: Shaders Objectives Basic shaders - Vertex shader - Fragment shader Programming shaders with GLSL Finish first program void init(void)

More information

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)! ! The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 4! stanford.edu/class/ee267/! Lecture Overview! Review

More information

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)! ! The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 4! stanford.edu/class/ee267/! Updates! for 24h lab access:

More information

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL Today s Agenda Basic design of a graphics system Introduction to OpenGL Image Compositing Compositing one image over another is most common choice can think of each image drawn on a transparent plastic

More information

Lecture 5 Vertex and Fragment Shaders-1. CITS3003 Graphics & Animation

Lecture 5 Vertex and Fragment Shaders-1. CITS3003 Graphics & Animation Lecture 5 Vertex and Fragment Shaders-1 CITS3003 Graphics & Animation E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley 2012 Objectives The rendering pipeline and the shaders Data

More information

Supplement to Lecture 22

Supplement to Lecture 22 Supplement to Lecture 22 Programmable GPUs Programmable Pipelines Introduce programmable pipelines - Vertex shaders - Fragment shaders Introduce shading languages - Needed to describe shaders - RenderMan

More information

Information Coding / Computer Graphics, ISY, LiTH GLSL. OpenGL Shading Language. Language with syntax similar to C

Information Coding / Computer Graphics, ISY, LiTH GLSL. OpenGL Shading Language. Language with syntax similar to C GLSL OpenGL Shading Language Language with syntax similar to C Syntax somewhere between C och C++ No classes. Straight ans simple code. Remarkably understandable and obvious! Avoids most of the bad things

More information

Tutorial 04. Harshavardhan Kode. September 14, 2015

Tutorial 04. Harshavardhan Kode. September 14, 2015 Tutorial 04 Harshavardhan Kode September 14, 2015 1 About This tutorial an extension of the Tutorial 03. So you might see quite a lot similarities. The following things are new. A Plane is added underneath

More information

Computer Graphics CS 543 Lecture 4 (Part 2) Building 3D Models (Part 2)

Computer Graphics CS 543 Lecture 4 (Part 2) Building 3D Models (Part 2) Computer Graphics CS 543 Lecture 4 (Part 2) Building 3D Models (Part 2) Prof Emmanuel Agu Computer Science Dept. Worcester Polytechnic Institute (WPI) Modeling a Cube In 3D, declare vertices as (x,y,z)

More information

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer CGT520 Lighting Lighting Color of an object can be specified 1) Explicitly as a color buffer Bedrich Benes, Ph.D. Purdue University Department of Computer Graphics 2) Implicitly from the illumination model.

More information

GEOMETRIC OBJECTS AND TRANSFORMATIONS I

GEOMETRIC OBJECTS AND TRANSFORMATIONS I Computer UNIT Graphics - 4 and Visualization 6 Hrs GEOMETRIC OBJECTS AND TRANSFORMATIONS I Scalars Points, and vectors Three-dimensional primitives Coordinate systems and frames Modelling a colored cube

More information

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico Programming with OpenGL Shaders I Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico Objectives Shader Programming Basics Simple Shaders Vertex shader Fragment shaders

More information

Best practices for effective OpenGL programming. Dan Omachi OpenGL Development Engineer

Best practices for effective OpenGL programming. Dan Omachi OpenGL Development Engineer Best practices for effective OpenGL programming Dan Omachi OpenGL Development Engineer 2 What Is OpenGL? 3 OpenGL is a software interface to graphics hardware - OpenGL Specification 4 GPU accelerates rendering

More information

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico Programming with OpenGL Shaders I Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico 0 Objectives Shader Basics Simple Shaders Vertex shader Fragment shaders 1 Vertex

More information

Tutorial 12: Real-Time Lighting B

Tutorial 12: Real-Time Lighting B Tutorial 12: Real-Time Lighting B Summary The last tutorial taught you the basics of real time lighting, including using the normal of a surface to calculate the diffusion and specularity. Surfaces are

More information

The Graphics Pipeline

The Graphics Pipeline The Graphics Pipeline Ray Tracing: Why Slow? Basic ray tracing: 1 ray/pixel Ray Tracing: Why Slow? Basic ray tracing: 1 ray/pixel But you really want shadows, reflections, global illumination, antialiasing

More information

CS770/870 Spring 2017 Open GL Shader Language GLSL

CS770/870 Spring 2017 Open GL Shader Language GLSL Preview CS770/870 Spring 2017 Open GL Shader Language GLSL Review traditional graphics pipeline CPU/GPU mixed pipeline issues Shaders GLSL graphics pipeline Based on material from Angel and Shreiner, Interactive

More information

CS770/870 Spring 2017 Open GL Shader Language GLSL

CS770/870 Spring 2017 Open GL Shader Language GLSL CS770/870 Spring 2017 Open GL Shader Language GLSL Based on material from Angel and Shreiner, Interactive Computer Graphics, 6 th Edition, Addison-Wesley, 2011 Bailey and Cunningham, Graphics Shaders 2

More information

Announcement. Homework 1 has been posted in dropbox and course website. Due: 1:15 pm, Monday, September 12

Announcement. Homework 1 has been posted in dropbox and course website. Due: 1:15 pm, Monday, September 12 Announcement Homework 1 has been posted in dropbox and course website Due: 1:15 pm, Monday, September 12 Today s Agenda Primitives Programming with OpenGL OpenGL Primitives Polylines GL_POINTS GL_LINES

More information

Advanced Lighting Techniques Due: Monday November 2 at 10pm

Advanced Lighting Techniques Due: Monday November 2 at 10pm CMSC 23700 Autumn 2015 Introduction to Computer Graphics Project 3 October 20, 2015 Advanced Lighting Techniques Due: Monday November 2 at 10pm 1 Introduction This assignment is the third and final part

More information

Computer graphics Labs: Blender (2/3) LuxRender: Interior Scene Rendering

Computer graphics Labs: Blender (2/3) LuxRender: Interior Scene Rendering Computer graphics Labs: Blender (2/3) LuxRender: Interior Scene Rendering University of Liège Department of Aerospace and Mechanical engineering Designed with Blender 2.76b LuxRender During the first tutorial

More information

WebGL A quick introduction. J. Madeira V. 0.2 September 2017

WebGL A quick introduction. J. Madeira V. 0.2 September 2017 WebGL A quick introduction J. Madeira V. 0.2 September 2017 1 Interactive Computer Graphics Graphics library / package is intermediary between application and display hardware Application program maps

More information

CS452/552; EE465/505. Texture Mapping in WebGL

CS452/552; EE465/505. Texture Mapping in WebGL CS452/552; EE465/505 Texture Mapping in WebGL 2-26 15 Outline! Texture Mapping in WebGL Read: Angel, Chapter 7, 7.3-7.5 LearningWebGL lesson 5: http://learningwebgl.com/blog/?p=507 Lab3 due: Monday, 3/2

More information

Computer Graphics Coursework 1

Computer Graphics Coursework 1 Computer Graphics Coursework 1 Deadline Deadline: 4pm, 24/10/2016 4pm 23/10/2015 Outline The aim of the coursework is to modify the vertex and fragment shaders in the provided OpenGL framework to implement

More information

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11 Pipeline Operations CS 4620 Lecture 11 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives to pixels RASTERIZATION

More information

The Application Stage. The Game Loop, Resource Management and Renderer Design

The Application Stage. The Game Loop, Resource Management and Renderer Design 1 The Application Stage The Game Loop, Resource Management and Renderer Design Application Stage Responsibilities 2 Set up the rendering pipeline Resource Management 3D meshes Textures etc. Prepare data

More information

CS179: GPU Programming

CS179: GPU Programming CS179: GPU Programming Lecture 4: Textures Original Slides by Luke Durant, Russel McClellan, Tamas Szalay Today Recap Textures What are textures? Traditional uses Alternative uses Recap Our data so far:

More information

Programming with OpenGL Part 3: Shaders. Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Programming with OpenGL Part 3: Shaders. Ed Angel Professor of Emeritus of Computer Science University of New Mexico Programming with OpenGL Part 3: Shaders Ed Angel Professor of Emeritus of Computer Science University of New Mexico 1 Objectives Simple Shaders - Vertex shader - Fragment shaders Programming shaders with

More information

CS 130 Final. Fall 2015

CS 130 Final. Fall 2015 CS 130 Final Fall 2015 Name Student ID Signature You may not ask any questions during the test. If you believe that there is something wrong with a question, write down what you think the question is trying

More information

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker CMSC427 Advanced shading getting global illumination by local methods Credit: slides Prof. Zwicker Topics Shadows Environment maps Reflection mapping Irradiance environment maps Ambient occlusion Reflection

More information

Mobile Application Programing: Android. OpenGL Operation

Mobile Application Programing: Android. OpenGL Operation Mobile Application Programing: Android OpenGL Operation Activities Apps are composed of activities Activities are self-contained tasks made up of one screen-full of information Activities start one another

More information

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction TEXTURE MAPPING 1 OUTLINE Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction 2 BASIC STRAGEGY Three steps to applying a texture 1. specify the texture

More information

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03.

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15 Min H. Kim KAIST School of Computing Materials SUMMARY 2 1 Light blob from PVC plastic Recall: Given any vector w (not necessarily of

More information

Adaptive Point Cloud Rendering

Adaptive Point Cloud Rendering 1 Adaptive Point Cloud Rendering Project Plan Final Group: May13-11 Christopher Jeffers Eric Jensen Joel Rausch Client: Siemens PLM Software Client Contact: Michael Carter Adviser: Simanta Mitra 4/29/13

More information

CS451Real-time Rendering Pipeline

CS451Real-time Rendering Pipeline 1 CS451Real-time Rendering Pipeline JYH-MING LIEN DEPARTMENT OF COMPUTER SCIENCE GEORGE MASON UNIVERSITY Based on Tomas Akenine-Möller s lecture note You say that you render a 3D 2 scene, but what does

More information

CS 464 Review. Review of Computer Graphics for Final Exam

CS 464 Review. Review of Computer Graphics for Final Exam CS 464 Review Review of Computer Graphics for Final Exam Goal: Draw 3D Scenes on Display Device 3D Scene Abstract Model Framebuffer Matrix of Screen Pixels In Computer Graphics: If it looks right then

More information

How OpenGL Works. Retained Mode. Immediate Mode. Introduction To OpenGL

How OpenGL Works. Retained Mode. Immediate Mode. Introduction To OpenGL How OpenGL Works Introduction To OpenGL OpenGL uses a series of matrices to control the position and way primitives are drawn OpenGL 1.x - 2.x allows these primitives to be drawn in two ways immediate

More information

Overview. By end of the week:

Overview. By end of the week: Overview By end of the week: - Know the basics of git - Make sure we can all compile and run a C++/ OpenGL program - Understand the OpenGL rendering pipeline - Understand how matrices are used for geometric

More information

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1 UNIVERSITY OF EAST ANGLIA School of Computing Sciences Main Series PG Examination 2013-14 COMPUTER GAMES DEVELOPMENT CMPSME27 Time allowed: 2 hours Answer any THREE questions. (40 marks each) Notes are

More information

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018 CSE 167: Introduction to Computer Graphics Lecture #7: Textures Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018 Announcements Project 2 due this Friday at 2pm Grading in

More information

last time put back pipeline figure today will be very codey OpenGL API library of routines to control graphics calls to compile and load shaders

last time put back pipeline figure today will be very codey OpenGL API library of routines to control graphics calls to compile and load shaders last time put back pipeline figure today will be very codey OpenGL API library of routines to control graphics calls to compile and load shaders calls to load vertex data to vertex buffers calls to load

More information

DH2323 DGI13. Lab 2 Raytracing

DH2323 DGI13. Lab 2 Raytracing DH2323 DGI13 Lab 2 Raytracing In this lab you will implement a Raytracer, which draws images of 3D scenes by tracing the light rays reaching the simulated camera. The lab is divided into several steps.

More information

Pipeline Operations. CS 4620 Lecture 14

Pipeline Operations. CS 4620 Lecture 14 Pipeline Operations CS 4620 Lecture 14 2014 Steve Marschner 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives

More information

3d Programming I. Dr Anton Gerdelan

3d Programming I. Dr Anton Gerdelan 3d Programming I Dr Anton Gerdelan gerdela@scss.tcd.ie 3d Programming 3d programming is very difficult 3d programming is very time consuming 3d Programming Practical knowledge of the latest, low-level

More information

Mali & OpenGL ES 3.0. Dave Shreiner Jon Kirkham ARM. Game Developers Conference 27 March 2013

Mali & OpenGL ES 3.0. Dave Shreiner Jon Kirkham ARM. Game Developers Conference 27 March 2013 Mali & OpenGL ES 3.0 Dave Shreiner Jon Kirkham ARM Game Developers Conference 27 March 2013 1 Agenda Some foundational work Instanced geometry rendering Transform feedback Occlusion Queries 2 What s New

More information

Terrain Rendering (Part 1) Due: Thursday November 30 at 10pm

Terrain Rendering (Part 1) Due: Thursday November 30 at 10pm CMSC 23700 Autumn 2017 Introduction to Computer Graphics Project 5 November 16, 2015 Terrain Rendering (Part 1) Due: Thursday November 30 at 10pm 1 Summary The final project involves rendering large-scale

More information

Spring 2011 Prof. Hyesoon Kim

Spring 2011 Prof. Hyesoon Kim Spring 2011 Prof. Hyesoon Kim Application Geometry Rasterizer CPU Each stage cane be also pipelined The slowest of the pipeline stage determines the rendering speed. Frames per second (fps) Executes on

More information

The Rendering Pipeline (1)

The Rendering Pipeline (1) The Rendering Pipeline (1) Alessandro Martinelli alessandro.martinelli@unipv.it 30 settembre 2014 The Rendering Pipeline (1) Rendering Architecture First Rendering Pipeline Second Pipeline: Illumination

More information

CS 543 Lecture 1 (Part 3) Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

CS 543 Lecture 1 (Part 3) Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI) Computer Graphics CS 543 Lecture 1 (Part 3) Prof Emmanuel Agu Computer Science Dept. Worcester Polytechnic Institute (WPI) Recall: OpenGL Skeleton void main(int argc, char** argv){ // First initialize

More information

Texture Mapping. Mike Bailey.

Texture Mapping. Mike Bailey. Texture Mapping 1 Mike Bailey mjb@cs.oregonstate.edu This work is licensed under a Creative Commons Attribution-NonCommercial- NoDerivatives 4.0 International License TextureMapping.pptx The Basic Idea

More information

CS 548: COMPUTER GRAPHICS PORTRAIT OF AN OPENGL PROGRAM SPRING 2015 DR. MICHAEL J. REALE

CS 548: COMPUTER GRAPHICS PORTRAIT OF AN OPENGL PROGRAM SPRING 2015 DR. MICHAEL J. REALE CS 548: COMPUTER GRAPHICS PORTRAIT OF AN OPENGL PROGRAM SPRING 2015 DR. MICHAEL J. REALE INTRODUCTION We re going to talk a little bit about the structure and logic of a basic, interactive OpenGL/GLUT

More information