An Introduction to OpenGL Programming

Similar documents
OpenGL pipeline Evolution and OpenGL Shading Language (GLSL) Part 2/3 Vertex and Fragment Shaders

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

CS475/CS675 - Computer Graphics. OpenGL Drawing

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 2: First Program

Building Models. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Building Models. Prof. George Wolberg Dept. of Computer Science City College of New York

Computer Graphics CS 543 Lecture 4 (Part 2) Building 3D Models (Part 2)

Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Copyright Khronos Group 2012 Page 1. Teaching GL. Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

CS 432 Interactive Computer Graphics

CS 432 Interactive Computer Graphics

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

WebGL A quick introduction. J. Madeira V. 0.2 September 2017

We assume that you are familiar with the following:

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 3: Shaders

Z-buffer Pipeline and OpenGL. University of Texas at Austin CS384G - Computer Graphics

The 3D Graphics Rendering Pipeline

We will use WebGL 1.0. WebGL 2.0 is now being supported by most browsers but requires a better GPU so may not run on older computers or on most cell

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

Information Coding / Computer Graphics, ISY, LiTH. OpenGL! ! where it fits!! what it contains!! how you work with it 11(40)

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

CS452/552; EE465/505. Image Processing Frame Buffer Objects

CS452/552; EE465/505. Texture Mapping in WebGL

Introduction to Computer Graphics with WebGL

ERKELEY DAVIS IRVINE LOS ANGELES RIVERSIDE SAN DIEGO SAN FRANCISCO EECS 104. Fundamentals of Computer Graphics. OpenGL

CS452/552; EE465/505. Review & Examples

API Background. Prof. George Wolberg Dept. of Computer Science City College of New York

COMP371 COMPUTER GRAPHICS

Lecture 4 of 41. Lab 1a: OpenGL Basics

OpenGL refresher. Advanced Computer Graphics 2012

Lighting and Shading II. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Announcement. Homework 1 has been posted in dropbox and course website. Due: 1:15 pm, Monday, September 12

Tutorial 04. Harshavardhan Kode. September 14, 2015

Comp 410/510 Computer Graphics Spring Programming with OpenGL Part 4: Three Dimensions

CSE 167: Introduction to Computer Graphics Lecture #7: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

Shaders. Slide credit to Prof. Zwicker

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

CS452/552; EE465/505. Models & Viewing

Programming with OpenGL Part 3: Shaders. Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Introduction to Computer Graphics with WebGL

Models and Architectures. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CS452/552; EE465/505. Transformations

Supplement to Lecture 22

Computer Viewing. Prof. George Wolberg Dept. of Computer Science City College of New York

Computer Graphics (CS 543) Lecture 8a: Per-Vertex lighting, Shading and Per-Fragment lighting

Computer Graphics (CS 543) Lecture 3b: Shader Setup & GLSL Introduction

Shaders. Introduction. OpenGL Grows via Extensions. OpenGL Extensions. OpenGL 2.0 Added Shaders. Shaders Enable Many New Effects

Computer Graphics (CS 543) Lecture 4a: Linear Algebra for Graphics (Points, Scalars, Vectors)

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Today s Agenda. Shaders fundamentals. Programming with shader-based OpenGL

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CS452/552; EE465/505. Lighting & Shading

Lessons Learned from HW4. Shading. Objectives. Why we need shading. Shading. Scattering

Three Main Themes of Computer Graphics

Lecture 5 Vertex and Fragment Shaders-1. CITS3003 Graphics & Animation

Computergraphics Exercise 15/ Shading & Texturing

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

C O M P U T E R G R A P H I C S. Computer Graphics. Three-Dimensional Graphics V. Guoying Zhao 1 / 65

Computer Graphics (CS 4731) Lecture 11: Implementing Transformations. Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

From system point of view, a graphics application handles the user input, changes the internal state, called the virtual world by modeling or

CS452/552; EE465/505. Intro to Lighting

OpenGL: What s Coming Down the Graphics Pipeline. Dave Shreiner ARM

Overview. By end of the week:

Introduction to Computer Graphics with WebGL

OpenGL. Jimmy Johansson Norrköping Visualization and Interaction Studio Linköping University

Rendering Objects. Need to transform all geometry then

CS559 Computer Graphics Fall 2015

CPSC 436D Video Game Programming

Starting out with OpenGL ES 3.0. Jon Kirkham, Senior Software Engineer, ARM

CS 432 Interactive Computer Graphics

Precept 2 Aleksey Boyko February 18, 2011

CS452/552; EE465/505. Image Formation

OPENGL RENDERING PIPELINE

Computer Graphics. Bing-Yu Chen National Taiwan University

Computer Graphics (CS 4731) Lecture 11: Implementing Transformations. Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

Programming shaders & GPUs Christian Miller CS Fall 2011

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

Exercise 1 Introduction to OpenGL

Introduction to Computer Graphics. Hardware Acceleration Review

Computer Graphics Seminar

CS770/870 Spring 2017 Open GL Shader Language GLSL

CS770/870 Spring 2017 Open GL Shader Language GLSL

12.2 Programmable Graphics Hardware

Tutorial 1: Your First Triangle!

Lighting and Texturing

Introduction to the OpenGL Shading Language (GLSL)

Lecture 09: Shaders (Part 1)

The Basic Computer Graphics Pipeline, OpenGL-style. Introduction to the OpenGL Shading Language (GLSL)

Computer Graphics (CS 4731) & 2D Graphics Systems

GLSL 1: Basics. J.Tumblin-Modified SLIDES from:

Sign up for crits! Announcments

Pipeline Operations. CS 4620 Lecture 14

gvirtualxray Tutorial 01: Creating a Window and an OpenGL Context Using GLUT

Computer graphics MN1

Tutorial 12: Real-Time Lighting B

CS 543 Lecture 1 (Part 3) Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)

Transcription:

An Introduction to OpenGL Programming Ed Angel University of New Mexico Dave Shreiner ARM, Inc. With some changes by João Madeiras Pereira

What Is OpenGL? OpenGL is a computer graphics rendering application programming interface, or API (for short) With it, you can generate high-quality color images by rendering with geometric and image primitives It forms the basis of many interactive applications that include 3D graphics By using OpenGL, the graphics part of your application can be operating system independent window system independent

Course Ground Rules We ll concentrate on the latest versions of OpenGL They enforce a new way to program with OpenGL Allows more efficient use of GPU resources Modern OpenGL doesn t support many of the classic ways of doing things, such as Fixed-function graphics operations, like vertex lighting and transformations All applications must use shaders for their graphics processing we only introduce a subset of OpenGL s shader capabilities in this course

Evolution of the OpenGL Pipeline

In the Beginning OpenGL 1.0 was released on July 1 st, 1994 Its pipeline was entirely fixed-function the only operations available were fixed by the implementation Vertex Data Pixel Data Vertex Transform and Lighting Texture Store Primitive Setup and Rasterization Fragment Coloring and Texturing Blending The pipeline evolved but remained based on fixed-function operation through OpenGL versions 1.1 through 2.0 (Sept. 2004)

Beginnings of The Programmable Pipeline OpenGL 2.0 (officially) added programmable shaders vertex shading augmented the fixed-function transform and lighting stage fragment shading augmented the fragment coloring stage However, the fixed-function pipeline was still available Vertex Data Pixel Data Vertex Transform and Lighting Texture Store Primitive Setup and Rasterization Fragment Coloring and Texturing Blending

An Evolutionary Change OpenGL 3.0 introduced the deprecation model the method used to remove features from OpenGL The pipeline remained the same until OpenGL 3.1 (released March 24 th, 2009) Introduced a change in how OpenGL contexts are used Context Type Full Forward Compatible Description Includes all features (including those marked deprecated) available in the current version of OpenGL Includes all non-deprecated features (i.e., creates a context that would be similar to the next version of OpenGL)

The Exclusively Programmable Pipeline OpenGL 3.1 removed the fixed-function pipeline programs were required to use only shaders Vertex Data Pixel Data Vertex Shader Texture Store Primitive Setup and Rasterization Fragment Shader Blending Additionally, almost all data is GPU-resident all vertex data sent using buffer objects

What We Can t DO Any use of the fixed function vertex or fragment operations; shaders are mandatory. Use of glbegin/glend and Display lists to define primitives; vertex buffers objects for geometry. Use of quad or polygon primitives; only triangles. Use of most of the built-in attribute and uniform variables in GLSL; pass them manually to shaders.

Vertex Shaders The main application of vertex shaders is to change the vertices of the primitives you already have defined and to setup variables such as lightening that depend of the vertices. The vertex shader is a one vertex in, one vertex out process, and it can't create more vertices.(geometry and Tessellation shaders do that on recent versions)

Vertex Shader (GLSL version < 1.2)

Fragment Shader (GLSL version < 1.2)

More Programmability OpenGL 3.2 (released August 3 rd, 2009) added an additional shading stage geometry shaders modify geometric primitives within the graphics pipeline Vertex Data Vertex Shader Geometry Shader Primitive Setup and Rasterization Fragment Shader Blending Pixel Data Texture Store

Geometry Shader (GLSL version 1.5)

More Evolution Context Profiles OpenGL 3.2 also introduced context profiles profiles control which features are exposed it s like GL_ARB_compatibility, only not insane currently two types of profiles: core and compatible Context Type Profile Description Full Forward Compatible core compatible core compatible All features of the current release All features ever in OpenGL All non-deprecated features Not supported

The Latest Pipelines OpenGL 4.1 (released July 25 th, 2010) included additional shading stages tessellation-control and tessellation-evaluation shaders Latest version is 4.5 Vertex Data Vertex Shader Tessellation Control Shader Tessellation Evaluation Shader Geometry Shader Primitive Setup and Rasterization Fragment Shader Blending Pixel Data Texture Store

OpenGL ES and WebGL OpenGL ES 2.0 Designed for embedded and hand-held devices such as cell phones Based on OpenGL 3.1 Shader based WebGL JavaScript implementation of ES 2.0 Runs on most recent browsers

OpenGL Application Development

A Simplified Pipeline Model Application GPU Data Flow Framebuffer Vertices Vertices Fragments Pixels Vertex Processing Assembly Rasterizer Interpolator Fragment Processing Tests and Blending Vertex Shader Fragment Shader

Pipeline OpenGL 4.3

OpenGL Programming in a Nutshell Modern OpenGL programs essentially do the following steps: Create shader programs Create buffer objects and load data into them Connect data locations with shader variables Render

Application Framework Requirements OpenGL applications need a place to render into usually an on-screen window Need to communicate with native windowing system Each windowing system interface is different We use GLUT (more specifically, freeglut - http://freeglut.sourceforge.net) simple, open-source library that works everywhere handles all windowing operations: opening windows input processing

Simplifying Working with OpenGL Operating systems deal with library functions differently compiler linkage and runtime libraries may expose different functions Additionally, OpenGL has many versions and profiles which expose different sets of functions managing function access is cumbersome, and window-system dependent We use another open-source library, GLEW, to hide those details - http://glew.sourceforge.net

Representing Geometric Objects Geometric objects are represented using vertices A vertex is a collection of generic attributes positional coordinates colors texture coordinates any other data associated with that point in space Position stored in 4 dimensional homogeneous coordinates Vertex data must be stored in vertex buffer objects (VBOs) VBOs must be stored in vertex array objects (VAOs) x y z w

All primitives are specified by vertices OpenGL s Geometric Primitives GL_POINTS GL_LINES GL_LINE_STRIP GL_LINE_LOOP GL_TRIANGLES GL_TRIANGLE_STRIP GL_TRIANGLE_FAN

A First Program

We ll render a cube with colors at each vertex Our example demonstrates: initializing vertex data organizing data for rendering simple object modeling building up 3D objects from geometric primitives building geometric primitives from vertices Our First Program

Classic arrays with indices Now we have to use triangles, Vertex attribute arrays with 8 elements because we have one normal per vertex If different normals at the shared vertices it should be used arrays with 24 elements

Initializing the Cube s Data We ll build each cube face from individual triangles Arrays without indices Need to determine how much storage is required (6 faces)(2 triangles/face)(3 vertices/triangle) const int NumVertices = 36; To simplify communicating with GLSL, we ll use a vec4 class (implemented in C++) similar to GLSL s vec4 type we ll also typedef it to add logical meaning typedef typedef vec4 point4; vec4 color4;

Initializing the Cube s Data (cont d) Before we can initialize our VBO, we need to stage the data Our cube has two attributes per vertex position color We create two arrays to hold the VBO data point4 vpositions[numvertices]; color4 vcolors[numvertices];

Cube Data Vertices of a unit cube centered at origin sides aligned with axes point4 positions[8] = { point4( -0.5, -0.5, 0.5, 1.0 ), point4( -0.5, 0.5, 0.5, 1.0 ), point4( 0.5, 0.5, 0.5, 1.0 ), point4( 0.5, -0.5, 0.5, 1.0 ), point4( -0.5, -0.5, -0.5, 1.0 ), point4( -0.5, 0.5, -0.5, 1.0 ), point4( 0.5, 0.5, -0.5, 1.0 ), point4( 0.5, -0.5, -0.5, 1.0 ) };

We ll also set up an array of RGBA colors color4 colors[8] = { color4( 0.0, 0.0, 0.0, 1.0 ), // black color4( 1.0, 0.0, 0.0, 1.0 ), // red color4( 1.0, 1.0, 0.0, 1.0 ), // yellow color4( 0.0, 1.0, 0.0, 1.0 ), // green color4( 0.0, 0.0, 1.0, 1.0 ), // blue color4( 1.0, 0.0, 1.0, 1.0 ), // magenta color4( 1.0, 1.0, 1.0, 1.0 ), // white color4( 0.0, 1.0, 1.0, 1.0 ) // cyan }; Cube Data (cont d)

Generating a Cube Face from Vertices To simplify generating the geometry, we use a convenience function quad() create two triangles for each face and assigns colors to the vertices int Index = 0; // global variable indexing into VBO arrays void quad( int a, int b, int c, int d ) { vcolors[index] = colors[a]; vpositions[index] = positions[a]; Index++; vcolors[index] = colors[b]; vpositions[index] = positions[b]; Index++; vcolors[index] = colors[c]; vpositions[index] = positions[c]; Index++; vcolors[index] = colors[a]; vpositions[index] = positions[a]; Index++; vcolors[index] = colors[c]; vpositions[index] = positions[c]; Index++; vcolors[index] = colors[d]; vpositions[index] = positions[d]; Index++; }

Generate 12 triangles for the cube 36 vertices with 36 colors void colorcube() { quad( 1, 0, 3, 2 ); quad( 2, 3, 7, 6 ); quad( 3, 0, 4, 7 ); quad( 6, 5, 1, 2 ); quad( 4, 5, 6, 7 ); quad( 5, 4, 0, 1 ); } Generating the Cube from Faces

Vertex Array Objects (VAOs) VAOs store the data of an geometric object Steps in using a VAO generate VAO names by calling glgenvertexarrays() bind a specific VAO for initialization by calling glbindvertexarray() update VBOs associated with this VAO bind VAO for use in rendering This approach allows a single function call to specify all the data for an objects previously, you might have needed to make many calls to make all the data current

VAOs in Code Create a vertex array object GLuint vao; glgenvertexarrays( 1, &vao ); glbindvertexarray( vao );

Storing Vertex Attributes Vertex data must be stored in a VBO, and associated with a VAO The code-flow is similar to configuring a VAO generate VBO names by calling glgenbuffers() bind a specific VBO for initialization by calling glbindbuffer( GL_ARRAY_BUFFER, ) load data into VBO using glbufferdata( GL_ARRAY_BUFFER, ) bind VAO for use in rendering glbindvertexarray()

VBOs in Code Create and initialize a buffer object GLuint buffer; glgenbuffers( 1, &buffer ); glbindbuffer( GL_ARRAY_BUFFER, buffer ); glbufferdata( GL_ARRAY_BUFFER, sizeof(vpositions) + sizeof(vcolors), NULL, GL_STATIC_DRAW ); glbuffersubdata( GL_ARRAY_BUFFER, 0, sizeof(vpositions), vpositions ); glbuffersubdata( GL_ARRAY_BUFFER, sizeof(vpositions), sizeof(vcolors), vcolors );

Connecting Vertex Shaders with Geometric Data Application vertex data enters the OpenGL pipeline through the vertex shader Need to connect vertex data to shader variables requires knowing the attribute location Attribute location can either be queried by calling glgetvertexattriblocation()after the linkage Or set by glbindattriblocation() before the linkage

Associate shader variables with vertex arrays do this after shaders are loaded Vertex Array Code GLuint vposition = glgetattriblocation( program, vposition" ); glenablevertexattribarray( vposition ); glvertexattribpointer( vposition, 4, GL_FLOAT, GL_FALSE, 0,BUFFER_OFFSET(0) ); GLuint vcolor = glgetattriblocation( program,"vcolor" ); glenablevertexattribarray( vcolor ); glvertexattribpointer( vcolor, 4, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(sizeof(vPositions)) );

VAO and VBO

Drawing Geometric Primitives For contiguous groups of vertices: gldrawarrays(gl_triangles, 0, NumVertices ); When reusing (repeating) the vertices, an index buffer should be used for minimizing the storage: gldrawelements(gl_triangles, NumIndices, GL_UNSIGNED_INT, (Glvoid*)indices ); Usually invoked in display callback Initiates vertex shader

Shaders and GLSL

Scalar types: float, int, bool Vector types: vec2, vec3, vec4 ivec2, ivec3, ivec4 bvec2, bvec3, bvec4 Matrix types: mat2, mat3, mat4 Texture sampling: sampler1d, sampler2d, sampler3d, samplercube C++ Style Constructors vec3 a = vec3(1.0, 2.0, 3.0); GLSL Data Types

Operators Standard C/C++ arithmetic and logic operators Overloaded operators for matrix and vector operations mat4 m; vec4 a, b, c; b = a*m; c = m*a;

Access vector components using either: [ ] (c-style array indexing) xyzw, rgba or strq (named components) Components and Swizzling For example: vec3 v; v[1], v.y, v.g, v.t - all refer to the same element Component swizzling: vec3 a, b; a.xy = b.yx;

Qualifiers in, out Copy vertex attributes and other variable into and out of shaders in vec2 texcoord; out vec4 color; uniform shader-constant variable from application uniform float time; uniform vec4 rotation;

Built in Arithmetic: sqrt, power, abs Trigonometric: sin, asin Graphical: length, reflect User defined Functions

gl_position (required) output position from vertex shader gl_fragcoord input fragment position gl_fragdepth input depth value in fragment shader Built-in Variables

Simple Vertex Shader for Cube Example #version 430 in vec4 vposition; in vec4 vcolor; out vec4 color; void main() { color = vcolor; gl_position = vposition; }

The Simplest Fragment Shader #version 430 in vec4 color; out vec4 fcolor; // fragment s final color void main() { fcolor = color; }

Getting Your Shaders into OpenGL Shaders need to be compiled and linked to form an executable shader program OpenGL provides the compiler and linker A program must contain vertex and fragment shaders other shaders are optional Create Program Create Shader Load Shader Source Compile Shader Attach Shader to Program Link Program Use Program glcreateprogram() glcreateshader() glshadersource() glcompileshader() glattachshader() gllinkprogram() gluseprogram() These steps need to be repeated for each type of shader in the shader program

A Simpler Way We ve created a routine for this course to make it easier to load your shaders available at course website GLuint InitShaders( const char* vfile, const char* ffile ); InitShaders takes two filenames vfile path to the vertex shader file ffile for the fragment shader file Fails if shaders don t compile, or program doesn t link

Associating Shader Variables and Data Need to associate a shader variable with an OpenGL data source vertex shader attributes app vertex attributes shader uniforms app provided uniform values OpenGL relates shader variables to indices for the app to set Two methods for determining variable/index association specify association before program linkage query association after program linkage

Determining Locations After Linking Assumes you already know the variables names GLint loc = glgetattriblocation( program, name ); GLint loc = glgetuniformlocation( program, name );

Initializing Uniform Variable Values Uniform Variables gluniform4f( index, x, y, z, w ); GLboolean transpose = GL_TRUE; // Since we re C programmers GLfloat mat[3][4][4] = { }; gluniformmatrix4fv( index, 3, transpose, mat );

Finishing the Cube Program int main( int argc, char **argv ) { glutinit( &argc, argv ); glutinitdisplaymode( GLUT_RGBA GLUT_DOUBLE GLUT_DEPTH ); glutinitwindowsize( 512, 512 ); glutcreatewindow( "Color Cube" ); glewinit(); init(); glutdisplayfunc( display ); glutkeyboardfunc( keyboard ); glutmainloop(); } return 0;

Cube Program s GLUT Callbacks void display( void ) { glclear( GL_COLOR_BUFFER_BIT GL_DEPTH_BUFFER_BIT ); glbindvertexarray(vao); gluseprogram(program); gldrawarrays( GL_TRIANGLES, 0, NumVertices ); glutswapbuffers(); } void keyboard( unsigned char key, int x, int y ) { switch( key ) { case 033: case 'q': case 'Q': exit( EXIT_SUCCESS ); break; } }

Vertex Shader Examples A vertex shader is initiated by each vertex output by gldrawarrays() A vertex shader must output a position in clip coordinates to the rasterizer Basic uses of vertex shaders Transformations Lighting Moving vertex positions

Transformations

Camera Analogy 3D is just like taking a photograph (lots of photographs!) camera viewing volume tripod model

Transformations Transformations take us from one space to another All of our transforms are 4 4 matrices Vertex Data World Coords. Model-View Transform Eye Coords. Projection Transform Clip Coords. Perspective Division (w) Normalized Device Coords. Viewport Transform 2D Window Coordinates

Camera Analogy and Transformations Projection transformations adjust the lens of the camera Viewing transformations tripod define position and orientation of the viewing volume in the world Modeling transformations moving the model Viewport transformations enlarge or reduce the physical photograph

Full Vertex Transformation Pipeline

A vertex is transformed by 4 4 matrices all affine operations are matrix multiplications 3D Transformations All matrices are stored column-major in OpenGL this is opposite of what C programmers expect Matrices are always post-multiplied product of matrix and vector is Mv M m m m m 0 1 2 3 m m m m 4 5 6 7 m m m m 8 9 10 11 m m m m 12 13 14 15

Specifying What You Can See Set up a viewing frustum to specify how much of the world we can see Done in two steps specify the size of the frustum (projection transform) specify its location in space (model-view transform) Anything outside of the viewing frustum is clipped primitive is either modified or discarded (if entirely outside frustum)

Specifying What You Can See (cont d) OpenGL projection model uses eye coordinates the eye is located at the origin looking down the -z axis Projection matrices use a six-plane model: near (image) plane and far (infinite) plane both are distances from the eye (positive values) enclosing planes top & bottom, left & right

Specifying What You Can See (cont d) Orthographic View Perspective View 2 r + l æ r -l 0 0 r -l ö ç 2 t+ b 0 t-b 0 t-b O = ç ç -2 f + n 0 0 f -n f -n ç 0 0 0 1 è ø 2n r + l æ r -l 0 r -l 0 ö ç 2n t+ b 0 t-b t-b 0 P = ç ç -( f + n) -2 fn 0 0 f -n f -n ç 0 0-1 0 è ø

2014/2015, AVT Orthogonal View-Volume

Frustum The portion of the virtual world seen by the camera at a given time is limited by front and back clipping planes. These are at z=n and z= f. Only what is within the viewing cone (also called frustum) is sent down the rendering pipe. clipping plane (l,t,f) projection reference point Y 2014/2015, AVT Z X (r,b,n)

Symetric frustum FOV: Field of View Y v Vista lateral do volume v VRP n Jan. Vis. D CW h Z v V : abertura vertical tg ( V / 2) = h / D Vista topo do volume VRP u n D Jan. Vis. CW w Z v W : abertura horizontal tg ( W / 2) = w / D X v 2014/2015, AVT

Symetric frustum Calcule as dimensões da janela de visualização do frustum simétrico definido por: Perspective(120.0f, 1.33f, 15.0, 120.0) Note: void Perspective( GLdouble fovy, GLdouble aspect, GLdouble near, GLdouble far)

Viewing Transformations Position the camera/eye in the scene place the tripod down; aim camera To fly through a scene change viewing transformation and redraw scene LookAt( eyex, eyey, eyez, lookx, looky, lookz, upx, upy, upz ) up vector determines unique orientation careful of degenerate positions tripod

Creating the LookAt Matrix 1 0 0 0 ) ( ) ( ) ( ˆ ˆ ˆ ˆ ˆ ˆ ˆ n eye n n n v eye v v v u eye u u u n u v u n z y x z y x z y x up n up n eye look eye look

Translation ø ö ç ç ç ç ç ç è æ = 1 0 0 0 1 0 0 0 1 0 0 0 1 ),, ( z y x z y x t t t t t t T Move the origin to a new location

Scale ø ö ç ç ç ç ç ç è æ = 1 0 0 0 0 0 0 0 0 0 0 0 0 ),, ( z y x z y x s s s s s S s Stretch, mirror or decimate a coordinate direction Note, there s a translation applied here to make things easier to see

Rotate coordinate system about an axis in space Rotation Note, there s a translation applied here to make things easier to see

Rotation (cont d) ø ö ç ç ç è æ - - - = 0 0 0 x y x z y z S ( ) ( ) z y x u z y x v v v = = = S u u I u u M t t ) sin( ) )( cos( ( ) ø ö ç ç ç ç ç ç è æ = q 1 0 0 0 0 0 0 v R M

Exercises Consider the following excerpt of code in OpenGL 3.3. Assume that a stack mechanism, like in AVT_MathLib, was implemented for all three types of matrices MODEL, VIEW and PROJECTION used by the Geometric Transform stage of the OpenGL pipeline. void renderscene(void) {. glclear(gl_color_buffer_bit GL_DEPTH_BUFFER_BIT); loadidentity(view); loadidentity(model); lookat(0.0, 0.0, -2.0, 2.0, 2.0, -2.0, 0.0, 0.0, 1.0) pushmatrix(model); translate(model, 2.0f, 2.5.0f, 1.0f); send_matrices(); //send the matrices PROJECTION, VIEW and MODEL to GLSL draw_obj1(); pushmatrix(model); scale (MODEL, 1.5f, 1.5f, 1.5f); send_matrices(); //send the matrices PROJECTION, VIEW and MODEL to GLSL draw_obj2(); popmatrix(model); popmatrix(model); send_matrices(); //send the matrices PROJECTION, VIEW and MODEL to GLSL draw_obj3(); Calculate the matrix VIEW sent to the GLSL in order to draw each of the three objects. Calculate the matrix MODEL sent to the GLSL in order to draw each of the three objects.

in vec4 vposition; in vec4 vcolor; out vec4 color; uniform vec3 theta; Vertex Shader for Rotation of Cube void main() { // Compute the sines and cosines of theta for // each of the three axes in one computation. vec3 angles = radians( theta ); vec3 c = cos( angles ); vec3 s = sin( angles );

Vertex Shader for Rotation of Cube (cont d) // Remember: these matrices are column-major mat4 rx = mat4( 1.0, 0.0, 0.0, 0.0, 0.0, c.x, s.x, 0.0, 0.0, -s.x, c.x, 0.0, 0.0, 0.0, 0.0, 1.0 ); In C language: mat4 ry = mat4( c.y, 0.0, -s.y, 0.0, 0.0, 1.0, 0.0, 0.0, s.y, 0.0, c.y, 0.0, 0.0, 0.0, 0.0, 1.0 );

Vertex Shader for Rotation of Cube (cont d) mat4 rz = mat4( c.z, -s.z, 0.0, 0.0, s.z, c.z, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0 ); } color = vcolor; gl_position = rz * ry * rx * vposition;

Sending Angles from Application Here, we compute our angles (Theta) in our mouse callback GLuint theta; // theta uniform location vec3 Theta; // Axis angles void display( void ) { glclear( GL_COLOR_BUFFER_BIT GL_DEPTH_BUFFER_BIT ); glbindvertexarray(vao); gluseprogram(program); gluniform3fv( theta, 1, Theta ); gldrawarrays( GL_TRIANGLES, 0, NumVertices ); } glutswapbuffers();

Lighting

Lighting simulates how objects reflect light material composition of object light s color and position global lighting parameters Usually implemented in vertex shader for faster speed fragment shader for nicer shading Lighting Principles

Modified Phong Model Computes a color for each vertex using Surface normals Diffuse and specular reflections Viewer s position and viewing direction Ambient light Emission Vertex colors are interpolated across polygons by the rasterizer Phong shading does the same computation per pixel, interpolating the normal across the polygon more accurate results

Surface Normals Normals define how a surface reflects light Application usually provides normals as a vertex atttribute Current normal is used to compute vertex s color Use unit normals for proper lighting scaling affects a normal s length

Phong Lightning Model n specular l ( <<) ( >>) r ambient + diffuse v I k a L a k d L d max P l n,0 k L max r v s s,0

Computation of r Blinn Approximation is expensive h Instead, halfway vector is used Vector normal a uma hipotética faceta reflectora pura Vector médio normalizado h l l v v l v 2 Unit Vectors l and v l n h v

Blinn Approximation Estimation of specular component Use n h Instead of r v Selecting a value for that guarantees: n h r v

Modified Phong Lightning Model (Blinn-Phong Lightning Model) I k a L a i a bd 1 i cd 2 i k d L i max[( l n),0] k L i s i max[( n i h ) i,0] L 1 h h 1 2 n v r 3 l 1 L 2 l 2 L 3 l 3 P

Define the surface properties of a primitive Material Properties Property Diffuse Specular Ambient Emission Shininess Description Base object color Highlight color Low-light color Glow color Surface smoothness you can have separate materials for front and back

Adding Lighting to Cube // vertex shader in vec4 vposition; in vec3 vnormal; out vec4 color; uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct; uniform mat4 ModelView; uniform mat4 Projection; uniform vec4 LightPosition; uniform float Shininess;

Adding Lighting to Cube (cont d) void main() { // Transform vertex position into eye coordinates vec3 pos = vec3(modelview * vposition); vec3 L = normalize(lightposition.xyz - pos); vec3 E = normalize(-pos); vec3 H = normalize(l + E); // Transform vertex normal into eye coordinates vec3 N = normalize(vec3(modelview * vnormal));

Adding Lighting to Cube (cont d) // Compute terms in the illumination equation vec4 ambient = AmbientProduct; float Kd = max( dot(l, N), 0.0 ); vec4 diffuse = Kd*DiffuseProduct; float Ks = pow( max(dot(n, H), 0.0), Shininess ); vec4 specular = Ks * SpecularProduct; if( dot(l, N) < 0.0 ) specular = vec4(0.0, 0.0, 0.0, 1.0) gl_position = Projection * ModelView * vposition; } color = ambient + diffuse + specular; color.a = 1.0;

Fragment Shaders

Fragment Shaders A shader that s executed for each potential pixel fragments still need to pass several tests before making it to the framebuffer There are lots of effects we can do in fragment shaders Per-fragment lighting Texture and bump Mapping Environment (Reflection) Maps

Vertex Shaders Moving vertices: height fields Per vertex lighting: height fields Per vertex lighting: cartoon shading Fragment Shaders Per vertex vs. per fragment lighting: cartoon shader Samplers: reflection Map Bump mapping Shader Examples

Height Fields A height field is a function y = f(x, z) y represents the height of a point for a location in the x-z plane. Heights fields are usually rendered as a rectangular mesh of triangles or rectangles sampled from a grid samples y ij = f( x i, z j )

First, generate a mesh data and use it to initialize data for a VBO float dx = 1.0/N, dz = 1.0/N; for( int i = 0; i < N; ++i ) { float x = i*dx; for( int j = 0; j < N; ++j ) { float z = j*dz; float y = f( x, z ); Displaying a Height Field } } vertex[index++] = vec3( x, y, z ); vertex[index++] = vec3( x, y, z + dz ); vertex[index++] = vec3( x + dx, y, z + dz ); vertex[index++] = vec3( x + dx, y, z ); Finally, display each quad using for( int i = 0; i < NumVertices ; i += 4 ) gldrawarrays( GL_LINE_LOOP, 4*i, 4 );

Time Varying Vertex Shader in vec4 vposition; in vec4 vcolor; uniform float time; // in milliseconds uniform mat4 ModelViewProjectionMatrix; void main() { vec4 v = vposition; vec4 u = sin( time + 5*v ); v.y = 0.1 * u.x * u.z; } gl_position = ModelViewProjectionMatrix * v;

Mesh Display

Solid Mesh: create two triangles for each quad Display with Adding Lighting gldrawarrays( GL_TRIANGLES, 0, NumVertices ); For better looking results, we ll add lighting We ll do per-vertex lighting leverage the vertex shader since we ll also use it to vary the mesh in a time-varying way

uniform float time, shininess; uniform vec4 vposition, lightposition, diffuselight, specularlight; uniform mat4 ModelViewMatrix, ModelViewProjectionMatrix, NormalMatrix; void main() { vec4 v = vposition; vec4 u = sin( time + 5*v ); v.y = 0.1 * u.x * u.z; gl_position = ModelViewProjectionMatrix * v; vec4 diffuse, specular; vec4 eyeposition = ModelViewMatrix * vposition; vec4 eyelightpos = lightposition; Mesh Shader

Mesh Shader (cont d) vec3 N = normalize(normalmatrix * Normal); vec3 L = normalize(vec3(eyelightpos eyeposition)); vec3 E = -normalize(eyeposition.xyz); vec3 H = normalize(l + E); } float Kd = max(dot(l, N), 0.0); float Ks = pow(max(dot(n, H), 0.0), shininess); diffuse = Kd*diffuseLight; specular = Ks*specularLight; color = diffuse + specular;

Shaded Mesh

Texture Mapping

Texture Mapping y z x geometry screen t image s

Texture Mapping and the OpenGL Pipeline Images and geometry flow through separate pipelines that join at the rasterizer complex textures do not affect geometric complexity Vertices Geometry Pipeline Rasterizer Fragment Shader Pixels Pixel Pipeline

Three basic steps to applying a texture 1. specify the texture read or generate image assign to texture enable texturing 2. assign texture coordinates to vertices 3. specify texture parameters wrapping, filtering Applying Textures

Texture Objects Have OpenGL store your images one image per texture object may be shared by several graphics contexts Generate texture names glgentextures( n, *texids );

Texture Objects (cont'd.) Create texture objects with texture data and state glbindtexture( target, id ); Bind textures before using glbindtexture( target, id );

Specifying a Texture Image Define a texture image from an array of texels in CPU memory glteximage2d( target, level,internal format, w, h, border, format, type, *texels );

Mapping a Texture Based on parametric texture coordinates coordinates needs to be specified at each vertex t (0, 1) a Texture Space 1, 1 Object Space (s, t) = (0.2, 0.8) A b (0, 0) (1, 0) c s (0.4, 0.2) B C (0.8, 0.4)

Applying the Texture in the Shader in vec4 texcoord; // Declare the sampler uniform float intensity; uniform sampler2d diffusematerialtexture; // Apply the material color vec3 diffuse = intensity * texture(diffusematerialtexture, texcoord).rgb;

Applying Texture to Cube // add texture coordinate attribute to quad function quad( int a, int b, int c, int d ) { vcolors[index] = colors[a]; vpositions[index] = positions[a]; vtexcoords[index] = vec2( 0.0, 0.0 ); Index++; } vcolors[index] = colors[b]; vpositions[index] = positions[b]; vtexcoords[index] = vec2( 1.0, 0.0 ); Index++; // rest of vertices

Creating a Texture Image // Create a checkerboard pattern for ( int i = 0; i < 64; i++ ) { for ( int j = 0; j < 64; j++ ) { GLubyte c; c = ((i & 0x8 == 0) ^ (j & 0x8 == 0)) * 255; image[i][j][0] = c; image[i][j][1] = c; image[i][j][2] = c; image2[i][j][0] = c; image2[i][j][1] = 0; image2[i][j][2] = c; } }

Texture Object GLuint textures[1]; glgentextures( 1, textures ); glactivetexture( GL_TEXTURE0 ); glbindtexture( GL_TEXTURE_2D, textures[0] ); glteximage2d( GL_TEXTURE_2D, 0, GL_RGB, TextureSize, TextureSize, GL_RGB, GL_UNSIGNED_BYTE, image ); gltexparameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT ); gltexparameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT ); gltexparameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST ); gltexparameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );

Vertex Shader in vec4 vposition; in vec4 vcolor; in vec2 vtexcoord; out vec4 color; out vec2 texcoord; void main() { color = vcolor; texcoord = vtexcoord; gl_position = vposition; }

Fragment Shader in vec4 color; in vec2 texcoord; out vec4 fcolor; uniform sampler texture; void main() { fcolor = color * texture( texture, texcoord ); }

Q & A Thanks for Coming!

Resources

Books Modern discussion The OpenGL Programming Guide, 8 th Edition Interactive Computer Graphics: A Top-down Approach using OpenGL, 6 th Edition The OpenGL Superbible, 5 th Edition Older resources The OpenGL Shading Language Guide, 3 rd Edition OpenGL and the X Window System OpenGL Programming for Mac OS X OpenGL ES 2.0 Programming Guide Not quite yet WebGL Programming Guide: Interactive 3D Graphics Programming with WebGL

Online Resources The OpenGL Website: www.opengl.org API specifications Reference pages and developer resources Downloadable OpenGL (and other APIs) reference cards Discussion forums The Khronos Website: www.khronos.org Overview of all Khronos APIs Numerous presentations

Thanks! Feel free to drop us any questions: angel@cs.unm.edu shreiner@siggraph.org Course notes and programs available at www.daveshreiner.com/siggraph www.cs.unm.edu/~angel