Computergraphics Exercise 15/16 3. Shading & Texturing Jakob Wagner for internal use only
Shaders Vertex Specification define vertex format & data in model space Vertex Processing transform to clip space Vertex Post-processing clipping, perspective divide, viewport transform Primitive Assembly build points/lines/triangles from vertices Vertex Shader Rasterisation scan-convert primitives to fragments, property transfer Fragment Processing computer fragment properties Fragment Shader Fragment Submission testing and blending write to buffer https://glumpy.github.io/modern-gl.html
Shaders - contain instructions for the Vertex/Fragment processing stages - use C-like syntax (GLSL) - two types of variables: - Uniforms: explicitly uploaded from CPU, constant value until new value is uploaded - Varyings: shader inputs/outputs, varying between each shader instance - built in variables: minimal Vertex shader outputs/ Fragment Shader inputs necessary for pipeline processing e.g. Vertex position output in Vertex Shader and Fragment position input in Fragment Shader
Vertex Shader Vertex Attributes (varying input) - 1st is position in Model Space - 2nd is normal in Model Space layout(location=0) in vec3 in_position; layout(location=1) in vec3 in_normal; Uniforms (constant input) - transformation matrices uploaded with gluniform*v Values passed to Fragment Shader (varying output) - normal in View Space Vertex Processing - position (predefined output) is transformed from Model- to Projection Space - normal is transformed from Model- to View Space uniform mat4 ModelMatrix; uniform mat4 ViewMatrix; uniform mat4 ProjectionMatrix; uniform mat4 NormalMatrix; out vec4 pass_normal; void main() { gl_position = (ProjectionMatrix * ViewMatrix * ModelMatrix) * vec4(in_position, 1.0f); pass_normal = normalize(normalmatrix * vec4(in_normal, 0.0f)); }
Fragment Shader Values interpolated from Vertices (varying input) - normal in View Space Values passed to Sample Processing (varying output) - fragment color, 4th component is opacity Fragment Processing - assign fragment RGB components value of the normal XYZ components and make it opaque in vec4 pass_normal; out vec4 out_color; void main() { out_color = vec4(pass_normal.xyz, 1.0f); }
Shader Uniforms - are part of ShaderProgram state - value does not change until a new value is uploaded - are shared between all stages of a ShaderProgram - upload uniform value to program a. query uniform location with - program handle - uniform name in Shader b. binding shader program c. upload value to uniform location int location = glgetuniformlocation( program_handle, "NameInShader") gluseprogram(program_handle) gluniformmatrix*v(location,..., data/data_ptr) - querying a location is expensive -> store it and only update when the shader is reloaded - querying locations in a ShaderProgram does not require it to be bound - location of uniform of the same name varies between ShaderPrograms
Star Shader - different vertex layout? -> new vertex shader required - 2 inputs: position (vec3) and color (vec3) - color needs to be passed to fragment shader -> vertex shader needs color output variable - fragment shader assigns input color to fragment - Model matrix not necessary because no transformation happens - in update_shader_programs() the star shader needs to be (re)loaded - shader needs View and Projection matrices - uniform locations between shaders vary -> matrix locations in star shader need to be stored in global variable - in update_uniform_locations() the matrix locations in the star shader need to be queried and updated - when modified in update_view() and update_camera(), matrices need to be uploaded to the star shader at their respective location
Rasterisation Vertex Specification define vertex format & data in model space Vertex Processing transform to clip space Vertex Post-processing clipping, perspective divide, viewport transform Primitive Assembly build points/lines/triangles from vertices Vertex Shader Rasterisation scan-convert primitives to fragments, property transfer Fragment Processing computer fragment properties Fragment Shader Fragment Submission testing and blending write to buffer https://glumpy.github.io/modern-gl.html
Property Transfer variable definition Vertex Shader out vec3 pass_color //predefined out vec4 gl_position... Interpolation Fragment Shader in vec3 pass_color //predefined in vec4 gl_fragcoord... variable interpolation v 1 pass_color 1 = vec3(0.0f, 0.0f, 1.0f) gl_position 1 = vec4(10.0f, 20.0f, 0.0f) d 1 pass_color = vec3(d 1 pass_color 1 + d 2 pass_color 2 + d 3 pass_color 3 ) gl_fragcoord = vec4(d 1 gl_position 1 + d 2 gl_position 2 + d 3 gl_position 3 ) d 2 d v 2 v 3 3 pass_color 2 = vec3(1.0f, 0.0f, 0.0f) gl_position 2 = vec4(0.0f, 0.0f, 0.0f) pass_color 3 = vec3(0.0f, 1.0f,0.0f) gl_position 3 = vec4(20.0f, 0.0f, 0.0f)
Perspective-correct Interpolation actual depth z-coordinates (depth) are not linear after projective transformation - very good depth accuracy close to camera - low depth accuracy close to far-plane - output distance between points dependent on distance to camera linear depth linear depth: (v 2 - v 1 ) / (v 4 - v 3 ) = d 1-2 / d 3-4 non-linear depth: (v 2 - v 1 ) / (v 4 - v 3 ) d 1-2 / d 3-4 fragment property interpolation takes place after perspective projection - linear interpolation leads to wrong results - Perspective-correct Interpolation necessary - explanation e.g. in Low, Perspective-Correct Interpolation, 2002 v 1 v 1 v 2 v 2 v 3 v 4 d 1-2 v3 v 2 d 3-4 output v 4 depth actual depth d z d 3-4 input linear depth depth v 1 v 1 v 2 d 1-2 d z http://learnopengl.com/#!advanced-opengl/depth-testing
Interpolation Qualifiers normal vector interpolation - interpolation can be specified in GLSL: - flat: no interpolation, values from first vertex used -> Flat Shading - smooth: default, perspective-correct interpolation - nonperspective: linear interpolation flat (1 normal per triangle) non-flat (1 normal per vertex) - qualifier for variable must match between shaders texture coordinate interpolation smooth (perspective-correct) nonperspective (linear) https://commons.wikimedia.org/wiki/file:phong-shading-sample.jpg http://s19.photobucket.com/user/coincoin/media/perspective-correction.png.html
Phong Reflection Model Components: - ambient: indirect light incoming from general surroundings ->constant - diffuse: Lambertian reflectance, diffusely reflected light from surface microfacets -> dependent on angle α between surface n and light direction l - specular: reflection of light directly to viewer -> dependent on angle ω between viewer v and light direction reflected from surface l -> specular highlight decay (glossiness) controlled by a Formula: k a,k d,k s,a - material parameters, i a,i d,i s - light parameters when v 1 and v 2 unit vectors, then: cos( (v 1, v 2 )) = <v 1, v 2 > (dot product) I = k a i a + k d i d cos(ω) + k s i s cos(α) a l v ω n α l = k a i a + k d i d <l, n> + k s i s <v, l > a https://commons.wikimedia.org/wiki/file:phong_components_version_4.png
Blinn-Phong Reflection Model reflection operation computation expensive -> approximation by Blinn does not require a reflection Blinn-Phong model: instead of angle between viewer and reflected light dir use angle ρ between normal and halfway vector h between viewer and light h = (l + v) / (ǁl + vǁ) = normalize(v + l) v n ρ ο ο h l I = k a i a + k d i d cos(ω) + k s i s cos(ρ) b = k a i a + k d i d <l, n> + k s i s <n, h> b - with same exponent as Phong model a is too small -> b = 4 a - Blinn approximation is actually empirically more accurate than Phong https://commons.wikimedia.org/wiki/file:blinn_phong_comparison.png
Blinn-Phong implementation - shading should be computed in View Space -> requires light and fragment position in View Space -> in Vertex Shader calculate position in View Space and pass to Fragment Shader -> upload uniform of sun position (which is vec3(0.0f, 0.0f, 0.0f) in World Space) in View Space to shader using gluniform3f() or gluniform3fv(), update value when when the View Matrix changes - light color properties can be hardcoded in fragment Shader - planet diffuse color can be assigned through a single vec3 uniform that is changed to the respective color before each planet is drawn - planet ambient color can be assumed as being the same as the diffuse color - planet specular color can be assumed as being white - before calculating angles with the dot product, both vectors need to be normalized
Texture Mapping creating all surface details with modeling is too expensive -> paint details on texture and project on surface project texture on model by assigning a coordinate on the texture to each vertex (UV coordinates) define 3d coordinates 1 (0.0, 1.0) v define texture coordinates (0.5, 0.8) (1.0, 1.0) apply texture to fragments 1 2 3 2 1 (0.1, 0.1) (0.9, 0.1) 3 u (0.0, 0.0) (1.0, 0.0) 3 2
Texture Specification Concept Implementation Texture Specification Framebuffer Object Texture Storage Texture Parameters Sampling Parameters Texture Data Texture Format glteximage* gltexparameter* upload format define
Texture Binding - the OpenGL context has multiple Texture Units named GL_TEXTURE* - each Texture Unit has binding points for each texture type like GL_TEXTURE_2D, GL_PROXY_TEXTURE_1D_ARRAY etc. - there is always one active Texture Unit - all manipulation functions addressed to a Texture Unit binding point are applied to the object at the active Texture Units binding point active GL_TEXTURE_1D GL_TEXTUREk GL_TEXTURE_2D Texture Object... Context GL_TEXTUREk+1... GL_TEXTURE_1D... GL_TEXTURE_2D
Texture Access - in the shader, textures are accessed through sampler uniforms Context - the sampler type defines which binding point is accessed - the samplers holds an integer with the index of the Texture Unit that it should access as value - the index must be uploaded to the sampler value with the gluniform1i() function - if two samplers of different type access the same unit, the rendering will fail - one Texture Object can be bound to multiple Texture Units GL_TEXTUREk GL_TEXTURE_2D Texture Object GL_TEXTUREk+1... GL_TEXTURE_1D Texture Object - the active Texture Unit has no effect on the process gluniform1i(tex_location, k) Shader uniform sampler2d colortex = k uniform sampler1d colortex = k+1
Texture Specification prepare for formating 1. activate Texture Unit to which to bind texture 2. generate Texture Object 3. bind Texture Object to 2d texture binding point of unit glactivetexture(gl_texture0) glgentextures(1, &texture_object) glbindtexture(gl_texture_2d, texture_object) define mandatory sampling parameters 4. define interpolation type when fragment covers multiple texels (texture pixels) 5. define interpolation type when fragment does not exactly cover one texel gltexparameteri(gl_texture_2d, GL_TEXTURE_MIN_FILTER, GL_LINEAR) gltexparameteri(gl_texture_2d, GL_TEXTURE_MAG_FILTER, GL_LINEAR) define texture data and format 6. format Texture Object bound to the 2d binding point - with no mipmaps - data storage format - in resolution - without border - input data format - channel type - data to upload glteximage2d(gl_texture_2d, 0, internal_format, width,height, 0, input_format, channel_type, data_ptr)
Texture Formating glteximage*(target, level, internal_format, width, height, border, input_format, channel_type, data_ptr ) - target: binding point on which to create new image - level: detail level to create image in, 0 when not using mip-maps - internal_format: specifies the number of color components: GL_RED, GL_RG, GL_RGB, GL_RGBA or a special sized or compressed format - width, height: the texture dimensions - border: must be 0, previously the width of a colored border - input_format: the format of the input data, like internal_format but with additional types for compatibility: GL_BGR or GL_RED_INTEGER, for unnormalized data - channel_type: datatype of the pixel data channels: GL_BYTE, GL_FLOAT, GL_INT, or compressed types like GL_UNSIGNED_BYTE_3_3_2
Texture Usage prepare for formating 1. activate Texture Unit to which to bind texture 2. bind Texture Object to 2d texture binding point of unit glactivetexture(gl_texturek) glbindtexture(gl_texture_2d, texture_object) upload unit index to shader 3. get location of sampler uniform 4. bind shader for uniform uploading 5. upload index of unit to sampler use sampler in shader 6. declare sampler variable 7. read data from sampler int color_sampler_location = glgetuniformlocation (program_handle, "ColorTex") gluseprogram(program_handle) gluniform1i(color_sampler_location, k) uniform sampler2d ColorTex vec4 color = texture(colortex, tex_coordinate)
Planet Texturing Texture Creation - load png or tga files with the texture_loader::file() function - the texture struct that is returned contains all variables necessary for specifying the texture format - create a Texture Object from the texture struct for each planet - query and store the location of the sampler uniform for the color texture - upload the index of the Texture Unit you want to use for the color texture - activate the Texture Unit that you want to use - before drawing each planet, bind the respective texture object
Planet Texturing Texture Coordinates - request the model loader to load the texture coordinates by replacing the last parameter model::normal of the model_loader::obj() function with model::normal model::texcoord - in the planet_object initialisation, add another attribute of the type model::texcoord Texture mapping - in the planet vertex shader add another input attribute vec2 in_texcoord that is directly assigned to an output variable vec2 pass_texcoord - in the fragment shader add another input vec2 pass_texcoord - look up the pixel color at pass_texcoord and use it as diffuse and ambient color in the shading computation