Computergraphics Exercise 15/ Shading & Texturing

Similar documents
Texture Mapping. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Lighting and Texturing

CS4621/5621 Fall Basics of OpenGL/GLSL Textures Basics

Discussion 3. PPM loading Texture rendering in OpenGL

CS 432 Interactive Computer Graphics

SUMMARY. CS380: Introduction to Computer Graphics Texture Mapping Chapter 15. Min H. Kim KAIST School of Computing 18/05/03.

Graphics. Texture Mapping 고려대학교컴퓨터그래픽스연구실.

Applying Textures. Lecture 27. Robb T. Koether. Hampden-Sydney College. Fri, Nov 3, 2017

Overview. By end of the week:

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

COMP371 COMPUTER GRAPHICS

Pipeline Operations. CS 4620 Lecture 14

Lecture 19: OpenGL Texture Mapping. CITS3003 Graphics & Animation

CISC 3620 Lecture 7 Lighting and shading. Topics: Exam results Buffers Texture mapping intro Texture mapping basics WebGL texture mapping

CSE 167: Introduction to Computer Graphics Lecture #7: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

Buffers. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

CT5510: Computer Graphics. Texture Mapping

CS 130 Final. Fall 2015

CS452/552; EE465/505. Texture Mapping in WebGL

CS559 Computer Graphics Fall 2015

Texture Mapping. Computer Graphics, 2015 Lecture 9. Johan Nysjö Centre for Image analysis Uppsala University

Fog example. Fog is atmospheric effect. Better realism, helps determine distances

CPSC 436D Video Game Programming

CS452/552; EE465/505. Image Processing Frame Buffer Objects

INF3320 Computer Graphics and Discrete Geometry

三維繪圖程式設計 3D Graphics Programming Design 第七章基礎材質張貼技術嘉大資工系盧天麒

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

Information Coding / Computer Graphics, ISY, LiTH GLSL. OpenGL Shading Language. Language with syntax similar to C

Shaders. Slide credit to Prof. Zwicker

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Pipeline Operations. CS 4620 Lecture 10

Shadow Rendering. CS7GV3 Real-time Rendering

INFOGR Computer Graphics

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

Textures. Texture Mapping. Bitmap Textures. Basic Texture Techniques

General Purpose computation on GPUs. Liangjun Zhang 2/23/2005

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

The Rasterization Pipeline

CS 4620 Program 3: Pipeline

Steiner- Wallner- Podaras

INF3320 Computer Graphics and Discrete Geometry

Introduction to Computer Graphics with WebGL

Computer Graphics Coursework 1

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

INFOGR Computer Graphics

CS 381 Computer Graphics, Fall 2012 Midterm Exam Solutions. The Midterm Exam was given in class on Tuesday, October 16, 2012.

Best practices for effective OpenGL programming. Dan Omachi OpenGL Development Engineer

OUTLINE. Implementing Texturing What Can Go Wrong and How to Fix It Mipmapping Filtering Perspective Correction

Supplement to Lecture 22

Shading. Slides by Ulf Assarsson and Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

Shading/Texturing. Dr. Scott Schaefer

CHAPTER 1 Graphics Systems and Models 3

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Lab 9 - Metal and Glass

Texturing. Slides done bytomas Akenine-Möller and Ulf Assarsson Department of Computer Engineering Chalmers University of Technology

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

Programmable Graphics Hardware

Programming with OpenGL Part 3: Shaders. Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Complex Shading Algorithms

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

CSE 167: Introduction to Computer Graphics Lecture #8: Textures. Jürgen P. Schulze, Ph.D. University of California, San Diego Spring Quarter 2016

Preparing for Texture Access. Stored Texture Shaders. Accessing Texture Maps. Vertex Shader Texture Access

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

12.2 Programmable Graphics Hardware

Information Coding / Computer Graphics, ISY, LiTH GLSL. OpenGL Shading Language. Language with syntax similar to C

Shader Programs. Lecture 30 Subsections 2.8.2, Robb T. Koether. Hampden-Sydney College. Wed, Nov 16, 2011

Stored Texture Shaders

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

Texture Mapping. Mike Bailey.

Computer graphics 2: Graduate seminar in computational aesthetics

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Shading. Slides by Ulf Assarsson and Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology

The Graphics Pipeline

Scanline Rendering 2 1/42

CS452/552; EE465/505. Intro to Lighting

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

Texture Mapping 1/34

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel,

The Graphics Pipeline

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Some advantages come from the limited environment! No classes. Stranight ans simple code. Remarkably. Avoids most of the bad things with C/C++.

PROFESSIONAL. WebGL Programming DEVELOPING 3D GRAPHICS FOR THE WEB. Andreas Anyuru WILEY. John Wiley & Sons, Ltd.

Geometry Shaders. And how to use them

Computer Graphics with OpenGL ES (J. Han) Chapter 6 Fragment shader


Lab 2-3D Transformations and Vertex Shaders

Today. Rendering - III. Outline. Texturing: The 10,000m View. Texture Coordinates. Specifying Texture Coordinates in GL

Programmable GPUs. Real Time Graphics 11/13/2013. Nalu 2004 (NVIDIA Corporation) GeForce 6. Virtua Fighter 1995 (SEGA Corporation) NV1

OUTLINE. Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system

Copyright Khronos Group 2012 Page 1. Teaching GL. Dave Shreiner Director, Graphics and GPU Computing, ARM 1 December 2012

Rendering Objects. Need to transform all geometry then

CS195V Week 6. Image Samplers and Atomic Operations

Lecture 07: Buffers and Textures

Transcription:

Computergraphics Exercise 15/16 3. Shading & Texturing Jakob Wagner for internal use only

Shaders Vertex Specification define vertex format & data in model space Vertex Processing transform to clip space Vertex Post-processing clipping, perspective divide, viewport transform Primitive Assembly build points/lines/triangles from vertices Vertex Shader Rasterisation scan-convert primitives to fragments, property transfer Fragment Processing computer fragment properties Fragment Shader Fragment Submission testing and blending write to buffer https://glumpy.github.io/modern-gl.html

Shaders - contain instructions for the Vertex/Fragment processing stages - use C-like syntax (GLSL) - two types of variables: - Uniforms: explicitly uploaded from CPU, constant value until new value is uploaded - Varyings: shader inputs/outputs, varying between each shader instance - built in variables: minimal Vertex shader outputs/ Fragment Shader inputs necessary for pipeline processing e.g. Vertex position output in Vertex Shader and Fragment position input in Fragment Shader

Vertex Shader Vertex Attributes (varying input) - 1st is position in Model Space - 2nd is normal in Model Space layout(location=0) in vec3 in_position; layout(location=1) in vec3 in_normal; Uniforms (constant input) - transformation matrices uploaded with gluniform*v Values passed to Fragment Shader (varying output) - normal in View Space Vertex Processing - position (predefined output) is transformed from Model- to Projection Space - normal is transformed from Model- to View Space uniform mat4 ModelMatrix; uniform mat4 ViewMatrix; uniform mat4 ProjectionMatrix; uniform mat4 NormalMatrix; out vec4 pass_normal; void main() { gl_position = (ProjectionMatrix * ViewMatrix * ModelMatrix) * vec4(in_position, 1.0f); pass_normal = normalize(normalmatrix * vec4(in_normal, 0.0f)); }

Fragment Shader Values interpolated from Vertices (varying input) - normal in View Space Values passed to Sample Processing (varying output) - fragment color, 4th component is opacity Fragment Processing - assign fragment RGB components value of the normal XYZ components and make it opaque in vec4 pass_normal; out vec4 out_color; void main() { out_color = vec4(pass_normal.xyz, 1.0f); }

Shader Uniforms - are part of ShaderProgram state - value does not change until a new value is uploaded - are shared between all stages of a ShaderProgram - upload uniform value to program a. query uniform location with - program handle - uniform name in Shader b. binding shader program c. upload value to uniform location int location = glgetuniformlocation( program_handle, "NameInShader") gluseprogram(program_handle) gluniformmatrix*v(location,..., data/data_ptr) - querying a location is expensive -> store it and only update when the shader is reloaded - querying locations in a ShaderProgram does not require it to be bound - location of uniform of the same name varies between ShaderPrograms

Star Shader - different vertex layout? -> new vertex shader required - 2 inputs: position (vec3) and color (vec3) - color needs to be passed to fragment shader -> vertex shader needs color output variable - fragment shader assigns input color to fragment - Model matrix not necessary because no transformation happens - in update_shader_programs() the star shader needs to be (re)loaded - shader needs View and Projection matrices - uniform locations between shaders vary -> matrix locations in star shader need to be stored in global variable - in update_uniform_locations() the matrix locations in the star shader need to be queried and updated - when modified in update_view() and update_camera(), matrices need to be uploaded to the star shader at their respective location

Rasterisation Vertex Specification define vertex format & data in model space Vertex Processing transform to clip space Vertex Post-processing clipping, perspective divide, viewport transform Primitive Assembly build points/lines/triangles from vertices Vertex Shader Rasterisation scan-convert primitives to fragments, property transfer Fragment Processing computer fragment properties Fragment Shader Fragment Submission testing and blending write to buffer https://glumpy.github.io/modern-gl.html

Property Transfer variable definition Vertex Shader out vec3 pass_color //predefined out vec4 gl_position... Interpolation Fragment Shader in vec3 pass_color //predefined in vec4 gl_fragcoord... variable interpolation v 1 pass_color 1 = vec3(0.0f, 0.0f, 1.0f) gl_position 1 = vec4(10.0f, 20.0f, 0.0f) d 1 pass_color = vec3(d 1 pass_color 1 + d 2 pass_color 2 + d 3 pass_color 3 ) gl_fragcoord = vec4(d 1 gl_position 1 + d 2 gl_position 2 + d 3 gl_position 3 ) d 2 d v 2 v 3 3 pass_color 2 = vec3(1.0f, 0.0f, 0.0f) gl_position 2 = vec4(0.0f, 0.0f, 0.0f) pass_color 3 = vec3(0.0f, 1.0f,0.0f) gl_position 3 = vec4(20.0f, 0.0f, 0.0f)

Perspective-correct Interpolation actual depth z-coordinates (depth) are not linear after projective transformation - very good depth accuracy close to camera - low depth accuracy close to far-plane - output distance between points dependent on distance to camera linear depth linear depth: (v 2 - v 1 ) / (v 4 - v 3 ) = d 1-2 / d 3-4 non-linear depth: (v 2 - v 1 ) / (v 4 - v 3 ) d 1-2 / d 3-4 fragment property interpolation takes place after perspective projection - linear interpolation leads to wrong results - Perspective-correct Interpolation necessary - explanation e.g. in Low, Perspective-Correct Interpolation, 2002 v 1 v 1 v 2 v 2 v 3 v 4 d 1-2 v3 v 2 d 3-4 output v 4 depth actual depth d z d 3-4 input linear depth depth v 1 v 1 v 2 d 1-2 d z http://learnopengl.com/#!advanced-opengl/depth-testing

Interpolation Qualifiers normal vector interpolation - interpolation can be specified in GLSL: - flat: no interpolation, values from first vertex used -> Flat Shading - smooth: default, perspective-correct interpolation - nonperspective: linear interpolation flat (1 normal per triangle) non-flat (1 normal per vertex) - qualifier for variable must match between shaders texture coordinate interpolation smooth (perspective-correct) nonperspective (linear) https://commons.wikimedia.org/wiki/file:phong-shading-sample.jpg http://s19.photobucket.com/user/coincoin/media/perspective-correction.png.html

Phong Reflection Model Components: - ambient: indirect light incoming from general surroundings ->constant - diffuse: Lambertian reflectance, diffusely reflected light from surface microfacets -> dependent on angle α between surface n and light direction l - specular: reflection of light directly to viewer -> dependent on angle ω between viewer v and light direction reflected from surface l -> specular highlight decay (glossiness) controlled by a Formula: k a,k d,k s,a - material parameters, i a,i d,i s - light parameters when v 1 and v 2 unit vectors, then: cos( (v 1, v 2 )) = <v 1, v 2 > (dot product) I = k a i a + k d i d cos(ω) + k s i s cos(α) a l v ω n α l = k a i a + k d i d <l, n> + k s i s <v, l > a https://commons.wikimedia.org/wiki/file:phong_components_version_4.png

Blinn-Phong Reflection Model reflection operation computation expensive -> approximation by Blinn does not require a reflection Blinn-Phong model: instead of angle between viewer and reflected light dir use angle ρ between normal and halfway vector h between viewer and light h = (l + v) / (ǁl + vǁ) = normalize(v + l) v n ρ ο ο h l I = k a i a + k d i d cos(ω) + k s i s cos(ρ) b = k a i a + k d i d <l, n> + k s i s <n, h> b - with same exponent as Phong model a is too small -> b = 4 a - Blinn approximation is actually empirically more accurate than Phong https://commons.wikimedia.org/wiki/file:blinn_phong_comparison.png

Blinn-Phong implementation - shading should be computed in View Space -> requires light and fragment position in View Space -> in Vertex Shader calculate position in View Space and pass to Fragment Shader -> upload uniform of sun position (which is vec3(0.0f, 0.0f, 0.0f) in World Space) in View Space to shader using gluniform3f() or gluniform3fv(), update value when when the View Matrix changes - light color properties can be hardcoded in fragment Shader - planet diffuse color can be assigned through a single vec3 uniform that is changed to the respective color before each planet is drawn - planet ambient color can be assumed as being the same as the diffuse color - planet specular color can be assumed as being white - before calculating angles with the dot product, both vectors need to be normalized

Texture Mapping creating all surface details with modeling is too expensive -> paint details on texture and project on surface project texture on model by assigning a coordinate on the texture to each vertex (UV coordinates) define 3d coordinates 1 (0.0, 1.0) v define texture coordinates (0.5, 0.8) (1.0, 1.0) apply texture to fragments 1 2 3 2 1 (0.1, 0.1) (0.9, 0.1) 3 u (0.0, 0.0) (1.0, 0.0) 3 2

Texture Specification Concept Implementation Texture Specification Framebuffer Object Texture Storage Texture Parameters Sampling Parameters Texture Data Texture Format glteximage* gltexparameter* upload format define

Texture Binding - the OpenGL context has multiple Texture Units named GL_TEXTURE* - each Texture Unit has binding points for each texture type like GL_TEXTURE_2D, GL_PROXY_TEXTURE_1D_ARRAY etc. - there is always one active Texture Unit - all manipulation functions addressed to a Texture Unit binding point are applied to the object at the active Texture Units binding point active GL_TEXTURE_1D GL_TEXTUREk GL_TEXTURE_2D Texture Object... Context GL_TEXTUREk+1... GL_TEXTURE_1D... GL_TEXTURE_2D

Texture Access - in the shader, textures are accessed through sampler uniforms Context - the sampler type defines which binding point is accessed - the samplers holds an integer with the index of the Texture Unit that it should access as value - the index must be uploaded to the sampler value with the gluniform1i() function - if two samplers of different type access the same unit, the rendering will fail - one Texture Object can be bound to multiple Texture Units GL_TEXTUREk GL_TEXTURE_2D Texture Object GL_TEXTUREk+1... GL_TEXTURE_1D Texture Object - the active Texture Unit has no effect on the process gluniform1i(tex_location, k) Shader uniform sampler2d colortex = k uniform sampler1d colortex = k+1

Texture Specification prepare for formating 1. activate Texture Unit to which to bind texture 2. generate Texture Object 3. bind Texture Object to 2d texture binding point of unit glactivetexture(gl_texture0) glgentextures(1, &texture_object) glbindtexture(gl_texture_2d, texture_object) define mandatory sampling parameters 4. define interpolation type when fragment covers multiple texels (texture pixels) 5. define interpolation type when fragment does not exactly cover one texel gltexparameteri(gl_texture_2d, GL_TEXTURE_MIN_FILTER, GL_LINEAR) gltexparameteri(gl_texture_2d, GL_TEXTURE_MAG_FILTER, GL_LINEAR) define texture data and format 6. format Texture Object bound to the 2d binding point - with no mipmaps - data storage format - in resolution - without border - input data format - channel type - data to upload glteximage2d(gl_texture_2d, 0, internal_format, width,height, 0, input_format, channel_type, data_ptr)

Texture Formating glteximage*(target, level, internal_format, width, height, border, input_format, channel_type, data_ptr ) - target: binding point on which to create new image - level: detail level to create image in, 0 when not using mip-maps - internal_format: specifies the number of color components: GL_RED, GL_RG, GL_RGB, GL_RGBA or a special sized or compressed format - width, height: the texture dimensions - border: must be 0, previously the width of a colored border - input_format: the format of the input data, like internal_format but with additional types for compatibility: GL_BGR or GL_RED_INTEGER, for unnormalized data - channel_type: datatype of the pixel data channels: GL_BYTE, GL_FLOAT, GL_INT, or compressed types like GL_UNSIGNED_BYTE_3_3_2

Texture Usage prepare for formating 1. activate Texture Unit to which to bind texture 2. bind Texture Object to 2d texture binding point of unit glactivetexture(gl_texturek) glbindtexture(gl_texture_2d, texture_object) upload unit index to shader 3. get location of sampler uniform 4. bind shader for uniform uploading 5. upload index of unit to sampler use sampler in shader 6. declare sampler variable 7. read data from sampler int color_sampler_location = glgetuniformlocation (program_handle, "ColorTex") gluseprogram(program_handle) gluniform1i(color_sampler_location, k) uniform sampler2d ColorTex vec4 color = texture(colortex, tex_coordinate)

Planet Texturing Texture Creation - load png or tga files with the texture_loader::file() function - the texture struct that is returned contains all variables necessary for specifying the texture format - create a Texture Object from the texture struct for each planet - query and store the location of the sampler uniform for the color texture - upload the index of the Texture Unit you want to use for the color texture - activate the Texture Unit that you want to use - before drawing each planet, bind the respective texture object

Planet Texturing Texture Coordinates - request the model loader to load the texture coordinates by replacing the last parameter model::normal of the model_loader::obj() function with model::normal model::texcoord - in the planet_object initialisation, add another attribute of the type model::texcoord Texture mapping - in the planet vertex shader add another input attribute vec2 in_texcoord that is directly assigned to an output variable vec2 pass_texcoord - in the fragment shader add another input vec2 pass_texcoord - look up the pixel color at pass_texcoord and use it as diffuse and ambient color in the shading computation