Artistic Rendering with Graphics Shaders

Similar documents
Introduction to Shaders for Visualization. The Basic Computer Graphics Pipeline

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

Programming shaders & GPUs Christian Miller CS Fall 2011

Programmable Graphics Hardware

Sign up for crits! Announcments

12.2 Programmable Graphics Hardware

Today. Rendering - III. Outline. Texturing: The 10,000m View. Texture Coordinates. Specifying Texture Coordinates in GL

Supplement to Lecture 22

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

The Graphics Pipeline and OpenGL III: OpenGL Shading Language (GLSL 1.10)!

Introduction to the OpenGL Shading Language

The Transition from RenderMan to the OpenGL Shading Language (GLSL)

Programmable Graphics Hardware

Programmable GPUs. Real Time Graphics 11/13/2013. Nalu 2004 (NVIDIA Corporation) GeForce 6. Virtua Fighter 1995 (SEGA Corporation) NV1

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Lecture 5 Vertex and Fragment Shaders-1. CITS3003 Graphics & Animation

CS 130 Final. Fall 2015

CS451Real-time Rendering Pipeline

Computer Graphics Fundamentals. Jon Macey

Shading Languages. Ari Silvennoinen Apri 12, 2004

Introduction to Shaders.

Pipeline Operations. CS 4620 Lecture 14

TSBK03 Screen-Space Ambient Occlusion

Shader Programs. Lecture 30 Subsections 2.8.2, Robb T. Koether. Hampden-Sydney College. Wed, Nov 16, 2011

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Shaders. Slide credit to Prof. Zwicker

Levy: Constraint Texture Mapping, SIGGRAPH, CS 148, Summer 2012 Introduction to Computer Graphics and Imaging Justin Solomon

X. GPU Programming. Jacobs University Visualization and Computer Graphics Lab : Advanced Graphics - Chapter X 1

CHAPTER 1 Graphics Systems and Models 3

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Shaders (some slides taken from David M. course)

Programming with OpenGL Part 3: Shaders. Ed Angel Professor of Emeritus of Computer Science University of New Mexico

GLSL Introduction. Fu-Chung Huang. Thanks for materials from many other people

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Dave Shreiner, ARM March 2009

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Computergrafik. Matthias Zwicker. Herbst 2010

GLSL v1.20. Scott MacHaffie Schrödinger, Inc.

Screen Space Ambient Occlusion TSBK03: Advanced Game Programming

CS130 : Computer Graphics Lecture 2: Graphics Pipeline. Tamar Shinar Computer Science & Engineering UC Riverside

CS 354R: Computer Game Technology

CS770/870 Spring 2017 Open GL Shader Language GLSL

CS770/870 Spring 2017 Open GL Shader Language GLSL

Introduction to the OpenGL Shading Language (GLSL)

Lecture 09: Shaders (Part 1)

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

Technical Game Development II. Reference: Rost, OpenGL Shading Language, 2nd Ed., AW, 2006 The Orange Book Also take CS 4731 Computer Graphics

The Basic Computer Graphics Pipeline, OpenGL-style. Introduction to the OpenGL Shading Language (GLSL)

Information Coding / Computer Graphics, ISY, LiTH GLSL. OpenGL Shading Language. Language with syntax similar to C

Today. Texture mapping in OpenGL. Texture mapping. Basic shaders for texturing. Today. Computergrafik

Adaptive Point Cloud Rendering

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

Shaders CSCI 4239/5239 Advanced Computer Graphics Spring 2014

EDAF80 Introduction to Computer Graphics. Seminar 3. Shaders. Michael Doggett. Slides by Carl Johan Gribel,

SHADER PROGRAMMING. Based on Jian Huang s lecture on Shader Programming

GLSL 1: Basics. J.Tumblin-Modified SLIDES from:

Graphics Hardware. Instructor Stephen J. Guy

Lab 9 - Metal and Glass

OpenGL shaders and programming models that provide object persistence

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

CS559 Computer Graphics Fall 2015

CS130 : Computer Graphics. Tamar Shinar Computer Science & Engineering UC Riverside

Real-Time Graphics Architecture

Today s Agenda. Basic design of a graphics system. Introduction to OpenGL

OPENGL RENDERING PIPELINE

The Rasterization Pipeline

Models and Architectures

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim

The Traditional Graphics Pipeline

CS4621/5621 Fall Computer Graphics Practicum Intro to OpenGL/GLSL

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Programming with OpenGL Shaders I. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

Chapter 7 - Light, Materials, Appearance

Technical Game Development II. Reference: Rost, OpenGL Shading Language, 2nd Ed., AW, 2006 The Orange Book Also take CS 4731 Computer Graphics

Computer Graphics (CS 543) Lecture 10: Soft Shadows (Maps and Volumes), Normal and Bump Mapping

graphics pipeline computer graphics graphics pipeline 2009 fabio pellacini 1

Non-Photorealistic Experimentation Jhon Adams

1.2.3 The Graphics Hardware Pipeline

Lecture 2. Shaders, GLSL and GPGPU

Computergrafik. Matthias Zwicker Universität Bern Herbst 2016

The Traditional Graphics Pipeline

Could you make the XNA functions yourself?

Graphics Hardware and Display Devices

CS GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1. Markus Hadwiger, KAUST

CMPS160 Shader-based OpenGL Programming. All slides originally from Prabath Gunawardane, et al. unless noted otherwise

The Rasterization Pipeline

The Traditional Graphics Pipeline

Introduction to Computer Graphics with WebGL

Shaders CSCI 4229/5229 Computer Graphics Fall 2017

Mobile graphics API Overview

Institutionen för systemteknik

INF3320 Computer Graphics and Discrete Geometry

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Computer Graphics with OpenGL ES (J. Han) Chapter 6 Fragment shader

Graphics Hardware. Computer Graphics COMP 770 (236) Spring Instructor: Brandon Lloyd 2/26/07 1

Practical Shadow Mapping

CSE 167: Introduction to Computer Graphics Lecture #13: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

In- Class Exercises for Shadow Algorithms

Transcription:

Artistic Rendering with Graphics Shaders by A Master Research Paper submitted to the Eastern Michigan University, Department of Computer Science In Partial Fulfillment of the Requirements for the Master of Science in Computer Science Approved at Ypsilanti, Michigan on April 14th, 2010 Professor Matthew Evett Professor Susan Haynes Professor William W. McMillan

Contents i Contents 1 Introduction 1 2 Computer Graphics and Shaders 2 2.1 Computer Graphics............................ 2 2.2 Graphics Shaders............................. 3 2.3 Separation of Shader Languages from Graphics API.......... 5 3 OpenGL Shading Language (GLSL) 6 3.1 Language Characteristics......................... 6 3.2 Example.................................. 9 3.3 Differences to C.............................. 11 3.4 Using several Shaders in the Pipeline.................. 12 3.5 Tools.................................... 16 4 Artistic Rendering 17 4.1 Interior Shading: Color Quantization.................. 17 4.2 Interior Shading: Hatching........................ 19 4.3 Silhouette Drawing............................ 22 5 Summary 33

Chapter 1. Introduction 1 1 Introduction Artistic rendering is a field in computer graphics, which creates a variety of graphical styles and presentations instead of photo realism. The goal of this research study is to develop an universal artistic rendering technique for any arbitrary 3D model. Those non-photorealistic rendering techniques should be easy to apply in other computer graphics application such as games, or any kind of visualization. Different shader techniques and approaches for non-photorealistic rendering will be will be analyzed and developed. The focus will be on the following three non-photorealistic rendering effects: Reducing the number of colors, in which the objects are displayed on the screen Darkening the silhouette of objects for cartoonish look Realizing shading in a sketchy look with pencil hatching Chapter 2 explains basic vocabulary of computer graphics and contains an introduction to graphics shaders. A concrete shader language is introduced in chapter 3. Both chapters are based on a literature study of Graphics Shaders - Theory and Practice [BC09], OpenGL Shading Language [Ros04] and OpenGL Programming Guide [SWND07]. Chapter 4 presents different techniques to achieve artistic rendering, the papers they are based on, and explain how they have been realized and implemented during this research study. Finally chapter 5 summarizes the results of this research study.

Chapter 2. Computer Graphics and Shaders 2 2 Computer Graphics and Shaders This chapter first defines the computer graphics vocabulary that is used and then explains what graphics shaders are, and how they fit into computer graphics. 2.1 Computer Graphics A 3D model is a description of a three-dimensional geometric object. A collection of points in 3D space describes the shape of the object. Those points are called vertices. Those vertices are connected to a mesh of triangles which builds the surface of the object. To each vertex there is an associated normal vector and a texture coordinate. A normal vector is perpendicular to the triangle surfaces around it and is used to determine the lighting during the rendering process. A texture coordinate defines the way the surface is textured. vertex list normal list camera view description 2D image triangle list texture coordinates list 3D model rendering process textures Figure 2.1: The Rendering Process The rendering process generates a 2D image representation of a 3D model. The most common technique for rendering is 3D projection, where the three dimensional vertices are projected to a two dimensional plane using matrix multiplica-

Chapter 2. Computer Graphics and Shaders 3 tion. OpenGL and DirectX are two commonly used technologies supporting this type of rendering. They both make heavy use of the hardware acceleration of a graphics card. The CPU passes rendering data such as vertices, normals, textures to the GPU, where they are stored, processed and finally rendered. At this point the CPU only controls the rendering process by passing control signals or updating the vertices. 2.2 Graphics Shaders Graphics shaders are used in modern computer graphics to achieve special effects. They can replace parts of the fixed function graphics pipeline on the graphics card i.e. on the graphics processing unit (GPU), which operates on the data with massive parallelism. This parallelism is essential for rendering animated computer graphics in real time with at least 30 frames per second, each consisting of 1 to 2 million pixels. There are different kinds of graphics shaders: Vertex shader Fragment shader (also called pixel shader) Each shader can create different effects by creating output data differently than the default fixed-function pipeline would do it. But usually to create more complex effects, the vertex shader computes data, bypasses them to the fragment shader, which act differently depending on the bypasses values. Graphics shaders usually have to replace the necessary functions of the fixed function pipeline: Transforming vectors from model to view space Applying lighting models Using textures to colorize polygons There are a few different real time rendering shader languages OpenGL ARB low-level assembly language OpenGL shading language (GLSL) Microsoft DirectX High-Level Shader Language (HLSL) Cg by Nvidia

Chapter 2. Computer Graphics and Shaders 4 This is an data flow diagram of the graphics pipeline with shaders. Vertex Data Fixed Function Transformation & Lightening Vertex Shader Transformed Vertices Geometry Assembly Assembled Geometry Primitives Clipping Backface Culling Viewport Transformation Rasterization Fixed Function Texturing, Filtering & Blending Fragment/Pixel Shader Frame Buffer Figure 2.2: Graphics Pipeline with Shaders The vertex data is defined by the application, usually it is a 3D model loaded from a filesystem. The fixed function pipeline would transform the coordinates from object to eye space and apply lightening model for each vertex according to its configuration. This fixed function can be replaced by a vertex shader, which can be programmed to fulfill this task differently to achieve a special effect. Then in the graphics pipeline those vertices are assembled into geometry primitives usually triangles. Then clipping takes place i.e. a geometry who is too far or too near from the camera will not be rendered. The backface culling avoids that the wrong triangle side is rendered. Then the coordinates are transformed from eye space to a 2D space by a perspective projection. The rasterization creates a pixel grid according to the size of the rendering target i.e. a display. Instead of calculating the pixel color by the fixed function a fragment shader can be executed for each pixel. The final result is written into the frame buffer.

Chapter 2. Computer Graphics and Shaders 5 2.3 Separation of Shader Languages from Graphics API One may ask why the shader language differs from the standard graphics API. The reason for this is due to the architecture, how computer graphics are created today. It is important to know that the graphics APIs OpenGL and DirectX can be used within several programming languages. A shader description inside of the API (i.e. objects, methods and functions) would mean that an adoption would be necessary for each programming language. This would destroy the portability advantage of shaders. Because of this separation the shader code can be used within the same graphics API in another programming languages. Due to the separation it is possible to develop the graphics API code (i.e. the game/graphics engine) and the shader effect code independently from each other, because they run on different components (i.e. the CPU and GPU). The graphics shaders are executed on graphics hardware and are meshed into the graphics pipeline. The graphics API is initially executed on the CPU and sends information to the GPU.

Chapter 3. OpenGL Shading Language (GLSL) 6 3 OpenGL Shading Language (GLSL) GLSL is a high level shading language with C-like syntax. It is considered a high level language relative to its predecessor OpenGL ARB, which is an assembler-like language. This chapter describes the GLSL version 1.20, which was introduced with OpenGL 2.1 in July 2006. This version is supported by most of the available tools. 3.1 Language Characteristics The control flow statements of GLSL such as if-then-else, looping, and functions calls are identical to C. Data types The most important data types are: Scalars: bool, int, float Vectors: vec2, vec3, vec4 Matrices: mat2, mat3, mat4, mat3x2 The vector data type offers flexible access to their components with character indices, which can be placed in any order and be repeated. 1 f l o a t red = gl_color. r ; 2 vec3 rgb = gl_color. rgb ; 3 4 vec3 xyz = gl_vertex. xyz ; 5 vec2 zx = gl_vertex. zx ; 6 vec3 yyy = gl_vertex. yyy ; Listing 3.1: Flexible Access of Vector Components Note that.xyz is equivalent to.rgb, so the programmer has the freedom to use gl_vertex.rgb or gl_color.xyz - if for any reason he wants to.

Chapter 3. OpenGL Shading Language (GLSL) 7 Functions GLSL provides a lot of mathematical functions and their usage is encouraged instead of programming an equivalent function, because their implementation is highly optimized by the corresponding hardware driver of the graphics card. The functions are all designed to operate on generic data types, so that they can be performed on single values, vectors or matrices, this allows the graphics card to use its parallel computing capabilities. Those are the different types of functions with some examples in parentheses. Trigonometric functions (sin, cos, tan, asin, acos, atan) Exponential functions (pow, log, sqrt) Vector function (length, distance, dot, cross, normalize, reflect) Matrix functions (transpose and component wise function such as multiplication and comparison) Other common functions (abs, round, mod, min, max, step, smoothstep) Operators Operators are used for elementary arithmetic such as addition(+), subtraction(-), multiplication(*), division(/) and assignment (=), equivalent to C. But they can be used with vectors and matrices. 1 vec3 a = { 1. 0, 2. 0 } ; 2 vec3 b = { 3. 0, 4. 0 } ; 3 vec3 c = a + b ; Listing 3.2: Vector Addition 1 f l o a t f a c t o r = 3. 1 ; 2 vec3 c o l o r = gl_color ; 3 c o l o r = c o l o r f a c t o r ; Listing 3.3: Multiplication of a Vector by a Factor 1 vec3 coord = gl_vertex. xyz ; 2 mat3 matrix = mat3 ( 2. 0, 0. 0, 0. 0, 0. 0, 1. 0, 0. 0, 0. 0, 0. 0, 1. 0 ) ; 3 vec3 transformedcoord = matrix coord ; Listing 3.4: Affine Transformation of a Vector

Chapter 3. OpenGL Shading Language (GLSL) 8 User defined Variables There are two types of user defined variables: uniform and varying. A uniform variable is used to set a value from outside of the shading language in the OpenGL API. It can be used to toggle between different shading modes, to set shading parameters or to avoid hardcoded assumptions. With this a game designer can modify the behavior of the shader or simply modify a specific color of an effect. The value of this variable can be changed over the OpenGL API at any times without reloading the shader or anything similar. It is recommended to change the value for animated shader effects, which need a external clock for a time-based animation. A varying variable is used to forward information from the vertex to the fragment shader. The value is interpolated over the triangle surface between the vertices, because the value is only calculated for each vertex and not each pixel, which improves the runtime performance. Predefined Variables In each shader there exists predefined variables, which can be used as in- and output. Shader I/O Variable Description Vertex Input vec4 gl_vertex Vertex coordinate in model space vec4 gl_normal Vertex normal in model space vec4 gl_color Vertex color gl_lightsource[i] Light information vec4 gl_multitexcoord0 Texture coordinate mat4 gl_modelview- Transformation matrix between ProjectionMatrix model and projection space Output vec4 gl_position Vertex coordinate in screen space vec4 gl_frontcolor Front vertex color vec4 gl_backcolor Back vertex color vec4 gl_texcoord[i] Texture coordinate Fragment Input vec4 gl_color Front or back vertex color vec4 gl_texcoord[i] Texture coordinate Output vec4 gl_fragcolor Pixel color Table 3.1: Predefined variables in GLSL

Chapter 3. OpenGL Shading Language (GLSL) 9 Global Constants The OpenGL API has built-in constants, their value can be retrieved through the OpenGL API or in a GLSL program. They provide the shader developer useful information about the graphic card capabilities, so that the shader can adjust his shader to the limits of the graphics card. Those are some examples: const int gl_maxvertexuniformcomponents const int gl_maxfragmentuniformcomponents const int gl_maxvaryingfloats const int gl_maxtextureunits const int gl_maxlights 3.2 Example This is an example of a vertex shader and a fragment shader that rewrites a task of the fixed-function pipeline in GLSL code. The vertex shader first transforms the coordinate of the vertex in the model coordinate system to view coordinate system and stores into the predefined variable gl_position (Line 5). The texture coordinate are copied from gl_multitexcoord0 to gl_texcoord[0] (Line 6). If the uniform variable applylighting is true (Line 7), the light intensity is calculated by the dot product of the surface normal and the light vector to the vertex. For this some vector transformation from model to camera space are necessary. It is stored in the varying variable LightIntensity (Line 9-17). 1 varying f l o a t L i g h t I n t e n s i t y ; 2 uniform bool a p p l y L ighting ; 3 4 void main ( void ) { 5 gl_position = gl_ ModelViewProjectionMatrix gl_vertex ; 6 gl_texcoord [ 0 ] = gl_multitexcoord0 ; 7 i f ( a p p l y L i g h t i n g ) { 8 // t r a n s p o s e normal 9 vec3 transnorm = n o r m a l i z e ( vec3 ( gl_normalmatrix gl_normal ) ) ; 10 // l i g h t p o s i t i o n 11 vec3 LightPos = gl_lightsource [ 0 ]. p o s i t i o n. xyz ; 12 // v e r t e x p o s i t i o n i n model view space 13 vec3 ECposition = vec3 ( gl_modelviewmatrix gl_vertex) ; 14 // c a l c l i g h t i n t e n s i t y 15 L i g h t I n t e n s i t y = dot ( normalize ( LightPos ECposition ), transnorm ) ; 16 L i g h t I n t e n s i t y = 0. 3 + abs ( L i g h t I n t e n s i t y ) ;

Chapter 3. OpenGL Shading Language (GLSL) 10 17 L i g h t I n t e n s i t y = clamp ( L i g h t I n t e n s i t y, 0., 1. ) ; 18 } 19 } Listing 3.5: Example Vertex Shader The fragment shader colorizes the inside of the triangles with the texture. The uniform variable texture must refer to a 2D texture. The uniform variable applylighting indicates, if texture color should be shaded by the lighting model or not (line 7). The varying variable LightIntensity from the vertex shader is used to change the brightness of the color (Line 8). 1 varying f l o a t L i g h t I n t e n s i t y ; 2 uniform bool a p p l y L ighting ; 3 uniform sampler2d t e x t u r e ; 4 5 void main ( void ) { 6 vec4 c o l o r = texture2d ( texture, gl_texcoord [ 0 ]. s t ) ; 7 i f ( a p p l y L i g h t i n g ) { 8 c o l o r = L i g h t I n t e n s i t y c o l o r ; 9 } 10 gl_fragcolor = vec4 ( c o l o r. rgb, 1. 0 ) ; 11 } Listing 3.6: Example Fragment Shader The following figure shows the result of the fragment shader. In the left image the uniform variable applylighting is set on true and in the right image on false. Figure 3.1: Comparison of Rendering without (left) and with Lighting Model (right)

Chapter 3. OpenGL Shading Language (GLSL) 11 3.3 Differences to C Programming on graphics card has some similarities with programming on embedded devices, because of the existing limitation and working with special data registers. But the way of programming is completely different from procedural and object-oriented programming languages such as C++, Java or MATLAB with which computer scientist are familiar today. The developer has to be aware in which environment the code is executed. The environment is a fixed framework, which is not so flexible as software systems in general, where architecture can be changed rather quickly. Such changes in computer graphics must be supported and implemented in the hardware, which takes time, especially when the changes should be done in the official standards of OpenGL or GLSL. The following features of the C language are NOT supported by GLSL. The usage of recursive functions is not allowed. The data types are kept much simpler, i.e. there is only one type of floating point and one type of integer number, so there are no double precisions floats, no short or long integer and no unsigned variables. Pointers do not exists and there is also no data type for characters, because there is no usage for them in graphics shaders. The data structures union and enumeration do not exist in GLSL. The language is not file based, therefore is no need for #include preprocessing statements like in C. Another notable difference is the mandatory use of constructors instead of implicit type cast conversion. The following expressions are forbidden: 1 int i = 2. 0 ; // t h i s would be an i m p l i c i t c o n v e r s i o n from f l o a t to i n t 2 f l o a t f = i ; // and t h i s from i n t to f l o a t Instead this would be correct: 1 int i = 2 ; 2 f l o a t f = f l o a t ( i ) ; This keeps the language type safe.

Chapter 3. OpenGL Shading Language (GLSL) 12 3.4 Using several Shaders in the Pipeline The execution of a vertex and a fragment shader is called a shader pass. The input to a shader pass is a 3D model. Besides this required input, the shader can use several textures as additional input. The output is a 2D image. textures 3D model input shader pass output 2D image Figure 3.2: Input and Output of a Shader Pass Instead of rendering the output image directly by a single shader pass on the screen the render target can be another texture, which can be used as an input texture by another shader pass. The second shader pass can apply image processing techniques on this texture by accessing all pixels and interpolated subpixels. The difference to an image processing environment is that pixel values are not accessed by row and column using integer values, but by texture coordinates given as floating point numbers between 0.0 and 1.0. So the shader has to know how big the texture is to access specific pixels and not interpolated subpixels. Example The following data flow diagram shows the following example with two shader passes. 3D model input Pass: Render Normal Image output Texture: tex flat quad input Pass: Blur Image output Display Figure 3.3: Example with two Shader Passes

Chapter 3. OpenGL Shading Language (GLSL) 13 First Shader Pass: Render Normal Image The first shader pass renders the 3D model using normal vectors as colors and stores the resulting image into a texture. Shader pass name Input 3D model Render target Render Normal Image arbitrary 3D Model Texture: tex The vertex shader normalizes the normal of the vertex and stores it into a varying variable normal, such that the vector components are between a value range of 0.0 and 1.0, by adding 1.0 to each component and dividing it by 2.0. 1 #v e r s i o n 120 2 varying vec3 Normal ; 3 4 void main ( void ) { 5 gl_position = gl_ ModelViewProjectionMatrix gl_vertex ; 6 Normal = ( normalize (gl_normal). xyz + 1. 0 ) / 2. 0 ; 7 } Listing 3.7: Vertex Shader of Shader Pass Render Normal Image The fragment shader simply colorizes the pixel by interpreting the slightly modified normal as RGB value. 1 #v e r s i o n 120 2 varying vec3 Normal ; 3 4 void main ( void ) { 5 gl_fragcolor = vec4 ( Normal. rgb, 1. 0 ) ; 6 } Listing 3.8: Fragment Shader of Shader Pass Render Normal Image

Chapter 3. OpenGL Shading Language (GLSL) 14 Figure 3.4: Result of the first Shader Pass Render Normal Image Second Shader Pass: Blur Image The second shader pass takes the normal image and blurs it. Shader pass name Input 3D model Used texture Render target Blur Image flat quad tex Display The vertex shader has as input a simple square called flat quad. It simply bypassed the vertex coordinate as camera coordinate. The texture coordinate needs to be subtracted from 1.0 to avoid a point reflected image. 1 #v e r s i o n 120 2 void main ( void ) { 3 gl_position = gl_vertex ; 4 gl_texcoord [ 0 ] = 1. 0 gl_multitexcoord0 ; 5 } Listing 3.9: Vertex Shader of Shader Pass Blur Image The fragment shader applies a simple blur filter, by averaging the color of the normal texture of the surrounded pixels and storing it into gl_fragcolor. The uniform variable texturesizex and texturesizey must be set to the texture width and height to access the neighbor pixel in this fragment shader.

Chapter 3. OpenGL Shading Language (GLSL) 15 1 #v e r s i o n 120 2 uniform sampler2d t e x ; 3 uniform f l o a t t e x t u r esizex ; 4 uniform f l o a t t e x t u r esizey ; 5 6 void main ( void ) { 7 vec4 c = vec4 ( 0, 0, 0, 0 ) ; 8 int times = 0 ; 9 f l o a t incx = 1 / t exturesizex ; 10 f l o a t incy = 1 / t e x t u r e S i z ex ; 11 f l o a t boundx = incx 2 ; 12 f l o a t boundy = incy 2 ; 13 for ( f l o a t s= boundx ; s <= boundx ; s+= incx ) { 14 for ( f l o a t t= boundy ; t <= boundy ; t+= incy ) { 15 c += texture2d ( tex, gl_texcoord [ 0 ]. s t+vec2 ( s, t ) ) ; 16 times++; 17 } 18 } 19 c = c / times ; 20 c. a = 1. 0 ; 21 gl_fragcolor = c ; 22 } Listing 3.10: Fragment Shader of Shader Pass Blur Image The rendered image of the second shader pass is shown in the right image in comparison to the sharp input texture i.e. the result of the first shader pass on the left side. Figure 3.5: Result of the first (left) and the second Shader Pass Blur Image (right)

Chapter 3. OpenGL Shading Language (GLSL) 16 3.5 Tools For developing graphics shader, the use of tools is essential. Unfortunately there are only a few free tools for developing graphics shaders. Apple OpenGL Shader Builder 2.1 The OpenGL Shader Builder is an application to develop OpenGL Shaders under Mac OS X. It supports OpenGL up to version 2.1. It offers a geometry shader extension, which is not part of the OpenGL 2.1 standard. This program was very helpful at the beginning of this research study, because it allowed me easily to test basic shader functionality. But it is very limited e.g. there is no way to combine several shaders passes. http://developer.apple.com/graphicsimaging/opengl/shader_image.html http://developer.apple.com/graphicsimaging/opengl/capabilities/ RenderMonkey 1.81 RenderMonkey is a rich shader development environment, which supports OpenGL 2.0, OpenGL ES and DirectX 9.0 Shaders on Windows. I would recommend this software, because it offers a lot of flexibility and high degree of freedom to explore the capabilities of graphic shaders, but an update to newer versions of OpenGL and Direct X are overdue. http://developer.amd.com/gpu/rendermonkey

Chapter 4. Artistic Rendering 17 4 Artistic Rendering This chapter will show techniques and discuss their implementation for interior shading and silhouette drawing. The interior shading techniques will include color quantization to create a cartoonish look and hatching based on intensity to simulate a pencil drawn sketchy looking picture. Silhouette drawing is based on depth and normal map of the rendering. 4.1 Interior Shading: Color Quantization Color quantization creates a cartoonish look, because the object is drawn with the use of fewer colors on the final image. A comic artist draws the interior using only a relatively small number of pens with different colors in contrast to the 16.7 million colors which a computer monitor can display. Before applying color quantization in a graphics shader the intensity of a color must be calculated by a lighting model. Lighting models determine the pixel color by using several partial components such as ambient, diffuse, specular and emissive light, whose intensity usually depends on the angle between the ray of light (or the line of sight) to the object surface. After the pixel color is calculated by the lighting model, the color quantization modifies the color in the fragment shader. 4.1.1 Implementation Details The color quantization for the above rendering can be realized with the following GLSL function used in a fragment shader. The input of the function is the pixel color, determined by the lighting model. The uniform variable quantizationlevel determines how many unique color values exists for each color channel (Line 2).

Chapter 4. Artistic Rendering 18 The quantization process takes place in Line 8. First the color is multiplied by the quantizationlevel, then rounded by the function floor(0.5+x) to next closest integer and finally divided by quantizationlevel. This reduces the total number of colors. The color of the pixel/fragment is set in the last line of the function. 1 // q u a n t i z a t i o n Level 2 uniform f l o a t q u a n t i z a t i o n L e v e l ; 3 4 void q u a n t i z e C o l o r ( vec4 c o l o r ) { 5 // s t o r e p r e v i o u s alpha value 6 f l o a t alpha = c o l o r. a ; 7 // q u a n t i z e p r o c e s s : m u l t i p l y by f a c t o r, round and d i v i d e by f a c t o r 8 c o l o r = f l o o r (0.5+( q u a n t i z a t i o n L e v e l c o l o r ) ) / q u a n t i z a t i o n L e v e l ; 9 // s e t fragment / p i x e l c o l o r 10 gl_fragcolor = vec4 ( c o l o r. rgb, alpha ) ; 11 } Listing 4.1: Function for Color Quantization in a Fragment Shader 4.1.2 Results This is a rendering comparison of a torus without and with color quantization. Figure 4.1: Rendering Comparison of a Torus without and with Color Quantization

Chapter 4. Artistic Rendering 19 4.2 Interior Shading: Hatching Hatching is used to show the lighting, the material properties and the shape of the 3D model. It consists of closely spaced parallel strokes, which are usually monochrome i.e. in grayscale colors. The implemented shader is based on an approach of Praun et al [PHWF01]. The approach needs several hatched textures to be created in a step before runtime. Those hatched textures are called tonal art map. Each hatched texture represents a different lighting intensity. The requirement to those textures is that the lighter texture must be a subset of the darker texture. So each stroke in a lighter texture must be drawn in every darker textures. This allows during the rendering process to fade in and out smoothly between them. Figure 4.2: Example of Tonal Art Map with 6 Textures The tonal art map can be either created automatically or hand-drawn. The tonal art map does not need to consist of parallel and perpendicular strokes, they can also be cross-hatched at different angles, curved strokes or even dots. Figure 4.3: Rendering of three Models with different Tonal Art Maps [PHWF01] This approach requires texture coordinates for each vertex and always places the tonal art map along those. So the result depends mainly on the texture coordinate and the design of the tonal art map.

Chapter 4. Artistic Rendering 20 The advantage of this approach is that it maintains the frame-to-frame coherence among strokes and so avoids a flickery and random look of the strokes, while the position of the object or the camera perspective changes in 3D space. 4.2.1 Implementation Details The vertex shader computes the diffuse lighting intensity for each vertex and stores it into the variable diffuse. The diffuse intensity determines which hatched textures should be used. The weights of each hatched texture are first initialized with the value 0 (Line 11). Then the two weights of the used texture are set to values between 0 and 1 (Line 18-19). 1 varying f l o a t [ 6 ] hatchweights ; 2 3 void main ( void ) { 4 gl_texcoord [ 0 ] = gl_multitexcoord0 ; 5 gl_position = gl_ ModelViewProjectionMatrix gl_vertex ; 6 7 vec3 l i g h t D i r = gl_lightsource [ 0 ]. p o s i t i o n. xyz ; 8 vec3 normalw = normalize (gl_normalmatrix gl_normal) ; 9 f l o a t d i f f u s e = clamp ( dot ( l i g h t D i r. xyz, normalw), 0. 0, 1. 0 ) ; 10 f l o a t hatchfactor = 6. 0 d i f f u s e d i f f u s e d i f f u s e d i f f u s e ; // make shading darker 11 hatchweights = f l o a t [ 6 ] ( 0. 0, 0. 0, 0. 0, 0. 0, 0. 0, 0. 0 ) ; 12 hatchfactor = clamp ( hatchfactor, 0. 0, 5. 0 ) ; // v a l u e s between 0. 0 and 5. 0 13 int index = int ( f l o o r ( 5. 0 hatchfactor ) ) ; 14 f l o a t blending = f r a c t ( hatchfactor ) ; // blending value 15 i f ( hatchfactor == 5. 0 ) { 16 hatchweights [ 0 ] = 1. 0 ; 17 } else { 18 hatchweights [ index ] = blending ; 19 hatchweights [ index +1] = 1. 0 blending ; 20 } 21 } Listing 4.2: Vertex Shader of Render Hatched Model In the fragment shader the color of each pixel is blended between the hatched textures. The color values of each hatched texture is multiplied with its corresponding weight. The sum of those is the color of the fragment (Line 7-12). This ensures a smooth transition from one hatched texture to another over the entire model surface. 1 uniform sampler2d Hatch0, Hatch1, Hatch2, Hatch3, Hatch4, Hatch5 ; 2 varying f l o a t [ 6 ] hatchweights ; 3

Chapter 4. Artistic Rendering 4 5 6 7 8 9 10 11 12 13 14 21 void main ( void ) { vec2 s t = gl_texcoord [ 0 ]. s t ; // t e x t u r e c o o r d i n a t e // compose hatched c o l o r by b l e n d i n g between t e x t u r e o f t o n a l a r t maps u s i n g hatch w e i g h t s. gl_fragcolor = t e x t u r e 2 D ( Hatch0, s t ) hatchweights [ 0 ] + t e x t u r e 2 D ( Hatch1, s t ) hatchweights [ 1 ] + t e x t u r e 2 D ( Hatch2, s t ) hatchweights [ 2 ] + t e x t u r e 2 D ( Hatch3, s t ) hatchweights [ 3 ] + t e x t u r e 2 D ( Hatch4, s t ) hatchweights [ 4 ] + t e x t u r e 2 D ( Hatch5, s t ) hatchweights [ 5 ] ; gl_fragcolor. a = 1. 0 ; } Listing 4.3: Fragment Shader of Render Hatched Model 4.2.2 Results These are the hatched rendering results. The tonal art map of figure 4.2 was used. Figure 4.4: Rendering Results of three Models in three Perspectives

Chapter 4. Artistic Rendering 22 4.3 Silhouette Drawing Different approaches for silhouette drawing are describe by Isenberg et al [IFH + 03], Lee et al [LKL06], Nienhaus and Doellner [ND04]. Normal and depth images can be used to detect silhouette in image space with image processing techniques. Edge detection is applied on normal and depth image to identify silhouette pixels. 4.3.1 Implementation Details Here is an overview of the implementation for silhouette drawing. 3D model Pass: Render Depth Pass: Render Normal depth image normal image Pass: Render Depth Edge Pass: Render Normal Edge depth edge image normal edge image Pass: Render Silhouette silhouette image Pass: Render Model image Pass: Combine Display Figure 4.5: Data Flow Diagram of the Shader Passes for Silhouette Drawing Two silhouette detection approaches are used simultaneously. In Render Depth the depth image of the scene is rendered and stored into a renderable texture, on which Render Depth Edge uses a edge detection technique of image processing and creates the depth edge image. The same concept is used to detect edges in the normal image in the two shaders Render Normal and Render Normal Edge.

Chapter 4. Artistic Rendering 23 The shader Render Contours joins the result two edge images together and increases the thickness of the silhouette. Independently from the silhouette detection the interior of the model is shaded in Render Model, for the silhouette drawing it is irrelevant which kind of rendering technique is used there. The last shader pass Combine unifies the result of interior shading and silhouette drawing together. Shader Pass: Render Depth The shader pass Render Depth calculates the distance from the camera to each pixel and stores this depth image in a renderable texture. Shader pass name Input 3D model Render target Render Depth arbitrarily Texture: depth image Example output The vertex shader calculates the position of the vertex in the 3D viewspace (Line 5) and stores the z-component into the variable depth (Line 6). 1 #v e r s i o n 120 2 varying f l o a t depth ; 3 4 void main ( void ) { 5 gl_position = gl_ ModelViewProjectionMatrix gl_vertex ; 6 depth = gl_position. z ; 7 } Listing 4.4: Vertex Shader of Render Depth The fragment shader can access the interpolated depth over the varying variable depth. Each pixel is color coded by the depth i.e. the distance to the camera, where

Chapter 4. Artistic Rendering 24 pixels closer to the camera have a brighter yellow than pixels further away from the camera. 1 #v e r s i o n 120 2 varying f l o a t depth ; 3 4 void main ( void ) { 5 f l o a t depthcolor = 1 depth / 9 0 0. 0 ; 6 gl_fragcolor = vec4 ( depthcolor, depthcolor, 0. 0, 1. 0 ) ; 7 } Listing 4.5: Fragment Shader of Render Depth Shader Pass: Render Depth Edge The shader pass Render Depth Edge takes the texture depthimage and applies an image based technique for detecting edges. If the edge strength is over a threshold, the pixel will have the color white, otherwise black. Shader pass name Input 3D model Used texture Render target Render Depth Edge flat quad depth image depth edge image Example output The vertex shader is identical to the one in the listing 3.9 on page 14, which uses as input model a screen aligned flat square and applies image processing techniques on input texture in the fragment shader.

Chapter 4. Artistic Rendering 25 The fragment shader access the surrounded pixels on the texture which represents the depth for each pixel (Line 18-22) and applies the discrete Laplace operator on it (Line 25). The convolution kernel of the used discrete Laplace operator is shown in equation 4.1. 0 1 0 D 2 st = 1 4 1 (4.1) 0 1 0 If the absolute value of the convolution result exceeds the threshold depthedgethreshold, the pixel color will set to white, otherwise black (Line 27-30). 1 #v e r s i o n 120 2 uniform sampler2d depthimage ; 3 uniform f l o a t t e x t u r esizex ; 4 uniform f l o a t t e x t u r esizey ; 5 uniform f l o a t depthedgethreshold ; 6 7 f l o a t getdepth ( vec2 s t ) { 8 vec2 texcoord = clamp ( st, 0. 0 0 1, 0. 9 9 9 ) ; 9 return texture2d ( depthimage, texcoord ). r ; 10 } 11 12 void main ( void ) { 13 f l o a t dxtex = 1. 0 / t e x t u r e SizeX ; 14 f l o a t dytex = 1. 0 / t e x t u r e SizeY ; 15 16 vec2 s t = gl_texcoord [ 0 ]. s t ; 17 // a c c e s s c e n t e r p i x e l and 4 surrounded p i x e l 18 f l o a t c e n t e r = getdepth ( s t ) ; 19 f l o a t l e f t = getdepth ( s t + vec2 ( dxtex, 0. 0 ) ) ; 20 f l o a t r i g h t = getdepth ( s t + vec2( dxtex, 0. 0 ) ) ; 21 f l o a t up = getdepth ( s t + vec2 (0.0, dytex ) ) ; 22 f l o a t down = getdepth ( s t + vec2 ( 0. 0, dytex ) ) ; 23 24 // d i s c r e t e Laplace o p e r a t o r 25 f l o a t l a p l a c e = abs( 4 c e n t e r + l e f t + r i g h t + up + down) ; 26 // i f r e s u l t o f c o n v o l u t i o n i s over t h r e s h o l d => t h e r e i s an edge 27 i f ( l a p l a c e > depthedgethreshold ) { 28 gl_fragcolor = vec4 ( 1. 0, 1. 0, 1. 0, 1. 0 ) ; // => c o l o r the p i x e l white 29 } else { 30 gl_fragcolor = vec4 ( 0. 0, 0. 0, 0. 0, 1. 0 ) ; // black 31 } 32 } Listing 4.6: Fragment Shader of Render Depth Edge

Chapter 4. Artistic Rendering 26 Shader Pass: Render Normal The shader pass Render Normal renders the 3D model and colorize the surface with the color coded normal value. Shader pass name Input 3D model Render target Render Normal arbitrarily normal image Example output The vertex shader converts each vertex from object to view space and stores the normal value into the varying variable normal. 1 #v e r s i o n 120 2 varying vec3 normal ; 3 4 void main ( void ) { 5 normal = ( ( gl_normalmatrix gl_normal). xyz + 1. 0 ) / 2. 0 ; 6 gl_position = gl_ ModelViewProjectionMatrix gl_vertex ; 7 } Listing 4.7: Vertex Shader of Render Normal The fragment shader accesses the interpolated values of the varying variable normal and rewrites them as color into gl_fragcolor. 1 #v e r s i o n 120 2 varying vec3 normal ; 3 4 void main ( void ) { 5 // c o l o r i z e p i x e l with normal v e c t o r used as c o l o r 6 gl_fragcolor. rgb = normal. xyz ; 7 } Listing 4.8: Fragment Shader of Render Normal

Chapter 4. Artistic Rendering 27 Shader Pass: Render Normal Edge Shader pass name Render Normal Edge Input 3D model flat quad Used texture normal image Render target normal edge image Example output The vertex shader is identical to the one in the listing 3.9 on page 14, which uses as input model a screen aligned flat square and applies image processing techniques on input texture in the fragment shader. The fragment shader of Render Normal Edge is basically the same as the fragment shader of Render Depth Edge on page 25, but this time edges are detected in all three color channels, because they represent the normal vector components x, y, and z. 1 #v e r s i o n 120 2 uniform sampler2d normalimage ; 3 uniform f l o a t t e x t u r esizex ; 4 uniform f l o a t t e x t u r esizey ; 5 uniform f l o a t normaledgethreshold ; 6 7 vec3 getnormal ( vec2 s t ) { 8 vec2 texcoord = clamp ( st, 0. 0 0 1, 0. 9 9 9 ) ; 9 return texture2d ( normalimage, texcoord ). rgb ; 10 } 11 12 void main ( void ) { 13 f l o a t dxtex = 1. 0 / t e x t u r e SizeX ; 14 f l o a t dytex = 1. 0 / t e x t u r e SizeY ; 15 16 vec2 s t = gl_texcoord [ 0 ]. s t ; 17 // a c c e s s c e n t e r p i x e l and 4 surrounded p i x e l 18 vec3 c e n t e r = getnormal ( s t ). rgb ; 19 vec3 l e f t = getnormal ( s t + vec2 ( dxtex, 0. 0 ) ). rgb ; 20 vec3 r i g h t = getnormal ( s t + vec2( dxtex, 0. 0 ) ). rgb ;

Chapter 4. Artistic Rendering 28 21 vec3 up = getnormal ( s t + vec2 (0.0, dytex ) ). rgb ; 22 vec3 down = getnormal ( s t + vec2 ( 0. 0, dytex ) ). rgb ; 23 24 // d i s c r e t e Laplace o p e r a t o r 25 vec3 l a p l a c e = abs( 4 c e n t e r + l e f t + r i g h t + up + down) ; 26 // i f one rgb component o f c o n v o l u t i o n r e s u l t i s over t h r e s h o l d => edge 27 i f ( l a p l a c e. r > normaledgethreshold 28 l a p l a c e. g > normaledgethreshold 29 l a p l a c e. b > normaledgethreshold ) { 30 gl_fragcolor = vec4 ( 0. 0, 1. 0, 0. 0, 1. 0 ) ; // => c o l o r the p i x e l green 31 } else { 32 gl_fragcolor = vec4 ( 0. 0, 0. 0, 0. 0, 1. 0 ) ; // black 33 } 34 } Listing 4.9: Fragment Shader of Render Normal Edge Shader Pass: Render Silhouette The shader pass Render Silhouette takes the two textures depth edge image and normal edge image, whose stores the contour pixels detected by the two approaches. Shader pass name Input 3D model Used texture Render target Render Silhouette flat quad depth edge image normal edge image silhouette image Example output The fragment shader uses dilation, which is a mathematical morphology operator, to expand the shape of the silhouette. It looks for pixels, from the depth edge image and normal edge image, that represents a silhouette (Line 8-13). If one of the neighbor pixel is an edge (Line 26), the center

Chapter 4. Artistic Rendering 29 pixel on the render target will be set to be part of the silhouette i.e. colored blue (Line 28). Otherwise it will be black (Line 34). 1 #v e r s i o n 120 2 uniform sampler2d depthedgeimage ; 3 uniform sampler2d normaledgeimage ; 4 uniform f l o a t t e x t u r esizex ; 5 uniform f l o a t t e x t u r esizey ; 6 7 // a c c e s s maximum edge value o f depth and normal image 8 f l o a t getmaximumedgevalue ( vec2 s t ) { 9 vec2 texcoord = clamp ( st, 0. 0 0 1, 0. 9 9 9 ) ; 10 vec4 d = texture2d ( depthedgeimage, texcoord ) ; 11 vec4 n = texture2d ( normaledgeimage, texcoord ) ; 12 return max( d. r, n. g ) ; 13 } 14 15 void main ( void ) { 16 f l o a t dxtex = 1. 0 / t e x t u r e SizeX ; 17 f l o a t dytex = 1. 0 / t e x t u r e SizeY ; 18 19 vec2 s t = gl_texcoord [ 0 ]. s t ; 20 // d e f i n e s i z e o f d i l a t i o n 21 int scope = 2 ; 22 for ( int s = scope ; s <= scope ; s++){ 23 for ( int t = scope ; t <= scope ; t++){ 24 vec2 o f f s e t = vec2 ( s dxtex, t dytex ) ; 25 // check f o r edge 26 i f ( getmaximumedgevalue ( s t + o f f s e t ) > 0. 1 ) { 27 // i s s i l h o u e t t e => c o l o r p i x e l blue 28 gl_fragcolor = vec4 ( 0. 0, 0. 0, 1. 0, 1. 0 ) ; 29 return ; 30 } 31 } 32 } 33 // no s i l h o u e t t e => c o l o r p i x e l black 34 gl_fragcolor = vec4 ( 0. 0, 0. 0, 0. 0, 1. 0 ) ; 35 } Listing 4.10: Fragment Shader of Render Silhouette

Chapter 4. Artistic Rendering 30 Shader Pass: Render Model This shader passes renders the interior of the model. The results of silhouette drawing are independent from this shader pass. The hatching implementation as shown in chapter 4.2 is used. Shader pass name Input 3D model Render target Render Model arbitrarily Example output interior image

Chapter 4. Artistic Rendering 31 Shader Pass: Combine The last shader pass called Combine simply unifies the silhouette image with the rendered image. Shader pass name Input 3D model Used texture Render target Combine flat quad silhouette image interior image Display Example output The vertex shader is identical to the one in the listing 3.9 on page 14, which uses as input model a screen aligned flat square and applies image processing techniques on input texture in the fragment shader. The fragment shader blends between the texture color of the interior rendering (Line 8) and the color of the silhouette defined by the uniform variable silhouettecolor (Line 9). The blue color channel of the silhouette image defines the blending factor (Line 10). 1 uniform sampler2d i n t e r i o r I m a g e ; 2 uniform sampler2d s i l h o u e t t e I m a g e ; 3 uniform vec4 s i l h o u e t t e C o l o r ; 4 void main ( void ) { 5 vec2 s t = gl_texcoord [ 0 ]. s t ; 6 // mix f u n c t i o n f o r l i n e a r blending l i k e t h i s : x (1.0 a ) + y a 7 gl_fragcolor = mix ( 8 texture2d ( i n t e r i o r I m a g e, s t ), // x = b a s e c o l o r 9 s i l h o u e t t e C o l o r, // y = c o l o r to blend i n t o 10 texture2d ( s i l h o u e t t e I m a g e, s t ). b // a = f a c t o r o f blending 11 ) ; 12 } Listing 4.11: Fragment Shader of Combine

Chapter 4. Artistic Rendering 32 4.3.2 Results Here are some rendering results of the silhouette drawing with different 3D models. Figure 4.6: Silhouette Drawing Results

Chapter 5. Summary 33 5 Summary Before actually applying the techniques presented in chapter 4, an intense study of books about graphics shader in general, the OpenGL shading language and finally paper about the specific effect techniques was necessary. The effects were implemented using first the tool OpenGL Shader Builder and finally RenderMonkey for more serious GLSL programming (chapter 3.5). The three non-photorealistic rendering effects, which were set as goal of this study, were analyzed and implemented (chapter 4). The first two effects are the interior shading techniques: Color quantization and intensity hatching. Color quantization reduced the number of colors used in a rendering (chapter 4.1). Intensity hatching blends between resembling hatched textures to represents different surface intensity (chapter 4.2). Finally silhouette drawing was realized by implementing several shader passes, which rendered normal and depth image and analyzed it with image processing techniques (chapter 4.3).

Bibliography 34 Bibliography [BC09] [IFH + 03] [LKL06] [ND04] Mike Bailey and Steve Cunningham. Graphics Shader: Theory and Practice. A K Peters, Ltd., 2009. Tobias Isenberg, Bert Freudenberg, Nick Halper, Stefan Schlechtweg, and Thomas Strothotte. A developer s guide to silhouette algorithms for polygonal models. IEEE Comput. Graph. Appl., 23(4):28 37, 2003. Hyunjun Lee, Sungtae Kwon, and Seungyong Lee. Real-time pencil rendering. In NPAR 06: Proceedings of the 4th international symposium on Non-photorealistic animation and rendering, pages 37 45, New York, NY, USA, 2006. ACM. Marc Nienhaus and Jürgen Döllner. Sketchy drawings. In AFRIGRAPH 04: Proceedings of the 3rd international conference on Computer graphics, virtual reality, visualisation and interaction in Africa, pages 73 81, New York, NY, USA, 2004. ACM. [PHWF01] Emil Praun, Hugues Hoppe, Matthew Webb, and Adam Finkelstein. Real-time hatching. In SIGGRAPH 01: Proceedings of the 28th annual conference on Computer graphics and interactive techniques, page 581, New York, NY, USA, 2001. ACM. [Ros04] Randi J. Rost. OpenGL(R) Shading Language. Addison Wesley Longman Publishing Co., Inc., Redwood City, CA, USA, 2004. [SWND07] Dave Shreiner, Mason Woo, Jackie Neider, and Tom Davis. OpenGL(R) Programming Guide: The Official Guide to Learning OpenGL(R), Version 2.1. Addison-Wesley Professional, 2007.

List of Figures 35 List of Figures 2.1 The Rendering Process.......................... 2 2.2 Graphics Pipeline with Shaders..................... 4 3.1 Comparison of Rendering without (left) and with Lighting Model (right) 10 3.2 Input and Output of a Shader Pass................... 12 3.3 Example with two Shader Passes.................... 12 3.4 Result of the first Shader Pass Render Normal Image....... 14 3.5 Result of the first (left) and the second Shader Pass Blur Image (right) 15 4.1 Rendering Comparison of a Torus without and with Color Quantization 18 4.2 Example of Tonal Art Map with 6 Textures.............. 19 4.3 Rendering of three Models with different Tonal Art Maps [PHWF01] 19 4.4 Rendering Results of three Models in three Perspectives........ 21 4.5 Data Flow Diagram of the Shader Passes for Silhouette Drawing... 22 4.6 Silhouette Drawing Results........................ 32

Listings 36 Listings 3.1 Flexible Access of Vector Components................. 6 3.2 Vector Addition.............................. 7 3.3 Multiplication of a Vector by a Factor.................. 7 3.4 Affine Transformation of a Vector.................... 7 3.5 Example Vertex Shader.......................... 9 3.6 Example Fragment Shader........................ 10 3.7 Vertex Shader of Shader Pass Render Normal Image........ 13 3.8 Fragment Shader of Shader Pass Render Normal Image...... 13 3.9 Vertex Shader of Shader Pass Blur Image............... 14 3.10 Fragment Shader of Shader Pass Blur Image............. 15 4.1 Function for Color Quantization in a Fragment Shader........ 18 4.2 Vertex Shader of Render Hatched Model.............. 20 4.3 Fragment Shader of Render Hatched Model............ 20 4.4 Vertex Shader of Render Depth.................... 23 4.5 Fragment Shader of Render Depth.................. 24 4.6 Fragment Shader of Render Depth Edge............... 25 4.7 Vertex Shader of Render Normal................... 26 4.8 Fragment Shader of Render Normal................. 26 4.9 Fragment Shader of Render Normal Edge.............. 27 4.10 Fragment Shader of Render Silhouette................ 29 4.11 Fragment Shader of Combine...................... 31