Homework 3: Programmable Shaders

Similar documents
CS 4620 Program 3: Pipeline

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture 14

CS 130 Final. Fall 2015

Pipeline Operations. CS 4620 Lecture 10

Deferred Rendering Due: Wednesday November 15 at 10pm

Topics and things to know about them:

COMP environment mapping Mar. 12, r = 2n(n v) v

CS 130 Exam I. Fall 2015

The Rasterization Pipeline

lecture 18 - ray tracing - environment mapping - refraction

CS 381 Computer Graphics, Fall 2008 Midterm Exam Solutions. The Midterm Exam was given in class on Thursday, October 23, 2008.

Computer Graphics Coursework 1

CS 130 Exam I. Fall 2015

CS GPU and GPGPU Programming Lecture 7: Shading and Compute APIs 1. Markus Hadwiger, KAUST

COS 116 The Computational Universe Laboratory 10: Computer Graphics

Homework #2. Shading, Ray Tracing, and Texture Mapping

Advanced Lighting Techniques Due: Monday November 2 at 10pm

CS 4620 Midterm, March 21, 2017

CS 184: Assignment 2 Scene Viewer

Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

TSBK 07! Computer Graphics! Ingemar Ragnemalm, ISY

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

The exam begins at 2:40pm and ends at 4:00pm. You must turn your exam in when time is announced or risk not having it accepted.

Models and Architectures

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Shaders. Slide credit to Prof. Zwicker

Ray Tracer Due date: April 27, 2011

Raytracing CS148 AS3. Due :59pm PDT

CS559 Computer Graphics Fall 2015

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

CS 354R: Computer Game Technology

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Blue colour text questions Black colour text sample answers Red colour text further explanation or references for the sample answers

TSBK03 Screen-Space Ambient Occlusion

CS Simple Raytracer for students new to Rendering

-=Bui Tuong Phong's Lighting=- University of Utah, but with shaders. Anton Gerdelan Trinity College Dublin

Shading and Texturing Due: Friday, Feb 17 at 6:00pm

CSE 167: Lecture #7: Color and Shading. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

CS 464 Review. Review of Computer Graphics for Final Exam

CSE 167: Lecture #8: GLSL. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

COS 116 The Computational Universe Laboratory 10: Computer Graphics

CHAPTER 1 Graphics Systems and Models 3

Midterm Exam! CS 184: Foundations of Computer Graphics! page 1 of 13!

x ~ Hemispheric Lighting

Computergrafik. Matthias Zwicker. Herbst 2010

A simple OpenGL animation Due: Wednesday, January 27 at 4pm

CGT520 Lighting. Lighting. T-vertices. Normal vector. Color of an object can be specified 1) Explicitly as a color buffer

Interactive Real-Time Raycasting

CMSC427: Computer Graphics Lecture Notes Last update: November 21, 2014

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

[175 points] The purpose of this assignment is to give you practice with shaders in OpenGL.

CS 381 Computer Graphics, Fall 2012 Midterm Exam Solutions. The Midterm Exam was given in class on Tuesday, October 16, 2012.

Shadows in the graphics pipeline

CS 184: Assignment 4 Simple Raytracer

Methodology for Lecture. Importance of Lighting. Outline. Shading Models. Brief primer on Color. Foundations of Computer Graphics (Spring 2010)

Simple Lighting/Illumination Models

Complex Shading Algorithms

Lab 9 - Metal and Glass

Graphics Programming. Computer Graphics, VT 2016 Lecture 2, Chapter 2. Fredrik Nysjö Centre for Image analysis Uppsala University

Midterm Exam! CS 184: Foundations of Computer Graphics! page 1 of 14!

Module Contact: Dr Stephen Laycock, CMP Copyright of the University of East Anglia Version 1

For each question, indicate whether the statement is true or false by circling T or F, respectively.

Homework #2. Shading, Projections, Texture Mapping, Ray Tracing, and Bezier Curves

CS5620 Intro to Computer Graphics

Adaptive Point Cloud Rendering

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016


Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Ray Tracing COMP575/COMP770

Introduction to Visualization and Computer Graphics

AGDC Per-Pixel Shading. Sim Dietrich

The exam begins at 2:40pm and ends at 4:00pm. You must turn your exam in when time is announced or risk not having it accepted.

HomeWork 2 Rasterization

Computer Graphics. Shadows

Recall: Indexing into Cube Map

CS 418: Interactive Computer Graphics. Basic Shading in WebGL. Eric Shaffer

CS 498 VR. Lecture 19-4/9/18. go.illinois.edu/vrlect19

Assignment 6: Ray Tracing

CS354 Computer Graphics Ray Tracing. Qixing Huang Januray 24th 2017

CMSC427 Final Practice v2 Fall 2017

Graphics Pipeline 2D Geometric Transformations

Sign up for crits! Announcments

OpenGL shaders and programming models that provide object persistence

6.837 Introduction to Computer Graphics Assignment 5: OpenGL and Solid Textures Due Wednesday October 22, 2003 at 11:59pm

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

Programmable GPUS. Last Time? Reading for Today. Homework 4. Planar Shadows Projective Texture Shadows Shadow Maps Shadow Volumes

S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T

OUTLINE. Learn the basic design of a graphics system Introduce pipeline architecture Examine software components for a graphics system

Computer Graphics Lecture 11

CS : Assignment 2 Real-Time / Image-Based Rendering

Lecture 15: Shading-I. CITS3003 Graphics & Animation

CS451Real-time Rendering Pipeline

CPSC / Texture Mapping

Comp 410/510 Computer Graphics. Spring Shading

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker

Transcription:

Homework 3: Programmable Shaders Introduction to Computer Graphics and Imaging (Summer 2012), Stanford University Due Monday, July 23, 11:59pm Warning: The coding portion of this homework involves features of OpenGL that are much newer than ones we have used in previous assignments. Thus, there are some odds they won t work on your machine. Even if you don t start coding for the assignment early, do try compiling and running the skeleton code ASAP to make sure everything is functional. If there is an issue with your hardware, you may need to code this assignment in person on the Myth machines (forwarding may also fail). In the written portion of this problem set, we ll explore properties of and computations involving barycentric coordinates, introduced in Lecture 4. These coordinates are used to interpolate normals, textures, depths, and other values across triangle faces, so it is important to understand how they behave. To begin with, let s say we have a triangle with vertices a, b, and c in R 2. We ll define the signed area of triangle abc as follows: A(abc) = 1 2 b x a x c x a x b y a y c y a y Problem 1 (10 points). (a) Use cross products to show that the absolute value of A(abc) is indeed the area of the triangle that we expect. (b) Show that A(abc) is positive when a, b, and c are in counterclockwise order and negative otherwise. Drawing pictures to demonstrate the solution for part (b) is good enough: no need for a formal proof. Now, let s take a point p in triangle abc. Assume that a, b, and c are given in counterclockwise order. Then, we can apply the formula above to yield straightforward expressions for barycentric coordinates. Problem 2 (10 points). Define the barycentric coordinates of p using the signed area formula above. Show that your coordinates always sum to 1 and that they are all positive exactly when p is inside the triangle abc. Our definition of barycentric coordinates in terms of areas seems somewhat arbitrary! After all, we could take any three numbers as functions of vertex positions and normalize to 1. Thankfully, we won t make you prove it, but the following relationship holds uniquely: p = α a a + α b b + αc c where p has barycentric coordinates (α a, α b, α c ) summing to 1. 1

Problem 3 (10 points). Use this formula to suggest a 2 2 linear system that will yield two of the three barycentric coordinates of p. Provide a formula for the third. Hint: Do the second part of the problem first. Then plug in the third coordinate in our expression for p and refactor into a linear system for α a and α b. If you ve done everything correctly so far, you should find that applying the standard formula for the inverse of a 2 2 matrix to your solution to Problem 3 yields your signed area formula from Problem 2. You can check this on scrap paper if you desire. Now, let s say we want to interpolate a function f smoothly given its values at a, b, and c. We will label these values f a, f b, and f c, resp., and will use f ( p) to denote the interpolated function with values everywhere on R 2 given by f ( p) = α a f a + α b f b + α c f c where the barycentric coordinates of p are (α a, α b, α c ). It s easy to see that f ( a) = f a, f ( b) = fb, and f ( c) = f c (make sure you understand why!). We also have one final important property: Problem 4 (10 points). Take p 1, p 2 R 2. Show that f (t p 1 + (1 t) p 2 ) = t f ( p 1 ) + (1 t) f ( p 2 ) for any t R. Comprehensive hint: Use barycentric coordinates to write p 1 = α 1 a a + α 1 b b + α 1 c c p 2 = α 2 a a + α 2 b b + α 2 c c Now, define p = t p 1 + (1 t) p 2. Plug in our barycentric expressions for p 1 and p 2 above and refactor to yield the barycentric coordinates of p from the coefficients of a, b, and c. Use these barycentric coordinates to find f ( p) as it is defined above the problem, and refactor once again to get the final result. This final result actually shows us that barycentric coordinates are the right way to interpolate depths for the depth buffer (ignoring the nonlinearity we discussed in Lecture 4). Namely, if we restrict f to a line through p 1 and p 2, it looks linear. By the way, just like Bresenham s algorithm, there are incremental tricks for computing barycentric coordinates inside triangles given their values on the edges. Modern GPUs make use of fragment-level parallelization and thus such tricks might not be useful. In the coding half of this assignment, 1 you will get some practice using GLSL to write shaders, which implicitly make use of barycentric coordinates and other tools to modify fragments, vertex positions, and other data. Lecture 6 covers the basics of GLSL as well as the philosophy behind writing shaders. Recall that the two most basic types of shaders in OpenGL are: Vertex shaders These shaders modify the vertices of a triangulated piece of geometry. The default OpenGL vertex shader multiplies vertex coordinates by model-view and projection matrices and applies other aspects of the fixed-function pipeline, but we can write a shader of our own to modify shape. 1 Adapted from a nearly identical assignment used in Pat Hanrahan s CS 148 course in Fall 2011. 2

Fragment shaders These shaders let us modify the method used to compute pixel colors, potentially employing textures and other external data. OpenGL also allows for geometry shaders, but we won t make use of them in CS 148. Some useful facts beyond what was covered in lecture to get you started writing shaders: The OpenGL Orange Book documents details of GLSL and can help you look up advanced features and particular functionality. There also is some information in the Red Book, in the course textbook, and in online tutorials. The entry point of a GLSL shader is the function main(), as in C. A vertex shader s main functionality is to compute gl Position, the position of the vertex, and a fragment shader s functionality is to compute gl FragColor, the color of the fragment. GLSL has all the simple C data types. It also has built-in vector types vec2, vec3, and vec4 (with constructors like vec2(x,y)). You can access fields of these vectors as v.x/v.y/v.z/v.w or as v.r/v.g/v.b/v.a. You can extract the RGB part of an RGBA color by swizzling: c.rgb, and multiplying vectors does so element-wise. There are many useful helper functions including dot(), cross(), length(), normalize(), and reflect(). The assignment this time comes with a makefile that should work on most systems, although it will be evaluated on the Myth machines. It makes use of the OpenGL Extension Wrangler Library (GLEW) to deal with shaders. Visual Studio and Xcode users shouldn t need any extra work to use this library. On Linux, you may need to install libglew-dev or download GLEW online (some systems may need to change the makefile to say -lglew rather than -lglew ). Make sure you can compile and run the assignment ASAP, as GLSL support can be shaky from one computer to another. To do so, first compile the included libst library and then ProgrammableShading. You actually won t need to recompile these if you only change the shader files shaders get compiled separately at runtime. To run the program, you will use the incantation:./programmableshading vertshader fragshader lightprobeimg normalmapimg Here, vertshader and fragshader are the shader files, and lightprobeimg and normalmapimg are images that we will use in later parts of this assignment. For example, to run the default skeleton shaders, type:./programmableshading kernels/default.vert kernels/phong.frag images/lightprobe.jpg images/normalmap.png Navigation in this simple viewer uses the left mouse button to rotate and the right mouse button to zoom. Pressing r resets the camera position, t toggles between displaying the plane and displaying the Utah teapot, and the arrow keys move the camera horizontally and vertically. Press s to take a screenshot and save it to screenshot.jpg. Your first shader in GLSL will be a fragment shader implementing the Phong reflection model. Recall that Phong reflection separates specular, diffuse, and ambient reflection yielding a final intensity of where I = M a I a + ( Lm N)Md I d + ( Rm V) s M s I s 3

(a) (b) Figure 1: For problem 5: (a) a normal-mapped plane; (b) a teapot shaded using the Phong model. I a, I d, and I s are the ambient, diffuse, and spectral illuminations (we ll specify them as threevectors of floats between 0 and 1 with one component for each color channel). M a, M d, and M s are the material s ambient, diffuse, and spectral colors (we ll take M a = M d ). Lm is a unit vector from the surface point to the light source. N is the unit surface normal. Rm is the reflected direction of a vector pointing from the surface to the light source over the normal N. V is a unit vector pointing from the surface to the viewer. s controls the shininess of the material. Note that multiplication like M a I a happens component-by-component. Also note that the dot products could be negative, so you ll have to clamp them to be 0. For fun, the skeleton code already implements normal mapping, which reads normal vectors from a texture to yield more interesting effects on the plane. Figure 1 shows this effect applied to the plane using images/normalmap.png, as well as the teapot with Phong shading but no normal map. Problem 5 (30 points). Modify kernels/phong.frag to implement this shading model. Next, we ll try our hand at writing a vertex shader, implementing a species of displacement mapping where the (initially flat) plane is modified along its normal direction. We ll simulate water ripples using a height field function specifying how each vertex on our plane should be moved up or down. Periodic functions are the easiest way to accomplish such an effect; for example, the function s cos(2πau) cos(2πav) will make a rippling height field oscillating between s and s, where u and v are texture coordinates. Note: For full credit in your submission, come up with your own function rather than implementing this one! You should write a vertex shader that does the following: 1. Evaluate your height field on texture coordinates (u, v) to yield height h(u, v). 2. Displace the vertex by h(u, v) N. 4

Figure 2: For problem 6. Since you just modified the geometry, however, you ll need to do one more task. The normal to the plane used to be the constant (0, 1, 0), but now our surface contains ripples and must be shaded as such! Thus, you must also: 3. Compute a new unit normal, using either a closed form or finite differencing (look it up or ask during office hours!). Figure 2 shows proper output for one potential ripple function. Problem 6 (30 points). Modify kernels/displacement.vert to implement your height field displacement shader. Be sure to output correct normals. Optional Extra Credit Shiny materials like metal reflect the surrounding environment. This is because these materials admit predominantly specular lighting. In class and in a future assignment, we ll talk about ray tracing as the proper solution to this problem, but it can be too slow for real-time applications. Instead, we d like to compromise, using OpenGL s fast rendering to obtain a plausible reflectional rendering. We will use a technique called environment mapping (also known as reflection mapping) to achieve these effects. The trick is to use a light probe image encoding the color and intensity of the light coming in from all directions. There are several ways to encode and decode such light information, but in this assignment we will use spherical mapping from a mirror ball image. Note that our math below assumes the light probe was photographed using an orthographic projection, an obviously nonphysical assumption but one that simplifies the math considerably. Let s say we take a photo of a chrome ball in our scene, as in Figure 4(a). Since our ball only reflects specularly, the color we see at each surface point must come from a distinct light direction, illustrated in Figure 3(a). In the diagram, for each point on the surface, the direction of the light source that is illuminating that point from our camera perspective is shown with a green arrow. Note that on the top and bottom of the sphere, the light we see from the eye position is coming from directly behind the ball. In fact, as suggested in Figure 3(b), photographing half the sphere tells us about light in all directions! To extract lighting information from the light probe, we will assume that the direction and magnitude of the vector from the camera to the light probe is the same as the vector from the direction of our OpenGL scene camera to the point on the surface we are shading. 5

(a) (b) (c) (d) Figure 3: Images to explain the algorithm for problem 7. (a) (b) Figure 4: For problem 7. ~ is the reflection of V ~ across the Recall that the source of specular light reaching the viewer at V ~ ~ so we normal vector N. We re assuming our probe and surface have approximately the same V, ~ only need to find the color of the point on the sphere that shares the same N! ~ is Let s say the center of the sphere is (0, 0, 0). Then, the point on the sphere with normal N ~ (an interesting property of the unit sphere is that position and normal simply ~ P = (0, 0, 0) + N vectors are the same, shown in Figure 3(c)). We re assuming an orthographic projection of the unit sphere, shown in Figure 3(d), so to find the right lookup on the probe image we simply remove one of the three coordinates! Finally, we map x, y [ 1, 1] to x 0, y0 [0, 1] in image texture coordinates. Figure 4(b) shows a sample of the final result. Problem 7 (15 points extra credit). Modify kernels/lightprobe.frag to implement the strategy for environment mapping described above. It should look good on the teapot; the normal-mapped plane will look messy since its normals change with such high frequency. 6