COMS 4160: Problems on Transformations and OpenGL

Similar documents
Geometry: Outline. Projections. Orthographic Perspective

Computer Viewing Computer Graphics I, Fall 2008

COMP3421. Introduction to 3D Graphics

The Graphics Pipeline and OpenGL I: Transformations!

COMP3421. Introduction to 3D Graphics

Viewing with Computers (OpenGL)

Overview. By end of the week:

COMP3421. Introduction to 3D Graphics

Describe the Orthographic and Perspective projections. How do we combine together transform matrices?

3D Graphics for Game Programming (J. Han) Chapter II Vertex Processing

CS380: Computer Graphics Viewing Transformation. Sung-Eui Yoon ( 윤성의 ) Course URL:

Overview. Viewing and perspectives. Planar Geometric Projections. Classical Viewing. Classical views Computer viewing Perspective normalization

The Viewing Pipeline adaptation of Paul Bunn & Kerryn Hugo s notes

Wire-Frame 3D Graphics Accelerator IP Core. C Library Specification

Motivation. What we ve seen so far. Demo (Projection Tutorial) Outline. Projections. Foundations of Computer Graphics

Computer Viewing. CITS3003 Graphics & Animation. E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley

1 Transformations. Chapter 1. Transformations. Department of Computer Science and Engineering 1-1

The Graphics Pipeline and OpenGL I: Transformations!

Spring 2013, CS 112 Programming Assignment 2 Submission Due: April 26, 2013

Viewing Transformation

Three-Dimensional Graphics III. Guoying Zhao 1 / 67

3D Viewing. With acknowledge to: Ed Angel. Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

CS354 Computer Graphics Viewing and Modeling

To Do. Motivation. Demo (Projection Tutorial) What we ve seen so far. Computer Graphics. Summary: The Whole Viewing Pipeline

CS 4204 Computer Graphics

Computer Viewing. Prof. George Wolberg Dept. of Computer Science City College of New York

Computer Viewing. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

Rasterization Overview

Getting Started. Overview (1): Getting Started (1): Getting Started (2): Getting Started (3): COSC 4431/5331 Computer Graphics.

Projection and viewing. Computer Graphics CSE 167 Lecture 4

Perspective transformations

Mouse Ray Picking Explained

Fundamental Types of Viewing

Fachhochschule Regensburg, Germany, February 15, 2017

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

Game Architecture. 2/19/16: Rasterization

Models and The Viewing Pipeline. Jian Huang CS456

CSE 167: Introduction to Computer Graphics Lecture #5: Projection. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

Computer Graphics. Chapter 10 Three-Dimensional Viewing

Viewing transformations. 2004, Denis Zorin

CIS 636 Interactive Computer Graphics CIS 736 Computer Graphics Spring 2011

CS 130 Final. Fall 2015

Advanced Computer Graphics (CS & SE )

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

7. 3D Viewing. Projection: why is projection necessary? CS Dept, Univ of Kentucky

To Do. Demo (Projection Tutorial) Motivation. What we ve seen so far. Outline. Foundations of Computer Graphics (Fall 2012) CS 184, Lecture 5: Viewing

COMP Computer Graphics and Image Processing. 5: Viewing 1: The camera. In part 1 of our study of Viewing, we ll look at ˆʹ U ˆ ʹ F ˆʹ S

CITSTUDENTS.IN VIEWING. Computer Graphics and Visualization. Classical and computer viewing. Viewing with a computer. Positioning of the camera

Transforms 3: Projection Christian Miller CS Fall 2011

Prof. Feng Liu. Fall /19/2016

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment

Graphics pipeline and transformations. Composition of transformations

Motivation. Sampling and Reconstruction of Visual Appearance. Effects needed for Realism. Ray Tracing. Outline

For each question, indicate whether the statement is true or false by circling T or F, respectively.

MORE OPENGL. Pramook Khungurn CS 4621, Fall 2011

CSE328 Fundamentals of Computer Graphics

To Do. Outline. Translation. Homogeneous Coordinates. Foundations of Computer Graphics. Representation of Points (4-Vectors) Start doing HW 1

Lecture 3 Sections 2.2, 4.4. Mon, Aug 31, 2009

Precept 2 Aleksey Boyko February 18, 2011

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Introduction to Computer Graphics with WebGL

Introduction to Computer Graphics 4. Viewing in 3D

Lecture 4. Viewing, Projection and Viewport Transformations

3D Viewing Episode 2

CS Simple Raytracer for students new to Rendering

Computer Graphics Shadow Algorithms

Virtual Cameras and The Transformation Pipeline

OpenGL Transformations

Viewing COMPSCI 464. Image Credits: Encarta and

CSE 167: Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

API for creating a display window and using keyboard/mouse interations. See RayWindow.cpp to see how these are used for Assignment3

Android and OpenGL. Android Smartphone Programming. Matthias Keil. University of Freiburg

One or more objects A viewer with a projection surface Projectors that go from the object(s) to the projection surface

Overview of Projections: From a 3D world to a 2D screen.

Perspective matrix, OpenGL style

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

OpenGL: Open Graphics Library. Introduction to OpenGL Part II. How do I render a geometric primitive? What is OpenGL

Ray Casting. Outline. Similar to glulookat derivation. Foundations of Computer Graphics

CS451Real-time Rendering Pipeline

1 OpenGL - column vectors (column-major ordering)

Lecture 9 Sections 2.6, 5.1, 5.2. Wed, Sep 16, 2009

Midterm Exam Fundamentals of Computer Graphics (COMP 557) Thurs. Feb. 19, 2015 Professor Michael Langer

Computer Viewing and Projection. Overview. Computer Viewing. David Carr Fundamentals of Computer Graphics Spring 2004 Based on Slides by E.

E.Order of Operations

Rendering Pipeline and Coordinates Transforms

Pipeline Operations. CS 4620 Lecture 14

CSE 167: Introduction to Computer Graphics Lecture #9: Visibility. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

Classical and Computer Viewing. Adapted From: Ed Angel Professor of Emeritus of Computer Science University of New Mexico

1 (Practice 1) Introduction to OpenGL

3D Viewing Episode 2

Computer Graphics. Basic 3D Programming. Contents

CSE 690: GPGPU. Lecture 2: Understanding the Fabric - Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department

Viewing and Modeling

CS 432 Interactive Computer Graphics

3D Viewing. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Announcement. Project 1 has been posted online and in dropbox. Due: 11:59:59 pm, Friday, October 14

The Traditional Graphics Pipeline

CS 184: Assignment 2 Scene Viewer

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

Transcription:

COMS 410: Problems on Transformations and OpenGL Ravi Ramamoorthi 1. Write the homogeneous 4x4 matrices for the following transforms: Translate by +5 units in the X direction Rotate by 30 degrees about the X axis The rotation followed by the translation above followed by scaling by a factor of.. In 3D consider applying a rotation R followed by a translation T. Write the form of the combined transformation in homogeneous coordinates (i.e. supply a 4x4 matrix in terms of the elements of R and T. Now construct the inverse transformation giving the corresponding 4x4 matrix in terms of R and T. You should simplify your answer (perhaps writing T as [TxTyTz] and using appropriate notation for the elements of the rotation matrix or using appropriate matrix and vector notation for R and T. Verify by matrix multiplication that the inverse times the original transform does in fact give the identity. 3. Derive the homogeneous 4x4 matrices for glulookat and gluperspective 4. Assume that in OpenGL your near and far clipping planes are set at a distance of 1m and 100m respectively. Further assume your z-buffer has bits of depth resolution. This means that after the gluperspective transformation the remapped z values [ranging from -1 to +1] are quantized into 51 discrete depths. How far apart are these discrete depth levels close to the near clipping plane More concretely what is the z range (i.e. 1m to of the first discrete depth Now consider the case where all the interesting geometry lies further than 10m. How far apart are the discrete depth levels at 10m Compare your answer to the first part and explain the cause for this difference. How many discrete depth levels describe the region between 10m and 100m What is the number of bits required for this number of depth levels How many bits of precision have been lost What would you recommend doing to increase precision 5. Consider the following operations in the OpenGL pipeline: Scan conversion or Rasterization Projection Matrix Transformation of Points and Normals by the ModelView Matrix Dehomogenization (perspective division clipping Lighting calculations. riefly explan what each of these operations are and in what order they are performed and why.. History: What is the brief history of OpenGL OpenGL is an interface to graphics hardware. Dedicated graphics hardware is itself a fairly new concept that is one of the cornerstones of the development of computer graphics. Which SIGGRAPH Computer graphics achievement awards have been given for the development of graphics hardware 1

% % % Answers 1. Homogeneous Matrices A general representation for 4x4 matrices involving rotation and translation is (1 where is rotation matrix and is a For a translation along the axis by 3 units! &( + translation matrix. while is the identity. Hence we have In the second case where we are rotating about the X axis the translation matrix is just 0. We need to remember the formula for rotation about an axis which is (with angle -/.10 0354 034 -/.10 &( 18: 8: 8: 8: Finally when we are combining these transformations STR we apply the rotation first followed by a translation. It is easy to verify by matrix multiplication that this simply has the same form as equation 1 (but see the next problem for when we have RT. The scale just multiplies everything by a factor of giving % &( &( + ( (3 (4 It is also possible to obtain this result by matrix multiplication of STR % &( % &( % 18: 8: 8: 18: &( % &( + (5 ;< = =. Rotations and translations Having a rotation followed by a translation is simply TR which has the same form as equation 1. The inverse transform is more interesting. Essentially! which in homogeneous coordinates is @! +A C @!! < DEF! = Note that this is the same form as equation 1 using and with and. Finally we may verify that the product of the inverse and the original does in fact give the identity. @!! G @ HG @! I J!! G <! = ( +AK (

3. glulookat and gluperspective glulookat defines the viewing transformation and is given by glu- LookAt(eyex eyey eyez centerx centery centerz upx upy upz corresponds to a camera at eye looking at center with up direction up. First we define the normalized viewing direction. The symbols used here are chosen to correspond to the definitions in the man page. % &( This direction will correspond to the direction since the eye is mapped to the origin and the lookat point or center to the negative z axis. What remains now is to define the and directions. The direction corresponds to the up vector. First we define <8 to normalize. However this may not be perpendicular to the axis so we use vector cross products to define and. In our notation this defines auxiliary vectors 8 Note that this requires the UP vector not to be parallel to the view direction. We now have a set of directions axes. We can therefore define a rotation matrix corresponding to % that rotates a point to the new coordinate frame. However glulookat requires applying this rotation matrix about the eye position not the origin. It is equivalent to glmultmatrixf(m ; gltranslated(-eyex -eyey -eyez ; This corresponds to a translation followed by a rotation. We know (using equation as a guideline for instance that this is the same as the rotation followed by a modified translation. Written out in full the matrix will then be %! &( gluperspective defines a perspective transformation used to map 3D objects to the D screen and is defined by gluperspective(fovy aspect znear zfar where fovy specifies the field of view angle in degrees in the y direction and aspect specifies the aspect ratio that determines the field of view in the x direction. The aspect ratio is the ratio of x (width to y (height. znear and zfar represent the distance from the viewer to the near and far clipping planes and must always be positive. -/.# First we define %&( 8: as corresponding to the focal length or focal distance. A 1 unit height in at should correspond to (. This means we must multiply by and corresponding by 8+.-0/143. The matrix has the form %5 8:;=<! 3 &( &( (8 ( (10 (11 (1

& & We will explain the form of the above matrix. The form of terms 8+.-0/ 183 and has already been explained. The term in the last line is needed to divide by the distance as required in perspective and the negative sign is because OpenGL conventions require us to look down the axis. It remains to find and. Those are chosen so the near and far clipping planes are taken to and respectively. Indeed the entire viewing volume or frustum is mapped to a cube between and along all axes. Using the matrix we can easily formulate that the remapped depth is given by where one must remember that points in front of the viewer have negative Now the required conditions / and Solving this system gives and the final matrix / / / %5 4 ; <! / / ;= = ; ; ;= = & (13 as per OpenGL conventions. have Z-buffer in OpenGL The purpose of this question is to show how depth resolution degrades as one moves further away from the near clipping plane since the remapped depth is nonlinear (and reciprocal in the original depth values. Equation 13 gives us the formula for remapping. We just need to find and which we can do by solving or plugging directly into equation 15 using / and. We obtain 8 and 8. The remapped value is then given by (1 Note that for mathematical simplicity you might imagine the far plane at infinity so we don t need the. For the remaining parts of the question it is probably simplest to just use differential techniques. We can obtain (18 To consider one depth bucket we simply need to set 8:. Now using the equation above setting we get!. In other words the first depth bucket ranges from a depth of to a depth of approximately and we can resolve depths apart. Now consider and plug in above. We know that % so and the depth buckets around are in 1 increments and we lose resolving power quadratically with a danger that many different objects may go into the same depth bucket. This brings us to the fundamental point of this problem that depth levels are closer near the near plane and depth resolution decreases far away. Finally we consider the depth levels between and. Using equation 1 transforms to C 18. Thus only a range of remains. Hence we only have & depth buckets or less than bits of precision. We have lost more than 3 bits of precision. To increase precision we should move the near clipping plane further out if interesting geometry is in the range. 4 (14 (15 (1

( Order of OpenGL operations The operations are performed in the following order. You might wish to peruse Appendix A of the OpenGL manual. 1. Modelview Matrix: Each vertex s spatial coordinates are transformed by the modelview matrix =! as. Simultaneously normals are transformed by the inverse transpose and renormalized if specified.. Lighting Calculations: The lighting calculations are performed next per vertex. They operate on the transformed coordinates so must occur after the modelview transform. They are performed before projection because the lighting isn t affected by the projection matrix. Remember that lighting is therefore performed in eye coordinates. Lighting must be performed before clipping because vertices that are clipped could influence shading inside the region of interest (such as a triangle one of whose vertices is clipped. A user-defined culling algorithm can help avoid unnecessary computations. 3. Projection Matrix: The projection matrix is then applied to project objects into the image plane. 4. Clipping: Clipping is then done in the homogeneous coordinates against the standard viewing planes. Clipping is done before dehomogenization to avoid the perspective divide for vertices that are clipped anyway. 5. Dehomogenization or Perspective divide: Perspective divide by the fourth or homogeneous coordinate then occurs.. Scan conversion or Rasterization: Finally the primitive is converted to fragments by scan conversion or rasterization. History: OpenGL is a successor to IRIS GL (GL = Graphics Library originally introduced by SGI as a hardware-independent API for SGI s graphics workstations. OpenGL is not backward-compatible with IRIS GL and removes many unnecessary problems (in particular namespace issues. To date 3 SIGGRAPH achievement awards have been given for graphics hardware development. The 184 award was given to Jim Clark founder of SGI (and later Netscape formerly an associate professor at Stanford University. The key technical publication is The geometry engine: A VLSI geometry system for graphics SIGGRAPH 8 pp 1 133. Clark s former student Kurt Akeley then CTO of SGI and responsible for the specification of extensions to the OpenGL API won the 15 Computer Graphics Achievement Award for his contributions to graphics hardware including the Reality Engine RealityEngine Graphics SIGGRAPH 3 pp 10 11. Most recently Dave Kirk chief scientist of nvidia was awarded the 00 SIGGRAPH computer graphics achievement award for his key technical role in bringing high performance computer graphics systems to the mass market including the RIVA 18 RIVA TNT GeForce Geforce Geforce3 and Geforce4. 5