Perspective transformations

Similar documents
Graphics pipeline and transformations. Composition of transformations

Notes on Assignment. Notes on Assignment. Notes on Assignment. Notes on Assignment

Lecture 4. Viewing, Projection and Viewport Transformations

Viewing transformations. 2004, Denis Zorin

Lecture 3 Sections 2.2, 4.4. Mon, Aug 31, 2009

CSE528 Computer Graphics: Theory, Algorithms, and Applications

CSE328 Fundamentals of Computer Graphics

CS 112 The Rendering Pipeline. Slide 1

COMP3421. Introduction to 3D Graphics

Rasterization Overview

CS 4204 Computer Graphics

CMSC427 Transformations II: Viewing. Credit: some slides from Dr. Zwicker

COMP3421. Introduction to 3D Graphics

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

3D Viewing. CS 4620 Lecture 8

COMS 4160: Problems on Transformations and OpenGL

Viewing with Computers (OpenGL)

For each question, indicate whether the statement is true or false by circling T or F, respectively.

COMP3421. Introduction to 3D Graphics

Overview. By end of the week:

3D Graphics Pipeline II Clipping. Instructor Stephen J. Guy

Models and The Viewing Pipeline. Jian Huang CS456

Viewing and Projection

OpenGL Transformations

Game Architecture. 2/19/16: Rasterization

GEOMETRIC TRANSFORMATIONS AND VIEWING

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

Computer Viewing Computer Graphics I, Fall 2008

CSE 167: Introduction to Computer Graphics Lecture #5: Projection. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

The Viewing Pipeline adaptation of Paul Bunn & Kerryn Hugo s notes

Transforms 3: Projection Christian Miller CS Fall 2011

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

INTRODUCTION TO COMPUTER GRAPHICS. It looks like a matrix Sort of. Viewing III. Projection in Practice. Bin Sheng 10/11/ / 52

The Graphics Pipeline and OpenGL I: Transformations!

1 OpenGL - column vectors (column-major ordering)

CSE452 Computer Graphics

INTRODUCTION TO COMPUTER GRAPHICS. cs123. It looks like a matrix Sort of. Viewing III. Projection in Practice 1 / 52

CS 130 Final. Fall 2015

To Do. Demo (Projection Tutorial) Motivation. What we ve seen so far. Outline. Foundations of Computer Graphics (Fall 2012) CS 184, Lecture 5: Viewing

3D Viewing. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Computer Graphics. Chapter 10 Three-Dimensional Viewing

Motivation. What we ve seen so far. Demo (Projection Tutorial) Outline. Projections. Foundations of Computer Graphics

CSE 690: GPGPU. Lecture 2: Understanding the Fabric - Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department

Prof. Feng Liu. Fall /19/2016

Geometry: Outline. Projections. Orthographic Perspective

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

To Do. Motivation. Demo (Projection Tutorial) What we ve seen so far. Computer Graphics. Summary: The Whole Viewing Pipeline

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Projection and viewing. Computer Graphics CSE 167 Lecture 4

Evening s Goals. Mathematical Transformations. Discuss the mathematical transformations that are utilized for computer graphics

3D Viewing Episode 2

Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to mode

Three-Dimensional Viewing Hearn & Baker Chapter 7

Mouse Ray Picking Explained

TSBK03 Screen-Space Ambient Occlusion

1 Transformations. Chapter 1. Transformations. Department of Computer Science and Engineering 1-1

CS451Real-time Rendering Pipeline

Perspective Mappings. Contents

3D Viewing. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 9

3D Viewing Episode 2

Announcement. Project 1 has been posted online and in dropbox. Due: 11:59:59 pm, Friday, October 14

Computer Viewing. CS 537 Interactive Computer Graphics Prof. David E. Breen Department of Computer Science

CITSTUDENTS.IN VIEWING. Computer Graphics and Visualization. Classical and computer viewing. Viewing with a computer. Positioning of the camera

3D Graphics for Game Programming (J. Han) Chapter II Vertex Processing

SUMMARY. CS380: Introduction to Computer Graphics Projection Chapter 10. Min H. Kim KAIST School of Computing 18/04/12. Smooth Interpolation

15. Clipping. Projection Transformation. Projection Matrix. Perspective Division

Clipping and Scan Conversion

Affine Transformations in 3D

CSE 167: Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

THE VIEWING TRANSFORMATION

3D Viewing. Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Perspective matrix, OpenGL style

Viewing. Reading: Angel Ch.5

The Graphics Pipeline. Interactive Computer Graphics. The Graphics Pipeline. The Graphics Pipeline. The Graphics Pipeline: Clipping

To Do. Outline. Translation. Homogeneous Coordinates. Foundations of Computer Graphics. Representation of Points (4-Vectors) Start doing HW 1

The Graphics Pipeline and OpenGL I: Transformations!

Drawing in 3D (viewing, projection, and the rest of the pipeline)

Overview. Viewing and perspectives. Planar Geometric Projections. Classical Viewing. Classical views Computer viewing Perspective normalization

p =(x,y,d) y (0,0) d z Projection plane, z=d

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

CSC 305 The Graphics Pipeline-1

3D Geometry and Camera Calibration

CS230 : Computer Graphics Lecture 6: Viewing Transformations. Tamar Shinar Computer Science & Engineering UC Riverside

3D Polygon Rendering. Many applications use rendering of 3D polygons with direct illumination

Computer Graphics 7: Viewing in 3-D

COMP3421. Vector geometry, Clipping

Graphics 2009/2010, period 1. Lecture 6: perspective projection

Drawing in 3D (viewing, projection, and the rest of the pipeline)

CS 432 Interactive Computer Graphics

GRAFIKA KOMPUTER. ~ M. Ali Fauzi

Viewing/Projections III. Week 4, Wed Jan 31

Texture Mapping. Texture (images) lecture 16. Texture mapping Aliasing (and anti-aliasing) Adding texture improves realism.

lecture 16 Texture mapping Aliasing (and anti-aliasing)

Announcements. Submitting Programs Upload source and executable(s) (Windows or Mac) to digital dropbox on Blackboard

Viewing COMPSCI 464. Image Credits: Encarta and

Three-Dimensional Graphics III. Guoying Zhao 1 / 67

3D Viewing Transformations

Drawing in 3D (viewing, projection, and the rest of the pipeline)

CGT520 Transformations

The Viewing Pipeline Coordinate Systems

Transcription:

Perspective transformations

Transformation pipeline Modelview: model (position objects) + view (position the camera) Projection: map viewing volume to a standard cube Perspective division: project D to D Viewport: map the square [-1,1]x[-1,1] in normalized device coordinates to the screen

Coordinate systems z x x Eye coordinates z World coordinates World coordinates - fixed initial coord system; everything is defined with respect to it Eye coordinates - coordinate system attached to the camera; in this system camera looks down negative Z-axis

Positioning the camera Modeling transformation: reshape the object, orient the object, position the object with respect to the world coordinate system Viewing transformation: transform world coordinates to eye coordinates Viewing transformation is the inverse of the camera positioning transformation Viewing transformation should be rigid: rotation + translation Steps to get the right transform: first, orient the camera correctly, then translate it

Positioning the camera Viewing transformation is the inverse of the camera positioning transformation: z world x world x eye z eye Camera positioning: translate by (t x ; t z ) Viewing transformation (world to eye): xeye = x world t z zeye = x world t x

Look-at positioning Find the viewing transform given the eye (camera) position, point to look at, and the up vector Need to specify two transforms: rotation and translation. translation is easy natural rotation: define implicitly using a point at which we want to look and a vector indicating the vertical in the image (up vector) can easily convert the eye point to the direction vector of the camera axis; can assume up vector perpendicular to view vector

Look-at positioning Problem: given two pairs of perpendicular unit vectors, find the transformation mapping the first pair into the second j c u v = l c jl cj Eye coords l World coords

Look-at positioning Determine rotation first, looking how coord vectors change: R 0 0 1 R = v 1 0 0 0 1 0 0 0 1 R 1 0 = v u 0 R 0 1 0 = R = [v u, u, v] = u

Look-at positioning Recall the matrix for translation: T = 1 0 0 c x 0 1 0 c y 0 0 1 c z 0 0 0 1 Now we have the camera positioning matrix, TR To get the viewing transform, invert: For rotation the inverse is the transpose! (v u) T R 1 = [v u u v] T = u T v T (TR) 1 = R 1 T 1

Look-at viewing transformation T 1 = 1 0 0 c x 0 1 0 c y 0 0 1 c z 0 0 0 1 = [e x e y e z c] V = R 1 T 1 = ( v u ) T ( v u : c ) u T ( u : c ) v T ( v : c ) [ 0 ; 0 ; 0 ] 1

Positioning the camera in OpenGL imagine that the camera is an object and write a sequence of rotations and translations positioning it change each transformation in the sequence to the opposite reverse the sequence Camera positioning is done in the code before modeling transformations OpenGL does not distinguish between viewing and modeling transformation and joins them into the modelview matrix

Space to plane projection In eye coordinate system y z c Image plane p x = p z P r o j ( p ) = p y = p z 1 v p Proj(p) = c = 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 v = p x p y p z 1 0 0 1 projecting to the plane z = -1

Visibility Objects that are closer to the camera occlude the objects that are further away All objects are made of planar polygons A polygon typically projects 1 to 1 idea: project polygons in turn ; for each pixel, record distance to the projected polygon when writing pixels, replace the old color with the new one only if the new distance to camera for this pixel is less then the recorded one

Z-buffering idea Problem: need to compare distances for each projected point Solution: convert all points to a coordinate system in which (x,y) are image plane coords and the distance to the image plane increases when the z coordinate increases In OpenGL, this is done by the projection matrix

Z buffer Assumptions: each pixel has storage for a z-value, in addition to RGB all objects are scanconvertible (typically are polygons, lines or points) Algorithm: initilize zbuf to maximal value for each object for each pixel (i,j) obtained by scan conversion if znew(i,j) < zbuf(i,j) zbuf(i,j) = znew(i,j) ; write pixel(i,j)

Z buffer What are z values? Z values are obtained by applying the projection transform, that is, mapping the viewing frustum to the standard cube. Z value increases with th distance to the camera. Z values for each pixel are computed for each pixel covered by a polygon using linear interpolation of z values at vertices. Typical Z buffer size: bits (same as RGB combined).

Camera specification Define the dimensions of the viewing volume (frustum) most general glfrustum(left,right,bottom,top,near,far) (l,t,-n) (l,b,-n) (sl,st,-f) (sl,sb,-f) (r,t,-n) (r,b,-n) (sr,st,-f) (sr,sb,-f) In the picture: l = left r = right b = bottom t = top n = near f = far s = far/near

Camera specification Less general but more convenient: gluperspective(field_of_view,aspect_ratio, near,far) In the picture: fov = field of view, h/w = a=aspect ratio h w fov Relationship to frustum: left = -a*near*tan(fov/) right = a*near*tan(fov/) bottom = -a*near*tan(fov/) top = a*near*tan(fov/) gluperspective requires fov in degrees, not radians!

Viewing frustum Volume in space that will be visible in the image r is the aspect ratio of the image width/height y z n f

Projection transformation Maps the viewing frustum into a standard cube extending from -1 to 1 in each coordinate (normalized device coordinates) steps: change the matrix of projection to keep z: result is a parallelepiped translate: parallelepiped centered at 0 scale in all directions: cube of of size centered at 0

Projection transformation 1 0 0 0 p x Proj(p) = 0 1 0 0 p y change 0 0 1 0 p z so that we keep z: 0 0 1 0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 p x p y p z 1 = p x p y 1 p z the homogeneous result corresponds to p x = p z p y = p z 1 = p z the last component increases monotonically with z!

Projection transformation 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 maps the frustum to an axis-aligned parallelepiped 1/f 1/n tan z 1 0 0 0 0 1 0 0 T = 0 0 1 1 µ 1 f + 1 n 0 0 0 1 r tan already centered in (x,y), center in z-direction and scale: S = µ 1 r tan 0 0 0 1 0 tan 0 0 0 0 ³ 1 n 0 1 f 0 0 0 1

Projection transformation ³ Combined matrix, mapping frustum to a cube: P = ST 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 = 1 r tan 0 0 0 1 0 tan 0 0 f + n 0 0 fn n f n f 0 0 1 0 To get normalized image plane coordinates (valid range [-1,1] both), just drop z in the result and convert from homogeneous to regular. To get pixel coordinates, translate by 1, and scale x and y (Viewport transformation)