Project 1, 467. (Note: This is not a graphics class. It is ok if your rendering has some flaws, like those gaps in the teapot image above ;-)

Similar documents
CSE467 Checkpoint 0: Course Logistics and Graphics Algorithm Software Model

Rendering If we have a precise computer representation of the 3D world, how realistic are the 2D images we can generate? What are the best way to mode

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Rasterization and Graphics Hardware. Not just about fancy 3D! Rendering/Rasterization. The simplest case: Points. When do we care?

CS 130 Final. Fall 2015

CS4620/5620: Lecture 14 Pipeline

CSE 167: Introduction to Computer Graphics Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2015

Computer Graphics. Lecture 02 Graphics Pipeline. Edirlei Soares de Lima.

CS 112 The Rendering Pipeline. Slide 1

Real Time Reflections Han-Wei Shen

Computer Graphics Fundamentals. Jon Macey

Triangle Rasterization

Rendering approaches. 1.image-oriented. 2.object-oriented. foreach pixel... 3D rendering pipeline. foreach object...

Scanline Rendering 2 1/42

The Application Stage. The Game Loop, Resource Management and Renderer Design

Wire-Frame 3D Graphics Accelerator IP Core. Specification

CS 4620 Program 3: Pipeline

LOD and Occlusion Christian Miller CS Fall 2011

COMP30019 Graphics and Interaction Scan Converting Polygons and Lines

Midterm Exam Fundamentals of Computer Graphics (COMP 557) Thurs. Feb. 19, 2015 Professor Michael Langer

COMP Preliminaries Jan. 6, 2015

Drawing Fast The Graphics Pipeline

The Rasterization Pipeline

Drawing Fast The Graphics Pipeline

1.2.3 The Graphics Hardware Pipeline

Rendering Objects. Need to transform all geometry then

Drawing Fast The Graphics Pipeline

Advanced Real- Time Cel Shading Techniques in OpenGL Adam Hutchins Sean Kim

CSE 167: Lecture #5: Rasterization. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

Drawing in 3D (viewing, projection, and the rest of the pipeline)

Culling. Computer Graphics CSE 167 Lecture 12

4: Polygons and pixels

Real-Time Rendering (Echtzeitgraphik) Michael Wimmer

Today. Rendering pipeline. Rendering pipeline. Object vs. Image order. Rendering engine Rendering engine (jtrt) Computergrafik. Rendering pipeline

Topics and things to know about them:

Optimisation. CS7GV3 Real-time Rendering

CS451Real-time Rendering Pipeline

(a) rotating 45 0 about the origin and then translating in the direction of vector I by 4 units and (b) translating and then rotation.

The Traditional Graphics Pipeline

TSBK03 Screen-Space Ambient Occlusion

Graphics for VEs. Ruth Aylett

CHAPTER 1 Graphics Systems and Models 3

Graphics Hardware and Display Devices

Overview. By end of the week:

Computer Graphics. Chapter 1 (Related to Introduction to Computer Graphics Using Java 2D and 3D)

The Traditional Graphics Pipeline

Working with Metal Overview

CS488. Visible-Surface Determination. Luc RENAMBOT

CSE 167: Introduction to Computer Graphics Lecture #9: Visibility. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2018

Surface Graphics. 200 polys 1,000 polys 15,000 polys. an empty foot. - a mesh of spline patches:

CS602 Midterm Subjective Solved with Reference By WELL WISHER (Aqua Leo)

CSE 167: Lecture #4: Vertex Transformation. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Robust Stencil Shadow Volumes. CEDEC 2001 Tokyo, Japan

From Vertices to Fragments: Rasterization. Reading Assignment: Chapter 7. Special memory where pixel colors are stored.

Introduction to 3D Graphics

Rasterizing triangles

Shadow Algorithms. CSE 781 Winter Han-Wei Shen

Graphics for VEs. Ruth Aylett

Chapter 5. Projections and Rendering

Performance OpenGL Programming (for whatever reason)

CSE4030 Introduction to Computer Graphics

Shader Series Primer: Fundamentals of the Programmable Pipeline in XNA Game Studio Express

Graphics Hardware, Graphics APIs, and Computation on GPUs. Mark Segal

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Models and Architectures

3D GRAPHICS. design. animate. render

VISIBILITY & CULLING. Don t draw what you can t see. Thomas Larsson, Afshin Ameri DVA338, Spring 2018, MDH

CS 130 Exam I. Fall 2015

Drawing in 3D (viewing, projection, and the rest of the pipeline)

The Traditional Graphics Pipeline

Mattan Erez. The University of Texas at Austin

CS452/552; EE465/505. Clipping & Scan Conversion

The Graphics Pipeline

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

CS 130 Exam I. Fall 2015

2: Introducing image synthesis. Some orientation how did we get here? Graphics system architecture Overview of OpenGL / GLU / GLUT

Pipeline Operations. CS 4620 Lecture 14

Applications of Explicit Early-Z Culling

Werner Purgathofer

CSE 690: GPGPU. Lecture 2: Understanding the Fabric - Intro to Graphics. Klaus Mueller Stony Brook University Computer Science Department

C P S C 314 S H A D E R S, O P E N G L, & J S RENDERING PIPELINE. Mikhail Bessmeltsev

Introduction Rasterization Z-buffering Shading. Graphics 2012/2013, 4th quarter. Lecture 09: graphics pipeline (rasterization and shading)

FROM VERTICES TO FRAGMENTS. Lecture 5 Comp3080 Computer Graphics HKBU

Realtime 3D Computer Graphics Virtual Reality

Windowing System on a 3D Pipeline. February 2005

Pipeline Operations. CS 4620 Lecture 10

Introduction to Visualization and Computer Graphics

Rasterization. MIT EECS Frédo Durand and Barb Cutler. MIT EECS 6.837, Cutler and Durand 1

COMP3421. Introduction to 3D Graphics

Wed, October 12, 2011

Spring 2009 Prof. Hyesoon Kim

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

CS 498 VR. Lecture 18-4/4/18. go.illinois.edu/vrlect18

DH2323 DGI13 Lab 3 Rasterization

Visible-Surface Detection Methods. Chapter? Intro. to Computer Graphics Spring 2008, Y. G. Shin

Spring 2011 Prof. Hyesoon Kim

3D Drawing Visibility & Rasterization. CS559 Spring 2016 Lecture 9 February 23, 2016

3D Graphics for Game Programming (J. Han) Chapter II Vertex Processing

Homework 3: Programmable Shaders

Transcription:

Project 1, 467 Purpose: The purpose of this project is to learn everything you need to know for the next 9 weeks about graphics hardware. What: Write a 3D graphics hardware simulator in your language of choice (C, C++, Java, Perl?). Render teapot image: (Note: This is not a graphics class. It is ok if your rendering has some flaws, like those gaps in the teapot image above ;-) You must write this simulator, as you would build it in hardware except for a few small differences; these differences are: you do not need to modal the pipelining at a fine grain. you can execute the pipeline stages serially; you do not need to use fixed-point math. Feel free to use floats or doubles. Other than that, be careful not to write anything you don t actually plan on doing in hardware later. For example, do not use divide operations. You will find the following links helpful: http://www.cs.washington.edu/education/courses/cse457/ http://graphics.lcs.mit.edu/classes/6.837/f98/lecture5/homepage.html

Here is a brief primer on graphics and graphics hardware: Graphics cards render a sequence of objects. These objects are passed to the graphics card over a FIFO link, usually in a DMA like fashion. They are stored to an object memory, essentially a polygon list. The CPU also controls the GPU and frame buffer, but this control is limited (reading and writing the frame buffer, and controlling the GPU). The polygons are usually defined in three space, in user coordinates. Usually user coordinates are defined as the volume of space spanned by (-1,-1,-1) to (1,1,1), centered at (0,0,0). In this model a polygon is simplified to be a triangle and is defined by: { (x1, y1, z1), (x2, y2, z2), (x3, y3, z3) and (nx,ny, nz) and (r,g,b) } In addition, under different lighting models either a triangle or each vertex of that triangle is given a color. For our class we will use the simpler shading model where each triangle has a color. The component of the triangle (nx,ny, nz) is the normal vector of the triangle. The normal vector is a vector that is perpendicular to the plane of the polygon and normalized to a magnitude of 1. The graphics card is going to use this to determine the color of the polygon. In this class we will not be using textures, but rather only shades of gray. Rendering a frame of a game say, begins with the CPU sending to the object memory a sequence of triangles objects (like above). For this assignment you will be reading in a.off file (more on this later) which defines a set of triangles. Set each of these triangles color to (255,255,255)

(which is white) and compute their normal vectors. This prepares the dataset for rendering. Form these into a list and send them to your graphics pipeline simulator. The GPU Hardware The GPU does the following steps, in order: - Transformation. This is essentially a rotation of all of the vertexes of all of the polygons. - Lighting. - Z culling (removing polygons that are beyond the viewers range) or obscured - Projection (transforming from user coordinates to screen coordinates and adding a small perspective to the image). - Clipping (removing triangles that are off the screen and breaking up triangles that run off the edges of the screen). - Rasterization In this class we will use a reduced set of steps: - Transformation - Lighting - Projection - Rasterization Note that the two clipping stages have been removed. Instead of clipping we will render (rasterize) all polygons and do clipping there. This is slower, since off-screen polygons are always processed, but a reasonable simplification for our purposes. We will discuss each of these steps now: Transformation Transformation is perhaps the simplest step. In this step all vertexes are converted into vectors of length 4 (of the form [x,y,z,0]). They are then multiplied by a matrix (supplied by the host CPU). The normal vectors are extended (by adding a 1) and multiplied by this same matrix. Lighting Lighting is carried out by taking a normalized vector representing the light direction (supplied by the host CPU) and dot-producting it with each polygons normal vector. This value is then multiplied by the existing color of the polygon. Note, that there are several lighting models in graphics work and this is only the most basic. Feel free to do something more advanced if your in the mood. Projection

Projection is a process that transforms vertexes from user to screen coordinates. Conceptually it is another matrix multiplication along with some normalization. In practice we will use a fixed projection and rely upon the prior transformation to handle any other work. The projection is as follows: Each vertex of the polygon is transformed as: vx = (((vx / 4) + 1) / 2) * ScreenWidth vy = (((-vy / 4) + 1) / 2) * ScreenHeight vz = (((vz / 4) + 1) / 2) * ScreenDepth It should be clear why we are using this fixed projection and transformation into screen space. The divide operations are all by a factor of 2 which is easily implemented in hardware by a shift operation. Conceptually, the above equations do the following: divide by 4 shrinks the polygons. The addition of 1 is there since the user coordinates are from [-1,-1,-1] to [1,1,1]. The same goes for the divide by 2. Finally, the multiplication by ScreenWidth and ScreenHeight is for transforming from user to screen coordinates. One side note. For this class we will use ScreenWidth = ScreenHeight = 256. Furthermore I suggest you use a byte for the depth field in the z-buffer, which would make ScreenDepth = 256 as well. Final note: feel free to play around with this projection if you like. Rasterization Rasterization is where the bulk of the work is in this assignment. The early phases of the graphics pipeline are straightforward matrix or vector math. Rasterization on the other hand is a complex control process. When doing the solution set for this assignment half the time was spent on this one stage of the process, so take this into account when you are working on it. The process of rasterization involves filling in polygons in thee dimensions. Pixels in the screen buffer in the X,Y dimensions are colored according to the lighting of the polygon (in real graphics cards a texture would be mapped as well). For the screen we will use a Z-buffer. The Z-buffer attaches to each pixel an additional value, Z. This Z is the depth of the currently stored pixel. At a high level the rasterization process is: For each polygon p For each scanline y in p Compute extents (x1,z1) and (x2,z2) Color Z-buffer from x1 to x2 with color c of p if pixel xi has Z value <= current Z-buffer s value Z value. There are several ways to compute the extents of the scanlines and the Z values. The simplest way involves computing a slope and performing a simple mx+b calculation. Unfortunately, computing the slope requires a divide operation which is out of bounds for this assignment since in real hardware the divide would be quite expensive. Furthermore, the mx+b calculation requires very high precision math, otherwise the polygons turn into garbage. Therefore, we

suggest you study Bresenham s algorithm closely. Nominally Bresenham s algorithm is designed to render lines in two dimensions. However, it can be extended to n dimensions (in our case this is n=3), and it can be simplified significantly because we are drawing filled triangles, not lines Getting started To get started with this assignment download the sample teapot.off file from the class web page. Search the net for a description of the format of this file. You can do this assignment in any language you like but there is one caveat. Do not use any preexisting OpenGL libraries or headers, or pre-existing graphics libraries of any kind. The point of this exercise is for you to roll your own and learn about this stuff.