Level of Details in Computer Rendering

Similar documents
Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

lecture 18 - ray tracing - environment mapping - refraction

Illumination Models & Shading

Pipeline Operations. CS 4620 Lecture 10

CS5620 Intro to Computer Graphics

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Lighting. To do. Course Outline. This Lecture. Continue to work on ray programming assignment Start thinking about final project

Pipeline Operations. CS 4620 Lecture 14

Shading. Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller/Fuhrmann

Topics and things to know about them:

Subdivision Surfaces. Course Syllabus. Course Syllabus. Modeling. Equivalence of Representations. 3D Object Representations

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

CPSC GLOBAL ILLUMINATION

CS Illumination and Shading. Slide 1

SEOUL NATIONAL UNIVERSITY

Computer Graphics 1. Chapter 7 (June 17th, 2010, 2-4pm): Shading and rendering. LMU München Medieninformatik Andreas Butz Computergraphik 1 SS2010

Computer Graphics. Instructor: Oren Kapah. Office Hours: T.B.A.

Illumination and Shading

Visualisatie BMT. Rendering. Arjan Kok

CHAPTER 1 Graphics Systems and Models 3

Models and Architectures

Computer Graphics I Lecture 11

CS230 : Computer Graphics Lecture 4. Tamar Shinar Computer Science & Engineering UC Riverside

Topic 12: Texture Mapping. Motivation Sources of texture Texture coordinates Bump mapping, mip-mapping & env mapping

Photorealism: Ray Tracing

Introduction to Visualization and Computer Graphics

Topic 11: Texture Mapping 11/13/2017. Texture sources: Solid textures. Texture sources: Synthesized

Graphics for VEs. Ruth Aylett

Interpolation using scanline algorithm

Reflection and Shading

The Rasterization Pipeline

Homework #2. Shading, Ray Tracing, and Texture Mapping

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Topic 11: Texture Mapping 10/21/2015. Photographs. Solid textures. Procedural


Lecture 17: Shading in OpenGL. CITS3003 Graphics & Animation

Graphics for VEs. Ruth Aylett

Interactive Computer Graphics A TOP-DOWN APPROACH WITH SHADER-BASED OPENGL

L1 - Introduction. Contents. Introduction of CAD/CAM system Components of CAD/CAM systems Basic concepts of graphics programming

Chapter 7 - Light, Materials, Appearance

Rasterization Overview

Introduction to Computer Graphics with WebGL

Course Title: Computer Graphics Course no: CSC209

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

CS130 : Computer Graphics. Tamar Shinar Computer Science & Engineering UC Riverside

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

ECS 175 COMPUTER GRAPHICS. Ken Joy.! Winter 2014

Lighting and Shading

CS 325 Computer Graphics

CEng 477 Introduction to Computer Graphics Fall

Lecture 15: Shading-I. CITS3003 Graphics & Animation

Rendering Algorithms: Real-time indirect illumination. Spring 2010 Matthias Zwicker

Objectives Shading in OpenGL. Front and Back Faces. OpenGL shading. Introduce the OpenGL shading methods. Discuss polygonal shading

From Graphics to Visualization

Advanced Computer Graphics

CEng 477 Introduction to Computer Graphics Fall 2007

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Rendering. Illumination Model. Wireframe rendering simple, ambiguous Color filling flat without any 3D information

CMSC427 Advanced shading getting global illumination by local methods. Credit: slides Prof. Zwicker

Lecture 12: Advanced Rendering

Today s class. Simple shadows Shading Lighting in OpenGL. Informationsteknologi. Wednesday, November 21, 2007 Computer Graphics - Class 10 1

9. Illumination and Shading

Deferred Rendering Due: Wednesday November 15 at 10pm

CS354 Computer Graphics Ray Tracing. Qixing Huang Januray 24th 2017

Ray Tracing. Kjetil Babington

CS 464 Review. Review of Computer Graphics for Final Exam

Graphics Hardware and Display Devices

Illumination and Shading

CSE528 Computer Graphics: Theory, Algorithms, and Applications

Virtual Reality for Human Computer Interaction

CS GPU and GPGPU Programming Lecture 2: Introduction; GPU Architecture 1. Markus Hadwiger, KAUST

3D Rasterization II COS 426

3/1/2010. Acceleration Techniques V1.2. Goals. Overview. Based on slides from Celine Loscos (v1.0)

Computer Graphics. Illumination and Shading

Visualization Computer Graphics I Lecture 20

Height Fields and Contours Scalar Fields Volume Rendering Vector Fields [Angel Ch. 12] April 23, 2002 Frank Pfenning Carnegie Mellon University

Render methods, Compositing, Post-process and NPR in NX Render

CS130 : Computer Graphics Lecture 2: Graphics Pipeline. Tamar Shinar Computer Science & Engineering UC Riverside

SUMMARY. CS380: Introduction to Computer Graphics Ray tracing Chapter 20. Min H. Kim KAIST School of Computing 18/05/29. Modeling

CPSC / Illumination and Shading

Real-Time Shadows. Last Time? Textures can Alias. Schedule. Questions? Quiz 1: Tuesday October 26 th, in class (1 week from today!

Intro to Ray-Tracing & Ray-Surface Acceleration

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016

SRM INSTITUTE OF SCIENCE AND TECHNOLOGY

Mattan Erez. The University of Texas at Austin

Local Illumination. CMPT 361 Introduction to Computer Graphics Torsten Möller. Machiraju/Zhang/Möller

Shadow Rendering EDA101 Advanced Shading and Rendering

For Intuition about Scene Lighting. Today. Limitations of Planar Shadows. Cast Shadows on Planar Surfaces. Shadow/View Duality.

GLOBAL EDITION. Interactive Computer Graphics. A Top-Down Approach with WebGL SEVENTH EDITION. Edward Angel Dave Shreiner

Mach band effect. The Mach band effect increases the visual unpleasant representation of curved surface using flat shading.

3D Modeling: Surfaces

Introduction to Computer Graphics 7. Shading

Real-Time Shadows. Last Time? Today. Why are Shadows Important? Shadows as a Depth Cue. For Intuition about Scene Lighting

Overview of 3D Object Representations

CSE 167: Introduction to Computer Graphics Lecture #6: Lights. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2014

Previously... contour or image rendering in 2D

Illumination and Shading

CIS 581 Interactive Computer Graphics

Fog and Cloud Effects. Karl Smeltzer Alice Cao John Comstock

Last Time. Why are Shadows Important? Today. Graphics Pipeline. Clipping. Rasterization. Why are Shadows Important?

Transcription:

Level of Details in Computer Rendering Ariel Shamir Overview 1. Photo realism vs. Non photo realism (NPR) 2. Objects representations 3. Level of details

Photo Realism Vs. Non Pixar Demonstrations

Sketching, Pen-and-Ink Painterly Style

Enhanced Illustration Representations

Object Representations Schemes in 3D Samples: Point Clouds Images Boundary: Implicit Parametric Meshes Subdivision Volume: Voxels Octree BSP High Level: CSG Skeleton, Sweep Scene Graph Constraints Feature Based Meshes

Entities & Relations The entities in a boundary mesh are faces, edges, vertices (in volumetric also elements). The relations between these entities are incidence & Adjacency: FF, FE, FV EF, EE, EV VF,VE,VV v 3 e 5 v 4 f e 2 3 e f 4 1 v 1 e 1 e 2 v 2 Separation of Topology and Geometry The relations between the all entities in the mesh are the topology (connectivity) of the mesh. The positions of the entities in space are the geometry of the mesh. A mesh is defined by its topology and geometry together, but they can be separated as different views of the object.

Polygonal Mesh Representation The description of all relations is the topology (connectivity) of the mesh. THE representation issue: which of those relations we store explicitly and which (and how) implicitly? Most used in graphics: triangle soups and indexed face set. In solid modeling: half-edge, winged-edge. edge. Triangle Soup (x 00,y 00,z 00 ) (x 01,y 01,z 01 ) (x 02,y 02,z 02 ) (x 10,y 10,z 10 ) (x 11,y 11,z 11 ) (x 12,y 12,z 12 ) (x n0,y n0,z n0 ) (x n1,y n1,z n1 ) (x n2,y n2,z n2 ) Do we have any relations? Is this an efficient representation?

Index Face Set V 0 V n then F 0 F m (x 0,y 0,z 0 ) (x n,y n,z n ) (i 0,j 0,k 0 ) (i m,j m,k m ) Maybe other information: N 0 N n Which relation we have? FV Display Device Display devices (screens, printers etc.) are usually raster devices. This means they use discrete 2D color elements called pixels to display images. As a consequence every type of virtual model must be turned into 2D pixels in order to be viewed!

Hardware Support Most computers today include a video card.. It is in fact a collection of hardware implementations of a rendering pipeline. The Rendering Pipeline

Visibility & Shading In terms of converting a 3D scene to a 2D image, one can think of two basic rendering tasks: Visibility determination: what is visible and what is occluded. Shading: what is the color of each visible 3D element (and 2D pixel) 3D Projection & 2D Rasterization Hardware support for triangles! Raster Screen Projection Rasterization Virtual World Models

Visibility Using The Z-buffer Any pixel rendered holds a color and a distance from the viewer (z). Pixels are accumulated in a buffer using the following algorithm: Clear Z buffer (set z to infinity and store bg-color) For each new pixel P(i,j) rendered If the distance z P of P is smaller than Z(i,j) store the z P & color of P in Z(i,j) Illumination & Shading The color of each point on a 3D object is calculated according to:

Object Properties An object O material properties includes a diffuse component O d and a specular components O s for each color (e.g. {R,G,B}). It also includes ambient diffuse and specular reflection coefficients: k a k d k s Lights Properties There is a global ambient light intensity I a There are multiple point light sources, each light i has its own intensity I pi There is a light attenuation factor for each light i f att i

Basic Display Primitives The rendering pipeline supports two basic display primitives: 1. Pixels (e.g. from images) are used directly or combined into the image. 2. Polygon (usually triangle) are converted to pixels using vertex information by rasterization and interpolation. Why Triangles? Most simple polygon, a planar polygon. Simplifies hardware design and implementation. Every polygon can be triangulated. Higher degree surfaces can be approximated to any extent by triangles. Bilinear interpolation gives reasonable results (color, position, normal).

Vertex Information Position in space (Depth) Normal Color/Shading Texture map coordinate OpenGL demo From Triangle to Pixels Triangles are planar, this means a simple bilinear interpolation can be used to interpolate the attributes from the vertices to any pixel inside.

Guarard Vs. Phong Shading Guarard: : calculate color in corners and interpolate Phong: : interpolate normals and calculate colors at each pixel Phong Flat Guarard Normal Bilinear Smoothing SMF demo

From Model To Triangles To Pixels Triangulation Rasterization Triangles Pixels Model The Other Option: Pixels (Points) The algorithm composes the image pixels by itself. No rasterization is used! Example: ray tracing is a technique to create more realistic images from modeled scenes.

Ray Tracing Ray Tracing Idea Simulate the path that light rays take as they bounce around within the world - they are traced through the scene. When tracing rays from light sources to the eye many are wasted since they never reach the eye

Reverse Tracing Each pixel color is created by shooting a ray from the eye through the pixel into the scene and tracing its refractions and reflections up to the light source. How many? Hardware Aware Efficient Rendering In terms of rendering the best model is a polygonal boundary representation: A triangle mesh. In fact, it is heavily used in computer graphics and animation. Disadvantages? s?

Polygonal Model Deficiencies Limited expressive power: it is only a linear model. No smoothness: only C 0. No realism: surfaces in real world are rarely polygonal. Rendering complexity is dependent on the scene, not the image, complexity. Triangle Rendering Rendering using triangles is most efficient if each triangle covers a number of pixels (at least more than one). If we have many triangles for each pixel than we do not need most of them. Rendering is always dependent on the number of triangles in the scene. Bad examples?

Complex Environments & Objects 90000 objects 575 million triangles 2500 horses 50 million triangles One complex object 376,436 triangles Number of Pixels Vs. Scene Complexity The end result is an image with a specified number of pixels. Is there a way to make rendering an order of the number of pixels instead of the scene complexity? This would be beneficial for large models, complex models and large scenes.

Solutions Multi-resolution models: Use the appropriate resolution as a function of the number of pixels (next lecture). Image based rendering: The model is a set of images taken from different directions. No model very difficult to create new images. Others Other Rendering Option? Maybe we do not need triangles! Maybe there are other primitives for rendering which will be faster than converting a high degree models to triangles and then rendering.

Point Based Rendering Any image is a sampling result of objects interaction with light. In order to render we can sample the object with respect to its Geometric properties Material properties The simplest primitive needed is in fact a point and a normal. Image As Samples

Splats Once we know the color calculation according to the position and normal, we can create a splat (any geometric 2D shape) and send it to the z-buffer. z Surface Splats

Iso-splat Projection Sampling QSplat - Demo

LOD for Photo Realistic Rendering One word: efficiency (time & space)! Bigger, more complex models with more details, faster and better better Using LOD: reduces details where it is not important meaning it is not noticed and important important meaning does not degrade visual realism or quality. 2/23/2005 Ariel Shamir Examples 2/23/2005 Ariel Shamir

Distance & Resolution Distance Resolution Another Example

View Dependent View Dependent (cont.)

Temporal LOD LOD for NPR NPR is strongly based on LOD as well but for other efficiency measures visual space or perception space (cluttering) Reducing details where it is not important this time means the image does not degrade the perception or understanding or insight!

Perceptual Based Enhanced Understanding

Summary Either for efficiency (Photo-Realism) or for better insight and perception (NPR) LOD reduces details where it is not important and enhances details where it is!