Augmented Reality: projectors

Size: px
Start display at page:

Download "Augmented Reality: projectors"

Transcription

1 Augmented Reality: projectors by Michael Vögele 1. Introduction 1.1. Brief overview on research areas 2. History of displays/projectors 3. Smart Projectors 3.1. Description 3.2. Areas of application 3.3. Core technological realization Undistorted projection Color correction Lamberts Law Neutralized texture 3.4. Applied technological tools 3.5. Smart Projectors in habitual environments Calibration Multiple Viewers 3.6. Multiple projectors 4. Enabling View-Dependent Stereoscopic Projection 4.1. Prerequisites 4.2. Innovation compared to smart projectors 4.3. Core technological realization 4.4. Areas of application 5. Dynamic Shader Lamps 5.1. Description 5.2. Shader Lamps Painting on Movable Objects 5.3. Areas of application 5.4. Technical aspect Projector arrangement Tracking + Calibration Choice of objects The brush 5.5. Problems waiting to be resolved 6. Seamless integration of VR into habitual workplaces 6.1. Brief Description Innovation + technology 6.2. Areas of application 7. References 1. Introduction Through the course of the 20th century television has played an important role in the entertainment industry. TV sets were continuously improved making them bigger, improving their refresh rate, minimizing power consumption and cutting the price. Todays flat panel displays however are reaching their limits. Projectors may just be the next evolutional step in displays. Not only could projectors push the boundaries given by flat panel displays, in addition they could be used to incorporate augmented reality into our every day lives. Hence the following document is aimed at presenting these posibilities using specific research projects for demonstration purposes Brief overview on research areas The core of this document is comprised of research projects aimed at taking video displays to the next level. The remaining projects demonstrate the posibilities beyond mere video displays. The following projects will be presented: Smart Projectors Stereoscopic Projection Shader Lamps integration into workplaces 2. History of displays/projectors The idea of projectors first arose in the 15. century. At the time projectors consisted primarily of a lantern and a lens. In 1825 limelight projectors were discovered. These projectors used a limestone as light source (limestones glow if lighted by an oxy-hydrogen flame). In the end of the 19th century limelight was replaced by electric light. With the arrival of digital systems projectors were adjusted to work with digital data. They first became usefull for presentations primarily in big companies. From this point on projectors were improved much the same way computers were. They became smaller, easier to handle and better in quality. Today projectors are expanding into new areas such as television, entertainment, information... Furthermore they are evolving to present us with new posibilities such as three dimensional projections.

2 3. Smart Projectors 3.1. Description Smart projectors are video projectors capable of projecting onto any aurface. Hence they eliminate the necesity of a large screen solving the main problem with projectors for habitual workplaces. The task of projecting onto arbitrary surfaces requires the use of sensors to scan the desired canvas. In theory all sorts of sensors can be used, smart projectors however concentrate on the use of cameras. At its core a smart projector is any projector that analyzes its environment before projecting Areas of application Smart projectors could be used in a wide variety of areas. Smart projectors could for instance be used as a means of applying information directly onto objects without altering them physically. This could be interesting for museums if they want to project information onto ancient artefacts. Most interestingly however smart projectors may drastically enhance the use of projectors in every day homes. They are capable of using any surface for a large screen experience. In the not too distant future smart projectors may even become a part of unconspicuous actions such as searching for a book or doing routine work at ones desk. Here they would circle the requested book in a shelf or project virtual objects onto your desk (see 6). Unlike large television sets smart projectors also have a benefit in that they are fairly small and can therefore easily be transported Core technological realization Undistorted projection The projector realizes undistorted projection by processing incoming video signals according to parameters gained by a one time calibration. The calibration process happens only once for a specific screen and camera position. Hence it is currently not possible to project onto a moving canvas. The projector is calibrated by emiting specially structured light onto the screen. The camera then transfers a snapshot of the lighted area to the smart projector which analyzes the given data to extract the desired parameters. The system usually only concentrates on a specific set of pixels. The projector emits these pixels as if projecting onto a flat screen. By comparing the pixel positions that were projected to those actually seen by the camera a displacement map that will map video pixels onto the desired screen can be constructed. In order to achieve an acurate displacement map multiple different patterns of light are emited and analyzed. The resulting data is then combined. In order to synchronize projection with the camera it is vital for the projector to know how long it takes until the projector receives the image taken by the camera. This time is initially measured by emiting a pattern and saving the time it takes until the image arrives through the camera. The entire process takes aproximately 28 seconds. If the surface is very rough more pixels must be analyzed which results in longer calibration times or the need for more advanced algorithms. The process summarized: 1. Positioning of camera and projector 2. Measuring camera latency 3. Mapping projector to camera for multiple types of structured light 4. Construction of a displacement map Color correction Up to this point we have only corrected the geometry of our screen. The screens color however still interferes with the color provided by our projection. In order to solve this problem one has to distinguish between surface types. While the color of some surfaces is altered by specular, reflection etc. effects the color of some surfaces is solely defined by its diffuse color. To this point research on smart projectors focuses only on the latter due to their simplicity. These surfaces are known as Lambertian surfaces. If however a surface absorbs a crucial part of the light spectrum color correction as well as geometry correction will fail Lamberts Law In order to understand Lamberts Law it is crucial to understand Lambertian surfaces. The color of a Lambertian surface can be acurately defined by a few decisive parameters: 1. surface s material color (M) 2. light color 3. intensity that leaves the source (I) 4. distance to the surface (r) 5. incidence angle of light rays (a) Lambertian surfaces are diffuse reflectors which means that light hitting the surface is reflected in all directions with the same intensity. The reflected intensity is dependant on the angle between the light vector and the surfaces normal. These attributes are taken into account in Lamberts Law: R = I*cos(a)/r^2*M R aproximates the diffuse reflection.

3 Neutralized texture In order to acurately display an image the projector must neutralize undesired color and light intensity. This is done using Lamberts Law. We start by adjusting Lamberts Law in such a way that it takes the environmental lighting into account. R = I*cos(a)/r^2*M+EM E holds the intensity and color of the environmental light. Solving the equation for I gives us the desired projection intensity. I = (R-EM)/(M*cos(a)/r^2) The last question remaining is how the parameters are acquired. The returned intensity R is given by the original image. EM can be aproximated by turning off the projector and using the camera to measure the light intensity. The result is a value proportional to EM which can be used to aproximate the value. M*cos(a)/r^2 can be aproximated in much the same way except that this time the projector projects using full light intensity (I = 1) Applied technological tools Up to this point smart projectors use devices accessible to the public. The basic configuration consists of a projector, a camera, some from of microcontroller and a graphics card. All data is saved in textures in order to make use of pixel shader technology found in modern graphics cards. The GPU does all the necessary calculations and then repeatedly uses a displacement texture to transform the input image into the desired projection on a pixel by pixel basis. The detail levels given by the camera and the projector are still a major weakness for smart projectors Smart Projectors in habitual environments Calibration Before usage the user must calibrate the smart projector. This is done by placing the camera as close to the optimal viewing position ( sweet spot ) facing the screen. The projector does all necessary calculations discussed in chapter 3.3. This process takes about 28 seconds which is very acceptable for every day use. Once calibrated the camera can be detached and it is not needed again until the projector needs to be recalibrated Multiple Viewers The calibration process optimizes the projection for one specific viewing position, the sweet spot. This means that people sitting next to eachother may see a differing degree of distortion in the screen. If however one refrains from using extreme surfaces such as a corner of the room the difference in perception is rather small Multiple projectors Up to this point we still face a serious problem. On some surfaces the projector may cast shadows causing the emitted light to be absorbed in these areas. This particular problem can be solved using more than one projector. Having multiple projectors means having to achieve the desired light intensity by evenly distributing it over the given projectors. This leads to the following adaption of Lamberts Law: R = EM + I1*F1*M + I2*F2*M + + IN*FN*M. Assuming every projector emits light with the same intensity we get: Ii = (R - EM)/( F1*M + F2*M + + FN*M) Multiple projectors also enable us to split the screen into tiles projected by different projectors. In this way we can create large scale screens in high resolution. 4. Enabling View-Dependent Stereoscopic Projection 4.1. Prerequisites View-dependant stereoscopic projection uses the posibilities given by smart projectors and takes the innovation to the next step Innovation compared to smart projectors Using the capabilities of smart projectors enables us to acurately view a projection on an arbritrary surface from one fixed location, the sweet spot. The next step is to allow the sweet spot to be moved in real time. Using this capability the sweet spot can be moved according to the viewers tracked location allowing the user to acurately see the projected image from any angle. Having tracked the spectators position the projector can use this information to determine what to display on

4 screen creating the illusion of a three dimensional display Core technological realization Up to date two methods have proven usefull in achieving the desired effect. The first method is purely image based. Instead of having just one camera to calibrate the projector multiple cameras are used to cover a variety of possible viewing positions. Using the viewers tracked position the projector can calculate the projection parameters by acurately weighting the cameras parameters. This aproach retains the benefits smart projectors had in that the set up is completely independant from the chosen screen. 5. Dynamic Shader Lamps 5.1. Description 5.2. Shader Lamps The aim of research on dynamic shader lamps is to create movable objects with virtual texture. The virtual texture is projected onto the given physical object using multiple projectors. Shader lamps are projectors used to paint the surface of a physical object. The given object is white in reality, it s texture is altered by projecting the desired images onto its surfaces. Basic shader lamps only work for stationary objects Painting on Movable Objects An interesting aspect of shader lamps is that they allow the texture of an object to be manipulated easily. This allows for the posibility of painting on an object Areas of application A second aproach is more geometry based. It either tries to analyze the geometry of the screen and save it as a three dimensional modell or it uses predesigned modells. Based on the modell and the viewers position the projection is then calculated in real time. This method is however strongely dependant on the quality of the modell and is therefore difficult to use for arbitrary surfaces Areas of application The additions given by stereoscopic view dependant projections can be used in the same areas the smart projectors can be used. Three dimensional projectors could enhance your every day lives at home, or work. A very likely area of application could also be a museum. Shader lamps could become usefull in video conferences. The fact that all texture data of objects is given virtually enables the data to be distributed between participants. If one person alters an objects texture an exact replica of the object in other areas would show the same texture. Further applications may include the cosmetics industry, art galleries, clothe shopping or childrens toys Technical aspect Projector arrangement For shader lamps to function correctly it is sufficient to have two projectors. The quality of this setup will however depend highly on the geometry of the object used. Some objects may cast shadows using only two projectors.

5 Tracking + Calibration The object is tracked using a sensor attached to the object. Using this sensor one can easily acquire the transformation matrix of the object. Multiplying the matrix by each vertex of the object yields the object vertices in world space. These are then used to acurately project onto the object. The projectors need to be calibrated once. During this process a transformation matrix is created which transforms points in world space into projector space Choice of objects In theory any object can be used. However using objects with Lambertian surfaces offers the same advantages as with smart projectors. Keeping the geometry simple is also beneficial in order to prevent shadows and in order to keep the data size small. Komplex modells also require the use of automated aproaches in gaining the geometry data. Here the same techniques as with stereoscopic projection can be used (see chapter 4.3.) The brush The brush is used to paint on the object. How much the brush affects the object depends solely on the brush s distance from the object. This is defined in the brush function: B(X) = 1 - (r/r)^2, r as distance; R as brush radius The color at a specific position x on the surface of the object is calculated using alpha blending: Color(x) = Color(x)*(1-B(X))+BrushColor*B(X) 5.5. Problems waiting to be resolved There are still a couple of limitations for dynamic shader lamps. - The sensors have fairly hight latency which disturbs the synchronisation between moving the object and projecting the texture. - The sensors have a fairly small range which limits the size of the entire setup. - As seen with smart projectors shadows cause problems with shader lamps as well Innovation + technology This aproach combines virtual objects with real objects. Both are manipulated with ones hand, real objects directly and virtual objects by means of gestures. An important technological aspect that is not covered in previously mentioned research areas is the challenge of tracking a users hand motion Areas of application A wide variety of applications are imaginable. Virtual notes could be made on ones desk, these could be easily archived. The technology could be used as a search function in which projectors mark real books that one is looking for. Rooms could have virtual pictures which adapt to whoever is presently in the room. Visualizations could aid when trying to repair something. 7. References Bimber, O., Emmerling, A., and Klemmer, T., Embedded Entertainment with Smart Projectors Oliver Bimber, Enabling View-Dependent Stereoscopic Projection in Real Environments, Submitted to the International Symposium on Mixed and Augmented Reality (ISMAR 05), 2005 Bandyopadhyay, D.; Raskar, R.; Fuchs, H., Dynamic Shader Lamps : Painting on Movable Objects, IEEE and ACM International Symposium on Augmented Reality, pp , October 2001 Bimber, O., Encarnação, L.M. and Stork, A., Seamless integration of virtual reality in habitual workplaces, Journal for Industrial Science, Munich University of Technology, vol.55, no.2, pp , Seamless integration of VR into habitual workplaces 6.1. Brief Description This research topic deals with incorporating virtual projections into our every day lives. Emphasis is placed on the use of projectors in order to add virtual objects to the desk environment.

Television played a central role in shaping the

Television played a central role in shaping the COVER FEATURE Embedded Entertainment with Smart Projectors Essentially video projectors enhanced with sensors to gain information about the environment, smart projectors do not require artificial canvases

More information

Lecture 15: Shading-I. CITS3003 Graphics & Animation

Lecture 15: Shading-I. CITS3003 Graphics & Animation Lecture 15: Shading-I CITS3003 Graphics & Animation E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley 2012 Objectives Learn that with appropriate shading so objects appear as threedimensional

More information

CS452/552; EE465/505. Intro to Lighting

CS452/552; EE465/505. Intro to Lighting CS452/552; EE465/505 Intro to Lighting 2-10 15 Outline! Projection Normalization! Introduction to Lighting (and Shading) Read: Angel Chapter 5., sections 5.4-5.7 Parallel Projections Chapter 6, sections

More information

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015

Orthogonal Projection Matrices. Angel and Shreiner: Interactive Computer Graphics 7E Addison-Wesley 2015 Orthogonal Projection Matrices 1 Objectives Derive the projection matrices used for standard orthogonal projections Introduce oblique projections Introduce projection normalization 2 Normalization Rather

More information

Practical Techniques for Ray Tracing in Games. Gareth Morgan (Imagination Technologies) Aras Pranckevičius (Unity Technologies) March, 2014

Practical Techniques for Ray Tracing in Games. Gareth Morgan (Imagination Technologies) Aras Pranckevičius (Unity Technologies) March, 2014 Practical Techniques for Ray Tracing in Games Gareth Morgan (Imagination Technologies) Aras Pranckevičius (Unity Technologies) March, 2014 What Ray Tracing is not! Myth: Ray Tracing is only for photorealistic

More information

Reading. Shading. An abundance of photons. Introduction. Required: Angel , 6.5, Optional: Angel 6.4 OpenGL red book, chapter 5.

Reading. Shading. An abundance of photons. Introduction. Required: Angel , 6.5, Optional: Angel 6.4 OpenGL red book, chapter 5. Reading Required: Angel 6.1-6.3, 6.5, 6.7-6.8 Optional: Shading Angel 6.4 OpenGL red book, chapter 5. 1 2 Introduction An abundance of photons So far, we ve talked exclusively about geometry. Properly

More information

Computer Graphics. Illumination and Shading

Computer Graphics. Illumination and Shading () Illumination and Shading Dr. Ayman Eldeib Lighting So given a 3-D triangle and a 3-D viewpoint, we can set the right pixels But what color should those pixels be? If we re attempting to create a realistic

More information

Introduction. Chapter Computer Graphics

Introduction. Chapter Computer Graphics Chapter 1 Introduction 1.1. Computer Graphics Computer graphics has grown at an astounding rate over the last three decades. In the 1970s, frame-buffers capable of displaying digital images were rare and

More information

Shading I Computer Graphics I, Fall 2008

Shading I Computer Graphics I, Fall 2008 Shading I 1 Objectives Learn to shade objects ==> images appear threedimensional Introduce types of light-material interactions Build simple reflection model Phong model Can be used with real time graphics

More information

Visualisatie BMT. Rendering. Arjan Kok

Visualisatie BMT. Rendering. Arjan Kok Visualisatie BMT Rendering Arjan Kok a.j.f.kok@tue.nl 1 Lecture overview Color Rendering Illumination 2 Visualization pipeline Raw Data Data Enrichment/Enhancement Derived Data Visualization Mapping Abstract

More information

Topic 9: Lighting & Reflection models 9/10/2016. Spot the differences. Terminology. Two Components of Illumination. Ambient Light Source

Topic 9: Lighting & Reflection models 9/10/2016. Spot the differences. Terminology. Two Components of Illumination. Ambient Light Source Topic 9: Lighting & Reflection models Lighting & reflection The Phong reflection model diffuse component ambient component specular component Spot the differences Terminology Illumination The transport

More information

CPSC 314 LIGHTING AND SHADING

CPSC 314 LIGHTING AND SHADING CPSC 314 LIGHTING AND SHADING UGRAD.CS.UBC.CA/~CS314 slide credits: Mikhail Bessmeltsev et al 1 THE RENDERING PIPELINE Vertices and attributes Vertex Shader Modelview transform Per-vertex attributes Vertex

More information

Rendering: Reality. Eye acts as pinhole camera. Photons from light hit objects

Rendering: Reality. Eye acts as pinhole camera. Photons from light hit objects Basic Ray Tracing Rendering: Reality Eye acts as pinhole camera Photons from light hit objects Rendering: Reality Eye acts as pinhole camera Photons from light hit objects Rendering: Reality Eye acts as

More information

Interactive Cloth Simulation. Matthias Wloka NVIDIA Corporation

Interactive Cloth Simulation. Matthias Wloka NVIDIA Corporation Interactive Cloth Simulation Matthias Wloka NVIDIA Corporation MWloka@nvidia.com Overview Higher-order surfaces Vertex-shader deformations Lighting modes Per-vertex diffuse Per-pixel diffuse with bump-map

More information

Lighting Models. CS116B Chris Pollett Mar 21, 2004.

Lighting Models. CS116B Chris Pollett Mar 21, 2004. Lighting Models CS116B Chris Pollett Mar 21, 2004. Outline Overview Light Sources Surface Lighting Effect Basic Illumination Models Overview An illumination model (lighting model) is used to calculate

More information

critical theory Computer Science

critical theory Computer Science Art/Science Shading, Materials, Collaboration Textures Example title Artists In the recommend real world, two the main following: factors determine the appearance of a surface: basic understanding what

More information

3D graphics, raster and colors CS312 Fall 2010

3D graphics, raster and colors CS312 Fall 2010 Computer Graphics 3D graphics, raster and colors CS312 Fall 2010 Shift in CG Application Markets 1989-2000 2000 1989 3D Graphics Object description 3D graphics model Visualization 2D projection that simulates

More information

Texture Mapping. Images from 3D Creative Magazine

Texture Mapping. Images from 3D Creative Magazine Texture Mapping Images from 3D Creative Magazine Contents Introduction Definitions Light And Colour Surface Attributes Surface Attributes: Colour Surface Attributes: Shininess Surface Attributes: Specularity

More information

Photo Studio Optimizer

Photo Studio Optimizer CATIA V5 Training Foils Photo Studio Optimizer Version 5 Release 19 September 008 EDU_CAT_EN_PSO_FF_V5R19 Photo Studio Optimizer Objectives of the course Upon completion of this course you will be able

More information

w Foley, Section16.1 Reading

w Foley, Section16.1 Reading Shading w Foley, Section16.1 Reading Introduction So far, we ve talked exclusively about geometry. w What is the shape of an object? w How do I place it in a virtual 3D space? w How do I know which pixels

More information

The Terrain Rendering Pipeline. Stefan Roettger, Ingo Frick. VIS Group, University of Stuttgart. Massive Development, Mannheim

The Terrain Rendering Pipeline. Stefan Roettger, Ingo Frick. VIS Group, University of Stuttgart. Massive Development, Mannheim The Terrain Rendering Pipeline Stefan Roettger, Ingo Frick VIS Group, University of Stuttgart wwwvis.informatik.uni-stuttgart.de Massive Development, Mannheim www.massive.de Abstract: From a game developers

More information

Interactive Real-Time Raycasting

Interactive Real-Time Raycasting Interactive Real-Time Raycasting CS184 AS4 Due 2009-02-26 11:00pm We start our exploration of Rendering - the process of converting a high-level object-based description into a graphical image for display.

More information

Multimedia Technology CHAPTER 4. Video and Animation

Multimedia Technology CHAPTER 4. Video and Animation CHAPTER 4 Video and Animation - Both video and animation give us a sense of motion. They exploit some properties of human eye s ability of viewing pictures. - Motion video is the element of multimedia

More information

Lighting and Shading Computer Graphics I Lecture 7. Light Sources Phong Illumination Model Normal Vectors [Angel, Ch

Lighting and Shading Computer Graphics I Lecture 7. Light Sources Phong Illumination Model Normal Vectors [Angel, Ch 15-462 Computer Graphics I Lecture 7 Lighting and Shading February 12, 2002 Frank Pfenning Carnegie Mellon University http://www.cs.cmu.edu/~fp/courses/graphics/ Light Sources Phong Illumination Model

More information

Light Transport CS434. Daniel G. Aliaga Department of Computer Science Purdue University

Light Transport CS434. Daniel G. Aliaga Department of Computer Science Purdue University Light Transport CS434 Daniel G. Aliaga Department of Computer Science Purdue University Topics Local and Global Illumination Models Helmholtz Reciprocity Dual Photography/Light Transport (in Real-World)

More information

Wednesday, 26 January 2005, 14:OO - 17:OO h.

Wednesday, 26 January 2005, 14:OO - 17:OO h. Delft University of Technology Faculty Electrical Engineering, Mathematics, and Computer Science Mekelweg 4, Delft TU Delft Examination for Course IN41 5 1-3D Computer Graphics and Virtual Reality Please

More information

Illumination & Shading

Illumination & Shading Illumination & Shading Goals Introduce the types of light-material interactions Build a simple reflection model---the Phong model--- that can be used with real time graphics hardware Why we need Illumination

More information

Topic 9: Lighting & Reflection models. Lighting & reflection The Phong reflection model diffuse component ambient component specular component

Topic 9: Lighting & Reflection models. Lighting & reflection The Phong reflection model diffuse component ambient component specular component Topic 9: Lighting & Reflection models Lighting & reflection The Phong reflection model diffuse component ambient component specular component Spot the differences Terminology Illumination The transport

More information

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University

Global Illumination CS334. Daniel G. Aliaga Department of Computer Science Purdue University Global Illumination CS334 Daniel G. Aliaga Department of Computer Science Purdue University Recall: Lighting and Shading Light sources Point light Models an omnidirectional light source (e.g., a bulb)

More information

Graphics for VEs. Ruth Aylett

Graphics for VEs. Ruth Aylett Graphics for VEs Ruth Aylett Overview VE Software Graphics for VEs The graphics pipeline Projections Lighting Shading VR software Two main types of software used: off-line authoring or modelling packages

More information

Lighting. Figure 10.1

Lighting. Figure 10.1 We have learned to build three-dimensional graphical models and to display them. However, if you render one of our models, you might be disappointed to see images that look flat and thus fail to show the

More information

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome!

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Ground Truth. Welcome! INFOGR Computer Graphics J. Bikker - April-July 2015 - Lecture 10: Ground Truth Welcome! Today s Agenda: Limitations of Whitted-style Ray Tracing Monte Carlo Path Tracing INFOGR Lecture 10 Ground Truth

More information

Module 5: Video Modeling Lecture 28: Illumination model. The Lecture Contains: Diffuse and Specular Reflection. Objectives_template

Module 5: Video Modeling Lecture 28: Illumination model. The Lecture Contains: Diffuse and Specular Reflection. Objectives_template The Lecture Contains: Diffuse and Specular Reflection file:///d /...0(Ganesh%20Rana)/MY%20COURSE_Ganesh%20Rana/Prof.%20Sumana%20Gupta/FINAL%20DVSP/lecture%2028/28_1.htm[12/30/2015 4:22:29 PM] Diffuse and

More information

Mali Demos: Behind the Pixels. Stacy Smith

Mali Demos: Behind the Pixels. Stacy Smith Mali Demos: Behind the Pixels Stacy Smith Mali Graphics: Behind the demos Mali Demo Team: Doug Day Stacy Smith (Me) Sylwester Bala Roberto Lopez Mendez PHOTOGRAPH UNAVAILABLE These days I spend more time

More information

Virtual Reality for Human Computer Interaction

Virtual Reality for Human Computer Interaction Virtual Reality for Human Computer Interaction Appearance: Lighting Representation of Light and Color Do we need to represent all I! to represent a color C(I)? No we can approximate using a three-color

More information

Chapter 17: The Truth about Normals

Chapter 17: The Truth about Normals Chapter 17: The Truth about Normals What are Normals? When I first started with Blender I read about normals everywhere, but all I knew about them was: If there are weird black spots on your object, go

More information

S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T

S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T S U N G - E U I YO O N, K A I S T R E N D E R I N G F R E E LY A VA I L A B L E O N T H E I N T E R N E T Copyright 2018 Sung-eui Yoon, KAIST freely available on the internet http://sglab.kaist.ac.kr/~sungeui/render

More information

Comp 410/510 Computer Graphics. Spring Shading

Comp 410/510 Computer Graphics. Spring Shading Comp 410/510 Computer Graphics Spring 2017 Shading Why we need shading Suppose we build a model of a sphere using many polygons and then color it using a fixed color. We get something like But we rather

More information

Shading. Brian Curless CSE 557 Autumn 2017

Shading. Brian Curless CSE 557 Autumn 2017 Shading Brian Curless CSE 557 Autumn 2017 1 Reading Optional: Angel and Shreiner: chapter 5. Marschner and Shirley: chapter 10, chapter 17. Further reading: OpenGL red book, chapter 5. 2 Basic 3D graphics

More information

Today s class. Simple shadows Shading Lighting in OpenGL. Informationsteknologi. Wednesday, November 21, 2007 Computer Graphics - Class 10 1

Today s class. Simple shadows Shading Lighting in OpenGL. Informationsteknologi. Wednesday, November 21, 2007 Computer Graphics - Class 10 1 Today s class Simple shadows Shading Lighting in OpenGL Wednesday, November 21, 27 Computer Graphics - Class 1 1 Simple shadows Simple shadows can be gotten by using projection matrices Consider a light

More information

Simple Lighting/Illumination Models

Simple Lighting/Illumination Models Simple Lighting/Illumination Models Scene rendered using direct lighting only Photograph Scene rendered using a physically-based global illumination model with manual tuning of colors (Frederic Drago and

More information

A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction

A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction A New Image Based Ligthing Method: Practical Shadow-Based Light Reconstruction Jaemin Lee and Ergun Akleman Visualization Sciences Program Texas A&M University Abstract In this paper we present a practical

More information

Topics and things to know about them:

Topics and things to know about them: Practice Final CMSC 427 Distributed Tuesday, December 11, 2007 Review Session, Monday, December 17, 5:00pm, 4424 AV Williams Final: 10:30 AM Wednesday, December 19, 2007 General Guidelines: The final will

More information

CS201 Computer Vision Lect 4 - Image Formation

CS201 Computer Vision Lect 4 - Image Formation CS201 Computer Vision Lect 4 - Image Formation John Magee 9 September, 2014 Slides courtesy of Diane H. Theriault Question of the Day: Why is Computer Vision hard? Something to think about from our view

More information

LEVEL 1 ANIMATION ACADEMY2010

LEVEL 1 ANIMATION ACADEMY2010 1 Textures add more realism to an environment and characters. There are many 2D painting programs that can be used to create textures, such as Adobe Photoshop and Corel Painter. Many artists use photographs

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. A wavelet based real-time rendering technology for indoor mixed reality ABSTRACT

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. A wavelet based real-time rendering technology for indoor mixed reality ABSTRACT [Type text] [Type text] [Type text] ISSN : 0974-7435 Volume 10 Issue 24 2014 BioTechnology An Indian Journal FULL PAPER BTAIJ, 10(24), 2014 [15095-15100] A wavelet based real-time rendering technology

More information

Starting this chapter

Starting this chapter Computer Vision 5. Source, Shadow, Shading Department of Computer Engineering Jin-Ho Choi 05, April, 2012. 1/40 Starting this chapter The basic radiometric properties of various light sources Develop models

More information

Introduction. Lighting model Light reflection model Local illumination model Reflectance model BRDF

Introduction. Lighting model Light reflection model Local illumination model Reflectance model BRDF Shading Introduction Affine transformations help us to place objects into a scene. Before creating images of these objects, we ll look at models for how light interacts with their surfaces. Such a model

More information

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome! INFOGR Computer Graphics J. Bikker - April-July 2016 - Lecture 10: Shading Models Welcome! Today s Agenda: Introduction Light Transport Materials Sensors Shading INFOGR Lecture 10 Shading Models 3 Introduction

More information

CS6670: Computer Vision

CS6670: Computer Vision CS6670: Computer Vision Noah Snavely Lecture 20: Light, reflectance and photometric stereo Light by Ted Adelson Readings Szeliski, 2.2, 2.3.2 Light by Ted Adelson Readings Szeliski, 2.2, 2.3.2 Properties

More information

Introduction Ray tracing basics Advanced topics (shading) Advanced topics (geometry) Graphics 2010/2011, 4th quarter. Lecture 11: Ray tracing

Introduction Ray tracing basics Advanced topics (shading) Advanced topics (geometry) Graphics 2010/2011, 4th quarter. Lecture 11: Ray tracing Lecture 11 Ray tracing Introduction Projection vs. ray tracing Projection Ray tracing Rendering Projection vs. ray tracing Projection Ray tracing Basic methods for image generation Major areas of computer

More information

Lighting and Shading

Lighting and Shading Lighting and Shading Today: Local Illumination Solving the rendering equation is too expensive First do local illumination Then hack in reflections and shadows Local Shading: Notation light intensity in,

More information

CS451Real-time Rendering Pipeline

CS451Real-time Rendering Pipeline 1 CS451Real-time Rendering Pipeline JYH-MING LIEN DEPARTMENT OF COMPUTER SCIENCE GEORGE MASON UNIVERSITY Based on Tomas Akenine-Möller s lecture note You say that you render a 3D 2 scene, but what does

More information

All forms of EM waves travel at the speed of light in a vacuum = 3.00 x 10 8 m/s This speed is constant in air as well

All forms of EM waves travel at the speed of light in a vacuum = 3.00 x 10 8 m/s This speed is constant in air as well Pre AP Physics Light & Optics Chapters 14-16 Light is an electromagnetic wave Electromagnetic waves: Oscillating electric and magnetic fields that are perpendicular to the direction the wave moves Difference

More information

WHY WE NEED SHADING. Suppose we build a model of a sphere using many polygons and color it with glcolor. We get something like.

WHY WE NEED SHADING. Suppose we build a model of a sphere using many polygons and color it with glcolor. We get something like. LIGHTING 1 OUTLINE Learn to light/shade objects so their images appear three-dimensional Introduce the types of light-material interactions Build a simple reflection model---the Phong model--- that can

More information

CS635 Spring Department of Computer Science Purdue University

CS635 Spring Department of Computer Science Purdue University Light Transport CS635 Spring 2010 Daniel G Aliaga Daniel G. Aliaga Department of Computer Science Purdue University Topics Local and GlobalIllumination Models Helmholtz Reciprocity Dual Photography/Light

More information

Raytracing CS148 AS3. Due :59pm PDT

Raytracing CS148 AS3. Due :59pm PDT Raytracing CS148 AS3 Due 2010-07-25 11:59pm PDT We start our exploration of Rendering - the process of converting a high-level object-based description of scene into an image. We will do this by building

More information

Global Illumination. Why Global Illumination. Pros/Cons and Applications. What s Global Illumination

Global Illumination. Why Global Illumination. Pros/Cons and Applications. What s Global Illumination Global Illumination Why Global Illumination Last lecture Basic rendering concepts Primitive-based rendering Today: Global illumination Ray Tracing, and Radiosity (Light-based rendering) What s Global Illumination

More information

Chapter 7 - Light, Materials, Appearance

Chapter 7 - Light, Materials, Appearance Chapter 7 - Light, Materials, Appearance Types of light in nature and in CG Shadows Using lights in CG Illumination models Textures and maps Procedural surface descriptions Literature: E. Angel/D. Shreiner,

More information

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows

Recollection. Models Pixels. Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows Recollection Models Pixels Model transformation Viewport transformation Clipping Rasterization Texturing + Lights & shadows Can be computed in different stages 1 So far we came to Geometry model 3 Surface

More information

Shading. Shading = find color values at pixels of screen (when rendering a virtual 3D scene).

Shading. Shading = find color values at pixels of screen (when rendering a virtual 3D scene). Light Shading Shading Shading = find color values at pixels of screen (when rendering a virtual 3D scene). Shading Shading = find color values at pixels of screen (when rendering a virtual 3D scene). Same

More information

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops.

OpenGl Pipeline. triangles, lines, points, images. Per-vertex ops. Primitive assembly. Texturing. Rasterization. Per-fragment ops. OpenGl Pipeline Individual Vertices Transformed Vertices Commands Processor Per-vertex ops Primitive assembly triangles, lines, points, images Primitives Fragments Rasterization Texturing Per-fragment

More information

Other approaches to obtaining 3D structure

Other approaches to obtaining 3D structure Other approaches to obtaining 3D structure Active stereo with structured light Project structured light patterns onto the object simplifies the correspondence problem Allows us to use only one camera camera

More information

Global Illumination. CSCI 420 Computer Graphics Lecture 18. BRDFs Raytracing and Radiosity Subsurface Scattering Photon Mapping [Ch

Global Illumination. CSCI 420 Computer Graphics Lecture 18. BRDFs Raytracing and Radiosity Subsurface Scattering Photon Mapping [Ch CSCI 420 Computer Graphics Lecture 18 Global Illumination Jernej Barbic University of Southern California BRDFs Raytracing and Radiosity Subsurface Scattering Photon Mapping [Ch. 13.4-13.5] 1 Global Illumination

More information

Computer Graphics. Shading. Based on slides by Dianna Xu, Bryn Mawr College

Computer Graphics. Shading. Based on slides by Dianna Xu, Bryn Mawr College Computer Graphics Shading Based on slides by Dianna Xu, Bryn Mawr College Image Synthesis and Shading Perception of 3D Objects Displays almost always 2 dimensional. Depth cues needed to restore the third

More information

CS130 : Computer Graphics Lecture 8: Lighting and Shading. Tamar Shinar Computer Science & Engineering UC Riverside

CS130 : Computer Graphics Lecture 8: Lighting and Shading. Tamar Shinar Computer Science & Engineering UC Riverside CS130 : Computer Graphics Lecture 8: Lighting and Shading Tamar Shinar Computer Science & Engineering UC Riverside Why we need shading Suppose we build a model of a sphere using many polygons and color

More information

There are many kinds of surface shaders, from those that affect basic surface color, to ones that apply bitmap textures and displacement.

There are many kinds of surface shaders, from those that affect basic surface color, to ones that apply bitmap textures and displacement. mental ray Overview Mental ray is a powerful renderer which is based on a scene description language. You can use it as a standalone renderer, or even better, integrated with 3D applications. In 3D applications,

More information

CS5670: Computer Vision

CS5670: Computer Vision CS5670: Computer Vision Noah Snavely Light & Perception Announcements Quiz on Tuesday Project 3 code due Monday, April 17, by 11:59pm artifact due Wednesday, April 19, by 11:59pm Can we determine shape

More information

Color and Shading. Color. Shapiro and Stockman, Chapter 6. Color and Machine Vision. Color and Perception

Color and Shading. Color. Shapiro and Stockman, Chapter 6. Color and Machine Vision. Color and Perception Color and Shading Color Shapiro and Stockman, Chapter 6 Color is an important factor for for human perception for object and material identification, even time of day. Color perception depends upon both

More information

Designing a Self-Calibrating Pipeline for Projection Mapping Application. Kevin Wright Kevin Moule

Designing a Self-Calibrating Pipeline for Projection Mapping Application. Kevin Wright Kevin Moule Designing a Self-Calibrating Pipeline for Projection Mapping Application Kevin Wright Kevin Moule 2 Who we are Kevin Wright Director of the Application Software group at Christie responsible for developing

More information

Photorealism: Ray Tracing

Photorealism: Ray Tracing Photorealism: Ray Tracing Reading Assignment: Chapter 13 Local vs. Global Illumination Local Illumination depends on local object and light sources only Global Illumination at a point can depend on any

More information

Radiometry Measuring Light

Radiometry Measuring Light 1 Radiometry Measuring Light CS 554 Computer Vision Pinar Duygulu Bilkent University 2 How do we see? [Plato] from our eyes flows a light similar to the light of the sun [Chalcidius, middle ages] Therefore,

More information

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models

Today. Global illumination. Shading. Interactive applications. Rendering pipeline. Computergrafik. Shading Introduction Local shading models Computergrafik Matthias Zwicker Universität Bern Herbst 2009 Today Introduction Local shading models Light sources strategies Compute interaction of light with surfaces Requires simulation of physics Global

More information

Visualising Solid Shapes

Visualising Solid Shapes VISUALISING SOLID SHAPES 2 7 7 Visualising Solid Shapes Chapter 15 15.1 INTRODUCTION: PLANE FIGURES AND SOLID SHAPES In this chapter, you will classify figures you have seen in terms of what is known as

More information

CS 354R: Computer Game Technology

CS 354R: Computer Game Technology CS 354R: Computer Game Technology Texture and Environment Maps Fall 2018 Texture Mapping Problem: colors, normals, etc. are only specified at vertices How do we add detail between vertices without incurring

More information

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007

Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007 Problem Set 4 Part 1 CMSC 427 Distributed: Thursday, November 1, 2007 Due: Tuesday, November 20, 2007 Programming For this assignment you will write a simple ray tracer. It will be written in C++ without

More information

Ray Tracing: Special Topics CSCI 4239/5239 Advanced Computer Graphics Spring 2018

Ray Tracing: Special Topics CSCI 4239/5239 Advanced Computer Graphics Spring 2018 Ray Tracing: Special Topics CSCI 4239/5239 Advanced Computer Graphics Spring 2018 Theoretical foundations Ray Tracing from the Ground Up Chapters 13-15 Bidirectional Reflectance Distribution Function BRDF

More information

Projector View Synthesis and Virtual Texturing

Projector View Synthesis and Virtual Texturing Projector View Synthesis and Virtual Texturing T. Molinier 1, D. Fofi 1, Joaquim Salvi 2, Y. Fougerolle 1, P. Gorria 1 1 Laboratory Le2i UMR CNRS 5158 University of Burgundy, France E-mail: thierry.molinier@bourgogne.fr

More information

COMPUTER SIMULATION TECHNIQUES FOR ACOUSTICAL DESIGN OF ROOMS - HOW TO TREAT REFLECTIONS IN SOUND FIELD SIMULATION

COMPUTER SIMULATION TECHNIQUES FOR ACOUSTICAL DESIGN OF ROOMS - HOW TO TREAT REFLECTIONS IN SOUND FIELD SIMULATION J.H. Rindel, Computer simulation techniques for the acoustical design of rooms - how to treat reflections in sound field simulation. ASVA 97, Tokyo, 2-4 April 1997. Proceedings p. 201-208. COMPUTER SIMULATION

More information

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD Takeo MIYASAKA and Kazuo ARAKI Graduate School of Computer and Cognitive Sciences, Chukyo University, Japan miyasaka@grad.sccs.chukto-u.ac.jp,

More information

Project report Augmented reality with ARToolKit

Project report Augmented reality with ARToolKit Project report Augmented reality with ARToolKit FMA175 Image Analysis, Project Mathematical Sciences, Lund Institute of Technology Supervisor: Petter Strandmark Fredrik Larsson (dt07fl2@student.lth.se)

More information

Adapting Ray Tracing to Spatial Augmented Reality

Adapting Ray Tracing to Spatial Augmented Reality Adapting Ray Tracing to Spatial Augmented Reality Markus Broecker Wearable Computer Lab University of South Australia markus.broecker@unisa.edu.au Ross T. Smith Wearable Computer Lab University of South

More information

Mathematics Mathematics Applied mathematics Mathematics

Mathematics Mathematics Applied mathematics Mathematics Mathematics Mathematics is the mother of science. It applies the principles of physics and natural sciences for analysis, design, manufacturing and maintenance of systems. Mathematicians seek out patterns

More information

Lighting affects appearance

Lighting affects appearance Lighting affects appearance 1 Source emits photons Light And then some reach the eye/camera. Photons travel in a straight line When they hit an object they: bounce off in a new direction or are absorbed

More information

x ~ Hemispheric Lighting

x ~ Hemispheric Lighting Irradiance and Incoming Radiance Imagine a sensor which is a small, flat plane centered at a point ~ x in space and oriented so that its normal points in the direction n. This sensor can compute the total

More information

CS 498 VR. Lecture 19-4/9/18. go.illinois.edu/vrlect19

CS 498 VR. Lecture 19-4/9/18. go.illinois.edu/vrlect19 CS 498 VR Lecture 19-4/9/18 go.illinois.edu/vrlect19 Review from previous lectures Image-order Rendering and Object-order Rendering Image-order Rendering: - Process: Ray Generation, Ray Intersection, Assign

More information

Capturing light. Source: A. Efros

Capturing light. Source: A. Efros Capturing light Source: A. Efros Review Pinhole projection models What are vanishing points and vanishing lines? What is orthographic projection? How can we approximate orthographic projection? Lenses

More information

CS6670: Computer Vision

CS6670: Computer Vision CS6670: Computer Vision Noah Snavely Lecture 21: Light, reflectance and photometric stereo Announcements Final projects Midterm reports due November 24 (next Tuesday) by 11:59pm (upload to CMS) State the

More information

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y.

COMP 4801 Final Year Project. Ray Tracing for Computer Graphics. Final Project Report FYP Runjing Liu. Advised by. Dr. L.Y. COMP 4801 Final Year Project Ray Tracing for Computer Graphics Final Project Report FYP 15014 by Runjing Liu Advised by Dr. L.Y. Wei 1 Abstract The goal of this project was to use ray tracing in a rendering

More information

Efficient Rendering of Glossy Reflection Using Graphics Hardware

Efficient Rendering of Glossy Reflection Using Graphics Hardware Efficient Rendering of Glossy Reflection Using Graphics Hardware Yoshinori Dobashi Yuki Yamada Tsuyoshi Yamamoto Hokkaido University Kita-ku Kita 14, Nishi 9, Sapporo 060-0814, Japan Phone: +81.11.706.6530,

More information

What is it? How does it work? How do we use it?

What is it? How does it work? How do we use it? What is it? How does it work? How do we use it? Dual Nature http://www.youtube.com/watch?v=dfpeprq7ogc o Electromagnetic Waves display wave behavior o Created by oscillating electric and magnetic fields

More information

Global Illumination. Global Illumination. Direct Illumination vs. Global Illumination. Indirect Illumination. Soft Shadows.

Global Illumination. Global Illumination. Direct Illumination vs. Global Illumination. Indirect Illumination. Soft Shadows. CSCI 480 Computer Graphics Lecture 18 Global Illumination BRDFs Raytracing and Radiosity Subsurface Scattering Photon Mapping [Ch. 13.4-13.5] March 28, 2012 Jernej Barbic University of Southern California

More information

WHEN an image is projected onto an everyday surface

WHEN an image is projected onto an everyday surface IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. XX, NO. X, XXXXXXX 201X 1 Artifact Reduction in Radiometric Compensation of Projector-Camera Systems for Steep Reflectance Variations

More information

Introduction to Computer Graphics 7. Shading

Introduction to Computer Graphics 7. Shading Introduction to Computer Graphics 7. Shading National Chiao Tung Univ, Taiwan By: I-Chen Lin, Assistant Professor Textbook: Hearn and Baker, Computer Graphics, 3rd Ed., Prentice Hall Ref: E.Angel, Interactive

More information

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker

CMSC427 Shading Intro. Credit: slides from Dr. Zwicker CMSC427 Shading Intro Credit: slides from Dr. Zwicker 2 Today Shading Introduction Radiometry & BRDFs Local shading models Light sources Shading strategies Shading Compute interaction of light with surfaces

More information

4) Finish the spline here. To complete the spline, double click the last point or select the spline tool again.

4) Finish the spline here. To complete the spline, double click the last point or select the spline tool again. 1) Select the line tool 3) Move the cursor along the X direction (be careful to stay on the X axis alignment so that the line is perpendicular) and click for the second point of the line. Type 0.5 for

More information

Surface Rendering. Surface Rendering

Surface Rendering. Surface Rendering Surface Rendering Surface Rendering Introduce Mapping Methods - Texture Mapping - Environmental Mapping - Bump Mapping Go over strategies for - Forward vs backward mapping 2 1 The Limits of Geometric Modeling

More information

CS5620 Intro to Computer Graphics

CS5620 Intro to Computer Graphics So Far wireframe hidden surfaces Next step 1 2 Light! Need to understand: How lighting works Types of lights Types of surfaces How shading works Shading algorithms What s Missing? Lighting vs. Shading

More information

Lighting and Shading. Slides: Tamar Shinar, Victor Zordon

Lighting and Shading. Slides: Tamar Shinar, Victor Zordon Lighting and Shading Slides: Tamar Shinar, Victor Zordon Why we need shading Suppose we build a model of a sphere using many polygons and color each the same color. We get something like But we want 2

More information

TSBK03 Screen-Space Ambient Occlusion

TSBK03 Screen-Space Ambient Occlusion TSBK03 Screen-Space Ambient Occlusion Joakim Gebart, Jimmy Liikala December 15, 2013 Contents 1 Abstract 1 2 History 2 2.1 Crysis method..................................... 2 3 Chosen method 2 3.1 Algorithm

More information