Rendering Multi-Perspective Images with Trilinear Projection

Size: px
Start display at page:

Download "Rendering Multi-Perspective Images with Trilinear Projection"

Transcription

1 Rendering Multi-Perspective Images with Trilinear Projection Scott Vallance Paul Calder School of Informatics and Engineering Flinders University of South Australia, PO Box 2100, Adelaide, South Australia 5001, Abstract Non-linear projections of 3D graphical scenes can be used to compute reflections and refractions in curved surfaces, draw artistic images in the style of Escher or Picasso, and produce visualizations of complex data. Previously, most non-linear projections were rendered by ray tracing. This paper presents trilinear projection, a technique for rendering non-linear projections in a manner that achieves significant performance benefits by taking advantage of current rendering hardware and software. Trilinear projections are geometrically similar to Phong-shaded triangular patches, and like Phong patches they can be joined to represent more complicated shapes. The paper details how a single trilinear projection projects a scene point, how projections of scene triangles can be built up by considering the connectivity of projected points, and how multiple trilinear projections can be combined. Finally, it outlines a method for using trilinear projection to approximate reflections and refractions on curved surfaces. Keywords: Multi-Perspective Images, Non-Linear Projection, Reflections, Refractions, Rendering Algorithms. 1.1 Nonlinear Projections Nonlinear projections occur naturally as reflections and refractions on curved objects. The strange and distorted images seen in an amusement park mirror are a familiar example of the distortion nonlinear projections generate. These images have also been examined in art, photography and computer graphics where they have variously been named cubist images, multi-perspective images, multiple-centre-ofprojection images and multi-perspective panoramas. Traditional Chinese landscape paintings frequently contain different foci, or sub-images, which are seamlessly joined. For example, Figure 1 shows a scene in which the perspective shifts from left to right, following the path of the stream. German artist M. C. Escher frequently depicted views with multiple vanishing points, or perspectives. For example, Figure 2 has five different vanishing points: top left and right, centre, and bottom left and right. While the automatic generation of images like these from 3D geometry may not be practical, they illustrate the concept and the aesthetic potential. 1 Introduction Most computer graphics rendering of 3D scenes is linear. For example, a perspective projection simulates the physics of optics, mapping scene data back to a single point in space as if viewed through a camera lens. Nonlinear projections differ from linear projections in that straight lines in 3D may not be straight lines when projected. Such projections can be used to compute reflections and refractions in curved surfaces, draw artistic images in the style of Escher or Picasso, and produce visualistations of complex data. Previous techniques for nonlinear projection have used ray tracing or relied on distortion of the scene data. This paper presents a new technique, called trilinear projection, for rendering nonlinear projections in a manner analogous to perspective transformation matrix rendering. The technique is based on a trilinear interpolation similar to that used for Phongshaded trianglular patches. And like Phong patches, multiple trilinear projections can be joined together to represent complex projection surfaces while maintaining continuity across sub-projections. Copyright c 2006, Australian Computer Society, Inc. This paper appeared at Twenty-Ninth Australasian Computer Science Conference (ACSC2006), Hobart, Tasmania, Australia, January Conferences in Research and Practice in Information Technology, Vol. 48. Vladimir Estivill-Castro and Gill Dobbie, Ed. Reproduction for academic, not-for profit purposes permitted provided this text is included. Figure 1: Fishermans Evening Song by Xu Daoning, circa 11th Century Strip cameras are widely used in surveillance and mapping. These cameras have a continuous roll of film that slides past a slit as a picture is being taken. The camera may be moved whilst shooting, providing a change in point of view from one section of the film to another. For example, if used from a moving aeroplane a strip camera can capture a long section of curved earth as if it were flat. The technique has also been used for artistic purposes, such as the image in Figure 3 (Davidhazy 2001). 1.2 A Trilinear Projection Surface In geometric terms, a perspective projection can be defined by a set of rays that emanate from a single point in space, and an orthographic projection by a set of parallel rays that emanate from a plane. By extending this approach, a nonlinear projection can be defined as a projection created by a set of rays that emanate from an arbitrary surface, with the origin and direction of each ray a function of the surface. In computer graphics, complex curved surfaces are usually approximated as a mesh of triangles because triangle geometry is easy to compute and render. The

2 (a) (b) Figure 4: A scene of columns (a) rendered with conventional perspective (b) rendered from a torus surface Figure 2: High and Low by M. C. Escher, an example of a nonlinear projection Figure 5: A nonlinear projection of an elephant Figure 3: A strip camera photograph of a man s head shape of a projection surface can thus be approximated by a suitable triangle mesh, and the directions of projection rays can be approximated by specifying the vertex normals of the triangles. The directions of rays internal to the triangles are defined implicitly by linear interpolation of the vertex rays. Using this approach, a complex non-linear projection can be computed by tiling smaller projections trilinear projections each defined by a triangle with specified vertex positions and normals. Because in general the vertex normals of a projection triangle do not converge to a single point, a trilinear projection will be nonlinear. And because adjacent triangles share vertex normals, the combined projection will be continuous across triangle boundaries. 1.3 Related Work Nonlinear projections can be rendered by ray tracing if a nonlinear projection can be expressed as a set of rays. For example, Löffelmann and Gröller (Löffelmann 1995) define an extended camera as a set of rays that start on a surface and point in the direction of the surface normal. The scene is then rendered with POVray [ovpl04], a widely available ray tracing implementation. Figure 4 shows a scene rendered with standard perpective projection and when rendered with a toroidal extended camera. Rademacher and Bishop (Rademacher & Bishop 1998) describe techniques for generating multiplecentre-of-projection (MCOP) images. The images are generated by moving a virtual camera through a scene and capturing a single line of pixels at regular intervals. The technique is effectively a virtual strip cam- era, and can generate images such as the one in Figure 5. Yu and McMillan (Yu & McMillan 2004) introduce general linear cameras (GLC) as a mathematical description for a class of nonlinear projections defined by three rays passing through two parallel planes. The authors define and name various special cases of these cameras, and implement them with ray tracing and a light field rendering system. The technique is similar to trilinear projection but more constrained in that the vectors must have equal magnitude in the direction normal to the view. In that sense, GLC projections are a subset of trilinear projections. Various authors have considered techniques for computing reflections from curved surfaces. For example, Glaeser (Glaeser 1999) presents equations for calculating the reflection of a space point on a sphere or cylinder of revolution, and environment mapping (originally described by Blinn and Newell (Blinn & Newell 1976)) approximates curved reflections by sampling the projection of a scene as if drawn from a point behind the reflection surface. Variations on environment mapping, such as extended environment maps (Cho 2000) and parameterised environment maps (Hakura, Snyder & Lengyel 2001), can produce more accurate images but at greater computational cost. Ofek and Rappoport (Ofek & Rappoport 1999) describe an intriguing approximation for rendering reflections on curved surface that involves distorting objects based on the reflective surface so that they may then be rendered using standard perspective projection. The technique requires an appropriate tessellation of both reflective surface and scene object to approximate the curvature of the reflected lines. The correspondence between a point in space and the reflective surface is approximated by computing an explosion map (the projection of the reflective surface to the surface of a surrounding sphere) and then computing where on the map a scene point falls. The performance of the technique is sufficient for real-time

3 rendering of moderate scenes, making it suited for visualisation tasks. 1.4 Organisation of the Paper The remainder of this paper is organised as follows: Section 2 presents the algorithms and techniques for computing the trilinear projection of a single scene point. Section 3 describes how the projection for a scene triangle can be computed from the projections of its vertices. Since the projection is non-linear, and since each point can have up to 3 images, the projection for a single scene triangle can have up to 9 vertices and may consist of several disconnected shapes. Section 4 shows how multiple trilinear projections can be combined to represent projection from a complex surface. Section 5 outlines how trilinear projection can be applied to the task of computing reflections and refractions from curved surfaces. Finally, Section 6 briefly examines the performance characteristics of trilinear projection. 2 Projecting a Point For purposes of this paper we define projecting a point as determining which location(s) on the projecting surface define a ray that intersects that point. A ray can be defined parametrically by a point, p and a normal n: r (t) := p + tn (1) If p and n are defined using barycentric coordinates u and v, vertex positions p 1..3 and n 1..3, then a ray on the surface of the trilinear projection becomes: r (u, v, t) := p (u, v) + tn (u, v) = (1 u v) p 1 + up 2 + vp 3 + t ((1 u v) n 1 + un 2 + vn 3 ) (2) So for a scene point p s : p s = r (u, v, t) = (1 u v) p 1 + up 2 + vp 3 + t ((1 u v) n 1 + un 2 + vn 3 ) (3) Solving for u, v and t in 3D means a system of three equations and three unknowns. The t and u, v terms are multiplied together making it a non-linear system of equations and an analytical solution is not readily apparent. 2.1 Treating the Trilinear Projection as a Parametric Triangle To analytically solve this problem, instead of representing the surface as a set of rays, we represent it as a parametric triangle. Each vertex has a point and a normal associated with it. Treating these as rays we can extend along them according to the parameter t giving three new points, which form the vertices of a triangle as shown in Figure 6. The three vertices of the parametric triangle are defined by the equations: r 1 := p 1 + tn 1 r 2 := p 2 + tn 2 r 3 := p 3 + tn 3 (4) A barycentrically defined point on the parametric triangle is: p (u, v, t) := (1 u v) r 1 + ur 2 + vr 3 = (1 u v) (p 1 + tn 1 ) + u (p 2 + tn 2 ) + v (p 3 + tn 3 ) (5) Figure 6: A parametric triangle shown at different values of t The task of projecting a scene point now becomes that of finding a point p (u, v, t) that coincides with the scene point. The u, v and t satisfying this constraint are the same as those satisfying p s = r (u, v, t) because the two equations are simply isomorphs. The advantage in representing the triangle as a parametric triangle is that the parameter t can be determined independently of u and v. First, for the parametric vertices defined in Equation 4, find the values of t such that the vertices and the scene point p s are coplanar, then solve for u and v in the plane of the parametric triangle. 2.2 Determining the Coplanarity of the Parametric Triangle and the Scene Point Any four points can be considered a tetrahedron; four coplanar points form a tetrahedron whose volume is 0. For a tetrahedron defined by four points A, B, C and D the volume of the tetrahedron is: volume := 1 AB (AC AD) (6) 4 This equation can also be expressed as the magnitude of the determinant of the matrix containing the three vectors. [ ] volume := 1 AB 4 det AC (7) AD Composing the three vectors from the parametric vertices defined in Equation 4 with the scene point p s and substituting into Equation 7 gives the volume of the tetrahedron defined by those four points. When these points are coplanar this volume is 0. Removing the unnecessary constant and magnitude gives: p 1 + tn 1 p s p 2 + tn 2 p s p 3 + tn 3 p s = 0 (8) This can be expanded for three dimensions, giving a cubic polynomial in terms of t. Either one or three real solutions for t exist, and each value of t defines a potentially different projection of the scene point. 2.3 Precalculating Partial Coefficient Values Computing the coefficients of Equation 8 involves substantial calculation not directly dependent on the scene point. These calculations depend only upon the values of the parametric triangle itself and therefore can be reused across scene points. This is useful because in a normal scene situation there are many

4 scene points that need to be projected by each parametric triangle. The most common drawing primitive, the triangle, is comprised of three such scene points. The minor additional memory requirements of storing these values is small in comparison to the reduction in computation. In Equation 7 the volume of a tetrahedron is defined as the determinant of three vectors formed from the four points. This can equally be expressed as the four by four determinant shown in Equation 9. p sx p sy p sz 1 p 1x + tn 1x p 1y + tn 1y p 1z + tn 1z 1 p 2x + tn 2x p 2y + tn 2y p 2z + tn 2z 1 = 0 (9) p 3x + tn 3x p 3y + tn 3y p 3z + tn 1z 1 The determinant in Equation 9 can be expanded to Equation 10 where E 1..4 are each three by three determinants. p sx E 1 p sy E 2 + p sz E 3 E 4 = 0 (10) Determinants E 1..3 are quadratics in t and E 4 is a cubic. This means the coefficient of t 3 depends only upon properties of the parametric triangle and not the scene point. Further, the other coefficients of the cubic defined by the full expansion of Equation 9 depend on the scene point in a useful way. Equations E 1..4 are defined by the general cubic equation, Equation 11, where A 1..3 = 0 E i = A i t 3 + B i t 2 + C i t + D i, i = 1, 2, 3, 4 (11) If the coefficients for the cubic equation defined by Equation 9 are represented by a, b, c and d, where at 3 + bt 2 + ct + d = 0, then the relationship between the coefficients and the properties (A, B, C, D) 1..4 can be described as: a b c d = A4 B1 B2 B3 B4 C1 C2 C3 C4 D1 D2 D3 D4 p sx p sy p sz 1 (12) All properties (A, B, C, D) 1..4 can be calculated independently of scene points. The full derivations of (A, B, C, D) 1..4 are provided elsewhere (Vallance 2005). Using these pre-calculated partial values allows for faster calculation across multiple scene points. 3 Drawing a Scene Triangle Each of the vertices of a scene triangle can be separately projected with the trilinear projection generating either one or three solutions for each vertex. The manner in which the projected vertices are connected can be understood by considering the sweep of the parametric triangle intersected with the scene triangle. 3.1 Computing the Projected Shapes At every value of t the parametric triangle is a standard triangle in scene space. The intersection of this triangle s plane with the scene triangle s plane forms a line. As the value of t changes the intersection line traverses the plane of the scene triangle. The motion of this intersection line is continuous because t is continuous, except where the intersection line is undefined because the scene triangle and parametric triangle are parallel or coplanar. A straight line intersects at most two sides of a triangle. Furthermore, the locus of a continuously defined line must intersect a vertex on the end of an edge before intersecting the edge itself. The order, from smallest to largest t value, in which projected scene triangle vertices are intersected determines the connectivity of those vertices. Initially the line of intersection crosses the two edges connected with the vertex projected by the smallest t value. Each vertex thereafter toggles which edges are being intersected by the parametric triangle. When all edges are toggled off the line of intersection is no longer traversing the scene triangle and the projected shape is complete. For strings of vertices, in order of t value, toggling the edge states reveals which vertices group together to form a shape. Even though the scene triangle has three vertices, it does not necessarily project shapes which have three vertices. Moreover, a single scene triangle can produce up to four shapes when projected with a parametric triangle. This particular case happens when the three scene vertices produce three t solutions each, and the nine projected vertices form three two-vertex shapes and one three-vertex shape. For example, for a list of vertices V =,,, v 1, which form a five-vertex shape the following diagrams show how these vertices are formed into a polygon. Each circle represents a processed vertex. An arrow between two circles represents the edge upon which those two vertics are connected. An unterminated arrow represents an open edge at that stage of the traversal. Only two arrows can be unterminated at a given stage in the traversal and the current unterminated arrows are named left and right. v 1 Step 1: Since the first vertex is the two edges v 1 and become active. If the next vertex was a v 1 then it would be placed on left because it has to be connected to the edge v 1. Similarly if the next vertex was a it would be placed on right. Finally, as a would be connected to both edges, it would finish the shape, and could be placed on either left or right. v 1 v 2 v 1 Step 2: The next vertex is in fact so it is placed on right. This toggles the active edges such that becomes inactive and v 1 active.

5 v 1 vertices which is unlikely to be accurate. Figure 7 shows one scene triangle projected by a single trilinear projection resulting in two shapes, one with four vertices and the other with five. The left hand figure is the correct projection as computed by a ray tracing algorithm. The right hand figure shows the approximation introduced by trilinear projection. v 1 Step 3: The next vertex is also, indicating a transition to the edge. Accordingly the vertex is placed on the right and the right s edge is changed. v 1 v 1 v 1 v 1 Step 4: In a similar way, vertex v 1 is placed on the left and the left s edge is changed to v 1. (a) (b) Figure 7: A projected triangle resulting in two shapes: (a) ray trace (b) trilinear projection To more accurately represent the curves that connect projected vertices the solution can be tesselated. This can be done by sampling the scene triangle at intermediate t values. In the non-tesselated case the triangle is sampled at a value of t for each projected vertex. Extra values of t between the start and end of a shape can be inserted to increase the projected polygons accuracy. For each t value the parametric triangle is a triangle in scene space. The intersection of this triangle and the scene triangle results in a line. For every value of t from the smallest for a particular shape to the largest the intersection line crosses the scene triangle. This line, clipped to the overlapping sections of the scene and parametric triangle (for this particular t value), corresponds directly to a line on the final image. Figure 8 shows a scene triangle rendered with 5 extra t samples per shape. v 1 v 1 v 1 v 1 Step 5: The final vertex is which could be placed in either list; it is arbitrarily placed on left for convenience. Previously active edges v 1 and are toggled off and the shape is finished. 3.2 Drawing the Projected Polygons The result of projecting each vertex in a triangle and then assembling the vertices together is a set of polygons. One scene triangles may result in nine projected vertices, which can be arranged in up to four different polygons from two to nine vertices (though not all combinations are possible). Each projected vertex has a u, v and t value associated with it and these values correspond to post projection x, y and z coordinates. Using these coordinates a polygon can be drawn using standard rasterising techniques with visibility resolved by the depth coordinate. Rasterising assumes that the straight lines connect the projected Figure 8: Scene triangle sampled at 5 extra t levels per shape approximating a (4,5) shape configuration 4 Combining Multiple Projections A projection comprising a single trilinear projection has limited curvature. More complicated projections can be built with a mesh of trilinear triangles. In the same way that a triangular scene mesh can be approximated as smoothly curving by sharing surface normals, so can a projection mesh. When projecting with multiple trilinear triangles, issues of continuity arise that can be solved by clipping in scene space. The simplest approach to the problem of clipping is to clip each projection in view space. This can

6 result in discontinuity because the lines between projected vertices are drawn as straight when they are really curved. Discontinuities arising from this simplistic approach can be addressed by clipping scene triangles to the volume swept out by projecting the parametric triangle into scene space. The result is equivalent to tessellating the projected scene triangle at the boundary of the next trilinear projection. Each edge of a surface triangle forms a parametric line segment that traces out a surface through space which may intersect with edges in the scene triangles. To correctly clip to a parametric surface triangle region, the three parametric line segments defining the triangle edges must be traced through all the scene triangles, and the triangles clipped according to the intersections. A line segment of the parametric triangle at t is defined parametrically by: e (s, t) := p i + tn i + s (p j + tn j ) i, j = 1, 2, 3 i j (13) where s varies from 0 to 1. Consider the intersection between a parametric edge, defined by the points p i + tn i and p j + tn j, and a scene triangle edge, defined by the points p s1 and p s2. The value of t at the intersection as projected out of the trilinear projection, can be determined independently of s because for the two line segments to intersect they must lie on the same plane. According to Equation 7 the four points are coplanar when: p s1 p s2 p i + t.n i p s2 p j + t.n j p s2 = 0 (14) This can expanded giving a quadratic in terms of t. This quadratic may have two, or no real solutions. Each value of t defines a triangle and the scene line can be intersected with that triangle yielding a clipping point. 5 Applications Non-linear projections have been explored in art and visualisation because of their ability to represent 3D objects in unusual ways. Though not as readily interpreted as perspective projections they may have the potential to illustrate complicated relations that are hidden in perspective projections. Finding nonlinear projections that genuinely improve the task of visualisation is an unsolved problem. This paper instead examines the use of trilinear projection as way of approximating reflections and refractions on curved surfaces. 5.1 Reflection The specular reflection of a ray is governed by the equation: θ r = θ i (15) where θ r is the angle of reflection and θ i is the angle of incidence. This leads to following equation: r = 2 (n i) n i (16) where r, n and i are unit vectors for the reflection, surface normal and incidence direction. 5.2 Refraction Refraction is the bending of light due to the difference in refractive index between two materials. Simple refraction can be described by Snell s law, which shows the relation between angles of incidence, refraction and the refractive index of the materials. Snell s law is described by the following relation: sin θ i sin θ t = η t η i (17) where θ i is the angle of incidence, θ t is the angle of transmission, η t is the refractive index of the material the ray is entering and η i is the refractive index of the material the ray is leaving. This leads to the following equation: q = ηi (cos θ t + η cos θ i )n (18) where q, i and n are the vectors of transmission, incidence and the surface normal, and η = ηt η i Approximating Reflections and Refractions For a given scene surface, represented by a polygon mesh with shared normal vectors (a Phong-shaded mesh), the reflected or refracted image in each segment can be approximated by a trilinear projection. The points p 1..3 and vectors r 1..3 in Figure 9 form a trilinear projection approximating the reflection seen from the eyepoint e in the triangle defined by the points p 1..3 and the normals n The projection is exactly correct at the vertices, but the reflection vectors across the surface will in general only approximate the correct solution. This is because instead of interpolating the surface normal and then calculating the reflection vector, the reflection vector is interpolated from the vertex reflection vectors. In Figure 9 the reflection at point p should be along the ray r but in the trilinear projection it is along ray r. Despite the inaccuracy the computed reflection has several desirable characteristics: it is nonlinear, which means that the reflection is curved, as is expected, and it is continuous across multiple reflective facets. Refractions, inter-reflections and inter-refractions can all be approximated by using the vector equations for reflection and refraction at the vertices of the face to generate a trilinear projection. r 1 e n 1 p 1 i 2 i 1 i 3 i Figure 9: An approximation of a reflection with a trilinear projection n p r r' p 2 n 2 p 3 r 2 n 3 r 3

7 5.3 Example Projections Figure 10 shows a scene with a reflective sphere and a cube that is being rendered from a view point close to the sphere s surface. The rays at the four corners of the view intersect the sphere and are reflected off at various angles. Figure 11 (a) shows a ray traced image similar to that which would be seen in the section of reflective sphere shown in Figure 10. Figures 11 (b) to (f) show a trilinear projection approximation of the image with increasing projection mesh resolution. In the 1x1 surface, Figure 11 (b), the trilinear projection is two coplanar triangles whose normals are the reflection vectors show in Figure 10. Figures 11 (c) to (f) are rendered from projection surfaces that are increasingly accurate approximations of the surface of the sphere and the reflection vectors off the sphere s surface. The increasing mesh resolution means more trilinear projections are used to approximate the reflection, resulting in more accurate rendering. Image Projections Ray(ms) Trilinear(ms) (a) (b) (c) (d) (e) (f) Table 1: Execution time for rendering examples in Figure 11 Image Projections Ray(ms) Trilinear(ms) (a) (b) (c) (d) (e) (f) Table 2: Execution time for rendering examples in Figure 12 the cubic equation, solving the cubic and rejoining the projected vertices. The performance of rendering with multiple trilinear projections scales in a linear fashion. 7 Conclusion Figure 10: A cube reflected in a sphere Figures 12 (a) to (f) show a refraction of a cube scene through a plane whose normals are coincident. This simulates the effect of viewing an object through a convex lens. Figure 12 (a) is the correct ray traced solution and figures 12 (b) through (f) show trilinear projections to approximate this using 1x1, 2x2, 3x3, 4x4 and 5x5 resolution meshes respectively. 6 Performance Characteristics An implementation of the trilinear projection algorithms and a ray tracer was developed using OpenGL (Segal & Akeley 1998). The prototype can draw single and multiple trilinear projections, with or without clipping and tessellation. Reflective or refractive projection surfaces can be generated automatically, and the ray tracing implementation can render correct reflection and refraction solutions for comparison. The ray tracing code is naive, and performs no scene organisation optimisation to decrease the rendering time. The results here were obtained on a 800 MHz Athlon with a GeForce 2mx graphics card. Execution speeds were averaged across multiple runs. Table 1 shows the speed in milliseconds to render a single frame of the scene presented in Figure 11 at resolutions of pixels. The scene consists of 686 untextured triangles. Table 2 shows the time taken to render the images in Figure 12, also at pixels with same scene rendered through a refractive sphere. As noted before ray tracing can be speeded up by organising the scene so that for any particular ray only a subset of triangles need be intersected. Trilinear projection can also use scene organisation to improve speed because any particular trilinear projection may only image a subset of the scene. For a single trilinear projection there is only a linear performance cost over scanline rendering. This performance cost relates to determining the coefficients of The problem of rendering nonlinear projections with scanline-style algorithms has not previously been thoroughly examined. This paper presents an overview of a trilinear projection algorithm that projects scene points and triangles in a manner compatible with scanline rendering algorithms. The algorithms have been implemented and used to demonstrate a range of applications. More details of the algorithms and performance benchmarks appear elsewhere (Vallance 2005). The performance characteristics of scanline rendering have made it very useful in interactive and animated computer graphics, and algorithms compatible with scanline rendering have a substantial base of hardware and software to draw upon. A limitation of scanline rendering has been its difficulty in modeling certain complicated optical interactions. The trilinear projection algorithm provides a new technique for rendering optical interactions that presents developers with more tools to render unusual visualisations and optical phenomena with acceptable performance. References Blinn, J. F. & Newell, M. E. (1976), Texture and reflection in computer generated images, Communications of the ACM 19(10), Cho, F. (2000), Towards Interative Ray Tracing in Two- and Three-Dimensions, PhD thesis, University of California at Berkeley. Davidhazy, A. (2001), Peripheral portraits and other strip camera photographs, Retrieved October, 2001 from the World Wide Web rit.edu/~andpph/exhibit-6.html. Glaeser, G. (1999), Reflections on spheres and cylinders of revolution, Journal for Geometry and Graphics 3(2), Hakura, Z., Snyder, J. & Lengyel, J. (2001), Parameterized environment maps, in Proc. of the 2001 Symposium of Interactive 3D Graphics.

8 Löffelmann, H. (1995), Extended cameras for ray tracing, Master s thesis, Vienna Technical Institute. Ofek, E. & Rappoport, A. (1999), Interactive reflections on curved objects, in Proc of SIGGRAPH 99. Rademacher, P. & Bishop, G. (1998), Multiple-centerof-projection images, in Proc. of SIGGRAPH 98. Segal, M. & Akeley, K. (1998), The OpenGL Graphics System: a Specification (Version 1.2). Vallance, S. (2005), Trilinear Projection, PhD thesis, Flinders University. Yu, J. & McMillan, L. (2004), General linear cameras, in T. Pajdla & J. Matas, eds, Computer Vision - ECCV 2004, 8th European Conference on Computer Vision, Prague, Czech Republic, May 11-14, Proceedings, Part II, Vol of Lecture Notes in Computer Science, Springer.

9 (a) (b) (a) (b) (c) (d) (c) (d) (e) Figure 11: A cube reflected on a sphere: (a) ray traced, (b) 1x1 surface, (c) 2x2 surface, (d) 3x3 surface, (e) 4x4 surface, (f) 5x5 surface (f) (e) Figure 12: A cube refracted through a plane with spherical normals: (a) ray traced, (b) 1x1 surface, (c) 2x2 surface, (d) 3x3 surface, (e) 4x4 surface, (f) 5x5 surface (f)

Ray tracing based fast refraction method for an object seen through a cylindrical glass

Ray tracing based fast refraction method for an object seen through a cylindrical glass 20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Ray tracing based fast refraction method for an object seen through a cylindrical

More information

From curves to surfaces. Parametric surfaces and solid modeling. Extrusions. Surfaces of revolution. So far have discussed spline curves in 2D

From curves to surfaces. Parametric surfaces and solid modeling. Extrusions. Surfaces of revolution. So far have discussed spline curves in 2D From curves to surfaces Parametric surfaces and solid modeling CS 465 Lecture 12 2007 Doug James & Steve Marschner 1 So far have discussed spline curves in 2D it turns out that this already provides of

More information

COMP environment mapping Mar. 12, r = 2n(n v) v

COMP environment mapping Mar. 12, r = 2n(n v) v Rendering mirror surfaces The next texture mapping method assumes we have a mirror surface, or at least a reflectance function that contains a mirror component. Examples might be a car window or hood,

More information

Graphics and Interaction Rendering pipeline & object modelling

Graphics and Interaction Rendering pipeline & object modelling 433-324 Graphics and Interaction Rendering pipeline & object modelling Department of Computer Science and Software Engineering The Lecture outline Introduction to Modelling Polygonal geometry The rendering

More information

CS559 Computer Graphics Fall 2015

CS559 Computer Graphics Fall 2015 CS559 Computer Graphics Fall 2015 Practice Final Exam Time: 2 hrs 1. [XX Y Y % = ZZ%] MULTIPLE CHOICE SECTION. Circle or underline the correct answer (or answers). You do not need to provide a justification

More information

Complex Features on a Surface. CITS4241 Visualisation Lectures 22 & 23. Texture mapping techniques. Texture mapping techniques

Complex Features on a Surface. CITS4241 Visualisation Lectures 22 & 23. Texture mapping techniques. Texture mapping techniques Complex Features on a Surface CITS4241 Visualisation Lectures 22 & 23 Texture Mapping Rendering all surfaces as blocks of colour Not very realistic result! Even with shading Many objects have detailed

More information

For each question, indicate whether the statement is true or false by circling T or F, respectively.

For each question, indicate whether the statement is true or false by circling T or F, respectively. True/False For each question, indicate whether the statement is true or false by circling T or F, respectively. 1. (T/F) Rasterization occurs before vertex transformation in the graphics pipeline. 2. (T/F)

More information

Introduction Ray tracing basics Advanced topics (shading) Advanced topics (geometry) Graphics 2010/2011, 4th quarter. Lecture 11: Ray tracing

Introduction Ray tracing basics Advanced topics (shading) Advanced topics (geometry) Graphics 2010/2011, 4th quarter. Lecture 11: Ray tracing Lecture 11 Ray tracing Introduction Projection vs. ray tracing Projection Ray tracing Rendering Projection vs. ray tracing Projection Ray tracing Basic methods for image generation Major areas of computer

More information

03 Vector Graphics. Multimedia Systems. 2D and 3D Graphics, Transformations

03 Vector Graphics. Multimedia Systems. 2D and 3D Graphics, Transformations Multimedia Systems 03 Vector Graphics 2D and 3D Graphics, Transformations Imran Ihsan Assistant Professor, Department of Computer Science Air University, Islamabad, Pakistan www.imranihsan.com Lectures

More information

Efficient Rendering of Glossy Reflection Using Graphics Hardware

Efficient Rendering of Glossy Reflection Using Graphics Hardware Efficient Rendering of Glossy Reflection Using Graphics Hardware Yoshinori Dobashi Yuki Yamada Tsuyoshi Yamamoto Hokkaido University Kita-ku Kita 14, Nishi 9, Sapporo 060-0814, Japan Phone: +81.11.706.6530,

More information

CSE528 Computer Graphics: Theory, Algorithms, and Applications

CSE528 Computer Graphics: Theory, Algorithms, and Applications CSE528 Computer Graphics: Theory, Algorithms, and Applications Hong Qin State University of New York at Stony Brook (Stony Brook University) Stony Brook, New York 11794--4400 Tel: (631)632-8450; Fax: (631)632-8334

More information

Texture Mapping. Brian Curless CSE 457 Spring 2016

Texture Mapping. Brian Curless CSE 457 Spring 2016 Texture Mapping Brian Curless CSE 457 Spring 2016 1 Reading Required Angel, 7.4-7.10 Recommended Paul S. Heckbert. Survey of texture mapping. IEEE Computer Graphics and Applications 6(11): 56--67, November

More information

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources.

So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources. 11 11.1 Basics So far, we have considered only local models of illumination; they only account for incident light coming directly from the light sources. Global models include incident light that arrives

More information

Homework #2. Shading, Projections, Texture Mapping, Ray Tracing, and Bezier Curves

Homework #2. Shading, Projections, Texture Mapping, Ray Tracing, and Bezier Curves Computer Graphics Instructor: Brian Curless CSEP 557 Autumn 2016 Homework #2 Shading, Projections, Texture Mapping, Ray Tracing, and Bezier Curves Assigned: Wednesday, Nov 16 th Due: Wednesday, Nov 30

More information

CS 4620 Midterm, March 21, 2017

CS 4620 Midterm, March 21, 2017 CS 460 Midterm, March 1, 017 This 90-minute exam has 4 questions worth a total of 100 points. Use the back of the pages if you need more space. Academic Integrity is expected of all students of Cornell

More information

Mach band effect. The Mach band effect increases the visual unpleasant representation of curved surface using flat shading.

Mach band effect. The Mach band effect increases the visual unpleasant representation of curved surface using flat shading. Mach band effect The Mach band effect increases the visual unpleasant representation of curved surface using flat shading. A B 320322: Graphics and Visualization 456 Mach band effect The Mach band effect

More information

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19 Lecture 17: Recursive Ray Tracing Where is the way where light dwelleth? Job 38:19 1. Raster Graphics Typical graphics terminals today are raster displays. A raster display renders a picture scan line

More information

Ray Tracing Basics I. Computer Graphics as Virtual Photography. camera (captures light) real scene. photo. Photographic print. Photography: processing

Ray Tracing Basics I. Computer Graphics as Virtual Photography. camera (captures light) real scene. photo. Photographic print. Photography: processing Ray Tracing Basics I Computer Graphics as Virtual Photography Photography: real scene camera (captures light) photo processing Photographic print processing Computer Graphics: 3D models camera model (focuses

More information

Computer Graphics I Lecture 11

Computer Graphics I Lecture 11 15-462 Computer Graphics I Lecture 11 Midterm Review Assignment 3 Movie Midterm Review Midterm Preview February 26, 2002 Frank Pfenning Carnegie Mellon University http://www.cs.cmu.edu/~fp/courses/graphics/

More information

CS354 Computer Graphics Ray Tracing. Qixing Huang Januray 24th 2017

CS354 Computer Graphics Ray Tracing. Qixing Huang Januray 24th 2017 CS354 Computer Graphics Ray Tracing Qixing Huang Januray 24th 2017 Graphics Pipeline Elements of rendering Object Light Material Camera Geometric optics Modern theories of light treat it as both a wave

More information

Computer Graphics. Lecture 9 Environment mapping, Mirroring

Computer Graphics. Lecture 9 Environment mapping, Mirroring Computer Graphics Lecture 9 Environment mapping, Mirroring Today Environment Mapping Introduction Cubic mapping Sphere mapping refractive mapping Mirroring Introduction reflection first stencil buffer

More information

Wednesday, 26 January 2005, 14:OO - 17:OO h.

Wednesday, 26 January 2005, 14:OO - 17:OO h. Delft University of Technology Faculty Electrical Engineering, Mathematics, and Computer Science Mekelweg 4, Delft TU Delft Examination for Course IN41 5 1-3D Computer Graphics and Virtual Reality Please

More information

COMP30019 Graphics and Interaction Rendering pipeline & object modelling

COMP30019 Graphics and Interaction Rendering pipeline & object modelling COMP30019 Graphics and Interaction Rendering pipeline & object modelling Department of Computer Science and Software Engineering The Lecture outline Introduction to Modelling Polygonal geometry The rendering

More information

Lecture outline. COMP30019 Graphics and Interaction Rendering pipeline & object modelling. Introduction to modelling

Lecture outline. COMP30019 Graphics and Interaction Rendering pipeline & object modelling. Introduction to modelling Lecture outline COMP30019 Graphics and Interaction Rendering pipeline & object modelling Department of Computer Science and Software Engineering The Introduction to Modelling Polygonal geometry The rendering

More information

Topics and things to know about them:

Topics and things to know about them: Practice Final CMSC 427 Distributed Tuesday, December 11, 2007 Review Session, Monday, December 17, 5:00pm, 4424 AV Williams Final: 10:30 AM Wednesday, December 19, 2007 General Guidelines: The final will

More information

Chapter 4-3D Modeling

Chapter 4-3D Modeling Chapter 4-3D Modeling Polygon Meshes Geometric Primitives Interpolation Curves Levels Of Detail (LOD) Constructive Solid Geometry (CSG) Extrusion & Rotation Volume- and Point-based Graphics 1 The 3D rendering

More information

CMSC427 Final Practice v2 Fall 2017

CMSC427 Final Practice v2 Fall 2017 CMSC427 Final Practice v2 Fall 2017 This is to represent the flow of the final and give you an idea of relative weighting. No promises that knowing this will predict how you ll do on the final. Some questions

More information

The exam begins at 2:40pm and ends at 4:00pm. You must turn your exam in when time is announced or risk not having it accepted.

The exam begins at 2:40pm and ends at 4:00pm. You must turn your exam in when time is announced or risk not having it accepted. CS 184: Foundations of Computer Graphics page 1 of 12 Student Name: Student ID: Instructions: Read them carefully! The exam begins at 2:40pm and ends at 4:00pm. You must turn your exam in when time is

More information

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India

A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India A Review of Image- based Rendering Techniques Nisha 1, Vijaya Goel 2 1 Department of computer science, University of Delhi, Delhi, India Keshav Mahavidyalaya, University of Delhi, Delhi, India Abstract

More information

Photorealism: Ray Tracing

Photorealism: Ray Tracing Photorealism: Ray Tracing Reading Assignment: Chapter 13 Local vs. Global Illumination Local Illumination depends on local object and light sources only Global Illumination at a point can depend on any

More information

Central issues in modelling

Central issues in modelling Central issues in modelling Construct families of curves, surfaces and volumes that can represent common objects usefully; are easy to interact with; interaction includes: manual modelling; fitting to

More information

CS 4620 Final Exam. (a) Is a circle C 0 continuous?

CS 4620 Final Exam. (a) Is a circle C 0 continuous? CS 4620 Final Exam Wednesday 9, December 2009 2 1 2 hours Prof. Doug James Explain your reasoning for full credit. You are permitted a double-sided sheet of notes. Calculators are allowed but unnecessary.

More information

Curves. Computer Graphics CSE 167 Lecture 11

Curves. Computer Graphics CSE 167 Lecture 11 Curves Computer Graphics CSE 167 Lecture 11 CSE 167: Computer graphics Polynomial Curves Polynomial functions Bézier Curves Drawing Bézier curves Piecewise Bézier curves Based on slides courtesy of Jurgen

More information

Supplement to Lecture 16

Supplement to Lecture 16 Supplement to Lecture 16 Global Illumination: View Dependent CS 354 Computer Graphics http://www.cs.utexas.edu/~bajaj/ Notes and figures from Ed Angel: Interactive Computer Graphics, 6 th Ed., 2012 Addison

More information

Chapter 26 Geometrical Optics

Chapter 26 Geometrical Optics Chapter 26 Geometrical Optics 26.1 The Reflection of Light 26.2 Forming Images With a Plane Mirror 26.3 Spherical Mirrors 26.4 Ray Tracing and the Mirror Equation 26.5 The Refraction of Light 26.6 Ray

More information

Principles of Computer Graphics. Lecture 3 1

Principles of Computer Graphics. Lecture 3 1 Lecture 3 Principles of Computer Graphics Lecture 3 1 Why we learn computer graphics? Appreciate what we see The knowledge can applied when we want to develop specific engineering program that requires

More information

Pipeline Operations. CS 4620 Lecture 14

Pipeline Operations. CS 4620 Lecture 14 Pipeline Operations CS 4620 Lecture 14 2014 Steve Marschner 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives

More information

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc.

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc. Chapter 32 Light: Reflection and Refraction Units of Chapter 32 The Ray Model of Light Reflection; Image Formation by a Plane Mirror Formation of Images by Spherical Mirrors Index of Refraction Refraction:

More information

Pipeline Operations. CS 4620 Lecture 10

Pipeline Operations. CS 4620 Lecture 10 Pipeline Operations CS 4620 Lecture 10 2008 Steve Marschner 1 Hidden surface elimination Goal is to figure out which color to make the pixels based on what s in front of what. Hidden surface elimination

More information

Lecture 25: Bezier Subdivision. And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10

Lecture 25: Bezier Subdivision. And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10 Lecture 25: Bezier Subdivision And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10 1. Divide and Conquer If we are going to build useful

More information

Computer Graphics 1. Chapter 2 (May 19th, 2011, 2-4pm): 3D Modeling. LMU München Medieninformatik Andreas Butz Computergraphik 1 SS2011

Computer Graphics 1. Chapter 2 (May 19th, 2011, 2-4pm): 3D Modeling. LMU München Medieninformatik Andreas Butz Computergraphik 1 SS2011 Computer Graphics 1 Chapter 2 (May 19th, 2011, 2-4pm): 3D Modeling 1 The 3D rendering pipeline (our version for this class) 3D models in model coordinates 3D models in world coordinates 2D Polygons in

More information

CPSC GLOBAL ILLUMINATION

CPSC GLOBAL ILLUMINATION CPSC 314 21 GLOBAL ILLUMINATION Textbook: 20 UGRAD.CS.UBC.CA/~CS314 Mikhail Bessmeltsev ILLUMINATION MODELS/ALGORITHMS Local illumination - Fast Ignore real physics, approximate the look Interaction of

More information

Visualizer An implicit surface rendering application

Visualizer An implicit surface rendering application June 01, 2004 Visualizer An implicit surface rendering application Derek Gerstmann - C1405511 MSc Computer Animation NCCA Bournemouth University OVERVIEW OF APPLICATION Visualizer is an interactive application

More information

Chapter 5. Projections and Rendering

Chapter 5. Projections and Rendering Chapter 5 Projections and Rendering Topics: Perspective Projections The rendering pipeline In order to view manipulate and view a graphics object we must find ways of storing it a computer-compatible way.

More information

Curves and Surfaces Computer Graphics I Lecture 10

Curves and Surfaces Computer Graphics I Lecture 10 15-462 Computer Graphics I Lecture 10 Curves and Surfaces Parametric Representations Cubic Polynomial Forms Hermite Curves Bezier Curves and Surfaces [Angel 10.1-10.6] September 30, 2003 Doug James Carnegie

More information

Answer Key: Three-Dimensional Cross Sections

Answer Key: Three-Dimensional Cross Sections Geometry A Unit Answer Key: Three-Dimensional Cross Sections Name Date Objectives In this lesson, you will: visualize three-dimensional objects from different perspectives be able to create a projection

More information

CSE 167: Introduction to Computer Graphics Lecture #13: Curves. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017

CSE 167: Introduction to Computer Graphics Lecture #13: Curves. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017 CSE 167: Introduction to Computer Graphics Lecture #13: Curves Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2017 Announcements Project 4 due Monday Nov 27 at 2pm Next Tuesday:

More information

Homework #2. Shading, Ray Tracing, and Texture Mapping

Homework #2. Shading, Ray Tracing, and Texture Mapping Computer Graphics Prof. Brian Curless CSE 457 Spring 2000 Homework #2 Shading, Ray Tracing, and Texture Mapping Prepared by: Doug Johnson, Maya Widyasari, and Brian Curless Assigned: Monday, May 8, 2000

More information

Light: Geometric Optics

Light: Geometric Optics Light: Geometric Optics 23.1 The Ray Model of Light Light very often travels in straight lines. We represent light using rays, which are straight lines emanating from an object. This is an idealization,

More information

Texture Mapping. Brian Curless CSE 457 Spring 2015

Texture Mapping. Brian Curless CSE 457 Spring 2015 Texture Mapping Brian Curless CSE 457 Spring 2015 1 Reading Required Angel, 7.4-7.10 Recommended Paul S. Heckbert. Survey of texture mapping. IEEE Computer Graphics and Applications 6(11): 56--67, November

More information

CPSC / Texture Mapping

CPSC / Texture Mapping CPSC 599.64 / 601.64 Introduction and Motivation so far: detail through polygons & materials example: brick wall problem: many polygons & materials needed for detailed structures inefficient for memory

More information

9. Three Dimensional Object Representations

9. Three Dimensional Object Representations 9. Three Dimensional Object Representations Methods: Polygon and Quadric surfaces: For simple Euclidean objects Spline surfaces and construction: For curved surfaces Procedural methods: Eg. Fractals, Particle

More information

Ray Tracing. Brian Curless CSEP 557 Fall 2016

Ray Tracing. Brian Curless CSEP 557 Fall 2016 Ray Tracing Brian Curless CSEP 557 Fall 2016 1 Reading Required: Shirley, section 10.1-10.7 (online handout) Triangle intersection (online handout) Further reading: Shirley errata on syllabus page, needed

More information

Reflection and Shading

Reflection and Shading Reflection and Shading R. J. Renka Department of Computer Science & Engineering University of North Texas 10/19/2015 Light Sources Realistic rendering requires that we model the interaction between light

More information

lecture 18 - ray tracing - environment mapping - refraction

lecture 18 - ray tracing - environment mapping - refraction lecture 18 - ray tracing - environment mapping - refraction Recall Ray Casting (lectures 7, 8) for each pixel (x,y) { cast a ray through that pixel into the scene, and find the closest surface along the

More information

Computer Graphics Ray Casting. Matthias Teschner

Computer Graphics Ray Casting. Matthias Teschner Computer Graphics Ray Casting Matthias Teschner Outline Context Implicit surfaces Parametric surfaces Combined objects Triangles Axis-aligned boxes Iso-surfaces in grids Summary University of Freiburg

More information

Until now we have worked with flat entities such as lines and flat polygons. Fit well with graphics hardware Mathematically simple

Until now we have worked with flat entities such as lines and flat polygons. Fit well with graphics hardware Mathematically simple Curves and surfaces Escaping Flatland Until now we have worked with flat entities such as lines and flat polygons Fit well with graphics hardware Mathematically simple But the world is not composed of

More information

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11 Pipeline Operations CS 4620 Lecture 11 1 Pipeline you are here APPLICATION COMMAND STREAM 3D transformations; shading VERTEX PROCESSING TRANSFORMED GEOMETRY conversion of primitives to pixels RASTERIZATION

More information

CHAPTER 1 Graphics Systems and Models 3

CHAPTER 1 Graphics Systems and Models 3 ?????? 1 CHAPTER 1 Graphics Systems and Models 3 1.1 Applications of Computer Graphics 4 1.1.1 Display of Information............. 4 1.1.2 Design.................... 5 1.1.3 Simulation and Animation...........

More information

Introduction to 3D Concepts

Introduction to 3D Concepts PART I Introduction to 3D Concepts Chapter 1 Scene... 3 Chapter 2 Rendering: OpenGL (OGL) and Adobe Ray Tracer (ART)...19 1 CHAPTER 1 Scene s0010 1.1. The 3D Scene p0010 A typical 3D scene has several

More information

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1)

Computer Graphics. Lecture 14 Bump-mapping, Global Illumination (1) Computer Graphics Lecture 14 Bump-mapping, Global Illumination (1) Today - Bump mapping - Displacement mapping - Global Illumination Radiosity Bump Mapping - A method to increase the realism of 3D objects

More information

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic.

Lets assume each object has a defined colour. Hence our illumination model is looks unrealistic. Shading Models There are two main types of rendering that we cover, polygon rendering ray tracing Polygon rendering is used to apply illumination models to polygons, whereas ray tracing applies to arbitrary

More information

CS 325 Computer Graphics

CS 325 Computer Graphics CS 325 Computer Graphics 04 / 02 / 2012 Instructor: Michael Eckmann Today s Topics Questions? Comments? Illumination modelling Ambient, Diffuse, Specular Reflection Surface Rendering / Shading models Flat

More information

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into 2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into the viewport of the current application window. A pixel

More information

Introduction to Computer Graphics with WebGL

Introduction to Computer Graphics with WebGL Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science Laboratory University of New Mexico Models and Architectures

More information

Computer Graphics. Si Lu. Fall uter_graphics.htm 11/22/2017

Computer Graphics. Si Lu. Fall uter_graphics.htm 11/22/2017 Computer Graphics Si Lu Fall 2017 http://web.cecs.pdx.edu/~lusi/cs447/cs447_547_comp uter_graphics.htm 11/22/2017 Last time o Splines 2 Today o Raytracing o Final Exam: 14:00-15:30, Novermber 29, 2017

More information

CSE 167: Introduction to Computer Graphics Lecture #11: Bezier Curves. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016

CSE 167: Introduction to Computer Graphics Lecture #11: Bezier Curves. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016 CSE 167: Introduction to Computer Graphics Lecture #11: Bezier Curves Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2016 Announcements Project 3 due tomorrow Midterm 2 next

More information

Curves and Surfaces Computer Graphics I Lecture 9

Curves and Surfaces Computer Graphics I Lecture 9 15-462 Computer Graphics I Lecture 9 Curves and Surfaces Parametric Representations Cubic Polynomial Forms Hermite Curves Bezier Curves and Surfaces [Angel 10.1-10.6] February 19, 2002 Frank Pfenning Carnegie

More information

An introduction to interpolation and splines

An introduction to interpolation and splines An introduction to interpolation and splines Kenneth H. Carpenter, EECE KSU November 22, 1999 revised November 20, 2001, April 24, 2002, April 14, 2004 1 Introduction Suppose one wishes to draw a curve

More information

Homework #2. Hidden Surfaces, Projections, Shading and Texture, Ray Tracing, and Parametric Curves

Homework #2. Hidden Surfaces, Projections, Shading and Texture, Ray Tracing, and Parametric Curves Computer Graphics Instructor: Brian Curless CSE 457 Spring 2013 Homework #2 Hidden Surfaces, Projections, Shading and Texture, Ray Tracing, and Parametric Curves Assigned: Sunday, May 12 th Due: Thursday,

More information

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation

CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation CSE 167: Introduction to Computer Graphics Lecture #4: Vertex Transformation Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013 Announcements Project 2 due Friday, October 11

More information

Phys102 Lecture 21/22 Light: Reflection and Refraction

Phys102 Lecture 21/22 Light: Reflection and Refraction Phys102 Lecture 21/22 Light: Reflection and Refraction Key Points The Ray Model of Light Reflection and Mirrors Refraction, Snell s Law Total internal Reflection References 23-1,2,3,4,5,6. The Ray Model

More information

Engineering designs today are frequently

Engineering designs today are frequently Basic CAD Engineering designs today are frequently constructed as mathematical solid models instead of solely as 2D drawings. A solid model is one that represents a shape as a 3D object having mass properties.

More information

Ray tracing. Computer Graphics COMP 770 (236) Spring Instructor: Brandon Lloyd 3/19/07 1

Ray tracing. Computer Graphics COMP 770 (236) Spring Instructor: Brandon Lloyd 3/19/07 1 Ray tracing Computer Graphics COMP 770 (236) Spring 2007 Instructor: Brandon Lloyd 3/19/07 1 From last time Hidden surface removal Painter s algorithm Clipping algorithms Area subdivision BSP trees Z-Buffer

More information

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture CS 475 / CS 675 Computer Graphics Lecture 11 : Texture Texture Add surface detail Paste a photograph over a surface to provide detail. Texture can change surface colour or modulate surface colour. http://en.wikipedia.org/wiki/uv_mapping

More information

Light: Geometric Optics (Chapter 23)

Light: Geometric Optics (Chapter 23) Light: Geometric Optics (Chapter 23) Units of Chapter 23 The Ray Model of Light Reflection; Image Formed by a Plane Mirror Formation of Images by Spherical Index of Refraction Refraction: Snell s Law 1

More information

Texture Mapping. Reading. Implementing texture mapping. Texture mapping. Daniel Leventhal Adapted from Brian Curless CSE 457 Autumn 2011.

Texture Mapping. Reading. Implementing texture mapping. Texture mapping. Daniel Leventhal Adapted from Brian Curless CSE 457 Autumn 2011. Reading Recommended Texture Mapping Daniel Leventhal Adapted from Brian Curless CSE 457 Autumn 2011 Angel, 8.6, 8.7, 8.9, 8.10, 9.13-9.13.2 Paul S. Heckbert. Survey of texture mapping. IEEE Computer Graphics

More information

Reading. Ray Tracing. Eye vs. light ray tracing. Geometric optics. Required:

Reading. Ray Tracing. Eye vs. light ray tracing. Geometric optics. Required: Reading Required: Watt, sections 1.3-1.4, 12.1-12.5.1 (handout) Triangle intersection handout Further reading: Ray Tracing Watt errata on syllabus page, needed if you work from his book instead of the

More information

Ray Tracing. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University

Ray Tracing. CS334 Fall Daniel G. Aliaga Department of Computer Science Purdue University Ray Tracing CS334 Fall 2013 Daniel G. Aliaga Department of Computer Science Purdue University Ray Casting and Ray Tracing Ray Casting Arthur Appel, started around 1968 Ray Tracing Turner Whitted, started

More information

Reading. Texture Mapping. Non-parametric texture mapping. Texture mapping. Required. Angel, 8.6, 8.7, 8.9, Recommended

Reading. Texture Mapping. Non-parametric texture mapping. Texture mapping. Required. Angel, 8.6, 8.7, 8.9, Recommended Reading Required Angel, 8.6, 8.7, 8.9, 8.10 Recommended Texture Mapping Paul S. Heckbert. Survey of texture mapping. IEEE Computer Graphics and Applications 6(11): 56--67, November 1986. Optional Woo,

More information

Optics II. Reflection and Mirrors

Optics II. Reflection and Mirrors Optics II Reflection and Mirrors Geometric Optics Using a Ray Approximation Light travels in a straight-line path in a homogeneous medium until it encounters a boundary between two different media The

More information

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane Rendering Pipeline Rendering Converting a 3D scene to a 2D image Rendering Light Camera 3D Model View Plane Rendering Converting a 3D scene to a 2D image Basic rendering tasks: Modeling: creating the world

More information

Ray Tracer Due date: April 27, 2011

Ray Tracer Due date: April 27, 2011 Computer graphics Assignment 4 1 Overview Ray Tracer Due date: April 27, 2011 In this assignment you will implement the camera and several primitive objects for a ray tracer, and a basic ray tracing algorithm.

More information

Spring 2012 Final. CS184 - Foundations of Computer Graphics. University of California at Berkeley

Spring 2012 Final. CS184 - Foundations of Computer Graphics. University of California at Berkeley Spring 2012 Final CS184 - Foundations of Computer Graphics University of California at Berkeley Write your name HERE: Write your login HERE: Closed book. You may not use any notes or printed/electronic

More information

Ray Tracing. Kjetil Babington

Ray Tracing. Kjetil Babington Ray Tracing Kjetil Babington 21.10.2011 1 Introduction What is Ray Tracing? Act of tracing a ray through some scene Not necessarily for rendering Rendering with Ray Tracing Ray Tracing is a global illumination

More information

Institutionen för systemteknik

Institutionen för systemteknik Code: Day: Lokal: M7002E 19 March E1026 Institutionen för systemteknik Examination in: M7002E, Computer Graphics and Virtual Environments Number of sections: 7 Max. score: 100 (normally 60 is required

More information

CS451Real-time Rendering Pipeline

CS451Real-time Rendering Pipeline 1 CS451Real-time Rendering Pipeline JYH-MING LIEN DEPARTMENT OF COMPUTER SCIENCE GEORGE MASON UNIVERSITY Based on Tomas Akenine-Möller s lecture note You say that you render a 3D 2 scene, but what does

More information

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico

Models and Architectures. Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico Models and Architectures Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico 1 Objectives Learn the basic design of a graphics system Introduce

More information

Ray Tracing. CPSC 453 Fall 2018 Sonny Chan

Ray Tracing. CPSC 453 Fall 2018 Sonny Chan Ray Tracing CPSC 453 Fall 2018 Sonny Chan Ray Tracing A method for synthesizing images of virtual 3D scenes. Image Capture Devices Which one shall we use? Goal: Simulate a Camera Obscura! Spheres & Checkerboard

More information

Path-planning by Tessellation of Obstacles

Path-planning by Tessellation of Obstacles Path-planning by Tessellation of Obstacles Tane Pendragon and Lyndon While School of Computer Science & Software Engineering, The University of Western Australia, Western Australia 6009 email: {pendrt01,

More information

Lecture 25 of 41. Spatial Sorting: Binary Space Partitioning Quadtrees & Octrees

Lecture 25 of 41. Spatial Sorting: Binary Space Partitioning Quadtrees & Octrees Spatial Sorting: Binary Space Partitioning Quadtrees & Octrees William H. Hsu Department of Computing and Information Sciences, KSU KSOL course pages: http://bit.ly/hgvxlh / http://bit.ly/evizre Public

More information

CS 465 Program 4: Modeller

CS 465 Program 4: Modeller CS 465 Program 4: Modeller out: 30 October 2004 due: 16 November 2004 1 Introduction In this assignment you will work on a simple 3D modelling system that uses simple primitives and curved surfaces organized

More information

Overview: Ray Tracing & The Perspective Projection Pipeline

Overview: Ray Tracing & The Perspective Projection Pipeline Overview: Ray Tracing & The Perspective Projection Pipeline Lecture #2 Thursday, August 28 2014 About this Lecture! This is an overview.! Think of it as a quick tour moving fast.! Some parts, e.g. math,

More information

Models and Architectures

Models and Architectures Models and Architectures Objectives Learn the basic design of a graphics system Introduce graphics pipeline architecture Examine software components for an interactive graphics system 1 Image Formation

More information

Reading. Texture Mapping. Non-parametric texture mapping. Texture mapping. Required. Angel, 8.6, 8.7, 8.9, 8.10,

Reading. Texture Mapping. Non-parametric texture mapping. Texture mapping. Required. Angel, 8.6, 8.7, 8.9, 8.10, Reading Required Angel, 8.6, 8.7, 8.9, 8.10, 9.13-9.13.2 Recommended Texture Mapping Paul S. Heckbert. Survey of texture mapping. IEEE Computer Graphics and Applications 6(11): 56--67, November 1986. Optional

More information

Lecture Outline Chapter 26. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Lecture Outline Chapter 26. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc. Lecture Outline Chapter 26 Physics, 4 th Edition James S. Walker Chapter 26 Geometrical Optics Units of Chapter 26 The Reflection of Light Forming Images with a Plane Mirror Spherical Mirrors Ray Tracing

More information

Computer Graphics. Shading. Based on slides by Dianna Xu, Bryn Mawr College

Computer Graphics. Shading. Based on slides by Dianna Xu, Bryn Mawr College Computer Graphics Shading Based on slides by Dianna Xu, Bryn Mawr College Image Synthesis and Shading Perception of 3D Objects Displays almost always 2 dimensional. Depth cues needed to restore the third

More information

Nicholas J. Giordano. Chapter 24. Geometrical Optics. Marilyn Akins, PhD Broome Community College

Nicholas J. Giordano.   Chapter 24. Geometrical Optics. Marilyn Akins, PhD Broome Community College Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 24 Geometrical Optics Marilyn Akins, PhD Broome Community College Optics The study of light is called optics Some highlights in the history

More information

Physics 1C Lecture 26A. Beginning of Chapter 26

Physics 1C Lecture 26A. Beginning of Chapter 26 Physics 1C Lecture 26A Beginning of Chapter 26 Mirrors and Lenses! As we have noted before, light rays can be diverted by optical systems to fool your eye into thinking an object is somewhere that it is

More information

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture

Texture. Texture Mapping. Texture Mapping. CS 475 / CS 675 Computer Graphics. Lecture 11 : Texture Texture CS 475 / CS 675 Computer Graphics Add surface detail Paste a photograph over a surface to provide detail. Texture can change surface colour or modulate surface colour. Lecture 11 : Texture http://en.wikipedia.org/wiki/uv_mapping

More information