Rasterization 1
Approaches to rendering Rasterization: project each object onto image real-time rendering and GPUs foreach object in scene: foreach pixel in image: if object affects pixel: do_something() Raytracing: project each pixel onto the objects offline rendering, realistic rendering foreach pixel in image: foreach object in scene: if object affects pixel: do_something() 2
The graphics pipeline The standard approach to object-order graphics Implemented in hardware, e.g. graphics cards in PCs amazing performance: millions of triangles per frame We ll focus on an abstract version of hardware pipeline Pipeline because of the many stages Very parallelizable, leading to remarkable performance of graphics cards (many times the flops of CPUs at ~1/5 clock speed) Minimal support for shapes: points, lines, triangles other shapes need to be translated into the above ones Trend over the decades: toward minimal primitives simple, uniform, repetitive: good for parallelism 3
Pipeline VERTS. INDICES TRANSFORMED VERTICES ASSEMBLED PRIMITIVES PIXEL FRAGMENTS SHADED FRAGMENTS FINAL IMAGE VERTEX PROCESSING PRIMITIVE ASSEMBLY RASTERIZATION FRAGMENT PROCESSING FRAMEBUFFER PROCESSING Transform vertices and projection Assemble vertices into primitives Fragment generation Lighting z-buffer Compositing 4
Programmable Pipeline VERTS. INDICES TRANSFORMED VERTICES ASSEMBLED PRIMITIVES PIXEL FRAGMENTS SHADED FRAGMENTS FINAL IMAGE VERTEX PROCESSING PRIMITIVE ASSEMBLY RASTERIZATION FRAGMENT PROCESSING FRAMEBUFFER PROCESSING VERTEX SHADER 2 6 4 2n l+r r l 0 l r 0 2n b+t 0 t b b t 0 f+n 2fn n f f n 0 0 0 0 1 0 CONSTANTS TEXTURES 3 7 5 FRAGMENT SHADER
Projection 6
Pipeline VERTS. INDICES TRANSFORMED VERTICES ASSEMBLED PRIMITIVES PIXEL FRAGMENTS SHADED FRAGMENTS FINAL IMAGE VERTEX PROCESSING PRIMITIVE ASSEMBLY RASTERIZATION FRAGMENT PROCESSING FRAMEBUFFER PROCESSING Transform vertices and projection Assemble vertices into primitives Fragment generation Lighting z-buffer Compositing 7
Parallel projection To render an image of a 3D scene, we project it onto a plane Simplest kind of projection is parallel projection image projection plane scene 8
Orthographic projection projection plane parallel to a coordinate plane projection direction perpendicular to projection plane [Carlbom & Paciorek 78] 9
Off-axis parallel projection [Carlbom & Paciorek 78] axonometric: projection plane perpendicular to projection direction but not parallel to coordinate planes oblique: projection plane parallel to a coordinate plane but not perpendicular to projection direction. 10
Orthographic projection In graphics usually we lump axonometric with orthographic Projection plane perpendicular to projection direction Image height determines size of objects in image 11
Orthographic projection In graphics usually we lump axonometric with orthographic Projection plane perpendicular to projection direction Image height determines size of objects in image 11
Orthographic projection In graphics usually we lump axonometric with orthographic Projection plane perpendicular to projection direction Image height determines size of objects in image 11
Orthographic projection In graphics usually we lump axonometric with orthographic Projection plane perpendicular to projection direction Image height determines size of objects in image 11
Oblique projection View direction no longer coincides with projection plane normal Objects at different distances still same size Objects are shifted in the image depending on their depth 12
Oblique projection View direction no longer coincides with projection plane normal Objects at different distances still same size Objects are shifted in the image depending on their depth 12
Oblique projection View direction no longer coincides with projection plane normal Objects at different distances still same size Objects are shifted in the image depending on their depth 12
Oblique projection View direction no longer coincides with projection plane normal Objects at different distances still same size Objects are shifted in the image depending on their depth 12
History of projection Ancient times: Greeks wrote about laws of perspective Renaissance: perspective is adopted by artists [Duccio c. 1308] 13
History of projection [da Vinci c. 1498] Later Renaissance: perspective formalized precisely 14
History of projection [Richard Zakia] 15
Perspective projection one-point: projection plane parallel to a coordinate plane two-point: projection plane parallel to one coordinate axis [Carlbom & Paciorek 78] three-point: projection plane not parallel to a coordinate axis 16
Perspective projection Perspective is projection by lines through a point Magnification determined by: image height h, object depth z, image plane distance d Corresponds to common cameras lenses y 0 = dy/z 17
Perspective projection Perspective is projection by lines through a point Magnification determined by: image height h, object depth z, image plane distance d Corresponds to common cameras lenses y 0 = dy/z 17
Perspective projection Perspective is projection by lines through a point Magnification determined by: image height h, object depth z, image plane distance d Corresponds to common cameras lenses y 0 = dy/z 17
Perspective projection Perspective is projection by lines through a point Magnification determined by: image height h, object depth z, image plane distance d Corresponds to common cameras lenses y 0 = dy/z 17
Perspective distortions Lengths and length ratios are not preserved Angles are not preserved [Carlbom & Paciorek 78] 18
Shifted perspective projection Perspective with image plane not perpendicular to view direction Like cropping an off-center rectangle from normal perspective Corresponds to view cameras in photography 19
Shifted perspective projection Perspective with image plane not perpendicular to view direction Like cropping an off-center rectangle from normal perspective Corresponds to view cameras in photography 19
Shifted perspective projection Perspective with image plane not perpendicular to view direction Like cropping an off-center rectangle from normal perspective Corresponds to view cameras in photography 19
Shifted perspective projection Perspective with image plane not perpendicular to view direction Like cropping an off-center rectangle from normal perspective Corresponds to view cameras in photography 19
Why shifted perspective? Control convergence of parallel lines Standard example: architecture buildings are taller than you, so you look up top of building is farther away, so it looks smaller Solution: make projection plane parallel to facade top of building is the same distance from the projection plane Same perspective effects can be achieved using post-processing (though not the focus effects) choice of which rays vs. arrangement of rays in image 20
[Philip Greenspun] camera tilted up: converging vertical lines 21
[Philip Greenspun] lens shifted up: parallel vertical lines 22
Computing Projections 23
Pipeline of transformations object space camera space screen space modeling transformation camera transformation projection transformation viewport transformation world space canonical view volume 24
Mathematics of projection Projections defined in eye coords: assume eye point at 0 and plane perpendicular to z Orthographic case: just ignore z Perspective case: scale diminishes with z and increases with d Use matrix notation eventually containing all transformations from local coordinate to pixel coordinates Screen space defined in pixel units [0,nx]x[0,ny] Canonical view volume: [-1,1] 3 25
Homogeneous coordinates Use a modified notation, indicating a point in 3D as a 4D vector with the last coordinate set to 1 Can allow arbitrary 4th coordinate w as well when w is not zero, we can divide by w therefore these points represent normal points when w is zero, it s a point at infinity, a.k.a. a direction In graphics, the main reason for homogeneous coordinates is perspective projection 2 3 2 3 2 3 x wx x p = 4y5 6y 7 4z5 6wy 7 4wz5 z 1 w 26
Viewing transformation the camera frame rewrites all coordinates in eye space 27
Viewing transform Projections transforms are written in eye coordinates Need to transform in that space before performing projection Use the matrix corresponding to the inverse of the camera frame Remember that geometry would originally have been in the object s local coordinates; transform into world coordinates with the instance frame p eye = M view p world F = {x, y, z, o}! M view = apple x y z o 0 0 0 1 1 28
Orthographic projection ex. Example: assume height is 1, then transform by ignoring z Keep z around for later 2 6 4 x canonical y canonical z canonical 1 p canonical = M ortho p eye 3 2 3 2 3 2 x eye 1 0 0 0 7 5 = 6y eye 7 4z eye 5 = 60 1 0 0 7 6 40 0 1 05 4 1 0 0 0 1 x eye y eye z eye 1 3 7 5 29
Orthographic view volume 30
Orthographic view frustum 31
Orthographic projection Specify view by left, right, top, bottom Add two clipping planes that constrain the view volume in z near plane: parallel to view plane; things between it and the viewpoint will not be rendered far plane: also parallel; things behind it will not be rendered partly to remove unnecessary stuff (e.g. behind the camera) mostly required constrain the range of depths (we ll see later) 32
Orthographic projection Matrix that maps the view volume to the canonical view volume M ortho = p canonical = M ortho p eye 2 2 r+l r l 0 0 r l 2 t+b 6 0 t b 0 t b 4 2 0 0 n f 0 0 0 1 n+f n f 3 7 5 33
Viewport transform To draw in image, need coordinates in pixel units Exactly the opposite of mapping (i,j) to (u,v) in ray generation In 3D, carry z along by adding one row and one column p screen = M viewport p canonical M viewport = 2 6 4 n x n x 1 2 n y 1 2 2 0 0 n 0 y 2 0 0 0 1 0 0 0 0 1 3 7 5 34
[Ray Verrier] 35
Perspective projection similar triangles: 36
Perspective projection Example: to implement perspective, just move z to w 37
Perspective view volume 38
Perspective view volume 39
Perspective view frustum 40
Projecting depth Perspective has a varying denominator can t preserve depth! Compromise: preserve depth on near and far planes That is, choose a and b so that z (n) = n and z (f) = f. n 0 0 0 0 n 0 0 0 0 n + f fn 0 0 1 0 41
Perspective projection Perspective divide from the equation before Combine with rescaling to fit the view volume (orthographic) p canonical = M persp p eye M persp = 2 6 4 2n l+r r l 0 2n 0 t b l r 0 b+t b t 0 f+n n f 2fn f n 0 0 0 0 1 0 3 7 5 42
Pipeline of transformations object space camera space screen space modeling transformation camera transformation projection transformation viewport transformation world space canonical view volume 43
Pipeline of transformations Start with coordinates in object s local coordinates p local Transform into world coords Transform into eye coords Project into canonical coords Transform to pixels coords p world = M instance p local p eye = M view p world p canonical =(M persp or M ortho )p eye p screen = M viewport p canonical p screen = M viewport M proj M view M instance p local object space camera space screen space modeling transformation camera transformation projection viewport transformation transformation world space canonical view volume 44
Rasterization 45
Pipeline VERTS. INDICES TRANSFORMED VERTICES ASSEMBLED PRIMITIVES PIXEL FRAGMENTS SHADED FRAGMENTS FINAL IMAGE VERTEX PROCESSING PRIMITIVE ASSEMBLY RASTERIZATION FRAGMENT PROCESSING FRAMEBUFFER PROCESSING Transform vertices and projection Assemble vertices into primitives Fragment generation Lighting z-buffer Compositing 46
Rasterization First job: enumerate the pixels covered by a primitive simple, aliased definition: pixels whose centers fall inside Second job: interpolate values across the primitive e.g. colors computed at vertices e.g. normals at vertices e.g. texture coordinates 47
Rasterizing lines Define line as a rectangle Specify by two endpoints Ideal image: black inside, white outside 48
Rasterizing lines Define line as a rectangle Specify by two endpoints Ideal image: black inside, white outside 48
Point sampling Approximate rectangle by drawing all pixels whose centers fall within the line Problem: sometimes turns on adjacent pixels 49
Point sampling Approximate rectangle by drawing all pixels whose centers fall within the line Problem: sometimes turns on adjacent pixels 49
Point sampling 50
Midpoint alg. (Bresenham) Point sampling unit width rectangle leads to uneven line width Define line width parallel to pixel grid That is, turn on the single nearest pixel in each column Note that 45º lines are ånow thinner 51
Midpoint alg. (Bresenham) Point sampling unit width rectangle leads to uneven line width Define line width parallel to pixel grid That is, turn on the single nearest pixel in each column Note that 45º lines are ånow thinner 51
Midpoint alg. (Bresenham) Point sampling unit width rectangle leads to uneven line width Define line width parallel to pixel grid That is, turn on the single nearest pixel in each column Note that 45º lines are ånow thinner 51
Midpoint alg. 52
Algorithms for drawing lines Line equation: y = b + m x Simple algorithm: evaluate line equation per column Assume x0 < x1, 0 m 1 for (x : ceil(x0) floor(x1)) { auto y = b + m*x; output(x, round(y)); } y = 1.91 + 0.37 x 53
Optimizing line drawing Multiplying and rounding is slow At each pixel the only options are E and NE d = m(x + 1) + b y d > 0.5 decides between E and NE 54
Optimizing line drawing d = m(x + 1) + b y Only need to update d for integer steps in x and y Do that with addition Known as DDA (digital differential analyzer) 55
Midpoint line algorithm x = ceil(x0); y = round(m*x + b); d = m*(x + 1) + b y; while(x < floor(x1)) { if(d > 0.5) { y += 1; d = 1; } x += 1; d += m; output(x, y); } 56
Linear interpolation We often attach attributes to vertices e.g. computed diffuse color of a hair being drawn using lines want color to vary smoothly along a chain of line segments Recall basic definition 1D: f(x) = (1 α) y0 + α y1 where α = (x x0) / (x1 x0) In the 2D case of a line segment, alpha is just the fraction of the distance from (x0, y0) to (x1, y1) 57
Linear interpolation Pixels are not exactly on the line Define 2D function by projection on line This is linear in 2D, therefore can use DDA to interpolate 58
Linear interpolation Pixels are not exactly on the line Define 2D function by projection on line This is linear in 2D, therefore can use DDA to interpolate 58
Linear interpolation Pixels are not exactly on the line Define 2D function by projection on line This is linear in 2D, therefore can use DDA to interpolate 58
Alternate interpretation We are updating d and α as we step from pixel to pixel d tells us how far from the line we are α tells us how far along the line we are So d and α are coordinates in a coordinate system oriented to the line 59
Alternate interpretation View loop as visiting all pixels the line passes through Interpolate d and α for each pixel Only output fragment if pixel is in band This makes linear interpolation the primary operation 60
Pixel-walk line rasterization x = ceil(x0); y = round(m*x + b); d = m*x + b y; while(x < floor(x1)) { if(d > 0.5) y += 1; d = 1; else x += 1; d += m; if( 0.5 < d 0.5) output(x, y); } 61
Rasterizing triangles The most common case in most applications with good antialiasing can be the only case some systems render a line as two skinny triangles Triangle represented by three vertices Simple way to think of algorithm follows the pixel-walk interpretation of line rasterization walk from pixel to pixel over (at least) the polygon s area evaluate linear functions as you go use those functions to decide which pixels are inside 62
Rasterizing triangles Input: three 2D points (the triangle s vertices in pixel space) (x0, y0); (x1, y1); (x2, y2) parameter values at each vertex q00,, q0n; q10,, q1n; q20,, q2n Output: a list of fragments, each with the integer pixel coordinates (x, y) interpolated parameter values q0,, qn 63
Rasterizing triangles 1. evaluation of linear functions on pixel grid 2. functions defined by parameter values at vertices 3. using extra parameters to determine fragment set 64
Incremental linear evaluation A linear (affine, really) function on the plane is: Linear functions are efficient to evaluate on a grid: 65
Incremental linear evaluation lineval(xm, xm, ym, ym, cx, cy, ck) { // setup auto qrow = cx*xm + cy*ym + ck; } // traversal for(auto y : range(ym, ym)) { auto qpix = qrow; for(auto x : range(xm, xm)) { output(x, y, qpix); qpix += cx; } qrow += cy; } cx =.005; cy =.005; ck = 0 (image size 100x100) 66
Rasterizing triangles 1. evaluation of linear functions on pixel grid 2. functions defined by parameter values at vertices 3. using extra parameters to determine fragment set 67
Defining parameter functions To interpolate parameters across a triangle we need to find the cx, cy, and ck that define the (unique) linear function that matches the given values at all 3 vertices We can solve as a linear system of 3 equations for 3 unknowns: (each states that the function agrees with the given value at one vertex) Leading to a 3x3 matrix equation for the coefficients: (singular iff triangle is degenerate) 68
Defining parameter functions More efficient version: shift origin to (x0, y0) q(x, y) = c x (x x 0 ) + c y (y y 0 ) + q 0 q(x 1, y 1 ) = c x (x 1 x 0 ) + c y (y 1 y 0 ) + q 0 = q 1 q(x 2, y 2 ) = c x (x 2 x 0 ) + c y (y 2 y 0 ) + q 0 = q 2 Now this is a 2x2 linear system (since q0 falls out): Solve using Cramer s rule (see Shirley): 69
Defining parameter functions lininterp(xm, xm, ym, ym, x0, y0, q0, x1, y1, q1, x2, y2, q2) { // setup det = (x1 x0)*(y2 y0) (x2 x0)*(y1 y0); cx = ((q1 q0)*(y2 y0) (q2 q0)*(y1 y0)) / det; cy = ((q2 q0)*(x1 x0) (q1 q0)*(x2 x0)) / det; qrow = cx*(xm x0)+cy*(ym y0)+q0; // traversal (same as before) for(auto y : range(ym, ym)) { auto qpix = qrow; for(auto x : range(xm, xm)) { output(x, y, qpix); qpix += cx; } qrow += cy; } } 70
Interpolating several lininterp(xm, xm, ym, ym, n, x0, y0, q0[], x1, y1, q1[], x2, y2, q2[]) { // setup for(auto k : range(0, n 1)) { // compute cx[k], cy[k], qrow[k] // from q0[k], q1[k], q2[k] // traversal for(auto y : range(ym, ym)) { for(auto k : range(1, n)) qpix[k] = qrow[k]; for(auto x : range(xm, xm)) { output(x, y, qpix); for(auto k : range(1, n)) qpix[k] += cx[k]; } for(auto k : range(1, n)) qrow[k] += cy[k]; } } 71
Rasterizing triangles 1. evaluation of linear functions on pixel grid 2. functions defined by parameter values at vertices 3. using extra parameters to determine fragment set 72
Clipping to the triangle Interpolate three barycentric coordinates across the plane recall each barycentric coord is 1 at one vert. and 0 at the other two Output fragments only when all three are > 0. 73
Pixel-walk (Pineda) rasterization Conservatively visit a superset of the pixels you want Interpolate linear functions Use those functions to determine when to emit a fragment 74
Rasterizing triangles Exercise caution with rounding and arbitrary decisions Need to visit these pixels once But it s important not to visit them twice! 75
Clipping Rasterizer tends to assume triangles are on screen particularly problematic to have triangles crossing the plane z = 0 After projection, before perspective divide clip against the planes x, y, z = 1, 1 (6 planes) primitive operation: clip triangle against axis-aligned plane 76
Clipping a triangle against a plane 4 cases, based on sidedness of vertices all in (keep) all out (discard) one in, two out (one clipped triangle) two in, one out (two clipped triangles) 77
Hidden Surface Removal 78
Pipeline VERTS. INDICES TRANSFORMED VERTICES ASSEMBLED PRIMITIVES PIXEL FRAGMENTS SHADED FRAGMENTS FINAL IMAGE VERTEX PROCESSING PRIMITIVE ASSEMBLY RASTERIZATION FRAGMENT PROCESSING FRAMEBUFFER PROCESSING Transform vertices and projection Assemble vertices into primitives Fragment generation Lighting z-buffer Compositing 79
Hidden surface elimination We have discussed how to map primitives to image space projection and perspective are depth cues occlusion is another very important cue 80
Painter s algorithm Simplest way to do hidden surfaces Draw from back to front, use overwriting in framebuffer [WikimediaCommons] 81
Painter s algorithm Simplest way to do hidden surfaces Draw from back to front, use overwriting in framebuffer 82
Painter s algorithm Simplest way to do hidden surfaces Draw from back to front, use overwriting in framebuffer 82
Painter s algorithm Simplest way to do hidden surfaces Draw from back to front, use overwriting in framebuffer 82
Painter s algorithm Simplest way to do hidden surfaces Draw from back to front, use overwriting in framebuffer 82
Painter s algorithm Simplest way to do hidden surfaces Draw from back to front, use overwriting in framebuffer 82
Painter s algorithm Simplest way to do hidden surfaces Draw from back to front, use overwriting in framebuffer 82
Painter s algorithm Amounts to a topological sort of the graph of occlusions But when the graph has cycles, no sort is valid [WikimediaCommons] 83
The z buffer In many (most) applications maintaining a z sort is too expensive changes all the time as the view changes many data structures exist, but complex Solution: draw in any order, keep track of closest object at pixel allocate extra channel per pixel to keep track of closest depth when drawing, compare object s depth to current closest depth and discard if greater this works just like any other compositing operation Another example of a memory-intensive brute force approach that works and has become the standard 84
The z buffer [WikimediaCommons] 85
Precision in z buffer Z precision is distributed between the near and far clipping planes this is why these planes have to exist also why you can t always just set them to very small and very large distances Generally use z (not world z) in z buffer 86
Interpolating in projection linear interp. in screen space linear interp. in world (eye) space 87
Interpolating in projection linear interp. in screen space linear interp. in world (eye) space 87
Interpolating in projection linear interp. in screen space linear interp. in world (eye) space 87
Interpolating in projection linear interp. in screen space linear interp. in world (eye) space 87
Interpolating in projection linear interp. in screen space linear interp. in world (eye) space 87
Interpolating in projection linear interp. in screen space linear interp. in world (eye) space 87
Interpolating in projection linear interp. in screen space linear interp. in world (eye) space 87
Minimal Pipeline Example 88
Pipeline for minimal operation Vertex stage (input: position / vtx; color / tri) transform position (object to screen space) pass through color Rasterizer pass through color Fragment stage (output: color) write to color planes 89
Result of minimal pipeline 90
Pipeline for basic z buffer Vertex stage (input: position / vtx; color / tri) transform position (object to screen space) pass through color Rasterizer interpolated parameter: z (screen z) pass through color Fragment stage (output: color, z ) write to color planes only if interpolated z < current z 91
Result of z-buffer pipeline 92
Flat shading Shade using the real normal of the triangle Leads to constant shading and faceted appearance [Foley et al.] 93
Pipeline for flat shading Vertex stage (input: position / vtx; color and normal / tri) transform position and normal (object to eye space) compute shaded color per triangle using normal transform position (eye to screen space) Rasterizer interpolated parameters: z (screen z) pass through color Fragment stage (output: color, z ) write to color planes only if interpolated z < current z 94
Result of flat-shading pipeline 95
Transforming normal vectors Transforming surface normals differences of points (and therefore tangents) transform OK normals do not --> use inverse transpose matrix 96
Transforming normal vectors Transforming surface normals differences of points (and therefore tangents) transform OK normals do not --> use inverse transpose matrix 96
Gouraud shading Compute colors at vertices using vertex normals Interpolate colors across triangles [Foley et al.] 97
Pipeline for Gouraud shading Vertex stage (input: position, color, and normal / vtx) transform position and normal (object to eye space) compute shaded color per vertex transform position (eye to screen space) Rasterizer interpolated parameters: z (screen z); r, g, b color Fragment stage (output: color, z ) write to color planes only if interpolated z < current z 98
Result of Gouraud shading 99
Local vs. infinite viewer, light Phong illumination requires geometric information: light vector (function of position) eye vector (function of position) surface normal (from application) Light and eye vectors change per vertex 100
Local vs. infinite viewer, light Look at case when eye or light is far away: distant light source: nearly parallel illumination distant eye point: nearly orthographic projection in both cases, eye or light vector changes very little Optimization: approximate eye and/or light as infinitely far away 101
Directional light Light vector always points in the same direction many pipelines are faster if you use directional lights 102
Directional light Light vector always points in the same direction many pipelines are faster if you use directional lights 102
Infinite viewer Orthographic camera: projection direction is constant Infinite viewer : even with perspective, we can approximate eye vector using the image plane normal 103
Non-diffuse Gouraud shading Can apply Gouraud shading to any illumination model Results are not so good for specular shading [Foley et al.] 104
Per-pixel (Phong) shading Get higher quality by interpolating the normal just as easy as interpolating the color but now we are evaluating the illumination model per pixel rather than per vertex (and normalizing the normal first) in pipeline, this means we are moving illumination from the vertex processing stage to the fragment processing stage 105
Per-pixel (Phong) shading Bottom line: produces much better highlights [Foley et al.] 106
Pipeline for per-pixel shading Vertex stage (input: position, color, and normal / vtx) transform position and normal (object to eye space) transform position (eye to screen space) pass through color Rasterizer interpolated parameters: z (screen z); r, g, b color; x, y, z normal Fragment stage (output: color, z ) compute shading using interpolated color and normal write to color planes only if interpolated z < current z 107
Result of per-pixel shading 108
Antialiasing 109
Aliasing continuous image defined by ray tracing procedure continuous image defined by a bunch of black rectangles 110
Rasterizing lines Define line as a rectangle Specify by two endpoints Ideal image: black inside, white outside 111
Rasterizing lines Define line as a rectangle Specify by two endpoints Ideal image: black inside, white outside 111
Point sampling Approximate rectangle by drawing all pixels whose centers fall within the line Problem: all-or-nothing leads to jaggies This is sampling with no filter (aka. point sampling) 112
Point sampling Approximate rectangle by drawing all pixels whose centers fall within the line Problem: all-or-nothing leads to jaggies This is sampling with no filter (aka. point sampling) 112
Point sampling 113
Aliasing Point sampling is fast and simple But the lines have stair steps and variations in width This is an aliasing phenomenon: sharp edges of line contain high frequencies Introduces features to image that are not supposed to be there! 114
Antialiasing Point sampling makes an all-or-nothing choice in each pixel therefore steps are inevitable when the choice changes yet another example where discontinuities are bad On bitmap devices this is necessary hence high resolutions required 600+ dpi in laser printers to make aliasing invisible On continuous-tone devices we can do better 115
Antialiasing Basic idea: replace is the image black at the pixel center? with how much is pixel covered by black? Replace yes/no question with quantitative question. 116
Box filtering Pixel intensity is proportional to area of overlap with square pixel area Also called unweighted area averaging 117
Box filtering by supersampling Compute coverage fraction by counting subpixels Simple and accurate But slow 118
Box filtering 119
Weighted filtering Box filtering problem: treats area near edge same as area near center results in pixel turning on too abruptly Alternative: weight area by a smooth function unweighted averaging corresponds to using a box function a gaussian is a popular choice of smooth filter important property: normalization (unit integral) 120
Weighted filtering Compute filtering integral by summing filter values for covered subpixels Simple, accurate But really slow 121
Weighted filtering Compute filtering integral by summing filter values for covered subpixels Simple, accurate But really slow 121
Gaussian filtering 122
Filter comparison Point sampling Box filtering Gaussian filtering 123
Supersampling vs. multisampling Supersampling is terribly expensive GPUs use an approximation called multisampling Compute one shading value per pixel Store it at many subpixel samples, each with its own depth 124
Multisample rasterization Each fragment carries several (color,depth) samples shading is computed per-fragment depth test is resolved per-sample final color is average of sample colors single- sample multi- sample [http://www.learnopengl.com] 125
Antialiasing textures With multisampling, we evaluate textures once per fragment Need to filter the texture somehow since perspective produces high minification Solution: render textures with one (few) samples/pixel but filter them first [Akenine-Möller et al. 2008] 126
Solution: pixel filtering point sampling area averaging [Akenine-Möller et al. 2008] 127
Pixel footprints image space texture space [Akenine-Möller et al. 2008] 128
Pixel vs. texels Optimal viewing distance: one-to-one mapping between pixel area and texel area When closer, magnification: each pixel is a small part of the texel When farther, minification: each pixel includes many texels [Akenine-Möller et al. 2008] upsampling magnification downsampling minification 129
Filter size by Jacobian image space texture space (0,1) ( u y, v ) y ( u x, v ) x (1,0) y x v ψ(x) x u 130
Mipmap pyramid MIP Maps: (Multum in Parvo) Store a hierarchy of pre-filtered versions of texture While rendering, use the version with texel size closer to pixel size [Akenine-Möller et al. 2002] 131
Compute Mipmaps Average over 4x4 neighbors or use Gaussian filtering Storage increase by 33% [WikimediaCommons] 132
Point sampling 133
Point sampling 133
Gaussian filtering 134
Gaussian filtering 134
Mipmap filtering 135
Mipmap filtering 135
Lighting with Textures 136
Lighting In rasterization, no access to the geometry during shading significantly better efficiency, parallelism, memory locality Shadows need occluder information Reflections need recursive evaluation Option 1: drop pipeline done in offline rendering, not any time soon for real-time Option 2: cheat the lighting use approximate, or even plainly wrong, alsorithms run data multiple times through the pipeline 137
Shadow mapping Pass 1: Render from the light, store z buffer as a depth texture Pass 1: Render from eye, check if distance to the light computed in fragment shader is less then the one saved in the depth texture [NVisia Cg Tutorial] 138
Shadow mapping Render from light Store z-buffer in shadow map Render from eye [WikimediaCommons] Project shadow map Test if shadow map closer than light Multiply by color for final image 139
Shadow mapping artifacts Pixelation due to mismatched resolution [Fernando et al.] 140
Shadow mapping artifacts [Fernando et al.] Pixelation due to mismatched resolution 141
Reflection mapping [M. C. Escher] [Paul Debevec] 142
Reflection mapping Pass 1-6: Render from object center into a 360 panorama using 6 cameras arranged as a cube often store as cube map Pass 7: Render from eye, look up the texture in mirror direction [Paul Haeberly] 143
Reflection mapping Pass 1-6: Render from object center into a 360 panorama using 6 cameras arranged as a cube often store as cube map [WikimediaCommons] 144
Reflection mapping artifacts Geometric distortions: works only at the center of projection Visibility is missing: reflect what it should not [WikimediaCommons] 145