Frequency domain depth filtering of integral imaging

Size: px
Start display at page:

Download "Frequency domain depth filtering of integral imaging"

Transcription

1 Frequency domain depth filtering of integral imaging Jae-Hyeung Park * and Kyeong-Min Jeong School of Electrical & Computer Engineering, Chungbuk National University, 410 SungBong-Ro, Heungduk-Gu, Cheongju-Si, Chungbuk, , Korea * jh.park@cbnu.ac.kr Abstract: A novel technique for depth filtering of integral imaging is proposed. Integral imaging captures spatio-angular distribution of the light rays which delivers three-dimensional information of the object scene. The proposed method performs filtering operation in the frequency domain of the captured spatio-angular light ray distribution, achieving depth selective reconstruction. Grating projection further enhances the depth discrimination performance. The principle is verified experimentally Optical Society of America OCIS codes: ( ) Three-dimensional image processing; ( ) Three-dimensional image acquisition; ( ) Image formation theory. References and links 1. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, Real-time pickup method for a three-dimensional image based on integral photography, Appl. Opt. 36(7), (1997). 2. N. Davies, M. McCormick, and L. Yang, Three-dimensional imaging systems: a new development, Appl. Opt. 27(21), 4520 (1988). 3. S.-W. Min, J. Kim, and B. Lee, New characteristic equation of three-dimensional integral imaging system and its applications, Jpn. J. Appl. Phys. 44(2), L71 L74 (2005). 4. R. Martinez-Cuenca, A. Pons, G. Saavedra, M. Martinez-Corral, and B. Javidi, Optically-corrected elemental images for undistorted Integral image display, Opt. Express 14(21), (2006). 5. J. Arai, M. Okui, T. Yamashita, and F. Okano, Integral three-dimensional television using a 2000-scanning-line video system, Appl. Opt. 45(8), (2006). 6. H. Liao, T. Dohi, and M. Iwahara, Improved viewing resolution of integral videography by use of rotated prism sheets, Opt. Express 15(8), (2007). 7. J. Hahn, Y. Kim, and B. Lee, Uniform angular resolution integral imaging display with boundary folding mirrors, Appl. Opt. 48(3), (2009). 8. J. Kim, S.-W. Min, and B. Lee, Viewing window expansion of integral floating display, Appl. Opt. 48(5), (2009). 9. Y. Kim, H. Choi, J. Kim, S.-W. Cho, Y. Kim, G. Park, and B. Lee, Depth-enhanced integral imaging display system with electrically variable image planes using polymer-dispersed liquid-crystal layers, Appl. Opt. 46(18), (2007). 10. J.-H. Jung, Y. Kim, Y. Kim, J. Kim, K. Hong, and B. Lee, Integral imaging system using an electroluminescent film backlight for three-dimensional-two-dimensional convertibility and a curved structure, Appl. Opt. 48(5), (2009). 11. M. Shin, G. Baasantseren, K.-C. Kwon, N. Kim, and J.-H. Park, Three-dimensional display system based on integral imaging with viewing direction control, Jpn. J. Appl. Phys. 49(7), (2010). 12. J.-Y. Son, B. Javidi, S. Yano, and K.-H. Choi, Recent Developments in 3-D Imaging Technologies, J. Display Technol. 6(10), (2010). 13. J.-H. Park, K. Hong, and B. Lee, Recent progress in three-dimensional information processing based on integral imaging, Appl. Opt. 48(34), H77 H94 (2009). 14. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, Light field microscopy, ACM Trans. on Graphics (Proc. SIGGRAPH) 25, (2006). 15. J.-H. Jung, K. Hong, G. Park, I. Chung, J.-H. Park, and B. Lee, Reconstruction of three-dimensional occluded object using optical flow and triangular mesh reconstruction in integral imaging, Opt. Express 18(25), (2010). 16. B. Javidi, I. Moon, and S. Yeom, Three-dimensional identification of biological microorganism using integral imaging, Opt. Express 14(25), (2006). 17. G. Passalis, N. Sgouros, S. Athineos, and T. Theoharis, Enhanced reconstruction of three-dimensional shape and texture from integral photography images, Appl. Opt. 46(22), (2007). (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18729

2 18. J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, View image generation in perspective and orthographic projection geometry based on integral imaging, Opt. Express 16(12), (2008). 19. T. Mishina, M. Okui, and F. Okano, Calculation of holograms from elemental images captured by integral photography, Appl. Opt. 45(17), (2006). 20. N. T. Shaked, J. Rosen, and A. Stern, Integral holography: white-light single-shot hologram acquisition, Opt. Express 15(9), (2007). 21. J.-H. Park, M.-S. Kim, G. Baasantseren, and N. Kim, Fresnel and Fourier hologram generation using orthographic projection images, Opt. Express 17(8), (2009). 22. S.-H. Hong, J.-S. Jang, and B. Javidi, Three-dimensional volumetric object reconstruction using computational integral imaging, Opt. Express 12(3), (2004). 23. J.-B. Hyun, D.-C. Hwang, D.-H. Shin, and E.-S. Kim, Curved computational integral imaging reconstruction technique for resolution-enhanced display of three-dimensional object images, Appl. Opt. 46(31), (2007). 24. B. Javidi and Y. S. Hwang, Passive near-infrared 3D sensing and computational reconstruction with synthetic aperture integral imaging, J. Display Technol. 4(1), 3 5 (2008). 25. K.-J. Lee, D.-C. Hwang, S.-C. Kim, and E.-S. Kim, Blur-metric-based resolution enhancement of computationally reconstructed integral images, Appl. Opt. 47(15), (2008). 26. G. Baasantseren, J.-H. Park, and N. Kim, Depth discrimination enhanced computational integral imaging using random pattern illumination, Jpn. J. Appl. Phys. 48(2), (2009). 27. J.-X. Chai, S.-C. Chan, H.-Y. Shum, and X. Tong, Plenoptic sampling, Proc. ACM SIGGRAPH, (2000). 28. X. Chen, M. Gramaglia, and J. A. Yeazell, Phase-shift calibration algorithm for phase-shifting interferometry, J. Opt. Soc. Am. A 17(11), (2000). 1. Introduction Integral imaging has been considered as one of the most prominent autostereoscopic threedimensional(3d) display techniques [1 12]. Besides its benefits as 3D display technique, integral imaging is also a versatile technology in 3D capturing and processing [13 16]. Unlike usual two-dimensional(2d) imaging which records only spatial distribution of the light rays, integral imaging captures spatio-angular distribution of the light rays so that 3D information can be acquired [13,14]. Various techniques for processing captured spatio-angular distribution of the light rays have been reported including 3D mesh model reconstruction [15,17], arbitrary view reconstruction [18], hologram synthesis [19 21], and depth plane reconstruction [22 24]. Depth plane reconstruction of integral imaging is usually called computational integral imaging reconstruction(ciir). By collecting and averaging light rays corresponding to each point in a depth plane, an image refocused on the plane is obtained [13,22]. Repeating this process for successive depth planes gives a stack of refocused images so that 3D structure of the object scene can be understood visually. Since CIIR is performed by simple pixel averaging without significant image processing, it is effective and free from image processing errors. However, CIIR has its limitations as well. One of the significant limitations is that the reconstructed image of CIIR is simply a refocused image in a certain depth plane without any further depth processing. The reconstructed image of the CIIR consists of focused object points accompanied by object points blurred according to difference between their actual depths and reconstruction depth. Any additional processing such as depth sectioning, and multi-plane refocusing is not possible in conventional CIIR. A few techniques have been reported to give additional functionality to the CIIR. K.-J. Lee et al. applied a focus filter to extract the focused part of the object in an effort to realize tomographic imaging [25]. The focus filter, however, relies on the spatial variation in an image window and generally prone to error. G. Baasantseren et al. used random pattern illumination to suppress blurred part [26]. However, except reduction of the effective depth of focus, various functionality of depth filtering is not possible. In this paper, we propose a method to perform depth filtering on the spatio-angular light ray distribution captured by integral imaging. Using the depth dependency of the frequency distribution of the light rays [27], the proposed method enables various depth filtering including depth selection and multi-plane refocusing. It is also possible to reconstruct various (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18730

3 views with depth filtered data, which was not possible in conventional CIIR. In addition to the basic method, we also propose a method using grating projection in order to enhance the depth discrimination. In the followings, we explain the principle and present the experimental results for verification. 2. Theory Fig. 1. Geometry of integral imaging capture 2.1. Depth filtering using frequency distribution of light rays Figure 1 shows geometry of integral imaging capture. A 3D scene is captured through an array of the identical small lenses. Since each lens captures different perspective of the 3D scene, the resultant image is a set of perspectives each of which is called elemental image. Assuming pinhole lens model and ray optics, each elemental image can be thought to be a representation of the angular ray distribution at the principal point of the corresponding lens. With an array of the lenses, captured set of the elemental images contains spatio-angular distribution of the light rays in the lens array plane. Fig. 2. Frequency spectrum of a single plane object In order to investigate the frequency characteristics of the captured spatio-angular light ray distribution, let us suppose a plane object at a specific distance z from the lens array as shown in Fig. 2. For a non-specular plane object f, the light ray l at a position x and an angle θ in the lens array plane is given under paraxial approximation by ( θ ) l( x, θ ) = f x+ z, (1) where unit of θ is radian. By Fourier-transforming Eq. (1), the spatio-angular frequency distribution of the captured light rays L is given by ( ) θ ( ) δ( ) j 2π( f x + f θ) x θ L f, f = f ( x+ z) e dxdθ, x θ = F f f z f x where F is the Fourier transform of the plane object f, and f x and f θ are the spatial and angular frequencies measured in cycles/m and cycles/rad, respectively. Here the Fourier transform is x θ (2) (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18731

4 performed only for x and θ, not the object depth z. Equation (2) reveals that a plane object at a distance z is represented by a single line in the frequency domain representation with a slanting angle proportional to the distance z as shown in Fig. 2. Note that this can also be explained using Fourier slice theorem. As can be observed in Fig. 2, the projection in spatial domain onto the line θ=zx gives the plane object function f. According to the Fourier slice theorem, the corresponding slice f θ =zf x in frequency domain represents the Fourier transform of the object function f. The other projections in spatial domain give constant function assuming the extent of the spatio-angular distribution is sufficiently large in the spatial domain, and thus their corresponding slices give delta function located at the origin. Consequently, the plane object is represented as a line f θ =zf x in frequency domain. Also note that in case of specular reflection, the frequency spectrum of a plane object cannot be represented as a single line due to limited angular extent of the rays. Hence discussion in this paper is valid only for objects of diffusive surface. For a volume object with extended depth range, the frequency representation of the captured light rays becomes an area including the collection of those slanted lines. Fig. 3. Procedure of proposed depth filtering Based on this characteristic of the captured light rays, the proposed depth filtering is performed as shown in Fig. 3. The light ray distribution of a 3D scene is captured by using a lens array following integral imaging principle. The captured light ray distribution is Fourier transformed to yield spatio-angular frequency domain representation. A desired filtering operation is performed on this frequency domain representation. As an example, depth pass filtering is illustrated in Fig. 3. Finally, inverse Fourier transform gives depth filtered light ray distribution. Any previously reported techniques to visualize the 3D information embedded in the elemental images can be additionally applied to this depth filtered light ray distribution. Note that for the color objects, the proposed method is performed for each color channel, i.e. red, green, and blue, independently. The filtered color channels are then merged to generate color output. In the followings, frequency spectrums are plotted only for red channel, while spatial domain representations are shown using all color channels for visibility. (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18732

5 Fig. 4. 3D scene used in simulation In order to verify the principle, a simulation is performed for a 3D scene comprised of three plane objects as shown in Fig. 4. Note that the three plane objects are located in both left and right field of the lens array plane in order to emphasize depth dependency of the frequency spectrum of the spatio-angular ray distribution. In real implementation of the integral imaging capture system, this condition is satisfied when the objects are not directly captured but their intermediate images which are formed by an additional imaging lens are captured by the lens array. In other cases, the objects are usually located in the right field, i.e. positive z, of the lens array. The depth filtering result is shown in Fig. 5. Depth filtering is performed to select one or two objects out of three plane objects by using the proposed method and the filtered light ray distribution is further processed to synthesize a view at a central position for a visualization purpose following conventional method [13]. The depth range for filtering of each object is set to 10 mm around its actual depth. Fig. 5. Simulation result of depth filtering. In the second row named original in Fig. 5, it can be seen that the frequency spectrum f x - f θ plot and f y -f φ plot reveals three lines with different slanting angles as expected. Filtering is performed to select specific lines in this frequency spectrum as shown in the two columns f x - f θ plot and f y -f φ plot. The last column in Fig. 5 shows the synthesized central views for filtered data. Note that selection of two objects at distant depths as shown in last 3 rows of Fig. 5 could not be done by conventional CIIR. From Fig. 5, it can be confirmed that the proposed depth filtering operation performs as expected. (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18733

6 Fig. 6. View reconstructions: (a) 9 examples and (b) movie (Media 1) using original data, (c) 9 examples and (d) movie (Media 2) using filtered data In order to verify that conventional processing can be additionally applied to the filtered data, various view reconstruction is performed for the filtered data as an example. Figure 6 shows the result of original data and filtered data for apple and banana objects with two movies. From Fig. 6, it can be confirmed that the conventional view reconstruction algorithm works well with the filtered data Depth discrimination enhancement using grating projection The basic method proposed in the previous section enables depth selective filtering of the captured 3D scene. The discrimination ratio of the depth, however, is not very good. As shown in last column in Fig. 5, the unselected depth plane object is not completely removed in the reconstruction, but remains in a blurry shape. This becomes especially severe when the depth separation between the objects is small. In this sub-section, we propose an additional method to enhance the depth discrimination ratio. The low depth discrimination ratio is primarily due to the overlapped low frequency components of depth planes. As shown in Fig. 7(a), slanted lines corresponding to different depth planes intersect at the origin of the spatio-angular frequency domain. Note that in cases of the real 3D objects, the most energy is concentrated on the low frequency range around the origin. Hence, any depth pass filtering is accompanied by significant energy from other depths, resulting in low discrimination ratio. (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18734

7 Fig. 7. Frequency spectrum (a) without grating projection and (b) with grating spectrum In order to enhance the discrimination ratio, we propose a method using grating projection, inspired by the standard phase retrieval procedures used in applications such as phase-shifting interferometry [28]. Figure 7(b) illustrates the proposed method. Instead of usual white light illumination, 4 sinusoidal amplitude grating patterns Gφ ( x) = 1+ sin( π x Tx + φ) are projected on the 3D scene sequentially with φ=0, 90, 180, and 270. The captured 4 sets of spatio-angular ray distributions, l 0, l 90, l 180, and l 270 are processed following l ( x, θ ) + l ( x, θ ) = (3) (, θ ), l x l ( x, θ ) l ( x, θ ) l x θ = l90 ( x, θ ) l270 ( x, θ ) (, ) 2 tan, to yield complex-valued ray distribution l(x,θ). For simplicity, let us consider a plane object f(x) at a distance z. Note that by grating projection, the object function f(x) becomes f(x)g φ (x). Hence from Eq. (2), the captured ray distribution with grating phase shift φ is l φ (x,θ)=f(x+θz)g φ (x+θz). Substituting this in Eqs. (3) and (4) shows that the complex-valued ray distribution l(x,θ) is given by 2 π ( x+ θ z) l( x, θ ) = f ( x+ θ z)exp j. (5) Tx Equation (5) indicates that the complex-valued ray distribution synthesized by Eqs. (3) and (4) is equivalent to the ray distribution of a complex-valued 3D scene f c given by 2π fc ( x) = f ( x)exp j x, Tx where T x is the half period of the projected grating pattern along x axis. Therefore, in the frequency domain representation of the complex-valued ray distribution l(x,θ), the spectrum of each depth plane is shifted by 1/T x along f x axis as shown in Fig. 7(b). The slanted lines corresponding to different depth planes still intersect at the origin, but now the low frequency part where most energy is concentrated is moved to f x =1/T x and is well separated without overlapping. Therefore, the depth discrimination ratio can be enhanced in the reconstruction. Note that since the effect of the grating projection is limited only to the phase of the 3D scene as revealed by Eq. (6) and what we usually care in the reconstruction is the amplitude distribution in the object space, any additional processing is not required to compensate the (4) (6) (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18735

8 added phase term in the reconstruction. Also note that the grating period T x does not need to be maintained constant for all objects at different depths. When the grating period T x is a function of depth z, the spectrums of depth planes are shifted by a different amount. However, low frequency parts of depth planes are still separated by the spectrum shifts, and thus the depth discrimination is enhanced. Another noteworthy point is that due to the shift of the spectrum, higher spatial frequencies can be captured for a side band. Considering the other side band can also be recovered using conjugate symmetry of frequency spectrum of realvalued object function, there is a possibility to enhance not only the depth discrimination but also the resolution. Figures 8-10 show the simulation result. Simulation is performed for the same 3D scene shown in Fig. 4, but the locations of three plane objects are changed to 30mm, 20mm and 10mm right to the lens array plane for Lena, banana, and apple objects, respectively. Note that the spacing between the plane objects is reduced to emphasize depth discrimination ratio. The diagonally slanted grating pattern is used in the simulation to give equal contribution to x and y directions. Using Eqs. (3) and (4), the four-dimensional, i.e. x, y, θ, and φ, complex-valued light ray distribution is obtained. Figure 8 shows two slices revealing x-θ plot and y-φ plot. As shown in Fig. 8, the phase of the captured light ray distribution now has periodic grating pattern due to grating projection technique. Fig. 8. A slice of spatio-angular light ray distribution: (a) amplitude and (b) phase of x-θ plot at y=φ =0, (c) amplitude and (d) phase of y-φ plot at x=θ=0 Figure 9 shows two slices of f x -f θ plot and f y -f φ plot with or without grating projection. Although three lines are not identified individually due to small depth separation, it can be observed that the peak points which represents DC component are moved from the coordinates origin as shown in Fig. 9(a) and (b) to different positions as shown in Fig. 9(c) and (d). Fig. 9. A slice of spatio-angular frequency spectrum (a) f x-f θ plot at f y=f φ=0 and (b) f y-f φ plot at f x=f θ=0 without grating projection, (c) f x-f θ plot at f y=f φ=0 and (d) f y-f φ plot at f x=f θ=0 with grating projection (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18736

9 The depth filtering result for one out of three plane objects is shown in Fig. 10. The depth range for filtering is set to 4 mm around the object s actual depth. In Fig. 10(a), the residual blurred images exist with significant energy due to low depth discrimination. In Fig. 10(b), however, it can be confirmed that they are largely suppressed leaving the desired object unchanged by the grating projection technique. 3. Experimental result Fig. 10. Depth filtering results (a) without grating projection, (b) with grating projection We verified the proposed method experimentally. Two experiments performed to verify the depth filtering operation and depth discrimination ratio enhancement, respectively. For the first experiment, three plane objects J, K, and M shown in Fig. 11 are located at 2cm, 7cm, and 12cm from a lens array, respectively. The lens array used in the experiment consists of identical elemental lenses of 1mm lens pitch and 3.3mm focal length. The number of the elemental lenses in the array is about 110(H) 55(V). Under uniform regular white illumination, the elemental images are captured through the lens array as shown in Fig. 12. The number of pixels per each elemental image is 31(H) 31(V). Fig. 11. Experimental setup (a) object (b) configuration (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18737

10 Fig. 12. Captured elemental images In the experiment, the depth range for filtering of each object was determined empirically considering resultant depth discrimination and brightness of the reconstruction. The depth filtering result is shown in Fig. 13. Although reconstruction of the object M is rather weak due to its large depth, it can be seen that the depth selection filtering for one or two objects out of three is performed successfully. Fig. 13. Experimental depth filtering result The original set of the elemental images and the filtered one are further processed for the various view reconstruction and the result is shown in Figs. 14 and 15. Figures 14 and 15 reveal different views of the 3D object scene can be reconstructed from the filtered elemental images as well as from the original elemental images. (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18738

11 (a) (b) (c) (d) Fig. 14. Examples of the view synthesis (a) original data, (b) filtered for J and M, (c) filtered for J and K, and (d) filtered for K and M (a) (b) (c) (d) Fig. 15. Movies of view synthesis (a) (Media 3) original data (b) (Media 4) filtered for J and M, (c) (Media 5) filtered for J and K, and (d) (Media 6) filtered for K and M The experiment for verification of the depth discrimination enhancement using grating projection is performed with a setup shown in Fig. 16(a). Three plane objects J, K, and M are located at 4cm, 6cm, and 8cm from the lens array. Instead of uniform illumination, 4 diagonal sinusoidal amplitude grating patterns with 90 phase shift are projected to the objects and corresponding 4 sets of the elemental images are captured as shown in Fig. 16(b). The grating period projected on the object surfaces is approximately 9mm for both x and y axis. In the captured elemental images, this projected grating period is sampled with more than 14 pixels both x and y axis without aliasing. Magnified image of the captured elemental images shown in Fig. 16(b) shows intensity variation due to the grating projection. (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18739

12 Fig. 16. Experimental setup for grating projection (a) configuration, (b) captured elemental images From these 4 sets of the elemental images, the complex-valued elemental images are synthesized using Eqs. (3) and (4). Figure 17 shows the amplitude and phase distribution of the synthesized complex-valued elemental images. Figure 17(a) reveals the intensity fluctuation in raw image of Fig. 16 is now removed by Eq. (3). The phase distribute shown in Fig. 17(b) shows periodic grating pattern as desired. Fig. 17. Synthesized set of elemental images (a) amplitude (b) phase Figure 18 shows f x -f θ plot with or without the grating projection. The plot in non-gratingprojection case shown in Fig. 18(a) is calculated by ignoring phase term of the complexvalued set of the elemental images of Fig. 17. As expected, the high energy point representing DC components of the objects is moved from the origin to different location. (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18740

13 Fig. 18. A slice of spatio-angular frequency spectrum (f x-f θ plot at f y=f φ=0) (a) without grating projection (b) with grating projection Finally, the depth filtering result is shown in Fig. 19. One of three plane objects is selected for the reconstruction in non-grating and grating cases. In last three rows, it is observed that the residual objects remain in the reconstruction with significant intensity due to small depth separation between the objects. By the grating projection method, however, they are successfully suppressed, leaving the desired object as shown in first four rows of Fig. 19. From the results in Fig. 19, it can be confirmed that the depth discrimination ratio is enhanced by the proposed grating projection method. 4. Conclusion Fig. 19. Experimental result of depth discrimination enhancement using grating projection A novel method to perform the depth filtering using integral imaging is proposed. By using spatio-angular frequency characteristics of the captured light ray distribution, various depth filtering operations can be performed. Any conventional processing developed for the integral imaging can be further applied after the proposed filtering. We also propose an additional method using grating projection. The grating projection method enhances the depth discrimination performance of the depth filtering operation by reducing overlapped energy between depth planes. The experimental results are provided for the verification of the proposed method. Acknowledgment This work was supported by the research grant of the Chungbuk National University in (C) 2011 OSA 12 September 2011 / Vol. 19, No. 19 / OPTICS EXPRESS 18741

Rectification of elemental image set and extraction of lens lattice by projective image transformation in integral imaging

Rectification of elemental image set and extraction of lens lattice by projective image transformation in integral imaging Rectification of elemental image set and extraction of lens lattice by projective image transformation in integral imaging Keehoon Hong, 1 Jisoo Hong, 1 Jae-Hyun Jung, 1 Jae-Hyeung Park, 2,* and Byoungho

More information

Color moiré pattern simulation and analysis in three-dimensional integral imaging for finding the moiré-reduced tilted angle of a lens array

Color moiré pattern simulation and analysis in three-dimensional integral imaging for finding the moiré-reduced tilted angle of a lens array Color moiré pattern simulation and analysis in three-dimensional integral imaging for finding the moiré-reduced tilted angle of a lens array Yunhee Kim, Gilbae Park, Jae-Hyun Jung, Joohwan Kim, and Byoungho

More information

Reconstruction of three-dimensional occluded object using optical flow and triangular mesh reconstruction in integral imaging

Reconstruction of three-dimensional occluded object using optical flow and triangular mesh reconstruction in integral imaging Reconstruction of three-dimensional occluded object using optical flow and triangular mesh reconstruction in integral imaging Jae-Hyun Jung, 1 Keehoon Hong, 1 Gilbae Park, 1 Indeok Chung, 1 Jae-Hyeung

More information

Rectification of distorted elemental image array using four markers in three-dimensional integral imaging

Rectification of distorted elemental image array using four markers in three-dimensional integral imaging Rectification of distorted elemental image array using four markers in three-dimensional integral imaging Hyeonah Jeong 1 and Hoon Yoo 2 * 1 Department of Computer Science, SangMyung University, Korea.

More information

3D image reconstruction with controllable spatial filtering based on correlation of multiple periodic functions in computational integral imaging

3D image reconstruction with controllable spatial filtering based on correlation of multiple periodic functions in computational integral imaging 3D image reconstruction with controllable spatial filtering based on correlation of multiple periodic functions in computational integral imaging Jae-Young Jang 1, Myungjin Cho 2, *, and Eun-Soo Kim 1

More information

Department of Game Mobile Contents, Keimyung University, Daemyung3-Dong Nam-Gu, Daegu , Korea

Department of Game Mobile Contents, Keimyung University, Daemyung3-Dong Nam-Gu, Daegu , Korea Image quality enhancement of computational integral imaging reconstruction for partially occluded objects using binary weighting mask on occlusion areas Joon-Jae Lee, 1 Byung-Gook Lee, 2 and Hoon Yoo 3,

More information

Shading of a computer-generated hologram by zone plate modulation

Shading of a computer-generated hologram by zone plate modulation Shading of a computer-generated hologram by zone plate modulation Takayuki Kurihara * and Yasuhiro Takaki Institute of Engineering, Tokyo University of Agriculture and Technology, 2-24-16 Naka-cho, Koganei,Tokyo

More information

Uniform angular resolution integral imaging display with boundary folding mirrors

Uniform angular resolution integral imaging display with boundary folding mirrors Uniform angular resolution integral imaging display with boundary folding mirrors Joonku Hahn, Youngmin Kim, and Byoungho Lee* School of Electrical Engineering, Seoul National University, Gwanak-Gu Sillim-Dong,

More information

Three dimensional Binocular Holographic Display Using Liquid Crystal Shutter

Three dimensional Binocular Holographic Display Using Liquid Crystal Shutter Journal of the Optical Society of Korea Vol. 15, No. 4, December 211, pp. 345-351 DOI: http://dx.doi.org/1.387/josk.211.15.4.345 Three dimensional Binocular Holographic Display Using iquid Crystal Shutter

More information

Three-dimensional integral imaging for orthoscopic real image reconstruction

Three-dimensional integral imaging for orthoscopic real image reconstruction Three-dimensional integral imaging for orthoscopic real image reconstruction Jae-Young Jang, Se-Hee Park, Sungdo Cha, and Seung-Ho Shin* Department of Physics, Kangwon National University, 2-71 Republic

More information

Wide-viewing integral imaging system using polarizers and light barriers array

Wide-viewing integral imaging system using polarizers and light barriers array Yuan et al. Journal of the European Optical Society-Rapid Publications (2017) 13:25 DOI 10.1186/s41476-017-0052-x Journal of the European Optical Society-Rapid Publications RESEARCH Open Access Wide-viewing

More information

Vision-Based 3D Fingertip Interface for Spatial Interaction in 3D Integral Imaging System

Vision-Based 3D Fingertip Interface for Spatial Interaction in 3D Integral Imaging System International Conference on Complex, Intelligent and Software Intensive Systems Vision-Based 3D Fingertip Interface for Spatial Interaction in 3D Integral Imaging System Nam-Woo Kim, Dong-Hak Shin, Dong-Jin

More information

Three-dimensional directional display and pickup

Three-dimensional directional display and pickup Three-dimensional directional display and pickup Joonku Hahn NCRCAPAS School of Electrical Engineering Seoul National University CONTENTS I. Introduction II. Three-dimensional directional display 2.1 Uniform

More information

Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship

Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship Proc. Natl. Sci. Counc. ROC(A) Vol. 25, No. 5, 2001. pp. 300-308 Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship YIH-SHYANG CHENG, RAY-CHENG CHANG, AND SHIH-YU

More information

Coupling of surface roughness to the performance of computer-generated holograms

Coupling of surface roughness to the performance of computer-generated holograms Coupling of surface roughness to the performance of computer-generated holograms Ping Zhou* and Jim Burge College of Optical Sciences, University of Arizona, Tucson, Arizona 85721, USA *Corresponding author:

More information

Invited Paper. Nukui-Kitamachi, Koganei, Tokyo, , Japan ABSTRACT 1. INTRODUCTION

Invited Paper. Nukui-Kitamachi, Koganei, Tokyo, , Japan ABSTRACT 1. INTRODUCTION Invited Paper Wavefront printing technique with overlapping approach toward high definition holographic image reconstruction K. Wakunami* a, R. Oi a, T. Senoh a, H. Sasaki a, Y. Ichihashi a, K. Yamamoto

More information

Digital holographic display with two-dimensional and threedimensional convertible feature by high speed switchable diffuser

Digital holographic display with two-dimensional and threedimensional convertible feature by high speed switchable diffuser https://doi.org/10.2352/issn.2470-1173.2017.5.sd&a-366 2017, Society for Imaging Science and Technology Digital holographic display with two-dimensional and threedimensional convertible feature by high

More information

Enhanced Techniques 3D Integral Images Video Computer Generated

Enhanced Techniques 3D Integral Images Video Computer Generated Enhanced Techniques 3D Integral Images Video Computer Generated M. G. Eljdid, A. Aggoun, O. H. Youssef Computer Sciences Department, Faculty of Information Technology, Tripoli University, P.O.Box: 13086,

More information

Depth-fused display with improved viewing characteristics

Depth-fused display with improved viewing characteristics Depth-fused display with improved viewing characteristics Soon-Gi Park, Jae-Hyun Jung, Youngmo Jeong, and Byoungho Lee* School of Electrical Engineering, Seoul National University, Gwanak-gu Gwanakro 1,

More information

A Survey of Light Source Detection Methods

A Survey of Light Source Detection Methods A Survey of Light Source Detection Methods Nathan Funk University of Alberta Mini-Project for CMPUT 603 November 30, 2003 Abstract This paper provides an overview of the most prominent techniques for light

More information

PRE-PROCESSING OF HOLOSCOPIC 3D IMAGE FOR AUTOSTEREOSCOPIC 3D DISPLAYS

PRE-PROCESSING OF HOLOSCOPIC 3D IMAGE FOR AUTOSTEREOSCOPIC 3D DISPLAYS PRE-PROCESSING OF HOLOSCOPIC 3D IMAGE FOR AUTOSTEREOSCOPIC 3D DISPLAYS M.R Swash, A. Aggoun, O. Abdulfatah, B. Li, J. C. Fernández, E. Alazawi and E. Tsekleves School of Engineering and Design, Brunel

More information

White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting

White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting Maitreyee Roy 1, *, Joanna Schmit 2 and Parameswaran Hariharan 1 1 School of Physics, University

More information

Fourier, Fresnel and Image CGHs of three-dimensional objects observed from many different projections

Fourier, Fresnel and Image CGHs of three-dimensional objects observed from many different projections Fourier, Fresnel and Image CGHs of three-dimensional objects observed from many different projections David Abookasis and Joseph Rosen Ben-Gurion University of the Negev Department of Electrical and Computer

More information

Defect Inspection of Liquid-Crystal-Display (LCD) Panels in Repetitive Pattern Images Using 2D Fourier Image Reconstruction

Defect Inspection of Liquid-Crystal-Display (LCD) Panels in Repetitive Pattern Images Using 2D Fourier Image Reconstruction Defect Inspection of Liquid-Crystal-Display (LCD) Panels in Repetitive Pattern Images Using D Fourier Image Reconstruction Du-Ming Tsai, and Yan-Hsin Tseng Department of Industrial Engineering and Management

More information

Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry

Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry Lei Huang,* Chi Seng Ng, and Anand Krishna Asundi School of Mechanical and Aerospace Engineering, Nanyang Technological

More information

Enhancing the pictorial content of digital holograms at 100 frames per second

Enhancing the pictorial content of digital holograms at 100 frames per second Enhancing the pictorial content of digital holograms at 100 frames per second P.W.M. Tsang, 1 T.-C Poon, 2 and K.W.K. Cheung 1 1 Department of Electronic Engineering, City University of Hong Kong, Hong

More information

Enhanced Reconstruction of 3D Shape and Texture from. Integral Photography Images

Enhanced Reconstruction of 3D Shape and Texture from. Integral Photography Images Enhanced Reconstruction of 3D Shape and Texture from Integral Photography Images G. Passalis, N. Sgouros, S. Athineos and T. Theoharis Department of Informatics and Telecommunications, University of Athens,

More information

Aberrations in Holography

Aberrations in Holography Aberrations in Holography D Padiyar, J Padiyar 1070 Commerce St suite A, San Marcos, CA 92078 dinesh@triple-take.com joy@triple-take.com Abstract. The Seidel aberrations are described as they apply to

More information

Computational Photography

Computational Photography Computational Photography Matthias Zwicker University of Bern Fall 2010 Today Light fields Introduction Light fields Signal processing analysis Light field cameras Application Introduction Pinhole camera

More information

Synthesis of a multiple-peak spatial degree of coherence for imaging through absorbing media

Synthesis of a multiple-peak spatial degree of coherence for imaging through absorbing media Synthesis of a multiple-peak spatial degree of coherence for imaging through absorbing media Mark Gokhler and Joseph Rosen The synthesis of a multiple-peak spatial degree of coherence is demonstrated.

More information

Technologies of Digital Holographic Display

Technologies of Digital Holographic Display Technologies of Digital Holographic Display Joonku Hahn Kyungpook National University Outline: 1. Classification of digital holographic display 2. Data capacity, View volume and Resolution 3. Holographic

More information

Fast Response Fresnel Liquid Crystal Lens for 2D/3D Autostereoscopic Display

Fast Response Fresnel Liquid Crystal Lens for 2D/3D Autostereoscopic Display Invited Paper Fast Response Fresnel Liquid Crystal Lens for 2D/3D Autostereoscopic Display Yi-Pai Huang* b, Chih-Wei Chen a, Yi-Ching Huang a a Department of Photonics & Institute of Electro-Optical Engineering,

More information

Fresnel and Fourier digital holography architectures: a comparison.

Fresnel and Fourier digital holography architectures: a comparison. Fresnel and Fourier digital holography architectures: a comparison. Damien P., David S. Monaghan, Nitesh Pandey, Bryan M. Hennelly. Department of Computer Science, National University of Ireland, Maynooth,

More information

A Fast Image Multiplexing Method Robust to Viewer s Position and Lens Misalignment in Lenticular 3D Displays

A Fast Image Multiplexing Method Robust to Viewer s Position and Lens Misalignment in Lenticular 3D Displays A Fast Image Multiplexing Method Robust to Viewer s Position and Lens Misalignment in Lenticular D Displays Yun-Gu Lee and Jong Beom Ra Department of Electrical Engineering and Computer Science Korea Advanced

More information

specular diffuse reflection.

specular diffuse reflection. Lesson 8 Light and Optics The Nature of Light Properties of Light: Reflection Refraction Interference Diffraction Polarization Dispersion and Prisms Total Internal Reflection Huygens s Principle The Nature

More information

3. Image formation, Fourier analysis and CTF theory. Paula da Fonseca

3. Image formation, Fourier analysis and CTF theory. Paula da Fonseca 3. Image formation, Fourier analysis and CTF theory Paula da Fonseca EM course 2017 - Agenda - Overview of: Introduction to Fourier analysis o o o o Sine waves Fourier transform (simple examples of 1D

More information

UNIT VI OPTICS ALL THE POSSIBLE FORMULAE

UNIT VI OPTICS ALL THE POSSIBLE FORMULAE 58 UNIT VI OPTICS ALL THE POSSIBLE FORMULAE Relation between focal length and radius of curvature of a mirror/lens, f = R/2 Mirror formula: Magnification produced by a mirror: m = - = - Snell s law: 1

More information

Head Tracking Three-Dimensional Integral Imaging Display Using Smart Pseudoscopic-to-Orthoscopic Conversion

Head Tracking Three-Dimensional Integral Imaging Display Using Smart Pseudoscopic-to-Orthoscopic Conversion 542 JOURNAL OF DISPLAY TECHNOLOGY, VOL. 12, NO. 6, JUNE 2016 Head Tracking Three-Dimensional Integral Imaging Display Using Smart Pseudoscopic-to-Orthoscopic Conversion Xin Shen, Manuel Martinez Corral,

More information

Overview of techniques applicable to self-interference incoherent digital holography

Overview of techniques applicable to self-interference incoherent digital holography J. Europ. Opt. Soc. Rap. Public. 8, 13077 (2013) www.jeos.org Overview of techniques applicable to self-interference incoherent digital holography J. Hong jisoohong@mail.usf.edu Department of Physics,

More information

Title. Author(s)Yamaguchi, Kazuhiro; Sakamoto, Yuji. CitationApplied Optics, 48(34): H203-H211. Issue Date Doc URL. Rights.

Title. Author(s)Yamaguchi, Kazuhiro; Sakamoto, Yuji. CitationApplied Optics, 48(34): H203-H211. Issue Date Doc URL. Rights. Title Computer generated hologram with characteristics of Author(s)Yamaguchi, Kazuhiro; Sakamoto, Yuji CitationApplied Optics, 48(34): H203-H211 Issue Date 2009-12-01 Doc URL http://hdl.handle.net/2115/52148

More information

Reprint. from the Journal. of the SID

Reprint. from the Journal. of the SID A 23-in. full-panel-resolution autostereoscopic LCD with a novel directional backlight system Akinori Hayashi (SID Member) Tomohiro Kometani Akira Sakai (SID Member) Hiroshi Ito Abstract An autostereoscopic

More information

LED holographic imaging by spatial-domain diffraction computation of. textured models

LED holographic imaging by spatial-domain diffraction computation of. textured models LED holographic imaging by spatial-domain diffraction computation of textured models Ding-Chen Chen, Xiao-Ning Pang, Yi-Cong Ding, Yi-Gui Chen, and Jian-Wen Dong* School of Physics and Engineering, and

More information

An Effective Hardware Architecture for Bump Mapping Using Angular Operation

An Effective Hardware Architecture for Bump Mapping Using Angular Operation An Effective Hardware Architecture for Bump Mapping Using Angular Operation Seung-Gi Lee, Woo-Chan Park, Won-Jong Lee, Tack-Don Han, and Sung-Bong Yang Media System Lab. (National Research Lab.) Dept.

More information

Grid Reconstruction and Skew Angle Estimation in Integral Images Produced using Circular Microlenses

Grid Reconstruction and Skew Angle Estimation in Integral Images Produced using Circular Microlenses Grid Reconstruction and Skew Angle Estimation in Integral Images Produced using Circular Microlenses E T Koufogiannis 1 N P Sgouros 1 M T Ntasi 1 M S Sangriotis 1 1 Department of Informatics and Telecommunications

More information

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor COSC579: Scene Geometry Jeremy Bolton, PhD Assistant Teaching Professor Overview Linear Algebra Review Homogeneous vs non-homogeneous representations Projections and Transformations Scene Geometry The

More information

AP* Optics Free Response Questions

AP* Optics Free Response Questions AP* Optics Free Response Questions 1978 Q5 MIRRORS An object 6 centimeters high is placed 30 centimeters from a concave mirror of focal length 10 centimeters as shown above. (a) On the diagram above, locate

More information

Extracting Sound Information from High-speed Video Using Three-dimensional Shape Measurement Method

Extracting Sound Information from High-speed Video Using Three-dimensional Shape Measurement Method Extracting Sound Information from High-speed Video Using Three-dimensional Shape Measurement Method Yusei Yamanaka, Kohei Yatabe, Ayumi Nakamura, Yusuke Ikeda and Yasuhiro Oikawa Department of Intermedia

More information

To see how a sharp edge or an aperture affect light. To analyze single-slit diffraction and calculate the intensity of the light

To see how a sharp edge or an aperture affect light. To analyze single-slit diffraction and calculate the intensity of the light Diffraction Goals for lecture To see how a sharp edge or an aperture affect light To analyze single-slit diffraction and calculate the intensity of the light To investigate the effect on light of many

More information

MULTIDIMENSIONAL OPTICAL SENSING, IMAGING, AND VISUALIZATION SYSTEMS (MOSIS)

MULTIDIMENSIONAL OPTICAL SENSING, IMAGING, AND VISUALIZATION SYSTEMS (MOSIS) MULTIDIMENSIONAL OPTICAL SENSING, IMAGING, AND VISUALIZATION SYSTEMS (MOSIS) Prof. Bahram Javidi* Board of Trustees Distinguished Professor *University of Connecticut Bahram.Javidi@uconn.edu Overview of

More information

Region segmentation and parallel processing for creating large-scale CGHs in polygon source method

Region segmentation and parallel processing for creating large-scale CGHs in polygon source method Practical Holography XXIII: Materials and Applications, eds. H. I. Bjelkhagen, R. K. Kostuk, SPIE#733, 7330E(009). 1 Region segmentation and parallel processing for creating large-scale CGHs in polygon

More information

Optics Vac Work MT 2008

Optics Vac Work MT 2008 Optics Vac Work MT 2008 1. Explain what is meant by the Fraunhofer condition for diffraction. [4] An aperture lies in the plane z = 0 and has amplitude transmission function T(y) independent of x. It is

More information

Supplementary Figure 1: Schematic of the nanorod-scattered wave along the +z. direction.

Supplementary Figure 1: Schematic of the nanorod-scattered wave along the +z. direction. Supplementary Figure 1: Schematic of the nanorod-scattered wave along the +z direction. Supplementary Figure 2: The nanorod functions as a half-wave plate. The fast axis of the waveplate is parallel to

More information

3D data merging using Holoimage

3D data merging using Holoimage Iowa State University From the SelectedWorks of Song Zhang September, 27 3D data merging using Holoimage Song Zhang, Harvard University Shing-Tung Yau, Harvard University Available at: https://works.bepress.com/song_zhang/34/

More information

Digital correlation hologram implemented on optical correlator

Digital correlation hologram implemented on optical correlator Digital correlation hologram implemented on optical correlator David Abookasis and Joseph Rosen Ben-Gurion University of the Negev Department of Electrical and Computer Engineering P. O. Box 653, Beer-Sheva

More information

Vibration parameter measurement using the temporal digital hologram sequence and windowed Fourier transform

Vibration parameter measurement using the temporal digital hologram sequence and windowed Fourier transform THEORETICAL & APPLIED MECHANICS LETTERS 1, 051008 (2011) Vibration parameter measurement using the temporal digital hologram sequence and windowed Fourier transform Chong Yang, 1, 2 1, a) and Hong Miao

More information

Limits of computational white-light holography

Limits of computational white-light holography Journal of Physics: Conference Series Limits of computational white-light holography To cite this article: Sebastian Mader et al 2013 J. Phys.: Conf. Ser. 415 012046 View the article online for updates

More information

DIFFRACTION 4.1 DIFFRACTION Difference between Interference and Diffraction Classification Of Diffraction Phenomena

DIFFRACTION 4.1 DIFFRACTION Difference between Interference and Diffraction Classification Of Diffraction Phenomena 4.1 DIFFRACTION Suppose a light wave incident on a slit AB of sufficient width b, as shown in Figure 1. According to concept of rectilinear propagation of light the region A B on the screen should be uniformly

More information

Techniques of Noninvasive Optical Tomographic Imaging

Techniques of Noninvasive Optical Tomographic Imaging Techniques of Noninvasive Optical Tomographic Imaging Joseph Rosen*, David Abookasis and Mark Gokhler Ben-Gurion University of the Negev Department of Electrical and Computer Engineering P. O. Box 653,

More information

Image Based Lighting with Near Light Sources

Image Based Lighting with Near Light Sources Image Based Lighting with Near Light Sources Shiho Furuya, Takayuki Itoh Graduate School of Humanitics and Sciences, Ochanomizu University E-mail: {shiho, itot}@itolab.is.ocha.ac.jp Abstract Recent some

More information

Image Based Lighting with Near Light Sources

Image Based Lighting with Near Light Sources Image Based Lighting with Near Light Sources Shiho Furuya, Takayuki Itoh Graduate School of Humanitics and Sciences, Ochanomizu University E-mail: {shiho, itot}@itolab.is.ocha.ac.jp Abstract Recent some

More information

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication DD2423 Image Analysis and Computer Vision IMAGE FORMATION Mårten Björkman Computational Vision and Active Perception School of Computer Science and Communication November 8, 2013 1 Image formation Goal:

More information

Extended Fractional View Integral Photography Using Slanted Orthogonal Lenticular Lenses

Extended Fractional View Integral Photography Using Slanted Orthogonal Lenticular Lenses Proceedings of the 2 nd World Congress on Electrical Engineering and Computer Systems and Science (EECSS'16) Budapest, Hungary August 16 17, 2016 Paper No. MHCI 112 DOI: 10.11159/mhci16.112 Extended Fractional

More information

Robust color segmentation algorithms in illumination variation conditions

Robust color segmentation algorithms in illumination variation conditions 286 CHINESE OPTICS LETTERS / Vol. 8, No. / March 10, 2010 Robust color segmentation algorithms in illumination variation conditions Jinhui Lan ( ) and Kai Shen ( Department of Measurement and Control Technologies,

More information

Secondary grating formation by readout at Bragg-null incidence

Secondary grating formation by readout at Bragg-null incidence Secondary grating formation by readout at Bragg-null incidence Ali Adibi, Jose Mumbru, Kelvin Wagner, and Demetri Psaltis We show that when a dynamic hologram is read out by illumination at the Bragg nulls

More information

Overview of Active Vision Techniques

Overview of Active Vision Techniques SIGGRAPH 99 Course on 3D Photography Overview of Active Vision Techniques Brian Curless University of Washington Overview Introduction Active vision techniques Imaging radar Triangulation Moire Active

More information

Advanced Image Reconstruction Methods for Photoacoustic Tomography

Advanced Image Reconstruction Methods for Photoacoustic Tomography Advanced Image Reconstruction Methods for Photoacoustic Tomography Mark A. Anastasio, Kun Wang, and Robert Schoonover Department of Biomedical Engineering Washington University in St. Louis 1 Outline Photoacoustic/thermoacoustic

More information

The main problem of photogrammetry

The main problem of photogrammetry Structured Light Structured Light The main problem of photogrammetry to recover shape from multiple views of a scene, we need to find correspondences between the images the matching/correspondence problem

More information

HOLOEYE Photonics. HOLOEYE Photonics AG. HOLOEYE Corporation

HOLOEYE Photonics. HOLOEYE Photonics AG. HOLOEYE Corporation HOLOEYE Photonics Products and services in the field of diffractive micro-optics Spatial Light Modulator (SLM) for the industrial research R&D in the field of diffractive optics Micro-display technologies

More information

Radiance Photography. Todor Georgiev Adobe Systems. Andrew Lumsdaine Indiana University

Radiance Photography. Todor Georgiev Adobe Systems. Andrew Lumsdaine Indiana University Radiance Photography Todor Georgiev Adobe Systems Andrew Lumsdaine Indiana University Course Goals Overview of radiance (aka lightfield) photography Mathematical treatment of theory and computation Hands

More information

Polarizing properties of embedded symmetric trilayer stacks under conditions of frustrated total internal reflection

Polarizing properties of embedded symmetric trilayer stacks under conditions of frustrated total internal reflection University of New Orleans ScholarWorks@UNO Electrical Engineering Faculty Publications Department of Electrical Engineering 3-1-2006 Polarizing properties of embedded symmetric trilayer stacks under conditions

More information

This paper is part of the following report: UNCLASSIFIED

This paper is part of the following report: UNCLASSIFIED UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO 11846 TITLE: Stream Cipher Based on Pseudo-Random Number Generation Using Optical Affine Transformation DISTRIBUTION: Approved

More information

Coherent digital demodulation of single-camera N-projections for 3D-object shape measurement: Co-phased profilometry

Coherent digital demodulation of single-camera N-projections for 3D-object shape measurement: Co-phased profilometry Coherent digital demodulation of single-camera N-projections for 3D-object shape measurement: Co-phased profilometry M. Servin, 1,* G. Garnica, 1 J. C. Estrada, 1 and A. Quiroga 2 1 Centro de Investigaciones

More information

Spectrographs. C. A. Griffith, Class Notes, PTYS 521, 2016 Not for distribution.

Spectrographs. C. A. Griffith, Class Notes, PTYS 521, 2016 Not for distribution. Spectrographs C A Griffith, Class Notes, PTYS 521, 2016 Not for distribution 1 Spectrographs and their characteristics A spectrograph is an instrument that disperses light into a frequency spectrum, which

More information

Supplementary materials of Multispectral imaging using a single bucket detector

Supplementary materials of Multispectral imaging using a single bucket detector Supplementary materials of Multispectral imaging using a single bucket detector Liheng Bian 1, Jinli Suo 1,, Guohai Situ 2, Ziwei Li 1, Jingtao Fan 1, Feng Chen 1 and Qionghai Dai 1 1 Department of Automation,

More information

High-resolution 3D profilometry with binary phase-shifting methods

High-resolution 3D profilometry with binary phase-shifting methods High-resolution 3D profilometry with binary phase-shifting methods Song Zhang Department of Mechanical Engineering, Iowa State University, Ames, Iowa 511, USA (song@iastate.edu) Received 11 November 21;

More information

Retardagraphy: A novel technique for optical recording of the. retardance pattern of an optical anisotropic object on a

Retardagraphy: A novel technique for optical recording of the. retardance pattern of an optical anisotropic object on a Retardagraphy: A novel technique for optical recording of the retardance pattern of an optical anisotropic object on a polarization-sensitive film using a single beam Daisuke Barada, 1,, Kiyonobu Tamura,

More information

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection CHAPTER 3 Single-view Geometry When we open an eye or take a photograph, we see only a flattened, two-dimensional projection of the physical underlying scene. The consequences are numerous and startling.

More information

Plenoptic camera and its Applications

Plenoptic camera and its Applications Aum Sri Sairam Plenoptic camera and its Applications Agenda: 1. Introduction 2. Single lens stereo design 3. Plenoptic camera design 4. Depth estimation 5. Synthetic refocusing 6. Fourier slicing 7. References

More information

Lab Report: Optical Image Processing

Lab Report: Optical Image Processing Lab Report: Optical Image Processing Kevin P. Chen * Advanced Labs for Special Topics in Photonics (ECE 1640H) University of Toronto March 5, 1999 Abstract This report describes the experimental principle,

More information

Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement

Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement Joobong Hyun, Dong-Choon Hwang, Dong-Ha Shin, Byung-Goo Lee, and Eun-Soo Kim In this paper,

More information

Depth extraction from unidirectional integral image using a modified multi-baseline technique

Depth extraction from unidirectional integral image using a modified multi-baseline technique Depth extraction from unidirectional integral image using a modified multi-baseline technique ChunHong Wu a, Amar Aggoun a, Malcolm McCormick a, S.Y. Kung b a Faculty of Computing Sciences and Engineering,

More information

Stereo Image Rectification for Simple Panoramic Image Generation

Stereo Image Rectification for Simple Panoramic Image Generation Stereo Image Rectification for Simple Panoramic Image Generation Yun-Suk Kang and Yo-Sung Ho Gwangju Institute of Science and Technology (GIST) 261 Cheomdan-gwagiro, Buk-gu, Gwangju 500-712 Korea Email:{yunsuk,

More information

Chapter 23. Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian

Chapter 23. Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian Chapter 23 Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian Reflection and Refraction at a Plane Surface The light radiate from a point object in all directions The light reflected from a plane

More information

Enhanced two-frequency phase-shifting method

Enhanced two-frequency phase-shifting method Research Article Vol. 55, No. 16 / June 1 016 / Applied Optics 4395 Enhanced two-frequency phase-shifting method JAE-SANG HYUN AND SONG ZHANG* School of Mechanical Engineering, Purdue University, West

More information

Hyperspectral interferometry for single-shot absolute measurement of 3-D shape and displacement fields

Hyperspectral interferometry for single-shot absolute measurement of 3-D shape and displacement fields EPJ Web of Conferences 6, 6 10007 (2010) DOI:10.1051/epjconf/20100610007 Owned by the authors, published by EDP Sciences, 2010 Hyperspectral interferometry for single-shot absolute measurement of 3-D shape

More information

Null test for a highly paraboloidal mirror

Null test for a highly paraboloidal mirror Null test for a highly paraboloidal mirror Taehee Kim, James H. Burge, Yunwoo Lee, and Sungsik Kim A circular null computer-generated hologram CGH was used to test a highly paraboloidal mirror diameter,

More information

Numerical investigation on the viewing angle of a lenticular three-dimensional display with a triplet lens array

Numerical investigation on the viewing angle of a lenticular three-dimensional display with a triplet lens array Numerical investigation on the viewing angle of a lenticular three-dimensional display with a triplet lens array Hwi Kim,, * Joonku Hahn, and Hee-Jin Choi 3 Department of Electronics and Information Engineering,

More information

Image quality improvement of polygon computer generated holography

Image quality improvement of polygon computer generated holography Image quality improvement of polygon computer generated holography Xiao-Ning Pang, Ding-Chen Chen, Yi-Cong Ding, Yi-Gui Chen, Shao-Ji Jiang, and Jian-Wen Dong* School of Physics and Engineering, and State

More information

3D X-ray Laminography with CMOS Image Sensor Using a Projection Method for Reconstruction of Arbitrary Cross-sectional Images

3D X-ray Laminography with CMOS Image Sensor Using a Projection Method for Reconstruction of Arbitrary Cross-sectional Images Ke Engineering Materials Vols. 270-273 (2004) pp. 192-197 online at http://www.scientific.net (2004) Trans Tech Publications, Switzerland Online available since 2004/08/15 Citation & Copright (to be inserted

More information

Chapters 1 7: Overview

Chapters 1 7: Overview Chapters 1 7: Overview Chapter 1: Introduction Chapters 2 4: Data acquisition Chapters 5 7: Data manipulation Chapter 5: Vertical imagery Chapter 6: Image coordinate measurements and refinements Chapter

More information

Chapters 1-4: Summary

Chapters 1-4: Summary Chapters 1-4: Summary So far, we have been investigating the image acquisition process. Chapter 1: General introduction Chapter 2: Radiation source and properties Chapter 3: Radiation interaction with

More information

Light source estimation using feature points from specular highlights and cast shadows

Light source estimation using feature points from specular highlights and cast shadows Vol. 11(13), pp. 168-177, 16 July, 2016 DOI: 10.5897/IJPS2015.4274 Article Number: F492B6D59616 ISSN 1992-1950 Copyright 2016 Author(s) retain the copyright of this article http://www.academicjournals.org/ijps

More information

This paper is published in the open archive of Mid Sweden University DIVA by permission of the publisher

This paper is published in the open archive of Mid Sweden University DIVA   by permission of the publisher This paper is published in the open archive of Mid Sweden University DIVA http://miun.diva-portal.org by permission of the publisher Roger Olsson and Youzhi Xu. An interactive ray-tracing based simulation

More information

Accurate projector calibration method by using an optical coaxial camera

Accurate projector calibration method by using an optical coaxial camera Accurate projector calibration method by using an optical coaxial camera Shujun Huang, 1 Lili Xie, 1 Zhangying Wang, 1 Zonghua Zhang, 1,3, * Feng Gao, 2 and Xiangqian Jiang 2 1 School of Mechanical Engineering,

More information

Plane Wave Imaging Using Phased Array Arno Volker 1

Plane Wave Imaging Using Phased Array Arno Volker 1 11th European Conference on Non-Destructive Testing (ECNDT 2014), October 6-10, 2014, Prague, Czech Republic More Info at Open Access Database www.ndt.net/?id=16409 Plane Wave Imaging Using Phased Array

More information

Surface and thickness profile measurement of a transparent film by three-wavelength vertical scanning interferometry

Surface and thickness profile measurement of a transparent film by three-wavelength vertical scanning interferometry Surface and thickness profile measurement of a transparent film by three-wavelength vertical scanning interferometry Katsuichi Kitagawa Toray Engineering Co. Ltd., 1-1-45 Oe, Otsu 50-141, Japan Corresponding

More information

Conversion of evanescent waves into propagating waves by vibrating knife edge

Conversion of evanescent waves into propagating waves by vibrating knife edge 1 Conversion of evanescent waves into propagating waves by vibrating knife edge S. Samson, A. Korpel and H.S. Snyder Department of Electrical and Computer Engineering, 44 Engineering Bldg., The University

More information

Understanding Variability

Understanding Variability Understanding Variability Why so different? Light and Optics Pinhole camera model Perspective projection Thin lens model Fundamental equation Distortion: spherical & chromatic aberration, radial distortion

More information

Effect of fundamental depth resolution and cardboard effect to perceived depth resolution on multi-view display

Effect of fundamental depth resolution and cardboard effect to perceived depth resolution on multi-view display Effect of fundamental depth resolution and cardboard effect to perceived depth resolution on multi-view display Jae-Hyun Jung, 1 Jiwoon Yeom, 1 Jisoo Hong, 1 Keehoon Hong, 1 Sung-Wook Min, 2,* and Byoungho

More information

FRED Slit Diffraction Application Note

FRED Slit Diffraction Application Note FRED Slit Diffraction Application Note The classic problem of diffraction through a slit finds one of its chief applications in spectrometers. The wave nature of these phenomena can be modeled quite accurately

More information