Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement

Similar documents
Vision-Based 3D Fingertip Interface for Spatial Interaction in 3D Integral Imaging System

Three-dimensional integral imaging for orthoscopic real image reconstruction

Department of Game Mobile Contents, Keimyung University, Daemyung3-Dong Nam-Gu, Daegu , Korea

3D image reconstruction with controllable spatial filtering based on correlation of multiple periodic functions in computational integral imaging

Three-dimensional directional display and pickup

PRE-PROCESSING OF HOLOSCOPIC 3D IMAGE FOR AUTOSTEREOSCOPIC 3D DISPLAYS

Digital holographic display with two-dimensional and threedimensional convertible feature by high speed switchable diffuser

Wide-viewing integral imaging system using polarizers and light barriers array

Rectification of distorted elemental image array using four markers in three-dimensional integral imaging

Enhanced Reconstruction of 3D Shape and Texture from. Integral Photography Images

Crosstalk in multiview 3-D images

Rectification of elemental image set and extraction of lens lattice by projective image transformation in integral imaging

Enhanced Still 3D Integral Images Rendering Based on Multiprocessor Ray Tracing System

AP Physics: Curved Mirrors and Lenses

Flexible Calibration of a Portable Structured Light System through Surface Plane

Digital Holographic Display System with Large Screen Based on Viewing Window Movement for 3D Video Service

Real-time Integral Photography Holographic Pyramid using a Game Engine

Light: Geometric Optics

Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship

Shading of a computer-generated hologram by zone plate modulation

Extended Fractional View Integral Photography Using Slanted Orthogonal Lenticular Lenses

3D Autostereoscopic Display Image Generation Framework using Direct Light Field Rendering

Null test for a highly paraboloidal mirror

Fast Device Discovery for Remote Device Management in Lighting Control Networks

Technologies of Digital Holographic Display

Color moiré pattern simulation and analysis in three-dimensional integral imaging for finding the moiré-reduced tilted angle of a lens array

Light: Geometric Optics (Chapter 23)

Design of Hierarchical Crossconnect WDM Networks Employing a Two-Stage Multiplexing Scheme of Waveband and Wavelength

ELECTROMAGNETIC diffraction by perfectly conducting

Estimation of 3D Geometry Using Multi-View and Structured Circular Light System

Depth extraction from unidirectional integral image using a modified multi-baseline technique

Optics II. Reflection and Mirrors

Physics for Scientists & Engineers 2

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation

Chapter 23. Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian

A Fast Image Multiplexing Method Robust to Viewer s Position and Lens Misalignment in Lenticular 3D Displays

Three dimensional Binocular Holographic Display Using Liquid Crystal Shutter

Optics INTRODUCTION DISCUSSION OF PRINCIPLES. Reflection by a Plane Mirror

Convergence Point Adjustment Methods for Minimizing Visual Discomfort Due to a Stereoscopic Camera

PAPER Optimal Quantization Parameter Set for MPEG-4 Bit-Rate Control

Uniform angular resolution integral imaging display with boundary folding mirrors

Enhanced Techniques 3D Integral Images Video Computer Generated

Three-Dimensional Image Security System Combines the Use of Smart Mapping Algorithm and Fibonacci Transformation Technique

Multi-projector-type immersive light field display

Cost Effective Acknowledgement Mechanism for Underwater Acoustic Sensor Network

Light: Geometric Optics

Stereo Image Rectification for Simple Panoramic Image Generation

Reflection & Mirrors

P H Y L A B 1 : G E O M E T R I C O P T I C S

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc.

Depth-fused display with improved viewing characteristics

Chapter 34: Geometrical Optics

Head Tracking Three-Dimensional Integral Imaging Display Using Smart Pseudoscopic-to-Orthoscopic Conversion

Computer Vision Project-1

Chapter 36. Image Formation

Video: The Mirror. Unit #3 - Optics. Geometric Optics. A) The Law of Reflection. applications Mirrors.

Angular multiplexed holographic memory system based on moving window on liquid crystal display and its crosstalk analysis

Chapter 34. Images. In this chapter we define and classify images, and then classify several basic ways in which they can be produced.

A Spatial Point Pattern Analysis to Recognize Fail Bit Patterns in Semiconductor Manufacturing

Measurement of Highly Parabolic Mirror using Computer Generated Hologram

Invited Paper. Nukui-Kitamachi, Koganei, Tokyo, , Japan ABSTRACT 1. INTRODUCTION

MULTIDIMENSIONAL OPTICAL SENSING, IMAGING, AND VISUALIZATION SYSTEMS (MOSIS)

5LSH0 Advanced Topics Video & Analysis

A Novel Stereo Camera System by a Biprism

A NEW PROOF OF THE SMOOTHNESS OF 4-POINT DESLAURIERS-DUBUC SCHEME

Reflection and Mirrors

Reprint. from the Journal. of the SID

New Method of Microimages Generation for 3D Display. Nicolò Incardona *, Seokmin Hong, Manuel Martínez-Corral

Numerical investigation on the viewing angle of a lenticular three-dimensional display with a triplet lens array

Chapter 34. Thin Lenses

Physics 1C Lecture 26A. Beginning of Chapter 26

Nicholas J. Giordano. Chapter 24. Geometrical Optics. Marilyn Akins, PhD Broome Community College

Measurements of the characteristics of spray droplets using in-line digital particle holography

Reconstruction of three-dimensional occluded object using optical flow and triangular mesh reconstruction in integral imaging

Chapter 23. Geometrical Optics: Mirrors and Lenses and other Instruments

Phys102 Lecture 21/22 Light: Reflection and Refraction

Programming Characteristics on Three-Dimensional NAND Flash Structure Using Edge Fringing Field Effect

Algebra Based Physics

Single Polarizer Liquid Crystal Display Mode with Fast Response

s70 Prototype of a Handheld Displacement Measurement System Using Multiple Imaging Sensors

Chapter 23. Light Geometric Optics

Light, Photons, and MRI

Computational Photography: Real Time Plenoptic Rendering

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

A Liver Surgery Simulator Using Full HD Autostereoscopic Displays

LED holographic imaging by spatial-domain diffraction computation of. textured models

Prospective Novel 3D Display Technology Development

Lecture Notes (Geometric Optics)

Projector Calibration for Pattern Projection Systems

Frequency domain depth filtering of integral imaging

Graphics Systems and Models

Robotics - Projective Geometry and Camera model. Marcello Restelli

High-Performance VLSI Architecture of H.264/AVC CAVLD by Parallel Run_before Estimation Algorithm *

PHYS 219 General Physics: Electricity, Light and Modern Physics

Computational Photography

DEPTH AND ANGULAR RESOLUTION IN PLENOPTIC CAMERAS. M. Damghanian, R. Olsson, M. Sjöström

The Law of Reflection

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD

Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry

Chapter 26 Geometrical Optics

Transcription:

Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement Joobong Hyun, Dong-Choon Hwang, Dong-Ha Shin, Byung-Goo Lee, and Eun-Soo Kim In this paper, we propose a curved projection integral imaging system to improve the horizontal and vertical viewing angles. The proposed system can be easily implemented by additional use of a large-aperture convex lens in conventional projection integral imaging. To obtain the simultaneous display of 3D images through real and virtual image fields, we propose a computer-generated picup method based on ray optics and elemental images, which are synthesized for the proposed system. To show the feasibility of the proposed system, preliminary experiments are carried out. Experimental results indicate that our system improves the viewing angle and displays 3D images simultaneously in real and virtual image fields. Keywords: Three-dimensional (3D) imaging, integral imaging, lenslet array, viewing angle. Manuscript received May 26, 2008; revised Dec. 30, 2008; accepted Feb. 18, 2009. This research was supported by the Ministry of Information and Communication (MIC), Rep. of Korea, under the Information Technology Research Center (ITRC) support program supervised by the Institute of Information Technology Assessment (IITA) (IITA-2008-C1090-0801-0018). Joobong Hyun (phone: + 82 2 940 5118, email: joobong99@nate.com) was with the Department of Electronics, Kwangwoon University, Seoul, Rep. of Korea, is with the TV Circuit Design Team, LG Display Co., Ltd., Gyeonggi-do, Rep. of Korea. Dong-Choon Hwang (email: aggape@w.ac.r) was with the Department of Electronics, Kwangwoon University, Seoul, Rep. of Korea, is with the Digital R&D Center, Samsung Electronics Co., Ltd., Gyeonggi-do, Rep. of Korea. Dong-Ha Shin (phone: + 82 51 320 2663, email: shindh2@dongseo.ac.r) and Byung-Goo Lee (email: lbg@dongseo.ac.r) are with the Department of Visual Contents, Dongseo University, Busan, Rep. of Korea. Eun-Soo Kim (email: esim@w.ac.r) is with the Department of Electronics, Kwangwoon University, Seoul, Rep. of Korea. I. Introduction Since integral imaging was first proposed by Lippmann in 1908, it has been a promising three-dimensional (3D) imaging and display technique with full parallax and incoherent light [1]-[7]. In general, integral imaging consists of both picup and display parts. The picup part of integral imaging is carried out by a lenslet array and a two-dimensional (2D) image sensor. In the picup part, rays coming from a 3D object are optically recorded as elemental images, which have their own perspective of a 3D object, by the lenslet array and 2D image sensor. The display part is the reverse of the picup part. The recoded elemental images are displayed on a display panel, and then a 3D image is reconstructed through the lenslet array in front of the display panel. Although integral imaging has many advantages, including the full parallax, a continuous viewing point, and full color images, it also has disadvantages, such as the narrow viewing angle and low resolution. Recently, to improve the viewing angle, curved integral imaging systems using a curved lens array have been reported [8], [9]. However, their optical implementation has been limited to only horizontal systems because it is difficult to fabricate a curved lens array and a curved display panel. To overcome this problem, a curved integral imaging system was proposed which additionally uses a large-aperture convex lens [10]. This system provides a simple structure due to the use of the well-fabricated flat devices and the additional large-aperture convex lens. Recently, several projection integral imaging (PII) systems with 2D image projectors have been reported [9], [11]-[13]. It ETRI Journal, Volume 31, Number 2, April 2009 2009 Joobong Hyun et al. 105

is easy to implement the simple structure of the integral imaging display system using PII because of simple mapping between the elemental images and the lenslet array. In 2004, Jang and Javidi proposed the PII system, in which high resolution elemental images were displayed by using multiple projectors; thus, the resolution of reconstructed 3D images was improved [11]. They also proposed a PII system in which the lenslet array is in contact with the large-aperture concave lens in order to picup and display the large 3D objects that are far away [12]. In 2004, Kim and others proposed a curved PII (CPII) system which uses a curved lenslet array to improve the viewing angle [9]. However, this system has been limited to only horizontal systems, and it requires complex modification of the proper elemental images. In this paper, to improve the horizontal and vertical viewing angles, we propose a CPII system which uses a large-aperture convex lens in the conventional PII system. The proposed system has a simple structure due to the use of a wellfabricated flat lenslet array and a large-aperture convex lens. To obtain the simultaneous display of 3D images through real and virtual image fields, we introduce a computer-generated picup method, and elemental images are synthesized for the proposed system. To show the usefulness of the proposed system, preliminary experiments are carried out and some experimental results are presented. II. Proposed CPII System 1. System Structure The ideal configuration of the CPII system using a curved lenslet array, a curved screen, and a 2D image projector is shown in Fig. 1. Using a curved lenslet array and a curved screen provides a wide viewing angle; however, it is difficult to fabricate curved devices. In addition, the projected elemental images must be modified for projection on the curved screen. To overcome this problem, we propose a CPII system which has the equivalent effect of a curved lenslet array: it uses a flat lenslet array and a large-aperture convex lens as shown in Fig. 1. With a flat lenslet array, a flat screen can be used. The additional large-aperture convex lens provides a multidirectional curvature effect and a wide viewing angle. 2. Synthesis of Elemental Images for Real and Virtual Display of 3D Images To obtain elemental images for the proposed CPII system, we can use a computer graphics (CG) picup method. In this paper, the CG picup technique based on an ABCD matrix [10] is extended to the synthesis of elemental images for real and virtual display of 3D images. To simplify the illustration of Relay optics 2D image projector Relay optics 2D image projector Curved lenslet array Curved screen Fig. 1. Ideal CPII system and the proposed CPII system. 3D Flat lenslet array Large-aperture lens Flat screen ray analysis, we consider 1D analysis. The extension to 2D analysis is straightforward. First, consider the elemental images of a 3D object located in the real image field. Figure 2 shows the scheme to synthesize the elemental images in real image field of the proposed CPII system. To simplify the explanation, we suppose the use of a pinhole array and a conventional large-aperture thin lens. Let us define the distance between the lenslet array and the display panel as g and the focal length of the thin lens as f. As shown in Fig. 2, we consider a ray from the n-th pixel of the -th elemental image. This starting ray goes toward the -th lenslet and is given by Hn ( z = 0) p + nd = An ( z 0) nd g, (1) = where H(z) and A(z) mean the height of the ray and the starting angle at position z, respectively; p is the pitch of the lenslet; and d is the size of a pixel in the elemental image. When z=g, the ray becomes 3D Hn ( g) p = An ( g) nd g. (2) For the rays that intersect at the corresponding pinhole, the confined transmission matrix from the elemental image plane z=g to an arbitrary distance z r is calculated by 1 zr g 1 0 Tr = 0 1 1/ f 1. (3) From (2) and (3), we can calculate the height of each ray at the distance z r. Then we obtain 106 Joobong Hyun et al. ETRI Journal, Volume 31, Number 2, April 2009

n-th pixel in -th elemental images Ideal thin lens -th pinhole To synthesize the final elemental images, we should calculate all n and using (4) and (6) according to the distance. Here, when f is infinite, (6) becomes the analysis of the conventional PII without a large aperture thin lens. 3. Calculation of Viewing Angle g 0 z r Fig. 2. Ray tracing for synthesizing elemental images in the real image field. n-th pixel in -th elemental images Ideal thin lens -th pinhole z In integral imaging, the viewing angle is a major factor in the display system because an observer wants to see the 3D images without restriction on viewing positions. Basically, the viewing angle of the integral imaging is restricted because of the limited f-number of the lenslet array. When an observer sees the 3D images out of the viewing angle, image flipping occurs. Therefore, it is important to increase the viewing angle of integral imaging. Figure 4 is a diagram of the viewing angle in the conventional integral imaging. The viewing angle depends on the f-number of the lenslet array. This is given by p θc = 2arctan. (7) 2 g On the other hand, the CPII system we propose can enhance the viewing angle of the 3D object in all directions. This can be calculated by considering the number of lenslets. The concept -z -z v 0 g z Fig. 3. Ray tracing for synthesizing elemental images in the virtual image field. zr g ( zr g) nd Hn( zr) = p 1. (4) f g The analysis of rays in the virtual image field is somewhat different from that in the real image field. Next, we explain the elemental images of 3D object in the virtual image field. Figure 3 shows the scheme to synthesize the elemental images in the virtual image field of the proposed CPII system. A ray from the n-th pixel of the -th elemental image to the thin lens passes through the real image field. However, after the ray starts from the thin lens, it travels in the z direction. If the ray reaches the distance z v in the virtual image field, the transmission matrix of the ray is calculated by 1 g zv 1 0 Tv = 0 1 1/ f 1. (5) Using (2) and (5), the height of each ray at distance z v located in the virtual image field is given by g zv ( g zv) nd Hn ( z) = p 1. (6) f g p Elemental images 0th lenslet 1st lenslet g θ c Lenslet array -K/2-th lenslet θ K / 2 θ max Lenslet array + large-aperture lens K/2-th lenslet Fig. 4. Viewing angle in the conventional system [8] and the proposed system. θ 1 θ 0 = 2θ K / 2 ETRI Journal, Volume 31, Number 2, April 2009 Joobong Hyun et al. 107

24 22 Relay optics 光 Lenslet array Lens 云 Viewing angle (degree) 20 18 16 14 2D image projector 30 mm 30 mm Camera lens CCD 12 Fig. 6. Experimental setup. 10 0 5 10 15 20 25 30 35 K Fig. 5. Maximal viewing angle of the proposed system according to the K number. of our viewing angle is shown in Fig. 4. When the number of lenslets K is odd, the viewing angle is chosen using two rays starting from the K/2-th and K/2-th lenslets. This is the maximum viewing angle in the proposed system. The viewing angle θ max can be derived as p Kp θmax = 2arctan +. (8) 2 g 2 f Figure 5 shows the calculated viewing angle using (8) when p=1.08 mm, g=5.2 mm, and f=200 mm. Here, the large K value provides the large viewing angle. However, the K value is limited by the size of the large aperture lens due to the fabrication difficulty. This tradeoff should be considered in designing the CPII system. III. Experiments and Results To demonstrate the proposed CPII system, preliminary experiments were performed using the optical setup shown in Fig. 6. A 2D LCD projector was used to display the elemental images. We used a lenslet array of 34 25 lenslets in which each lenslet is mapped with 30 30 pixels in the image projector. The diameter and focal length of the lenslets are p=1.08 mm and g=5.2 mm, respectively. The focal length of the large aperture lens is f=200 mm. The 3D object was composed of two character patterns as shown in Fig. 6. The Chinese character 光 was positioned at 30 mm in the virtual image field and the Chinese character 云 was positioned at 30 mm from the lenslet array in the real image field. First, we synthesized the elemental images in the CPII system by using (4) and (6). The synthesized elemental images are shown in Fig. 7. For comparison, we present elemental images using the conventional PII in Fig. 7. We see that fewer elemental images were sampled for the character 光, Fig. 7. Synthesized elemental images: conventional PII and proposed CPII. and more elemental images were sampled for the character 云 in the proposed CPII system than in the conventional PII system. This is because the character 光 is piced up in the virtual image field and the character 云 is piced up in the real image field. Next, we displayed the synthesized elemental images in the image projector shown in Fig. 6. Figure 8 shows the reconstructed images captured by the CCD camera in the real and virtual image fields when using the proposed CPII system. The image focused at the virtual image field is shown in Fig. 8, and the image focused at the real image field is shown in Fig. 8. Figure 9 shows the reconstructed images 108 Joobong Hyun et al. ETRI Journal, Volume 31, Number 2, April 2009

10º (left) 6º (left) center 6º (right) Fig. 8. Focused image in the virtual image field and focused image in the real image field using the proposed CPII system. from different viewing angles. We observed the full 光 image within 6 o to the left and right in the conventional PII system as shown in Fig. 9. However, in the proposed CPII system, we observed images within 10 o to the left and right. Therefore, the measured viewing angle of the conventional PII system was approximately 12 o, and that of the proposed CPII system was approximately 20 o. Under the experimental conditions shown in Fig. 6, the theoretical viewing angles of the conventional and proposed systems were calculated to compare those of the optical experiments. From (7), (8), and Fig. 5, the theoretical viewing angles for the conventional PII system and the proposed CPII were calculated as approximately 12 o and 22 o, respectively. The calculated viewing angles agree well with the experimental results. From the experimental results, we can see that the proposed CPII system improves the viewing angle and displays 3D images simultaneously through the real and virtual image fields. IV. Conclusion We proposed a CPII system which provides a curvature effect by using a large-aperture convex lens as an addition to the conventional PII system. The proposed system has a simple structure using of well-fabricated flat devices without any modification and provides a curvature effect by using a largeaperture convex lens. The experimental results demonstrate 10º (right) Fig. 9. Experimental results: conventional PII system and proposed CPII system. that the proposed system provides an improved viewing angle and can display 3D images simultaneously through the real and virtual image fields. References [1] S.A. Benton, Selected Papers on Three-Dimensional Displays, SPIE Optical Engineering Press, Bellingham, WA, 2001. [2] G. Lippmann, La Photographic Integrale, C.R. Acad. Sci., vol. 146, 1908, pp. 446-451. [3] F. Oano et al., Three-Dimensional Video System Based on Integral Photography, Opt. Eng., vol. 38, 1999, pp. 1072-1077. [4] H. Arimoto and B. Javidi, Integral Three-Dimensional Imaging with Digital Reconstruction, Opt. Lett., vol. 26, 2001, pp. 157-159. [5] B. Lee et al., Three-Dimensional Display by Use of Integral Photography with Dynamically Variable Image Planes, Opt. Lett., vol. 26, 2001, pp. 1481-1482. [6] J.-S. Jang and B. Javidi, Improved Viewing Resolution of Three- Dimensional Integral Imaging by Use of Nonstationary Micro- ETRI Journal, Volume 31, Number 2, April 2009 Joobong Hyun et al. 109

optics, Opt. Lett., vol. 27, 2002, pp. 324-326. [7] D.-H. Shin et al., Improved Viewing Quality of 3-D Images in Computational Integral Imaging Reconstruction Based on Round Mapping Model, ETRI J., vol. 29, no. 5, Oct. 2007, pp.649-654. [8] Y. Kim et al., Viewing-Angle-Enhanced Integral Imaging System Using a Curved Lens Array, Opt. Express, vol. 12, 2004, pp. 421-429. [9] Y. Kim et al., Wide-Viewing-Angle Integral Three-Dimensional Imaging System by Curving a Screen and a Lens Array, Appl. Opt., vol. 44, 2005, pp. 546-552. [10] D.-H. Shin, B. Lee, and E.-S. Kim, Multidirectional Curved Integral Imaging with Large Depth by Additional Use of a Large- Aperture Lens, Appl. Opt., vol. 45, 2006, pp. 7375-7381. [11] J.-S. Jang, Y.-S. Oh, and B. Javidi, Spatiotemporally Multiplexed Integral Imaging Projector for Large-Scale High-Resolution Three-Dimensional Display, Opt. Express, vol. 12, 2004, pp. 557-563. [12] J.-S. Jang and B. Javidi, Depth and Lateral Size Control of Three- Dimensional Images in Projection Integral Imaging, Opt. Express, vol. 2004, pp. 3778-3790. [13] J.-S. Jang and B. Javidi, Very Large-Scale Integral Imaging (VLSII) for 3-D Display, Opt. Eng., vol. 44, 2005, pp. 014001_1-014001_6. Joobong Hyun received the BS and MS degrees in electronic engineering from Kwangwoon University, Seoul, Korea, in 2006 and 2008. Currently, he is an engineer with the TV Circuit Design Team of LG Display Co., Ltd., Gyeonggi-do, Korea. Dong-Choon Hwang received the BS degrees in electronic engineering from Sunchon National University, Korea, in 2003, and the MS and PhD degrees in electronic engineering from the graduate school of Kwangwoon University, Seoul, Korea, in 2005 and 2008, respectively. He is currently an engineer with the Digital R&D Center, Samsung Electronics Co., Ltd., Korea. He is interested in 3D displays, image processing, and optical information processing. Dong-Ha Shin received the BS, MS, and PhD degrees in telecommunication and information engineering from Puyong National University, Busan, Korea, in 1996, 1998, and 2001, respectively. From 2001 to 2004, he was a senior researcher with TS-Photon established by Toyohashi University of Technology, Japan. From 2005 to 2006, He was with the 3D Display Research Center (3DRC-ITRC), Kwangwoon University, Seoul, Korea. He is currently a research professor with the Department of Visual Contents, Dongseo University. His research interests include 3D displays, optical information processing, and optical data storage. Byung-Goo Lee received his BS degree in mathematics from Yonsei University, Korea, in 1987, and his MS and PhD degrees in applied mathematics from Korea Advanced Institute of Science and Technology (KAIST) in 1989 and 1993, respectively. He wored at the DACOM Corp. R&D Center as a senior engineer from March 1993 to February 1995. He has been woring at Dongseo University since 1995. He is currently an assistant professor with the Division of Computer Information Engineering at Dongseo University, Korea. His research interests include computer graphics, computer aided geometric design, and image processing. Eun-Soo Kim received the BS, MS, and PhD degrees in electronic engineering from Yonsei University, Seoul, Korea, in 1978, 1980, and 1984, respectively. In 1981, he joined the faculty of Kwangwoon University, where he is currently a professor with the Department of Electronic Engineering. He was a visiting research associate with the California Institute of Technology, Pasadena, from February 1987 to August 1988. He is now the director of the 3D Display Research Center (3DRC-ITRC) and the head professor of the National Research Lab of 3D Media. He has also been the acting president of the Society for 3D Broadcasting and Imaging since 2000. His research interests include 3D imaging and display systems, 3DTV, holographic and stereoscopic image processing, and 3D camera systems. 110 Joobong Hyun et al. ETRI Journal, Volume 31, Number 2, April 2009