Veritas et Visus September 2008 Vol 3 No 9/10. UCI, p9 FUJIFILM, p7 Pixelux, p23 Eldim, p57

Size: px
Start display at page:

Download "Veritas et Visus September 2008 Vol 3 No 9/10. UCI, p9 FUJIFILM, p7 Pixelux, p23 Eldim, p57"

Transcription

1 3rd Dimension Veritas et Visus September 2008 Vol 3 No 9/10 UCI, p9 FUJIFILM, p7 Pixelux, p23 Eldim, p57 Letter from the publisher : Big Sky Country by Mark Fihn 2 News from around the world 4 3DTV Conference 2008, May 28-30, 2008, Istanbul, Turkey 25 SID Display Week 2008, May, Los Angeles, California 34 Stereoscopic Displays and Applications Conference, January 28-30, San Jose, California 46 Behold a Miracle : the rise and slide of the Nimslo camera by Michael Mullen 51 Characterization of autostereoscopic 3D displays by Pierre Boher 57 Eyes tire if focus cues are wrong by Adrian Travis 72 Motion parallax and stereoscopic displays by Robert Patterson 74 Interview with Andrew Fear from nvidia 76 The right horse by Neil Schneider 78 3D game review by Eli Fihn 80 Ray directions - where ray tracing is going by Fluppeteer 81 3D photography commentaries by Sheldon Aronowitz 86 3D HDTV watch by Andrew Woods 88 Fly me to the Moon: a 3D evaluation by Daniel Smith 90 Classification of plano-stereoscopic displays by Lenny Lipton 94 Calendar of events 97 The 3rd Dimension is focused on bringing news and commentary about developments and trends related to the use of 3D displays and supportive components and software. The 3rd Dimension is published electronically 10 times annually by Veritas et Visus, 3305 Chelsea Place, Temple, Texas, USA, Phone: Publisher & Editor-in-Chief Mark Fihn mark@veritasetvisus.com Managing Editor Phillip Hill phill@veritasetvisus.com Contributors Sheldon Aronowitz, Pierre Boher, Eli Fihn, Fluppeteer, Lenny Lipton, Michael Mullen, Robert Patterson, Neil Schneider, Daniel Smith, Adrian Travis, Andrew Woods Subscription rate: US$47.99 annually. Single issues are available for US$7.99 each. Hard copy subscriptions are available upon request, at a rate based on location and mailing method. Copyright 2008 by Veritas et Visus. All rights reserved. Veritas et Visus disclaims any proprietary interest in the marks or names of others.

2 Characterization of autostereoscopic 3D displays using Fourier optics viewing angle instrument Pierre Boher earned an Engineer Degree at the ECP (Ecole Centrale des Arts et Manufactures) in After earning his Ph.D. in material sciences in 1984, he worked in the French Philips Laboratories during nine years on the deposition and characterization of very thin films and multilayers. R&D manager at SOPRA between 1995 and 2002, he was involved in the development of different metrology tools for non-destructive characterization mainly for microelectronics. He joined ELDIM in 2003 to work in the research and development of new metrology heads. by Pierre Boher Optical characterization of autostereoscopic 3D display is mandatory to optimize performances and to make efficient comparison between them. Up to now, quite simple optical characterizations are found in the literature as simple detector moved horizontally at the optimal viewing distance 1. Most sophisticated techniques like goniometers or Fourier optics instruments have also been used but generally in a quite restrict way analyzing only one single cross section in the observer space 2-5. From this limited information, different parameters have been defined such as 3D crosstalk, optimum viewing distance, viewing freedom that gives a first evaluation of the performances of a 3D display 4. The case of multi-view displays is much complex even if the measurement procedures are very similar. In particular, since three or more views can be contributing to the image seen by one eye, the perceived image quality via distribution of crosstalk becomes difficult to analyze even if different attempts have been made in this sense 5. In principle all the 3D display optical characteristics can be derived from the measurement of the output luminance and colors versus viewing angle. Nevertheless an excellent angular resolution is a priori required regarding the high angular resolution of the human eye and a measurement of the entire angular aperture is needed if a full characterization is wanted. Fourier optics instrument are extremely well adapted for this task since the full angle of view is measured rapidly and easily. Nevertheless, the angular resolution of standard instruments is limited to about 0.5, which is not sufficient for accurate 3D display characterization. The purpose of the present paper is twofold; we first present the new ELDIM Fourier optics instrument dedicated to 3D display characterization and its performance; then we propose a new way to analyze the viewing angle measurements on twin view and multi-view 3D displays to get a full characterization of their optical performances. We show that best using conditions of a given display can be visualized easily in the observer space and that an optimum viewing region can be defined and quantified. From this optimum viewing positions all other needed parameters can be computed such as viewing freedom, total luminance, standard contrast or color shifts. Making viewing angle measurements at different locations on the display, 3D spatial corrections can be quantified and verified. The display aspect for each eye of the observer can be also predicted. All these tools offer new precise characterization of autostereoscopic 3D displays for better optimization and easier performance comparison. Viewing angle measurements and Fourier optics Fourier optics viewing angle instruments: Fourier optics has the capacity to transform the angular response of a sample in spatial information that can be imaged by a 2D sensor (cf. figure 1). Each light beam emitted from the sample surface with an angle θ with regards to the normal of the surface is focused on the Fourier plane at the same azimuth and at a position x = F tan(θ). The angular emission of the sample is then measured simply and quickly without any mechanical movement. In practice, the Fourier optics is an achromatic combination of different lenses (6 to 9) that collect quasi all the light coming from the display and focus each angle on an intermediate Fourier plane (cf. figure 2). A field lens and an 57

3 imaging lens are then used to re-image the first Fourier plane on the CCD sensor. In the optical mounting patented by ELDIM, there is field iris before the sensor is complex conjugate of the display surface and allows adjusting the measurement spot size independently than the angular aperture. The consequence is that as represented in figure 2, the measurement spot size varies with the angle. It is what we call cosine compensation which is mandatory to get good collection efficiency even for large incidence angles. The size of the measuring spot can be easily adapted by an iris. Results are generally reported in the Fourier plane where each point corresponds to one incidence angle θ and one azimuth angle φ. The optical resolution mainly depends on the optical setup, the measuring spot size and the pixel size on the CCD detector. It is less than 0.5 for standard ELDIM system. Figure 1 on the left shows a concept of Fourier optics; Figure 2 on the right is the patented optical setup of ELDIM viewing angle instruments. Description of the VCMaster-3D viewing angle instrument: For 3D display characterization, we need to be able to measure all the views of the display independently and with a maximum of angular resolution. The spot size must then be sufficiently large to include tens of pixels for each view of the display. For VCMaster-3D a maximum spot diameter of 4mm has been selected. The optimum working distance is fixed to 20mm. This distance ensures only that the measured spot is always centered at the same location for each incidence angle. It is not critical parameters for the measurements. The main feature of the new instrument is the very high angular resolution below The instrument can measurement full luminance and color viewing angle emission up to 50 using 5 color filters adjusted to the spectral response of the CCD. Main system characteristics are summarized in Table I. Figure 3 on the left is the VCMaster-3D for 3D display characterization; Table I on the right shows the main characteristics of the VCMaster-3D Measurement procedure: For a given stereoscopic display, luminance viewing angle measurements are made for ON state and OFF state and for white applied on each view with black on all the others. The example of a twin view cell phone display is reported in figure 4. The series of measurements is made at the center of the display and 58

4 optionally on other locations. In case of color shifts color measurements can be made instead of luminance measurements. Red, green, and blue states can be also applied instead of white state for more precise color analysis. All the examples provided below are based on contrast between black and white states. In practice it can be also interest to consider contrast one low gray level and one high gray level. It should be closer to the practical using conditions but all the definitions and computations are similar. Figure 4: Viewing angle measurements on one location of a lenticular 3D cell phone display. Computation of 3D optical properties Axis definition and observer position: Our target is to calculate the light arriving from one location of the display to an observer located in front of the display. The emissive properties of this particular location have been of course measured by Fourier optics viewing angle instrument as discussed in the previous paragraph. The observer is defined its coordinates in the XYZ referential (cf. figure 5). The origin O is always the display center. The X axis, Y axis and Z axis define the transverse, saggital and coronal planes respectively as schematically reported in figure 6. The observer position is supposed to be the center of its two eyes. The two eyes are always assumed parallel to the X-axis. The interpupillary distance is fixed (generally 6.25cm). For each observer position we can calculate the two polar angle θ and φ using the following formula: 59

5 (xo, yo, zo) are coordinates of one of the eye of the observer in XYZ referential (xe, ye, 0) are coordinates of the emissive point of the display in XYZ referential The orientation of the head of the observer is also a parameter to deduce the polar angle especially at high incidence angles. We have made the calculations in two limit configurations (cf. figure 7): or the observer keeps the eyes always perpendicular to the display surface independently of his location, or he stare at the location of interest on the display. Of course the result is similar near normal incidence. Figure 5 on the left is a system of coordinates used for the observer and one point on the display surface; Figure 6 on the right show calculations presented in the sagittal, transverse and coronal planes. Figure 7: Observer head position gives different angles: we examine two situations with eyes always perpendicular to display surface or eyes always perpendicular to the direction of the observed point. Problem of the angular resolution: Before going into calculation it is important to understand why the angular resolution of the measurement plays a key role for the observer. The angular resolution is the capacity to distinguish two different light beams coming from the same location but with very close directions. To be realistic the accuracy of the calculation in the plane of the observer must be better or of the other of the human eye pupil diameter. We can easily calculate the maximum position uncertainty x versus the angular resolution θ depending on the observer distance D using the equation: 60

6 As shown in figure 8, the angular resolution of 1 to 2 generally found for spectrophotometers is unacceptable even for phone cell displays with short working distances. The angular resolution of the VCMaster-3D system presented previously can cover all the needs and in particular the 3D TV applications were the working distance is always important. Figure 8: Position uncertainty versus angular resolution: the human eye min pupil diameter gives a criterion for the metrology requirements. Computation for twin view 3D displays: a) Optimum viewing region (OVR): The quality of the 3D display for an observer is directly related to his capacity to see clearly the right images with his right and left eyes. In case of thin view displays, we first calculate the two contrasts associated to each eye using the following equations: (θr, φr) and (θl, φl) are the right and left eye positions in polar coordinates calculated as reported above. YWRBL is the luminance for white view on right eye and black view on left eye, YBRWL is the luminance for black view on right eye and white view on left eye, and YBRBL is the luminance for black view on right and on left eye. These two contrasts are nothing less than the inverse of the 3D crosstalk of right and left eyes χr and χl as introduced by Montgomery in We prefer to work with contrasts because there are quantities used every day in the field of standard displays, and there must be maximize for optimum viewing conditions, which is always easier for graphic display. The 3D quality is optimum only when the two previous contrasts are maximized simultaneously. It is why we have decided to combine the two contrasts to get what we call the 3D contrast for the observer given by: 61

7 We take the product and not the sum because a good quality requires a good contrast for left and right eyes simultaneously. The square maintains the dimensionally of the quantity as a contrast which can be compared to the standard contrast of displays. Figure 9: Calculated contrast for the left and right eye of an observer in the transverse plane: the emissive point is at the center of the display (origin), the extension of the plane is fixed to 50cm in the two directions. To illustrate the definition we have calculated this combined contrast for the twin view phone cell display whose viewing angle measurements have been reported in figure 4. The calculation is made on a surface of 50x50cm and the eyes are assumed to stare at the center of the display. The contrasts for left and right eyes in the transversal plane are reported in figure 9. The different beams at different directions are clearly seen and their intersections define the optimum viewing region (OVR) as shown in figure 10. In the transverse plane (cf. figure 10.a), the main OVR is an elongated area. The two additional lateral regions with high 3D contrast correspond to an inverted situation where the right eye sees the image of the left eye and vice versa. In the sagittal plane (cf. figure 10.b), the OVR is an elliptical shape with quite high spatial extension. This is not surprising with lenticular lenses. We have reported two calculations in two coronal planes at 37 and 28cm from the display surface. 37cm corresponds to the distance of the central position of the OVR that we can define as the Optimum Viewing distance (OVD) and 28cm is better optimized for lateral inverted regions. We see that the lateral position of the observer is always extremely critical as waited for twin view display. b) Definition of additional parameters using OVR: A first thing is to define a criterion for a good 3D factor. Tolerance limit for crosstalk have been reported to be around 5-10% [2]. In terms of 3D contrast we can fix a percentage of the maximum value above which we can accept to be in the OVR. In the case of our example we have made some calculations for 90, 95 and 97% as shown in figure 11 and Table II. For each criterion we can calculate the volume of the OVR and define different Optical Viewing Freedom value along the three spatial directions. We then define: Transverse OVF: maximum extension of the OVR along horizontal X axis. Sagittal OVF: maximum extension of the OVR along vertical Y axis. Coronal OVF : maximum extension of the OVR along depth Z axis. These values are reported in Table II for the three criteria that we have selected. In each case the Tranverse OVF is always very restricted. The quality of the display depends not only on the 3D contrast which take care of the 62

8 differences between the two eyes but also on amount of light arriving inside the two eyes of the observer. It is in particular easy to calculate using the same method as previously the mean luminance value for the two eyes in ON state and OFF state and to evaluate the standard contrast. Results in the sagittal plane are reported in figure 12. Without surprise there is no correlation between 3D and 2D properties and the poor quality of the OFF state on this display outside normal incidence reduce the standard contrast. We have evaluated the variation of the standard parameters inside the OVR for the three previous criteria. There are also reported in table II. Strong standard contrast variations are noticed especially for the less restrictive criterion for the OVR. c) Extension to color shifts using color viewing angle measurement: All the computation made above is based on the luminance emitted by the displays. Color shifts are also important source of imperfection Figure 10: Calculated 3D contrast for an observer in the transverse, sagittal and coronal planes: two positions of the coronal plane are calculated at 37cm (central optimized position) and 28cm (lateral inverted positions). for autostereoscopic displays 5. In the case of lenticular barrier displays for example this effect is difficult to reduce. The twin view phone cell display analyzed previously shows this type of effect (cf. figure 13). Viewing angle color measurements reported in the figure show clearly different angular emissions of blue, green, and red components. The approach proposed previously using 3D contrast ratio based on luminance can be extended using X and Z CIE components in addition to the luminance. We first define the X and Z contrast for right and left eyes of the observer using the following equations: It is then straightforward to combine the contrasts for right and left eyes to get a 3D contrast value for each primary component X and Z in addition to Y: 63

9 In practice we calculate three different OVR for each CIE component. The common spatial region can be considered as the best color OVR for the display. We show this type of calculation for the twin view cell phone display in figure 14. The color shift between red X component and green Y component is clearly seen both on transverse and sagittal planes. The maximum of 3D contrast is not the same for the two quantities. It is always higher for the green component, which is generally optimized, but we can understand that the color OVR can be quite reduced by this type of effect. Additional parameters such as OVD and transverse, sagittal and coronal OVF can be deduced for each CIE component. Some examples are reported in Table III. In this case the red component is the most critical. d) Measurements at different locations and 3D display viewpoint corrections: All types of autostereoscopic displays are normally designed with corrected viewpoint to ensure pixels at the edge of the display are seen correctly by the observer. Figure 11: Different shapes of the OVR in the sagittal plane depending on the criterion on the 3D contrast. Figure 12: OFF state, ON state, 3D contrast and standard contrast in the sagittal plane. 64

10 Table II : Main parameters of the twin view cell phone display depending on the criterion for OVR For lenticular lenses the pitch is adapted so that the center of each pair of pixels should be projected to the center of the viewing window 6. Using viewing angle measurements at different locations on the display surface we can calculate different OVR that should be similar if the display is correctly fabricated. In the computation, we need just to take into account the position of the measurement with regards to the display center. On example of such verification is reported in figure 15 for twin view cell phone display previously analyzed. In addition to the central point, two measurements have been made with a horizontal shift of +2cm and -2cm. Figure 13: Color viewing angles measurement of the twin view cell phone display. Table III: Main parameters of the twin view cell phone display for X, Y and Z component: the criterion to defined OVR is fixed at 95%. 65

11 Figure 14: 3D contrast for X and Y components: red component gives reduced OVR. Figure 15: Computed OVR for three positions on the twin-view display, center and + or -2cm in the horizontal direction. 66

12 Computation for multi-view 3D displays 3D contrast: As reported previously, the optimum viewing region of a twin view autostereoscopic display is very limited and so the observer must be located at a very precise location in front of the display to see it properly. In multiview systems the idea is to increase the OVR by generating multiple simultaneous viewing windows of which an observer sees just two at any time. In this case multi-view systems can support more than one observer. The OVR can be defined in the same way as previously taking into account all the views at the same time. We first define the contrast of view i with regards to all the other views by: Yi(θ,φ) is the luminance at location (θ,φ) when white is applied to view i and black on the other views, YB(θ,φ) is the luminance at location (θ,φ) when black is applied to all the views simultaneously, and N is the total number of views of the display. Ci(θ,φ) is related to the inverse of the crosstalk of view i. The factor (N-1) is here to keep the scale of the contrast comparable for each type of display. For an observer located at (θ,φ) in polar coordinates and with its right and left eyes at (θr,φr) and (θl,φl), the quality of the display is good if the contrast seen by each of his eyes is good. In addition he needs to see different images in each eye simultaneously and only one image must be seen clearly by each of his eyes. We have decided to define a 3D contrast related to the views seen by the observer in the following way. First we calculate the 3D contrast when view i is seen by right eye and view j by left eye: This formula defines one limited OVR were the quality of 3D vision is optimum for view i in right eye and view j in left eye. b) Combined 3D contrasts: This OVR is of course restricted in space and the procedure must be repeated for all the views if ones want an overall performance of the display. We have decided to classify the different OVRs in the following way. We first calculate the combine 3D contrast assuming that left and right eyes see two successive views of the display using the following equation: We call it C 1 because we assume a step 1 between right and left eye. One example for parallax barrier 14 views phone cell display is reported in figure 16. We have reported the 3D contrast between view 1 and view 2 in the transverse (top left) and sagittal planes (top right) and the combined 3D contrast C1 in the same planes (bottom left and right). As waited, the OVR defined by view 1 and view 2 is very restrictive in the observer space. It is in fact comparable to the twin view lenticular lens phone cell display presented previously. The extension in the sagittal plane is nevertheless better because parallax barrier are only dependent on the viewing angle of the LCD and not on the focusing properties of the lenticular lenses as previously. The combined 3D contrast C1 defines a much more extended and continuous OVR in particular in the transverse plane. It means that an observer at the OVD can move horizontally without losing the 3D contrast. He just shifts slightly from one view to the other. In fact in the present example there is an overlapping between the different OVRs that ensure this property. From the computation of C1 we can define different parameters related to the combine OVR such as OVD and OVF in the different directions. We can also define an overlapping ratio between the different limited OVRs. In the same way different other combine 3D contrasts can be defined using the following equation: 67

13 Figure 16: 3D contrast of 14 views cell phone display: between view 1 and view 2 (top) and between each view I and view i+1 (bottom). The calculation is reported in transverse (left) and sagittal (right) planes. The meaning of these new combine 3D contrasts is clear. Computations of C1, C2, C3 and C4 in the case of the parallax barrier phone cell display of figure 16 are reported in figure 17. As expected, different new combine OVRs appear when the observer is a little more distant of the display that the OVD for C1 parameter. These new OVRs allow defining new optical viewing distances related to the type of view seen by the observer. Important overlaps exist also between these OVR that can be quantified and optimized. Computation of viewing aspect: The approach presented above is useful to quantify different important parameters such as OVD and OVF for a given 3D display using Fourier optics viewing angle measurements at one or different locations on the display. Nevertheless when the optimum viewing locations have been determined, the quality of the 3D display cannot be restricted to few parameters like 3D contrast as defined above. To judge of the real 3D quality of a display, an overall evaluation of the display aspect must be made. Computation of the display aspect from optimized viewing position: Knowing the OVR it is then possible to simulate the display aspect using the same viewing angle measurements. The method has been published by ELDIM in different papers 3-4. For conventional LCDs we have shown that their viewing angle capacity plays a key role for the display aspect. A homogeneous white image can appear inhomogeneous for the observer only because of geometrical questions related to its own position in front of the display. The practical standard contrast can fall rapidly even if the normal incidence contrast is high. For autostereoscopic 3D displays the problem is little more complex because of the difficulty to control simultaneously the views of the two eyes. In principle, even for small 3D displays it is necessary to change the emissive properties of the display for the center to the two lateral sides to 68

14 ensure good 3D quality. We have seen an example of this view point correction. Using the display aspect simulation software from ELDIM it is very easy to check what will be seen by the left and right eye of an observer at any location in front of a 3D display. In principle the computation is perfectly accurate only if we know completely the emissive properties of the display at any location on its surface (in practice we make viewing angle measurements at different locations on the display and assume that the variation versus position is continuous). Figure 17: 3D contrasts of 14 views cell phone display in the transverse plane: C1 (top left), C2 (top right), C3 (bottom left) and C4 (bottom right) are reported. Even if we have only one measurement at the center, the display aspect simulation is interesting in the sense that it emphasizes the effect of the geometric parameters. Such a simulation is reported in figure 18 for the twin view phone cell display already studied previously. We assume the observer at one optimized position in the OVR (here on axis at 37.2cm from the display), and we calculate the image seen by his left and right eyes for two situations: white image on left eye and black image on right eye and the opposite. As shown in figure 18, we can predict that the geometric angular correction is not necessary for a very small lateral size of less than 2cm. In practice, cell phone displays are 3 to 4cm large and so a view point correction is generally applied as illustrated in figure 15. The interest of display aspect simulation is not only limited to luminance but also to color as shown in figure 19. The computation conditions are similar than for figure 18 but this time the display color is predicted using a viewing angle color measurement instead of a luminance measurement. The effect of color shifts can be emphasized and quantified. 69

15 Figure 18: Simulated aspect of the twin view display from OVD (37.2cm) for right and left eyes and White- Black or Black-White patterns: the simulated luminance is reported for an emissive 5x5cm surface; the emissive properties are assumed similar to the center at each point of the display. Conclusion Autostereoscopic displays always exhibit complex angular emission properties to provide different images in the eyes of the observer in the best using conditions. A complete characterization requires not only the measurement of the luminance and color in particular directions but a full characterization of the viewing cone with high angular resolution. In this context, Fourier optics instruments appear as the most convenient instrument for this task. In this paper we have presented a new system, VCMaster-3D, dedicated to the characterization of 3D displays that exhibit unprecedented angular resolution. We have introduced also the 3D contrast that is useful to locate the optimum viewing region for an observer in front of the display. Different parameters such as optimum viewing distance and viewing freedom can then be calculated and the display aspect from one given location can be predicted for the right and left eyes of the observer. The approach can be extended to displays presenting color shifts using color viewing angle measurements instead of luminance one s. For multi-view displays, the same approach can be applied and the recovery between the different views can be quantified. We hope that these new tools will be helpful to the engineers in charge of the development of such displays, and also that their will offer better quantified comparisons between displays for the customer. 70

16 Figure 19: Simulated aspect of the twin view display from OVD (37.2cm) for right and left eyes and White- Black or Black-White patterns: the simulated color is reported for an emissive 5x5cm surface; the emissive properties are assumed similar to the center at each point of the display. References 1 H. Hong, Autostereoscopic 2D/3D switching display using electric field driven LC lens, 25.3, SID N. Takanashi, S. Uehara, J. Ishii, H. Hayana, H. Asada, Dual lenticular lense based 2D/3D convertible autostereoscopic display, J. of SID, 335, 12, S. Uehara, N. Ikeda, N. Takanashi, M. Iriguchi, M. Sugimoto, T. Matsuzaki, H. Asada, a 470x235 ppi poly-si TFT LCD for high resolution 2D and 3D autostereoscopic displays, J. of SID, 209, 13, T. Jarvenpaa, M. Salmimaa, Optical characterization methods for autostereoscopic 3D displays, Eurodisplay M. Salmimaa, T. Jarvenpaa, Objective evaluation of multi-view autostereoscopic displays, 20.4, SID J. Montgomery, Analysis of the performance of a flat panel display system convertible between 2D and 3D modes, Proc. SPIE, Vol. 4297, Y. Nojiri, Visual comfort/discomfort and visual fatigue caused by stereoscopic HDTV viewing, Proc. SPIE, Vol. 5291, 303, P. Boher, Study of LCD emission features yields clues to display aspect, Display Devices, Fall 05, N 40, 23, P. Boher, V. Gibour, T. Leroux, Can we predict LCD aspect using local viewing angle measurements, IDW05, 2005 Dec 6-9, Takamatsu, Japan 10 D. Sandin, T. Margolis, J. Ge, J. Girado, T. Peterka, T. DeFanti, The VarrierTM autostereoscopic virtual reality display, Proc. ACM SIGGRAPH, 894, N. Holliman, 3D display systems,

Optical characterization of auto-stereoscopic 3D displays: interest of the resolution and comparison to human eye properties

Optical characterization of auto-stereoscopic 3D displays: interest of the resolution and comparison to human eye properties Optical characterization of auto-stereoscopic 3D displays: interest of the resolution and comparison to human eye properties Pierre Boher, Thierry Leroux, Thibault Bignon, Véronique Collomb-Patton ELDIM,

More information

Mu lt i s p e c t r a l

Mu lt i s p e c t r a l Viewing Angle Analyser Revolutionary system for full spectral and polarization measurement in the entire viewing angle EZContrastMS80 & EZContrastMS88 ADVANCED LIGHT ANALYSIS by Field iris Fourier plane

More information

Published online: 30 Sep 2014.

Published online: 30 Sep 2014. This article was downloaded by: [P. Boher] On: 02 October 2014, At: 23:07 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer

More information

Council for Optical Radiation Measurements (CORM) 2016 Annual Technical Conference May 15 18, 2016, Gaithersburg, MD

Council for Optical Radiation Measurements (CORM) 2016 Annual Technical Conference May 15 18, 2016, Gaithersburg, MD Council for Optical Radiation Measurements (CORM) 2016 Annual Technical Conference May 15 18, 2016, Gaithersburg, MD Multispectral measurements of emissive and reflective properties of displays: Application

More information

Characterization of one Time-Sequential Stereoscopic 3D Display - Part I: Temporal Analysis -

Characterization of one Time-Sequential Stereoscopic 3D Display - Part I: Temporal Analysis - Journal of Information Display, Vol. 11, No. 2, June 2010 (ISSN 1598-0316) 2010 KIDS Characterization of one Time-Sequential Stereoscopic 3D Display - Part I: Temporal Analysis - Pierre Boher, Thierry

More information

A Fast Image Multiplexing Method Robust to Viewer s Position and Lens Misalignment in Lenticular 3D Displays

A Fast Image Multiplexing Method Robust to Viewer s Position and Lens Misalignment in Lenticular 3D Displays A Fast Image Multiplexing Method Robust to Viewer s Position and Lens Misalignment in Lenticular D Displays Yun-Gu Lee and Jong Beom Ra Department of Electrical Engineering and Computer Science Korea Advanced

More information

Reprint. from the Journal. of the SID

Reprint. from the Journal. of the SID A 23-in. full-panel-resolution autostereoscopic LCD with a novel directional backlight system Akinori Hayashi (SID Member) Tomohiro Kometani Akira Sakai (SID Member) Hiroshi Ito Abstract An autostereoscopic

More information

m e a s u r e m e n t

m e a s u r e m e n t Viewing Co n e m e a s u r e m e n t The world leader for Fourier optics viewing angle instruments For Luminance, Chromaticity, Radiance & Polarization EZLite, EZContrastL80, EZContrastL80W, EZContrastXL88

More information

Chapter 34. Thin Lenses

Chapter 34. Thin Lenses Chapter 34 Thin Lenses Thin Lenses Mirrors Lenses Optical Instruments MFMcGraw-PHY 2426 Chap34a-Lenses-Revised: 7/13/2013 2 Inversion A right-handed coordinate system becomes a left-handed coordinate system

More information

Extended Fractional View Integral Photography Using Slanted Orthogonal Lenticular Lenses

Extended Fractional View Integral Photography Using Slanted Orthogonal Lenticular Lenses Proceedings of the 2 nd World Congress on Electrical Engineering and Computer Systems and Science (EECSS'16) Budapest, Hungary August 16 17, 2016 Paper No. MHCI 112 DOI: 10.11159/mhci16.112 Extended Fractional

More information

Natural Viewing 3D Display

Natural Viewing 3D Display We will introduce a new category of Collaboration Projects, which will highlight DoCoMo s joint research activities with universities and other companies. DoCoMo carries out R&D to build up mobile communication,

More information

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor COSC579: Scene Geometry Jeremy Bolton, PhD Assistant Teaching Professor Overview Linear Algebra Review Homogeneous vs non-homogeneous representations Projections and Transformations Scene Geometry The

More information

Scanner Parameter Estimation Using Bilevel Scans of Star Charts

Scanner Parameter Estimation Using Bilevel Scans of Star Charts ICDAR, Seattle WA September Scanner Parameter Estimation Using Bilevel Scans of Star Charts Elisa H. Barney Smith Electrical and Computer Engineering Department Boise State University, Boise, Idaho 8375

More information

3.3 Implementation of a Lenticular 3D Display

3.3 Implementation of a Lenticular 3D Display 56 Chapter 3 integral imaging can be understood as the number of different pixel data within a certain viewing angle. The angular resolution is determined by the number of pixels on the flat-panel display

More information

Diffraction. Single-slit diffraction. Diffraction by a circular aperture. Chapter 38. In the forward direction, the intensity is maximal.

Diffraction. Single-slit diffraction. Diffraction by a circular aperture. Chapter 38. In the forward direction, the intensity is maximal. Diffraction Chapter 38 Huygens construction may be used to find the wave observed on the downstream side of an aperture of any shape. Diffraction The interference pattern encodes the shape as a Fourier

More information

Crosstalk in multiview 3-D images

Crosstalk in multiview 3-D images Invited Paper Crosstalk in multiview 3-D images * Jung-Young Son, 1 Beom-Ryeol Lee, 2 Min-Chul Park, and 2 Thibault Leportier Dept. of Biomedical Engineering, Konyang University, Nonsan, Chungnam, 320-711,

More information

Understanding Variability

Understanding Variability Understanding Variability Why so different? Light and Optics Pinhole camera model Perspective projection Thin lens model Fundamental equation Distortion: spherical & chromatic aberration, radial distortion

More information

Mobile 3D Display Technology to Realize Natural 3D Images

Mobile 3D Display Technology to Realize Natural 3D Images 3D Display 3D Image Mobile Device Special Articles on User Interface Research New Interface Design of Mobile Phones 1. Introduction Nowadays, as a new method of cinematic expression continuing from the

More information

Light: Geometric Optics

Light: Geometric Optics Light: Geometric Optics Regular and Diffuse Reflection Sections 23-1 to 23-2. How We See Weseebecauselightreachesoureyes. There are two ways, therefore, in which we see: (1) light from a luminous object

More information

TFT-LCD Technology Introduction

TFT-LCD Technology Introduction TFT-LCD Technology Introduction Thin film transistor liquid crystal display (TFT-LCD) is a flat panel display one of the most important fields, because of its many advantages, is the only display technology

More information

lecture 10 - depth from blur, binocular stereo

lecture 10 - depth from blur, binocular stereo This lecture carries forward some of the topics from early in the course, namely defocus blur and binocular disparity. The main emphasis here will be on the information these cues carry about depth, rather

More information

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers By Jeffrey L. Guttman, Ph.D., Director of Engineering, Ophir-Spiricon Abstract: The Mode-Field Diameter (MFD) and spot

More information

Department of Photonics, NCTU, Hsinchu 300, Taiwan. Applied Electromagnetic Res. Inst., NICT, Koganei, Tokyo, Japan

Department of Photonics, NCTU, Hsinchu 300, Taiwan. Applied Electromagnetic Res. Inst., NICT, Koganei, Tokyo, Japan A Calibrating Method for Projected-Type Auto-Stereoscopic 3D Display System with DDHOE Ping-Yen Chou 1, Ryutaro Oi 2, Koki Wakunami 2, Kenji Yamamoto 2, Yasuyuki Ichihashi 2, Makoto Okui 2, Jackin Boaz

More information

Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different

Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different types of lens systems used Want to look at each from

More information

Convergence Point Adjustment Methods for Minimizing Visual Discomfort Due to a Stereoscopic Camera

Convergence Point Adjustment Methods for Minimizing Visual Discomfort Due to a Stereoscopic Camera J. lnf. Commun. Converg. Eng. 1(4): 46-51, Dec. 014 Regular paper Convergence Point Adjustment Methods for Minimizing Visual Discomfort Due to a Stereoscopic Camera Jong-Soo Ha 1, Dae-Woong Kim, and Dong

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Apr 22, 2012 Light from distant things We learn about a distant thing from the light it generates or redirects. The lenses in our eyes create images of objects our brains can

More information

Physical Optics. You can observe a lot just by watching. Yogi Berra ( )

Physical Optics. You can observe a lot just by watching. Yogi Berra ( ) Physical Optics You can observe a lot just by watching. Yogi Berra (1925-2015) OBJECTIVES To observe some interference and diffraction phenomena with visible light. THEORY In a previous experiment you

More information

34.2: Two Types of Image

34.2: Two Types of Image Chapter 34 Images 34.2: Two Types of Image For you to see an object, your eye intercepts some of the light rays spreading from the object and then redirect them onto the retina at the rear of the eye.

More information

Formulas of possible interest

Formulas of possible interest Name: PHYS 3410/6750: Modern Optics Final Exam Thursday 15 December 2011 Prof. Bolton No books, calculators, notes, etc. Formulas of possible interest I = ɛ 0 c E 2 T = 1 2 ɛ 0cE 2 0 E γ = hν γ n = c/v

More information

Light, Photons, and MRI

Light, Photons, and MRI Light, Photons, and MRI When light hits an object, some of it will be reflected. The reflected light can form an image. We usually want to be able to characterize the image given what we know about the

More information

Section 10. Stops and Pupils

Section 10. Stops and Pupils 10-1 Section 10 Stops and Pupils Stops and Pupils The aperture stop is the aperture in the system that limits the bundle of light that propagates through the system from the axial object point. The stop

More information

AP* Optics Free Response Questions

AP* Optics Free Response Questions AP* Optics Free Response Questions 1978 Q5 MIRRORS An object 6 centimeters high is placed 30 centimeters from a concave mirror of focal length 10 centimeters as shown above. (a) On the diagram above, locate

More information

AP Physics: Curved Mirrors and Lenses

AP Physics: Curved Mirrors and Lenses The Ray Model of Light Light often travels in straight lines. We represent light using rays, which are straight lines emanating from an object. This is an idealization, but is very useful for geometric

More information

Review of paper Non-image-forming optical components by P. R. Yoder Jr.

Review of paper Non-image-forming optical components by P. R. Yoder Jr. Review of paper Non-image-forming optical components by P. R. Yoder Jr. Proc. of SPIE Vol. 0531, Geometrical Optics, ed. Fischer, Price, Smith (Jan 1985) Karlton Crabtree Opti 521 14. November 2007 Introduction:

More information

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation

Part Images Formed by Flat Mirrors. This Chapter. Phys. 281B Geometric Optics. Chapter 2 : Image Formation. Chapter 2: Image Formation Phys. 281B Geometric Optics This Chapter 3 Physics Department Yarmouk University 21163 Irbid Jordan 1- Images Formed by Flat Mirrors 2- Images Formed by Spherical Mirrors 3- Images Formed by Refraction

More information

specular diffuse reflection.

specular diffuse reflection. Lesson 8 Light and Optics The Nature of Light Properties of Light: Reflection Refraction Interference Diffraction Polarization Dispersion and Prisms Total Internal Reflection Huygens s Principle The Nature

More information

NEAR-IR BROADBAND POLARIZER DESIGN BASED ON PHOTONIC CRYSTALS

NEAR-IR BROADBAND POLARIZER DESIGN BASED ON PHOTONIC CRYSTALS U.P.B. Sci. Bull., Series A, Vol. 77, Iss. 3, 2015 ISSN 1223-7027 NEAR-IR BROADBAND POLARIZER DESIGN BASED ON PHOTONIC CRYSTALS Bogdan Stefaniţă CALIN 1, Liliana PREDA 2 We have successfully designed a

More information

Chapter 12 Notes: Optics

Chapter 12 Notes: Optics Chapter 12 Notes: Optics How can the paths traveled by light rays be rearranged in order to form images? In this chapter we will consider just one form of electromagnetic wave: visible light. We will be

More information

Time-of-flight basics

Time-of-flight basics Contents 1. Introduction... 2 2. Glossary of Terms... 3 3. Recovering phase from cross-correlation... 4 4. Time-of-flight operating principle: the lock-in amplifier... 6 5. The time-of-flight sensor pixel...

More information

4. Recommended alignment procedure:

4. Recommended alignment procedure: 4. Recommended alignment procedure: 4.1 Introduction The described below procedure presents an example of alignment of beam shapers Shaper and Focal- Shaper (F- Shaper) with using the standard Shaper Mount

More information

Chapter 26 Geometrical Optics

Chapter 26 Geometrical Optics Chapter 26 Geometrical Optics 26.1 The Reflection of Light 26.2 Forming Images With a Plane Mirror 26.3 Spherical Mirrors 26.4 Ray Tracing and the Mirror Equation 26.5 The Refraction of Light 26.6 Ray

More information

Devices displaying 3D image. RNDr. Róbert Bohdal, PhD.

Devices displaying 3D image. RNDr. Róbert Bohdal, PhD. Devices displaying 3D image RNDr. Róbert Bohdal, PhD. 1 Types of devices displaying 3D image Stereoscopic Re-imaging Volumetric Autostereoscopic Holograms mounted displays, optical head-worn displays Pseudo

More information

Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different

Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different types of lens systems used Want to look at each from

More information

Computational Photography

Computational Photography Computational Photography Matthias Zwicker University of Bern Fall 2010 Today Light fields Introduction Light fields Signal processing analysis Light field cameras Application Introduction Pinhole camera

More information

Development of Reflector for HR-TFTs

Development of Reflector for HR-TFTs Development of Reflector for HR-TFTs Kazuhiko Tsuda* Naofumi Kimura* Shigeaki Mizushima* * Development Department 1, Liquid Crystal Display Laboratories, Liquid Crystal Display Development Group Abstract

More information

Ch 22 Inspection Technologies

Ch 22 Inspection Technologies Ch 22 Inspection Technologies Sections: 1. Inspection Metrology 2. Contact vs. Noncontact Inspection Techniques 3. Conventional Measuring and Gaging Techniques 4. Coordinate Measuring Machines 5. Surface

More information

Chapter 8: Physical Optics

Chapter 8: Physical Optics Chapter 8: Physical Optics Whether light is a particle or a wave had puzzled physicists for centuries. In this chapter, we only analyze light as a wave using basic optical concepts such as interference

More information

PHYS:1200 LECTURE 32 LIGHT AND OPTICS (4)

PHYS:1200 LECTURE 32 LIGHT AND OPTICS (4) 1 PHYS:1200 LECTURE 32 LIGHT AND OPTICS (4) The first three lectures in this unit dealt with what is for called geometric optics. Geometric optics, treats light as a collection of rays that travel in straight

More information

Video Communication Ecosystems. Research Challenges for Immersive. over Future Internet. Converged Networks & Services (CONES) Research Group

Video Communication Ecosystems. Research Challenges for Immersive. over Future Internet. Converged Networks & Services (CONES) Research Group Research Challenges for Immersive Video Communication Ecosystems over Future Internet Tasos Dagiuklas, Ph.D., SMIEEE Assistant Professor Converged Networks & Services (CONES) Research Group Hellenic Open

More information

3D Autostereoscopic Display Image Generation Framework using Direct Light Field Rendering

3D Autostereoscopic Display Image Generation Framework using Direct Light Field Rendering 3D Autostereoscopic Display Image Generation Framework using Direct Light Field Rendering Young Ju Jeong, Yang Ho Cho, Hyoseok Hwang, Hyun Sung Chang, Dongkyung Nam, and C. -C Jay Kuo; Samsung Advanced

More information

Using Edge Detection in Machine Vision Gauging Applications

Using Edge Detection in Machine Vision Gauging Applications Application Note 125 Using Edge Detection in Machine Vision Gauging Applications John Hanks Introduction This application note introduces common edge-detection software strategies for applications such

More information

9. RAY OPTICS AND OPTICAL INSTRUMENTS

9. RAY OPTICS AND OPTICAL INSTRUMENTS 9. RAY OPTICS AND OPTICAL INSTRUMENTS 1. Define the terms (a) ray of light & (b) beam of light A ray is defined as the straight line path joining the two points by which light is travelling. A beam is

More information

Michelson Interferometer

Michelson Interferometer Michelson Interferometer The Michelson interferometer uses the interference of two reflected waves The third, beamsplitting, mirror is partially reflecting ( half silvered, except it s a thin Aluminum

More information

COHERENCE AND INTERFERENCE

COHERENCE AND INTERFERENCE COHERENCE AND INTERFERENCE - An interference experiment makes use of coherent waves. The phase shift (Δφ tot ) between the two coherent waves that interfere at any point of screen (where one observes the

More information

Polarization of light

Polarization of light Polarization of light TWO WEIGHTS RECOMENDED READINGS 1) G. King: Vibrations and Waves, Ch.5, pp. 109-11. Wiley, 009. ) E. Hecht: Optics, Ch.4 and Ch.8. Addison Wesley, 00. 3) PASCO Instruction Manual

More information

TEAMS National Competition High School Version Photometry Solution Manual 25 Questions

TEAMS National Competition High School Version Photometry Solution Manual 25 Questions TEAMS National Competition High School Version Photometry Solution Manual 25 Questions Page 1 of 15 Photometry Questions 1. When an upright object is placed between the focal point of a lens and a converging

More information

Name: Chapter 14 Light. Class: Date: 143 minutes. Time: 143 marks. Marks: Comments: Page 1 of 53

Name: Chapter 14 Light. Class: Date: 143 minutes. Time: 143 marks. Marks: Comments: Page 1 of 53 Chapter 4 Light Name: Class: Date: Time: 43 minutes Marks: 43 marks Comments: Page of 53 A person can see an image of himself in a tall plane mirror. The diagram shows how the person can see his hat. (a)

More information

Prof. Feng Liu. Spring /27/2014

Prof. Feng Liu. Spring /27/2014 Prof. Feng Liu Spring 2014 http://www.cs.pdx.edu/~fliu/courses/cs510/ 05/27/2014 Last Time Video Stabilization 2 Today Stereoscopic 3D Human depth perception 3D displays 3 Stereoscopic media Digital Visual

More information

Experiment 8 Wave Optics

Experiment 8 Wave Optics Physics 263 Experiment 8 Wave Optics In this laboratory, we will perform two experiments on wave optics. 1 Double Slit Interference In two-slit interference, light falls on an opaque screen with two closely

More information

Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement

Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement Curved Projection Integral Imaging Using an Additional Large-Aperture Convex Lens for Viewing Angle Improvement Joobong Hyun, Dong-Choon Hwang, Dong-Ha Shin, Byung-Goo Lee, and Eun-Soo Kim In this paper,

More information

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482 Rigid Body Motion and Image Formation Jana Kosecka, CS 482 A free vector is defined by a pair of points : Coordinates of the vector : 1 3D Rotation of Points Euler angles Rotation Matrices in 3D 3 by 3

More information

MEASUREMENT OF PERCEIVED SPATIAL RESOLUTION IN 3D LIGHT-FIELD DISPLAYS

MEASUREMENT OF PERCEIVED SPATIAL RESOLUTION IN 3D LIGHT-FIELD DISPLAYS MEASUREMENT OF PERCEIVED SPATIAL RESOLUTION IN 3D LIGHT-FIELD DISPLAYS Péter Tamás Kovács 1, 2, Kristóf Lackner 1, 2, Attila Barsi 1, Ákos Balázs 1, Atanas Boev 2, Robert Bregović 2, Atanas Gotchev 2 1

More information

Binocular cues to depth PSY 310 Greg Francis. Lecture 21. Depth perception

Binocular cues to depth PSY 310 Greg Francis. Lecture 21. Depth perception Binocular cues to depth PSY 310 Greg Francis Lecture 21 How to find the hidden word. Depth perception You can see depth in static images with just one eye (monocular) Pictorial cues However, motion and

More information

Neurophysical Model by Barten and Its Development

Neurophysical Model by Barten and Its Development Chapter 14 Neurophysical Model by Barten and Its Development According to the Barten model, the perceived foveal image is corrupted by internal noise caused by statistical fluctuations, both in the number

More information

Reflective Illumination for DMS 803 / 505

Reflective Illumination for DMS 803 / 505 APPLICATION NOTE // Dr. Michael E. Becker Reflective Illumination for DMS 803 / 505 DHS, SDR, VADIS, PID & PLS The instruments of the DMS 803 / 505 series are precision goniometers for directional scanning

More information

POME A mobile camera system for accurate indoor pose

POME A mobile camera system for accurate indoor pose POME A mobile camera system for accurate indoor pose Paul Montgomery & Andreas Winter November 2 2016 2010. All rights reserved. 1 ICT Intelligent Construction Tools A 50-50 joint venture between Trimble

More information

Physics 11 Chapter 18: Ray Optics

Physics 11 Chapter 18: Ray Optics Physics 11 Chapter 18: Ray Optics "... Everything can be taken from a man but one thing; the last of the human freedoms to choose one s attitude in any given set of circumstances, to choose one s own way.

More information

Dispersion (23.5) Neil Alberding (SFU Physics) Physics 121: Optics, Electricity & Magnetism Spring / 17

Dispersion (23.5) Neil Alberding (SFU Physics) Physics 121: Optics, Electricity & Magnetism Spring / 17 Neil Alberding (SFU Physics) Physics 121: Optics, Electricity & Magnetism Spring 2010 1 / 17 Dispersion (23.5) The speed of light in a material depends on its wavelength White light is a mixture of wavelengths

More information

DIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY

DIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY DIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY Jacobsen, K. University of Hannover, Institute of Photogrammetry and Geoinformation, Nienburger Str.1, D30167 Hannover phone +49

More information

PHYSICS 1040L LAB LAB 7: DIFFRACTION & INTERFERENCE

PHYSICS 1040L LAB LAB 7: DIFFRACTION & INTERFERENCE PHYSICS 1040L LAB LAB 7: DIFFRACTION & INTERFERENCE Object: To investigate the diffraction and interference of light, Apparatus: Lasers, optical bench, single and double slits. screen and mounts. Theory:

More information

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ = Radiometry (From Intro to Optics, Pedrotti -4) Radiometry is measurement of Emag radiation (light) Consider a small spherical source Total energy radiating from the body over some time is Q total Radiant

More information

Polarization of Light

Polarization of Light Polarization of Light Introduction Light, viewed classically, is a transverse electromagnetic wave. Namely, the underlying oscillation (in this case oscillating electric and magnetic fields) is along directions

More information

All forms of EM waves travel at the speed of light in a vacuum = 3.00 x 10 8 m/s This speed is constant in air as well

All forms of EM waves travel at the speed of light in a vacuum = 3.00 x 10 8 m/s This speed is constant in air as well Pre AP Physics Light & Optics Chapters 14-16 Light is an electromagnetic wave Electromagnetic waves: Oscillating electric and magnetic fields that are perpendicular to the direction the wave moves Difference

More information

Paraxial into real surfaces

Paraxial into real surfaces Paraxial into real surfaces Curvature, Radius Power lens and mirrors lens maker equation mirror and lens in contact Principle planes Real Surfaces Refractive via Fermat s Principle Calculate optical path

More information

MURA & DEFECT DETECTION WITH TrueTest

MURA & DEFECT DETECTION WITH TrueTest MURA & DEFECT DETECTION WITH TrueTest January 2015 1 OUTLINE The TrueTest system Quick introduction to TrueTest layout and structure TrueTest walk-through TrueTest gallery Summary 2 WHAT IS TRUETEST? A

More information

New Opportunities for 3D SPI

New Opportunities for 3D SPI New Opportunities for 3D SPI Jean-Marc PEALLAT Vi Technology St Egrève, France jmpeallat@vitechnology.com Abstract For some years many process engineers and quality managers have been questioning the benefits

More information

Multi-projector-type immersive light field display

Multi-projector-type immersive light field display Multi-projector-type immersive light field display Qing Zhong ( é) 1, Beishi Chen (í ì) 1, Haifeng Li (Ó ô) 1, Xu Liu ( Ê) 1, Jun Xia ( ) 2, Baoping Wang ( ) 2, and Haisong Xu (Å Ø) 1 1 State Key Laboratory

More information

Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different

Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different Refractive Optical Design Systems Any lens system is a tradeoff of many factors Add optical elements (lens/mirrors) to balance these Many different types of lens systems used Want to look at each from

More information

The Death of the Aerial Image

The Death of the Aerial Image Tutor50.doc: Version 5/9/05 T h e L i t h o g r a p h y E x p e r t (August 005) The Death of the Aerial Image Chris A. Mack, KLA-Tencor, FINLE Division, Austin, Texas The aerial image is, quite literally,

More information

Effective Medium Theory, Rough Surfaces, and Moth s Eyes

Effective Medium Theory, Rough Surfaces, and Moth s Eyes Effective Medium Theory, Rough Surfaces, and Moth s Eyes R. Steven Turley, David Allred, Anthony Willey, Joseph Muhlestein, and Zephne Larsen Brigham Young University, Provo, Utah Abstract Optics in the

More information

4.5 Images Formed by the Refraction of Light

4.5 Images Formed by the Refraction of Light Figure 89: Practical structure of an optical fibre. Absorption in the glass tube leads to a gradual decrease in light intensity. For optical fibres, the glass used for the core has minimum absorption at

More information

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka CS223b Midterm Exam, Computer Vision Monday February 25th, Winter 2008, Prof. Jana Kosecka Your name email This exam is 8 pages long including cover page. Make sure your exam is not missing any pages.

More information

Reflection, Refraction and Polarization of Light

Reflection, Refraction and Polarization of Light Reflection, Refraction and Polarization of Light Physics 246/Spring2012 In today's laboratory several properties of light, including the laws of reflection, refraction, total internal reflection and polarization,

More information

Spatial and Angular Resolution Measurement of a Tensor Light Field Display

Spatial and Angular Resolution Measurement of a Tensor Light Field Display Spatial and Angular Resolution Measurement of a Tensor Light Field Display P. A. Surman 1, S. Wang 1, J. Yuan 1, Y. Zheng 1 1 Nanyang Technological University, Singapore Abstract - Dependent on the way

More information

TEAMS National Competition Middle School Version Photometry Solution Manual 25 Questions

TEAMS National Competition Middle School Version Photometry Solution Manual 25 Questions TEAMS National Competition Middle School Version Photometry Solution Manual 25 Questions Page 1 of 14 Photometry Questions 1. When an upright object is placed between the focal point of a lens and a converging

More information

3DPIXA: options and challenges with wirebond inspection. Whitepaper

3DPIXA: options and challenges with wirebond inspection. Whitepaper 3DPIXA: options and challenges with wirebond inspection Whitepaper Version Author(s) Date R01 Timo Eckhard, Maximilian Klammer 06.09.2017 R02 Timo Eckhard 18.10.2017 Executive Summary: Wirebond inspection

More information

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB HIGH ACCURACY 3-D MEASUREMENT USING MULTIPLE CAMERA VIEWS T.A. Clarke, T.J. Ellis, & S. Robson. High accuracy measurement of industrially produced objects is becoming increasingly important. The techniques

More information

Imaging Sphere Measurement of Luminous Intensity, View Angle, and Scatter Distribution Functions

Imaging Sphere Measurement of Luminous Intensity, View Angle, and Scatter Distribution Functions Imaging Sphere Measurement of Luminous Intensity, View Angle, and Scatter Distribution Functions Hubert Kostal, Vice President of Sales and Marketing Radiant Imaging, Inc. 22908 NE Alder Crest Drive, Suite

More information

Lab 5: Diffraction and Interference

Lab 5: Diffraction and Interference Lab 5: Diffraction and Interference Light is a wave, an electromagnetic wave, and under the proper circumstances, it exhibits wave phenomena, such as constructive and destructive interference. The wavelength

More information

Chapter 34. Images. In this chapter we define and classify images, and then classify several basic ways in which they can be produced.

Chapter 34. Images. In this chapter we define and classify images, and then classify several basic ways in which they can be produced. Chapter 34 Images One of the most important uses of the basic laws governing light is the production of images. Images are critical to a variety of fields and industries ranging from entertainment, security,

More information

1 Laboratory #4: Division-of-Wavefront Interference

1 Laboratory #4: Division-of-Wavefront Interference 1051-455-0073, Physical Optics 1 Laboratory #4: Division-of-Wavefront Interference 1.1 Theory Recent labs on optical imaging systems have used the concept of light as a ray in goemetrical optics to model

More information

Plane Wave Imaging Using Phased Array Arno Volker 1

Plane Wave Imaging Using Phased Array Arno Volker 1 11th European Conference on Non-Destructive Testing (ECNDT 2014), October 6-10, 2014, Prague, Czech Republic More Info at Open Access Database www.ndt.net/?id=16409 Plane Wave Imaging Using Phased Array

More information

Numerical investigation on the viewing angle of a lenticular three-dimensional display with a triplet lens array

Numerical investigation on the viewing angle of a lenticular three-dimensional display with a triplet lens array Numerical investigation on the viewing angle of a lenticular three-dimensional display with a triplet lens array Hwi Kim,, * Joonku Hahn, and Hee-Jin Choi 3 Department of Electronics and Information Engineering,

More information

Multimedia Technology CHAPTER 4. Video and Animation

Multimedia Technology CHAPTER 4. Video and Animation CHAPTER 4 Video and Animation - Both video and animation give us a sense of motion. They exploit some properties of human eye s ability of viewing pictures. - Motion video is the element of multimedia

More information

Chapter 24. Wave Optics. Wave Optics. The wave nature of light is needed to explain various phenomena

Chapter 24. Wave Optics. Wave Optics. The wave nature of light is needed to explain various phenomena Chapter 24 Wave Optics Wave Optics The wave nature of light is needed to explain various phenomena Interference Diffraction Polarization The particle nature of light was the basis for ray (geometric) optics

More information

1. What is the law of reflection?

1. What is the law of reflection? Name: Skill Sheet 7.A The Law of Reflection The law of reflection works perfectly with light and the smooth surface of a mirror. However, you can apply this law to other situations. For example, how would

More information

Directional Backlights for Time-multiplexed 3D Displays. As the advancement of image display from monochromic to color, each

Directional Backlights for Time-multiplexed 3D Displays. As the advancement of image display from monochromic to color, each Chapter 6 Directional Backlights for Time-multiplexed 3D Displays As the advancement of image display from monochromic to color, each development is driven by pursuing ever more natural visions. Therefore,

More information

2/26/2016. Chapter 23 Ray Optics. Chapter 23 Preview. Chapter 23 Preview

2/26/2016. Chapter 23 Ray Optics. Chapter 23 Preview. Chapter 23 Preview Chapter 23 Ray Optics Chapter Goal: To understand and apply the ray model of light. Slide 23-2 Chapter 23 Preview Slide 23-3 Chapter 23 Preview Slide 23-4 1 Chapter 23 Preview Slide 23-5 Chapter 23 Preview

More information

Measuring Light: Radiometry and Cameras

Measuring Light: Radiometry and Cameras Lecture 11: Measuring Light: Radiometry and Cameras Computer Graphics CMU 15-462/15-662, Fall 2015 Slides credit: a majority of these slides were created by Matt Pharr and Pat Hanrahan Simulating a pinhole

More information

Supplementary Information

Supplementary Information Supplementary Information Interferometric scattering microscopy with polarization-selective dual detection scheme: Capturing the orientational information of anisotropic nanometric objects Il-Buem Lee

More information