Transparent Object Shape Measurement Based on Deflectometry

Similar documents
Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry

LIGHT. Speed of light Law of Reflection Refraction Snell s Law Mirrors Lenses

Three Dimensional Measurements by Deflectometry and Double Hilbert Transform

An Innovative Three-dimensional Profilometer for Surface Profile Measurement Using Digital Fringe Projection and Phase Shifting

Ray Optics. Physics 11. Sources of Light Rays: Self-Luminous Objects. The Ray Model of Light

AP Physics: Curved Mirrors and Lenses

Lecture 17: Recursive Ray Tracing. Where is the way where light dwelleth? Job 38:19

Algebra Based Physics

Chapter 36. Image Formation

Geometric Optics. The Law of Reflection. Physics Waves & Oscillations 3/20/2016. Spring 2016 Semester Matthew Jones

Chapter 26 Geometrical Optics

Light: Geometric Optics (Chapter 23)

Light: Geometric Optics

Flexible Calibration of a Portable Structured Light System through Surface Plane

Metrology and Sensing

Double-Shot 3-D Displacement Field Measurement Using Hyperspectral Interferometry

index of refraction-light speed

Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware

Chapters 1 7: Overview

Coherent Gradient Sensing Microscopy: Microinterferometric Technique. for Quantitative Cell Detection

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

Chapter 7: Geometrical Optics. The branch of physics which studies the properties of light using the ray model of light.

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Innovations in beam shaping & illumination applications

Optics II. Reflection and Mirrors

3B SCIENTIFIC PHYSICS... going one step further

Reflection and Refraction

Ray-Tracing. Misha Kazhdan

Chapter 26 Geometrical Optics

The Ray model of Light. Reflection. Class 18

Draft SPOTS Standard Part III (7)

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

Waves & Oscillations

Chapter 26 Geometrical Optics

A Source Localization Technique Based on a Ray-Trace Technique with Optimized Resolution and Limited Computational Costs

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc.

Geometrical Optics INTRODUCTION. Wave Fronts and Rays

Metrology and Sensing

On Fig. 7.1, draw a ray diagram to show the formation of this image.

Camera Model and Calibration

Outline The Refraction of Light Forming Images with a Plane Mirror 26-3 Spherical Mirror 26-4 Ray Tracing and the Mirror Equation

Chapter 7: Geometrical Optics

Extracting Sound Information from High-speed Video Using Three-dimensional Shape Measurement Method

Optics and Images. Lenses and Mirrors. Matthew W. Milligan

Enhanced two-frequency phase-shifting method

ECE-161C Cameras. Nuno Vasconcelos ECE Department, UCSD

Accurate projector calibration method by using an optical coaxial camera

FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES

Handy Rangefinder for Active Robot Vision

Reflections. I feel pretty, oh so pretty

A Theory of Multi-Layer Flat Refractive Geometry

Optics. a- Before the beginning of the nineteenth century, light was considered to be a stream of particles.

Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera

Light source estimation using feature points from specular highlights and cast shadows

Experiment 7 Geometric Optics

3D Computer Vision. Structured Light I. Prof. Didier Stricker. Kaiserlautern University.

4. A bulb has a luminous flux of 2400 lm. What is the luminous intensity of the bulb?

Photorealism: Ray Tracing

Modeling of diffractive optical elements for lens design.

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

Metrology and Sensing

Natural method for three-dimensional range data compression

ONE MARK QUESTIONS GEOMETRICAL OPTICS QUESTION BANK

Introduction Ray tracing basics Advanced topics (shading) Advanced topics (geometry) Graphics 2010/2011, 4th quarter. Lecture 11: Ray tracing

Chapter 2: Wave Optics

Ray Optics I. Last time, finished EM theory Looked at complex boundary problems TIR: Snell s law complex Metal mirrors: index complex

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

Historical Perspective of Laser Beam Shaping

Chapter 23. Light Geometric Optics

3B SCIENTIFIC PHYSICS

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

MEFT / Quantum Optics and Lasers. Suggested problems from Fundamentals of Photonics Set 1 Gonçalo Figueira

3D Shape and Indirect Appearance By Structured Light Transport

Physics 123 Optics Review

Dynamic 3-D surface profilometry using a novel color pattern encoded with a multiple triangular model

Reflection & Mirrors

Lecture Outline Chapter 26. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Ray Tracer Due date: April 27, 2011

The Lens. Refraction and The Lens. Figure 1a:

Exterior Orientation Parameters

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Refraction of Light. Research Problem. Materials. Procedure. Due Date. Glass Block Protractor Ruler PENCIL 4 Pins Cardboard

calibrated coordinates Linear transformation pixel coordinates

Optimal checkerboard selection for structured light system calibration

2011 Optical Science & Engineering PhD Qualifying Examination Optical Sciences Track: Advanced Optics Time allowed: 90 minutes

Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Interference with polarized light

Chapter 82 Example and Supplementary Problems

specular diffuse reflection.

WORCESTER POLYTECHNIC INSTITUTE

Chapter 34. Images. In this chapter we define and classify images, and then classify several basic ways in which they can be produced.

Reflection & Refraction

General Physics II. Mirrors & Lenses

Name Section Date. Experiment Reflection and Refraction

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Unit 9 Light & Optics

Visual Hulls from Single Uncalibrated Snapshots Using Two Planar Mirrors

Light: Geometric Optics

Transcription:

Proceedings Transparent Object Shape Measurement Based on Deflectometry Zhichao Hao and Yuankun Liu * Opto-Electronics Department, Sichuan University, Chengdu 610065, China; 2016222055148@stu.scu.edu.cn * Correspondence: lyk@scu.edu.cn; Tel.: +86-028-8546-3879 Presented at the 18th International Conference on Experimental Mechanics (ICEM18), Brussels, Belgium, 1 5 July 2018. Published: 9 July 2018 Abstract: This paper proposes a method for obtaining surface normal orientation and 3-D shape of plano-convex lens using refraction stereo. We show that two viewpoints are sufficient to solve this problem under the condition that the refractive index of the object is known. What we need to know is that (1) an accurate function that maps each pixel to the refraction point caused by the refraction of the object. (2) light is refracted only once. In the simulation, the actual measurement process is simplified: light is refracted only once; and the accurate one-to-one correspondence between incident ray and refractive ray is realized by known object points. The deformed grating caused by refraction point is also constructed in the process of simulation. A plano-convex lens with a focal length of 242.8571 mm is used for stereo data acquisition, normal direction acquisition, and the judgment of normal direction consistency. Finally, restoring the three-dimensional information of the plano-convex lens by computer simulation. Simulation results suggest that our method is feasibility. In the actual experiment, considering the case of light is refracted more than once, combining the calibration data acquisition based on phase measurement, phase-shifting and temporal phase-unwrapping techniques to complete (1) calibrating the corresponding position relationship between the monitor and the camera (2) matching incident ray and refractive ray. Keywords: deflectometry; stereo vision; fringe 1. Introduction Phase Measuring Deflectometry (PMD), which is stemmed from structured illumination measurement technology, is based on the deflection of light to determine the 3D shape of the object. In practical application, PMD can be used to measure two kinds of objects, that is, mirror or mirror like object, and transparent phase object. The former uses the reflection law of light to record the mirror image of the standard fringe of the tested object, and calculates the three-dimensional surface shape of the mirror object according to the phase change of the image fringe. The latter determines the equivalent wave-front distribution by measuring the changes of standard fringe caused by transparent objects. Petet et al. [1]. put forward a method for measuring mirror objects based on active PMD, the method determines the original ray and the offset ray caused by the object according to the ray tracing method. The intersection point, i.e., the measured mirror surface point, is calculated according to the straight line equation of two rays. Then the three-dimensional surface shape of the measured mirror surface can be obtained. Knauer [2,3], Pet and so on also proposed the stereo PMD to complete the full field measurement of the specular objects, and use the matching technology similar to stereo vision to complete the measurement of the tested object. Proceedings 2018, 2, 548; doi:10.3390/icem18-05428 www.mdpi.com/journal/proceedings

Proceedings 2018, 2, 548 2 of 8 Canabal [4] proposed a work to test the surface shape of a graded refractive index positive planoconvex lens by using structured illumination measurement technology. A computer monitor is used to display the gray-scale modulated grating pattern and to photograph the deformation fringe and standard fringe respectively. The light deflection caused by the phase object is calculated from the phase difference and converted into the wave-front slope, which can lead the reconstruction of the wave surface distribution of the measured object. Liu et al. [5] proposed an active PMD technique with simple operation and high flexibility. In a calibrated camera system, the deflected ray direction through the measured object is determined by the active movement of the standard sinusoidal fringe plane (without any calibrated mechanical device). Compared with the original ray without the object, the deflection angle and the wave-front gradient can be calculated, and the reconstruction of the wave-front is completed according to the numerical integration. In addition, the researchers also try to calculate the surface of transparent objects by means of ray tracing. Among them, Nigel J. Morris and Kiriakos N. Kutulakos [6] developed an optimized algorithm based on the stereo vision to complete the instantaneous measurement of liquid morphologies with time flow. In this method, a standard checkerboard is placed under the liquid, for a ray from the camera, the deflection of the ray is only related to the liquid surface and the refractive index of the liquid, and each incident ray is refracted only once. Therefore, when the system parameters are known, the surface shape of liquid can be determined by stereo normal vector consistency constraint. A method based on stereo vision to measure the shape of plano-convex lens is proposed in this paper. The two orthogonal sinusoidal fringes are displayed on a computer screen, which can produce enough information to calculate the world coordinates in a calibrated stereo-vision system. Once the premise of refractive index is known, the one-to-one correspondence between the refraction points and the imaging points of the two cameras are found respectively, the incident ray and refractive ray can be obtained, a height of the object is assumed, the corresponding normals of the two cameras can be calculated by refraction law. Because the normal of the object is only related to the shape of the object itself, there is only one normal direction. Only when the two normals of the candidate object point under the two cameras are equal, the candidate object point information can be considered as the real three-dimensional information of the object. 2. Principle When the two camera system is calibrated, we can use the stereo normal vector consistency constraint to calculate the 3D data for the tested transparent object. In this paper, we treat the camera model as a pinhole model, describing the imaging process of the camera based on perspective projection, and assumed that there is only one refraction for each incident ray. 2.1. Laws of Refraction in Vector Form According to the laws of refraction in vector form (Figure 1a), we can know the incident ray, the refractive ray, and the normal are coplanar. If we know the incident ray and the refractive ray, the normal can be linearly represented by the incident ray and refractive ray as: N ( r βi) / ( r βi) = (1) where N is the normalized normal of the incident ray, i and r are the normalized incident and refractive rays, β is the ratio of n and n, which represent the refractive index in the air and that of the plano-convex lens medium respectively. As shown in Figure 1b, when calculating the normal direction, we know that the incident ray must pass through the camera s optical center C1 in the pinhole camera model. Assumed that Q1 is the point of the corresponding refractive ray on the refraction plane, S is the object point, the incident ray and the refractive ray can be expressed as [7]:

Proceedings 2018, 2, 548 3 of 8 S C1 Q1 S i = r = S C1 Q1 S (2), Similarly, if we know the incident ray and the normal, the refractive ray can be got by: r = i + ΓN (3) where Γ = n ' 2 n 2 2 + ( N i) N i. Figure 1. The sketch of Refraction. Laws of refraction in vector form; correspondence between the pixel point and the refraction point. 2.2. Stereo Normal Vector Consistency Constraint When the displayed fringes are imaged via a transparent object, the refraction ray will follow the refraction law and hit a point on the screen, i.e., a pixel on the CCD chip will be matched with the point on the screen. But the information is not enough to help us find the exact location of the transparent object, because there are more candidate object points as shown in Figure 2a. If a second camera is employed, we can get the exact location (Figure 2b). Figure 2. Ambiguities in single view; stereo restoration of 3d shape of objects. The process of stereo determining the position of an object is as follows (Figure 2b): First, starting from CCD1, the incident ray C1P1 is determined by the camera optical center C1 and the image point P1, P1 is the image of Q1 on the refraction plane passing through a point S of the object surface. Apparently all the points, e.g., S and S1, which are on the incident ray C1P1, can satisfy the refraction law as shown in Figure 2a. Second, when S and S1 are projected onto the CCD2 image plane, the pixel P2 and P3 can be got, and they are matched with the points Q2 and Q3 on the refraction plane respectively. Then the incident ray and refractive ray corresponding to the CCD2 camera for point S

Proceedings 2018, 2, 548 4 of 8 are SP2 and SQ2 respectively, and a new normal (N ) is calculated. For the assumed point S1, we can get another normal (N1 ). Then we can calculate the normal difference. E= arccos( N N') (4) where N and N are the two normals from left camera and right camera separately. Apparently for each candidate point on the incident ray C1P1, there will be a lot of normals. Only when E is zero, i.e., the two normals are collinear, it is considered that the point Sis the correct position of the object. But in the real case, E can t be zero, then a proper threshold has to be used to find the correct normal. With this method, the whole three-dimensional information of the object can be obtained. 2.3. Pose Calculation To calculate the monitor position, the sinusoidal fringe patterns are displayed on the monitor [8]. Typically, a standard sinusoidal grating intensity function can be expressed by: I x( x I x, y) = a + bcos(2πx / P ) (5) y ( y x, y) = a + bcos(2πy / P ) (6) Equations (5) and (6) are one-dimensional fringes with only x and y direction respectively. Among them, a, b are the normal number; Px and Py is fringe period respectively. The well-known phase-shifting and temporal phase-unwrapping techniques can be used to extract the phase distributions of each pixel. Then the 3D coordinates can be got by: x y = ϕu( u, v) Px = ϕv( u, v) Py z = 0 2π + k 2π + k x0 y0 (7) where kx0, ky0 are constant w.r.t the starting point of the phase unwrapping. ϕu(u,v) and ϕv(u,v) are the unwrapped phase distributions for the two direction fringes respectively. Then the monitor position in the camera coordinate can be got. When the tested object is placed above the monitor, the same pixel will see a different point of the monitor. Then we can get the corresponding world coordinates (x,y,z) via the deformed phase distributions via the known extrinsic parameters. 3. Simulation In simulation, the resolution of both cameras is 1200 1600, and the focal length of the two camera is 5905.1 pixel and 5862.4 pixel. The refractive index is 1.7. It is assumed that the tested object is a sphere with a radius of 100 mm, and assumed that there is only one refraction and the refractive ray will hit the fringe screen, which is located at the z = 0 mm. The tested object is shown in Figure 3a in the left camera coordinate system. In Figure 3b, it shows the relationship between the angle and the candidate z coordinate for one pixel. The restored object after one calculation is shown in Figure 3c. The final relationship between the angle and the candidate z coordinates after iterative process, and the final object shape are shown in Figure 3d,e. The reconstruction error is shown in Figure 3f. The simulated original and deformed grating are shown in Figure 4a,b.

Proceedings 2018, 2, 548 5 of 8 (c) (d) (e) (f) Figure 3. The simulation results. The tested object in the left camera coordinate system; the relationship between the angle and the candidate z coordinate for one pixel; (c) the restored object; (d) the final relationship between the angle and the candidate z coordinates after iterative process; (e) the final object shape; (f) the reconstruction error.

Proceedings 2018, 2, 548 6 of 8 4. Experiment Figure 4. The original grating and the deformed grating. Vertical; horizontal. In the experiment, the setup and the tested plano-convex lens are shown in Figure 5a,b. The monitor is a Philips 170S with resolution 1280 1024 and the lateral resolution 0.264 mm/pixel, and the sinusoidal fringe period is 16 pixels. The resolution of two cameras is 1200 1600 with the pixel size 4.4 μm 4.4 μm. The focal lengths are 5905.1 and 5862.4 pixels. The standard sinusoidal fringes are displayed on the monitor as shown in Figure 5c,d. Note that there is no significant deformation on the captured fringe images, when the tested plano-convex lens is attached on the monitor as shown in Figure 5e,f. Therefore, the tested object is placed above the monitor to produce the significant deformation as shown in Figure 5g,h. But it makes the points of the refraction ray, which go out from the plano-convex lens, become unknown. Then we introduce another measurement by moving the screen to give us another point on the refraction ray. Then we can get the exact direction of the refraction ray. Assumed that the location of the back side of the plano-convex lens is known, then we can get all the exact points of the refraction ray, where is the location of the backside surface. The process is shown in Figures 6 and 7 shows estimated location vs. the z coordinates and the preliminary results. (c) (d) (e) (f) (g) (h) Figure 5. The experimental setup; the tested plano-convex lens; (c,d) the captured original standard sinusoidal fringes in the vertical and horizontal directions, respectively; (e,f) the fringe images by placing the plano-convex lens directly on the monitor; (g,h) the fringe images by placing the plano-convex lens above the monitor.

Proceedings 2018, 2, 548 7 of 8 Figure 6. The experiment process. Figure 7. The relationship between the candidate object and the angle; the preliminary results. 5. Conclusions Based on the active PMD technique, a new method for restoring the three-dimensional information of transparent phase object is proposed in this paper. In this method, the tested object is placed between the calibrated CCD and the monitor, the standard fringes and the deformed fringes are taken by CCD, moving the monitor to complete the measurement again. According to the stereo normal vector consistency constraint, the 3D shape of the tested object can be restored. The method is reasonable and feasible as shown in the simulation. In the experiment, the minimum angle and the preliminary results can be found according to the stereo normal vector consistency constraint. There is no doubt that the improvement of the guessed backside positions can lead to the more accurate three-dimensional information of the tested object. Acknowledgments: The authors acknowledge the support by National Key Scientific Apparatus Development project (2013YQ49087901). References 1. Petz, M.; Tutsch, R. Measurement of optically effective surfaces by imaging of gratings. Proc. SPIE Int. Soc. Opt. Eng. 2003, 5144, 288 294. 2. Knauer, M.C.; Hausler, G. Phase measuring deflectometry: A new approach to measure specular free-form surfaces. Proc. SPIE Int. Soc. Opt. Eng. 2004, 5457, 366 376. 3. Kaminski, J.; Lowitzsch, S.; Knauer, M.C.; Häusler, G. Full-Field Shape Measurement of Specular Surfaces. Fringe 2005; Springer: Berlin/Heidelberg, Germany, 2006; pp. 372 379. 4. Canabal, H.A.; Alonso, J. Automatic wavefront measurement technique using a computer display and a charge-coupled device camera. Opt. Eng. 2002, 41, 822 826. 5. Liu, Y. Research on Key Technology and Application of Phase Measurement Deflection. Ph.D. Thesis, Sichuan University, Chengdu, China, 2007.

Proceedings 2018, 2, 548 8 of 8 6. Morris, N.J.W.; Kutulakos, K.N. Dynamic Refraction Stereo. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 1518 1531. 7. Gomit, G.; Chatellier, L.; Calluaud, D.; David, L. Free surface measurement by stereo-refraction. Exp. Fluids 2013, 54, 1540. 8. Liu, Y.; Su, X. Camera Calibration method based on Phase Measurement. Opto-Electron. Eng. 2007, 34, 65 69. 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).