Optical 3D Sensors for Real Applications Potentials and Limits Applications pratiques de capteurs optiques tridimensionnels: potentiel et limites

Similar documents
Roughness parameters and surface deformation measured by "Coherence Radar" P. Ettl, B. Schmidt, M. Schenk, I. Laszlo, G. Häusler

Flying Triangulation Acquiring the 360 Topography of the Human Body on the Fly

A RADIAL WHITE LIGHT INTERFEROMETER FOR MEASUREMENT OF CYLINDRICAL GEOMETRIES

Information Theoretical Optimization for. Optical Range Sensors

REMOTE SENSING OF SURFACE STRUCTURES

THREE DIMENSIONAL ACQUISITION OF COLORED OBJECTS N. Schön, P. Gall, G. Häusler

Metrology and Sensing

White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Hyperspectral interferometry for single-shot absolute measurement of 3-D shape and displacement fields

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

Overview of Active Vision Techniques

Addressing High Precision Automated Optical Inspection Challenges with Unique 3D Technology Solution

Winter College on Optics in Environmental Science February Adaptive Optics: Introduction, and Wavefront Correction

Metrology and Sensing

Advanced Stamping Manufacturing Engineering, Auburn Hills, MI

Range Sensors (time of flight) (1)

Metrology and Sensing

What is Frequency Domain Analysis?

Engineered Diffusers Intensity vs Irradiance

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

Optical coherence tomography with the "Spectral Radar" - Fast optical analysis in volume scatterers by short coherence interferometry

An Intuitive Explanation of Fourier Theory

PHYSICS 1040L LAB LAB 7: DIFFRACTION & INTERFERENCE

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc.

Structured light 3D reconstruction

Measurements using three-dimensional product imaging

Coupling of surface roughness to the performance of computer-generated holograms

The role of light source in coherence scanning interferometry and optical coherence tomography

Contour LS-K Optical Surface Profiler

Physics 1C DIFFRACTION AND INTERFERENCE Rev. 2-AH. Introduction

Optical Topography Measurement of Patterned Wafers

Nanorelief measurements errors for a white-light interferometer with chromatic aberrations

Switzerland ABSTRACT. Proc. of SPIE Vol N-1

Inspection of Laser Generated Lamb Waves using Shearographic Interferometry

3D Modeling of Objects Using Laser Scanning

Optimized Design of 3D Laser Triangulation Systems

Digital Volume Correlation for Materials Characterization

Eric Lindmark, Ph.D.

Techniques of Noninvasive Optical Tomographic Imaging

11.1A Optics in different regimes

Full-field optical methods for mechanical engineering: essential concepts to find one way

AUTOMATED 4 AXIS ADAYfIVE SCANNING WITH THE DIGIBOTICS LASER DIGITIZER

Improving the 3D Scan Precision of Laser Triangulation

The main problem of photogrammetry

specular diffuse reflection.

Chapters 1-4: Summary

COMPUTATIONAL AND EXPERIMENTAL INTERFEROMETRIC ANALYSIS OF A CONE-CYLINDER-FLARE BODY. Abstract. I. Introduction

Construction of an Active Triangulation 3D Scanner for Testing a Line Following Strategy

Depth Sensors Kinect V2 A. Fornaser

Inspection system for microelectronics BGA package using wavelength scanning interferometry

Ch 22 Inspection Technologies

Micro Cutting Tool Measurement by Focus-Variation

s70 Prototype of a Handheld Displacement Measurement System Using Multiple Imaging Sensors

Physical or wave optics

Chapter 37. Wave Optics

CODE Analysis, design, production control of thin films

Fiber Probe with Interferometric Z-Measurement

Seam tracking for fillet welds with scanner optics

IMAGE DE-NOISING IN WAVELET DOMAIN

4D Technology Corporation

Related topics Interference, wavelength, refractive index, speed of light, phase, virtuallight source.

PHY 171 Lecture 6 (January 18, 2012)

Ray Optics. Lecture 23. Chapter 23. Physics II. Course website:

arxiv: v1 [physics.optics] 9 Jan 2014

Optics. a- Before the beginning of the nineteenth century, light was considered to be a stream of particles.

Surface and thickness profile measurement of a transparent film by three-wavelength vertical scanning interferometry

Tilted Wave Interferometer Improved Measurement Uncertainty. Germany. ABSTRACT

Lecture Wave Optics. Physics Help Q&A: tutor.leiacademy.org

Development of 3D Dimension Inspection Device by Image Processing

METHODS OF HOLOGRAPHIC INTERFEROMETRY FOR INDUSTRIAL MEASUREMENTS

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers

Exploiting scattering media for exploring 3D objects

COMPUTER AND ROBOT VISION

Coherent digital demodulation of single-camera N-projections for 3D-object shape measurement: Co-phased profilometry

Computer Vision. Introduction

COMET 5 3D DIGITIZING

Other Reconstruction Techniques

Precise flatness measurement

Draft SPOTS Standard Part III (7)

MICHELSON S INTERFEROMETER

Shading. Brian Curless CSE 557 Autumn 2017

Edge Detection Techniques in Digital and Optical Image Processing

Center for Nondestructive Evaluation The Johns Hopkins University Baltimore, Maryland 21218

Optics Vac Work MT 2008

CHEM-E5225 :Electron Microscopy Imaging I

Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship

ENGN D Photography / Spring 2018 / SYLLABUS

Using Fringe Projection Phase-Shifting to Correct Contact Angles for Roughness Effects. June th, 2016 Greg Wills Biolin Scientific

Application of Photopolymer Holographic Gratings

Optimization of white light interferometry on rough surfaces based on error analysis

HOLOEYE Photonics. HOLOEYE Photonics AG. HOLOEYE Corporation

A Survey of Light Source Detection Methods

Tridimensional invariant correlation based on phase-coded and sine-coded range images

f. (5.3.1) So, the higher frequency means the lower wavelength. Visible part of light spectrum covers the range of wavelengths from

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

A Michelson Interferometer-Based Method For Measuring The Angle Of Rotation

Simple Spatial Domain Filtering

SUPPLEMENTARY INFORMATION

Transcription:

Optical 3D Sensors for Real Applications Potentials and Limits Applications pratiques de capteurs optiques tridimensionnels: potentiel et limites Abstract Optical 3D-sensors measure local distances or the shape of surfaces, from the nanometer regime to the meter regime. Surprisingly, only three different physical mechanisms of signal formation are necessary to cover this range. These mechanisms determine different limits of the ultimate measuring uncertainty. We will discuss those limits of optical 3D-sensors and give rules to select the proper sensors for different applications. Les capteurs optiques tridimensionnels permettent de mesurer des distances locales ou la forme de surfaces, dans un domaine allant du nanomètre au mètre. Etonnamment, seulement trois principes physiques différents décrivant la formation du signal sont nécessaires pour couvrir ce domaine. Ces principes déterminent les limites de l incertitude de mesure qu il est possible d obtenir. Nous discuterons les limites des capteurs optiques tridimensionnels et donnerons quelques règles pour choisir le(s) capteur(s) approprié(s) pour différentes applications. Peter Klinger 1 peter.klinger@ physik.uni-erlangen.de Klaus Veit 1,2 veit@3d-shape.com Gerd Häusler 1 haeusler@physik.uni-erlangen.de Stefan Karbacher 2 karbacher@3d-shape.com Xavier Laboureux 1,2 laboureux@3d-shape.com 1 Chair for Optics University of Erlangen-Nuremberg Staudtstrasse 7/B2 D-91058 Erlangen Germany +49-9131-852-8372 kerr.physik.uni-erlangen.de 2 3D-SHAPE GmbH Henkestr.127 D-91052 Erlangen Germany +49-9131-977959-0 www.3d-shape.com

Introduction Most of the problems of industrial inspection, reverse engineering and virtual reality require data about the geometrical shape of objects in 3D space. Such 3D data offer advantages over 2D data: shape data are invariant against alteration of the illumination, soiling and object motion. Unfortunately, those data are much more difficult to acquire than video-data about the two-dimensional local reflectivity of objects. In our talk we will discuss the physics of 3D sensing, and will address the following subjects: different type of illumination (coherent or incoherent, structured or unstructured), interaction of light with matter (coherent or incoherent, at rough surfaces or at smooth surfaces), the consequences of Heisenberg's uncertainty relation. The knowledge of physical limits of the measuring uncertainty enables the design of optimal sensors that work at those limits and helps to judge available sensors. We will show that the vast number of known optical 3D sensors is based on only three different principles. The three principles are different in terms of how the measuring uncertainty scales with the object distance. We will further learn that with only two or three different sensors a great majority of problems of automatic inspection or virtual reality can be solved. We will not explain many sensors in detail, we will rather discuss the potentials and limitations of the major sensor principles for the physicist, as well as for the benefit of the user of optical 3D sensors: laser triangulation, phase measuring triangulation white light interferometry on rough surfaces As mentioned above, it turns out that with this set of sensors 3D data of objects of different kind or material can be acquired. The measuring uncertainty ranges from about 1 nanometer to a few millimeters, depending on the principle and on the measuring range. We will give examples of the potentials of each sensor, by examples of measured objects and by discussion of the physical and technological drawbacks. We will specifically address the interests of potential users of these sensors concerning the applicability to real problems. Here, we briefly explain the three principles. In laser triangulation systems we project a laser spot onto the surface under test, from a certain direction of illumination, and we observe the spot by a video line array, from a different direction of observation. The angle between both directions is called the angle of triangulation, see Figure 1. If the object distance changes, the lateral position of the spot image changes as well. With simple geometric calculations, we can evaluate the distance of the spot from its lateral position. There is a straight forward improvement of laser-spot-triangulation, by projecting a line, instead of a point. Figure 1: Principle of triangulation.

This is sometimes called laser sectioning, because an observing video camera sees a profile ( cross section ) of the surface under test. To acquire the entire surface, a one-dimensional scan of the laser line over the object is necessary. With phase measuring triangulation (PMT) we can further proceed from a line-sensor to an area-sensor that measures the shape z(x, y) of an entire surface patch, without any scanning. The basic idea is to project a grid pattern onto the object. If the object surface is curved, the camera observes curved grid lines. If we project sinusoidal patterns with different phase shifts, it can be shown that from at least three exposures we can derive the local phase of the grid image, and, hence, the distance of each object point (Figure 2). It is possible to project a perfect sinusoidal pattern with a binary mask, by using an astigmatic projection lens system (Figure 3). Figure 2: Sinusoidal fringes projected onto the surface of a half sphere observed from the camera s viewpoint. Figure3: Principle of astigmatic projection for phase measuring triangulation. The two principles discussed so far are based on triangulation. The third principle of our list is white light interferometry. Interferometry is essentially based on time-of-flight measurement, by interference of the object light wave with a reference light wave. Distance variation of the object will cause phase variation of the object wave. Since those phase variations can be measured with extreme accuracy (better than λ/1000), we can measure shape variations in the sub-nanometer regime. This, however, works only for optically smooth (polished) surfaces. For rough surfaces, a phase evaluation is impossible, since the object wave suffers from speckle noise, which means, the phase is arbitrary and does not contain information about the distance. Instead, we use the temporal coherence properties, to detect the time-of-flight of the object wave. We make use of the fact that the reference wave and the object wave display interference contrast only if the path length difference is smaller than the coherence length of the source (Figure 4). This gives us the possibility to measure the shape of macroscopic objects with an uncertainty of only one micrometer. It should be noted that in interferometric sensors illumination and observation are coaxial. Hence, we can look into narrow holes.

Figure 2: Principle of the coherence radar (left).the correlogram (right) shows the (temporal) interference pattern in one single speckle while scanning the object along the z-axis. The Physical Limits of Optical 3D Data Acquisition There are several reasons that limit the optical acquisition of 3D data. We developed a theory about the physical limits of optical 3D sensors [1]. According to this theory, there are only three different physical measuring principles for optical 3D sensors. They differ in the way the measuring uncertainty scales with the object distance. The best known and widely used sensors are based on triangulation (we called these sensors type I ). Their performance is limited by coherent noise. The measuring uncertainty scales with the square of the object distance. There is another class of sensors ( type II ): these sensors are based on white light interferometry on rough surfaces. The signal formation is quite complex and different from classical interferometry, that is why we gave this principle a new name coherence radar [2]. The coherence radar is characterized by the surprising feature that the measuring uncertainty does not at all scale with the object distance, but just with the surface roughness. Classical interferometry at smooth surfaces has a third type ( type III ) of scaling behavior [3]. It features optical averaging over the microtopography: the measuring uncertainty is proportional to the inverse off the standoff. Conclusion It is quite useful to understand the physical limits of optical 3D-sensors, because we can judge existing sensors whether they already reach the physical limits or if there is room for technical improvements (or if the advertising of sensors displays a better performance than what is allowed by physics). In our group, our ambition is to build sensors that reach those physical limits. Yet, it is not sufficient to know the physical limits, because there are a lot of non physical boundary conditions that may keep us away from reaching the physical limits. Such boundary conditions are, for example, specularly reflecting surfaces, volume scatterers, strongly tilted surfaces, large dynamical range of reflectivity, and moving objects (living people). Such boundary conditions require not only physical knowledge but careful choice and design of the technology. Our sensor based on phase-measuring triangulation [4] incorporated some measures to overcome the difficulties above, and is specifically fast for medical applications.

References [1] G. Häusler, P. Ettl, M. Schenk, G. Bohn, I. Laszlo. Limits of Optical Range Sensors and How to Exploit Them. In T. Asakura, ed., Trends in Optics and Photonics, ICO IV, Springer Series in Optical Sciences, Vol. 74, pp. 328-342, Springer Verlag Berlin, Heidelberg, New York, 1999. [2] T. Dresel, G. Häusler, H. Venzke. 3D-sensing of rough surfaces by coherence radar. Appl. Opt. 31, No. 7 (1992) pp. 919-925. [3] G. Häusler, M. B. Hernanz, R. Lampalzer, and H. Schönfeld. 3D Real Time Camera. In W. Jüptner and W. Osten, eds., Fringe '97, 3rd International Workshop on Automatic Processing of Fringe Pattern, 1997. [4] M. Gruber, G. Häusler. Simple, robust and accurate phase-measuring triangulation. Optik 89, No. 3 (1992) pp. 118-122.