Information Theoretical Optimization for. Optical Range Sensors

Size: px
Start display at page:

Download "Information Theoretical Optimization for. Optical Range Sensors"

Transcription

1 Information Theoretical Optimization for Optical Range Sensors Christoph Wagner and Gerd Häusler Department of Physics, Chair of Optics, Friedrich-Alexander-University Erlangen-Nuremberg, Staudtstr. 7/B2, Erlangen, Germany Most of the known optical range sensors require a big amount of 2D raw data from which the 3D data is decoded, associated with considerable cost. The cost arises from expensive hardware as well as from the time necessary to acquire the images. We will address the question of how one can acquire a maximum shape information with a minimum amount of image raw data, in terms of information theory. It is shown that one can greatly reduce the amount of raw data needed by proper optical redundancy reduction. Through these considerations, a 3D sensor is introduced, needing only a single color (RGB) raw image and still delivering data with only about 2µm longitudinal measurement uncertainty. c 2003 Optical Society of America 2003 Optical Society of America. This paper was published in Applied Optics 42(27), pp , 20. Sept

2 OCIS codes: , , Introduction An important question in three-dimensional (3D) metrology, as in many other fields of science, is the question of the fundamental limits. Why are limit questions important? Firstly to satisfy our scientific curiosity, and secondly to define limits for technology to strive towards, as surpassing there is unrealistic. In this paper we will pose the question: how much two-dimensional (2D) raw data do we need in order to measure the shape of object surfaces in 3D space. This question is important because common optical range sensors require a big amount of image raw data in order to eventually calculate the 3D data. Since the acquisition of raw data is expensive, it would be of great value to find sensor principles which are less demanding in terms of raw data. We will discuss how the amount of raw data can be managed better and what an economic 3D sensor should look like. Before we come back to this question we want to mention two other limits which have been discussed in the past literature. The first limiting factor concerns the fundamental measurement uncertainty of optical range sensors. It turns out that even with enormous technical effort, there is a physical limit to the measurement uncertainty which cannot be overcome. Ingelstam was one of the first authors who discussed the physical limits for classical interferometry. 1 The limits of triangulation are studied by Dorsch et al.: 2 where it is shown that for all types of triangulation sensors this measurement uncertainty is due to speckle noise. Speckle noise does not only occur with laser illumination, but is also present in par- 2

3 tially coherent illumination. Speckle noise is the limiting factor when localizing an optically rough object in space. It can be shown that this uncertainty can also be derived from the application of Heisenberg s uncertainty principle. Besides classical interferometry and triangulation, there is one more physical principle for range sensors: time-of-flight measurement, which includes the white light interferometry on optically rough surfaces. 3 5 It is remarkable that for those types of 3D sensors, the measuring uncertainty is not given by the apparatus but by the roughness of the object surface. Now we turn to the question of localization (registration) accuracy, which is important when trying to achieve a full 360-degree 3D view of an object. The acquisition of 360-degree views requires the registration of several single 3D views. The registration is limited by the accuracy of the localization. This limit was investigated in a recent publication. 6 The investigations show that the localization accuracy is directly related to the surface curvature and the noise. This result is a 3D generalization of results achieved earlier by Yaroslavsky. 7 Now that the ultimate measurement uncertainty of 3D sensors and the ultimate localization accuracy is known, we return to the question posed in the beginning: how much 2D raw data do we have to acquire to find the shape of objects in 3D space? Obviously, different 3D sensors have different requirements for image raw data. Stereoscopic vision needs only two 2D images to achieve some kind of 3D data acquisition. Active triangulation by fringe projection 8 needs raw data from at least three 2D images. Active triangulation by laser sectioning needs a 2D video target to acquire only one single 1D profile. Other sensors are based on time of flight techniques as described for example by Ulich. 9 With an array of high-bandwidth photo detectors it is possible to measure the distance of many object points just by measuring the time-of-flight of a laser pulse that is scattered back by 3

4 the object. In principle, detecting one single pulse at each pixel of the array is sufficient to acquire shape data. Yet there is a very high price to pay in terms of temporal bandwidth. Even with (expensive) Hz bandwidth only a depth uncertainty of about 10 mm can be achieved. For white light interferometry (on smooth and rough surfaces) we can obtain extremely high accuracy and large depth of field, however, at a cost of acquiring hundreds of raw images. This collection of examples poses some questions. 1. What is the minimum amount of raw data needed to achieve a required performance? 2. Or: inversely, how can we exploit the raw data in the best way to achieve optimal accuracy and a large measuring range? 3. Yet another formulation would be: how can we build a 3D sensor which is able to acquire only the necessary minimum of data? The tool to answer these questions is information theory. We will consider the process of the optical 3D measurement as a communication system, and therefore we refer to information theory. Information theory according to Shannon states the maximum amount of information that can be transmitted from location A to B, without errors. This amount of information is described by the so called channel capacity. In our model, the channel will be attributed to the observation branch of the sensor. 18 Since channel capacity is expensive, our aim is to provide only as much capacity as necessary to achieve a desired quality of 3D data (question 3 above). Hence, another way to pose the problem is the achievement of the highest possible channel capacity with given technology. 4

5 The insights deduced from the model will lead to answers, to the above questions, and eventually to a range sensor that is extremely economic in terms of its raw data requirement, while displaying high accuracy. In section 2 we will introduce a general communication model of optical 3D measurement. In section 3 we will focus on the channel of the communication system. The properties of the source and the source encoding will be considered in section 4. In section 5 the results will be applied in order to implement an advantageous sensor. 2. Optical 3D measurements modelled as communication system The aim of this section is to formulate the process of 3D measurement as a communication system. 19,20 A communication system consists of a source, an encoder, a channel, a decoder and a sink (Fig. 1 a)). 21 The goal of such a system is to transmit information from the source to the sink, separated by a distance in space or time. This distance is referred to as the channel. The channel is a crucial component of the system, as it limits the amount of information that can be transmitted in the presence of noise. In many cases the source signal cannot be transmitted directly over the channel. It is necessary to encode the signal. At the receive end of the channel, the signal then has to be decoded. We have to translate this model into sensor language (Fig. 1 b)). The source of information is the physical shape z(x, y) of the object, where by z(x, y) we mean the object surface in three-dimensional space. Usually, we encode the shape signal by proper illumination. There are many kinds of illumination, however the most important features are the spatial and temporal structure and the amount of coherence. After the interaction of the illuminating 5

6 wave with the object matter, we consider the reflected or scattered object wave as being encoded. The next component in Fig. 1 b) is the channel. It consists of free space propagation, optical imaging, and optical-to-electronic conversion including the A/D-conversion. 3D sensors comprise of different kinds of optical systems, such as simple lenses, lenses with spatial filters (dark field, phase contrast etc.) or interferometers. One major parameter of those optical systems, used to calculate the maximum amount of information that can be transmitted is the observation aperture. This maximum amount of transmittable information is called channel capacity. We need to know that noise is attributed to the channel in this model. For our 3D measuring process, the sources of noise are: coherent noise, photon noise, electronic noise and quantization noise. The channel, eventually, supplies the image raw data to the decoder, which is the last component in the model. The decoder tries to reconstruct the 3D shape data z(x, y) using an appropriate algorithm. Since the measurement will not be perfect, we will call the measured shape z m (x, y). These data is sent to the sink, which is usually the person interested in the measured shape. 3. The channel As mentioned above, the channel is a crucial component of the 3D measuring process. In our context it is composed of three different subchannels: the optical channel including free space propagation, the electronic channel including the O/E conversion and the discrete channel (A/D conversion). Figure 2 displays a block diagram representing this structure. Any developer of 3D sensors will try to provide a high channel capacity, within the physical, 6

7 technical, and economical limits. Therefore we will discuss this property here. However, we will keep in mind not to provide more channel capacity than necessary in accordance with question 1. This question will be discussed in section 4. In all communication systems there is a maximum amount of information that can be transmitted over a channel in the presence of noise. This limit is called the channel capacity. If the information rate is below channel capacity, a free of error transmission is possible. If the rate is higher than the capacity, errors are inevitable. A common model for noisy channels is the Continuous-Time Additive White Gaussian Noise Channel (AWGN). The capacity C T of the Continuous-Time AWGN channel is known to be ( C T = B log S ) input N (1) where B is the one-sided temporal signal bandwidth, S input the power of the channel s input signal, and N the average noise power. 10,22 The capacity is measured in bit/s. As the input signal of the channel is not known, but only the output signal, the following expression is used: with ( ) N + Sinput C T = B log 2 N ( ) Soutput = B log 2 N (2) S output = N + S input (3) The equations above describe how much information can be transmitted per second. In our case, the total amount of information will be a more useful quantity than this time normalized channel capacity. Therefore, we will use the term channel capacity to express the total amount of information. In this sense the capacity of the AWGN channel during the 7

8 time T is C = T B log 2 ( S N ) (4) S is written instead of S output, keeping in mind that it is an output signal. After this general introduction we will now discuss the different subchannels. A. Capacity of the optical channel In most cases information theory deals with 1-dimensional signals in the time domain. The situation in optics is different. The channel is not continuous in time, but continuous in 2D space. Therefore, the maximum amount of information that can be transmitted per image is ( ) Sopt C opt = 2 X Y B x B y log 2 N opt (5) where X and Y are the lateral extension of the detector and B X B Y is the bandwidth in two dimensions. It is assumed that the optical system can be represented by a diffractionlimited lens system of focal length f and an aperture with diameter D. The average wavelength being λ (Fig. 3). It can be shown that the bandwidth is B x B y = πd2 16λ 2 f 2 (6) The noise observed in the optical channel is caused by the spatial and temporal coherence of light (speckle noise). 23 Photon noise will be attributed to the detector. For large signal-tonoise ratios (low coherence) the speckle noise can be assumed to be Gaussian, which conforms to the AWGN channel model chosen. In this case the SNR can be expressed with the help of the Speckle contrast c Speckle. S opt N opt = I2 σ 2 I I2 σ 2 I = 1 c 2 Speckle (7) 8

9 where I represents the intensity and σ I the standard deviation of the intensity noise. Combining Eqs. (5) and (7) we obtain Eq. (8), describing the information which can be transmitted through the optical channel as: C opt = 4 X Y B x B y log 2 (c Speckle ) (8) Using the definition for the optical conductivity Λ = A PA L f 2 (9) with A P being the surface of the pupil and A L the surface of the detector. The channel capacity can then be expressed as C opt = 1 λ 2Λ log 2 (c Speckle ). (10) From Eqs. (8) and (10) we can derive requirements for a good optical channel that transmits an image with a maximum amount of information about the object. It requires that we should use largely incoherent illumination (c Speckle << 1), a large observation aperture, small wavelengths (B x B y large), and a large detector array. Figure 4 a) shows a coin that was illuminated with a laser and observed with a small aperture. Figure 4 b) displays the same object illuminated with an extended source with low spatial coherence. These figures illustrate why our research group no longer builds sensors with laser illumination. 23 B. Capacity of the electronic channel We will now concentrate on the electronic channel. The procedure is analogous to text book examples. The electronic channel describing the detector is in most cases a matrix of M x M y pixels with an overall size of X Y. We assume that the size X Y of the detector and 9

10 the size of the optical image are the same. The signal-to-noise ratio of the electronic channel is S el /N el. The capacity of the electronic channel is then ( ) Sel C el = 2 X Y B x,el B y,el log 2 N el (11) Since this channel is discrete in space (pixels), the sampling theorem has to be obeyed: 22 B x,el = M x 2 X B y,el = M y 2 Y C el = 1 2 M xm y log 2 ( Sel N el ) (12) (13) (14) For good cameras and high average intensities the main source of noise in the electronic channel is photon noise, which depends only on the number of photons impinging on each detector element. For an average number of n photons the standard deviation of the noise is σ n = n, and the signal-to-noise ratio is Therefore S el N el = n2 σ 2 n n2 σ 2 n = n (15) C el = 1 2 M xm y log 2 ( n) (16) This result can be easily understood. The information that can be received with a camera grows linearly with the number of pixels and increases logarithmically with the number of photons collected by each pixel. For a good signal-to-noise ratio we need many photons, in conjunction with a high full well capacity n max (which ensures that the camera is not saturated). 10

11 C. Capacity of the discrete channel After the optical signal has been converted into an electronic signal, it is then converted into a discrete signal in the A/D converter. The channel model that we use is discrete in intensity and space. With the number of quantization steps m, the capacity of the discrete channel is C dis = M x M y log 2 (m) (17) Having described the capacities for the three kinds of channels the results can be combined and conclusions can be drawn. The overall channel capacity cannot be larger than any of the three capacities. The sub-channel with the minimum capacity will define an upper bound for the performance of the entire channel. For example, with coherent illumination the main source of noise will probably be the speckle noise in the optical channel. So we should aim for an extended light source with low spatial coherence, until the overall capacity is restricted by either the electronic or the discrete channel. For high intensities, photon noise will be dominant and the electronic channel is the bottleneck of the total system. Of course, we will use an A/D converter that provides a better channel capacity than the electronic channel. D. Conclusions about the total channel capacity Optical systems can be built economically, with considerable channel capacity. In general, the bottleneck occurs at the electronic channel, if this is a video camera. Specifically, the camera pixels are expensive, especially if they have to display a large full well capacity. Hence, we have to design our 3D sensor in a manner which is not too demanding with respect to the electronic channel. 11

12 4. Properties of the source and the encoder So far the properties of the channel have been studied. Looking at existing optical 3D sensors, it appears that some developers of those sensors mainly focus their attention on maximizing the electronic channel capacity. However, this approach is too shortsighted. This approach might lead to a fairly good sensor, but in general it will not lead to an economic sensor. We should still keep in mind our objective to keep channel capacity costs to the necessary minimum. The tool which provides solution to this problem is the encoder (see. Fig. 1 b)). The encoding method is of great importance for a good and economic sensor. In order for the whole system to achieve maximum performance, the encoder has to eliminate the source s redundancy. This is especially important for 3D measuring problems because common shape signals include a huge amount of redundant information. What does a typical object look like? Is there something all kinds of objects, natural and technical, have in common? The local shape z(x, y) of the object is considered a random process. 24 We will not discuss its probability density function which is, generally, not known. For our purposes we turn to second order statistics: the autocorrelation function of the random process is ϕ zz (x, y) and its power spectral density is Φ zz (f x, f y ). Objects typically display a power spectral density which rapidly decreases with higher spatial frequencies f x and f y (in x- and y-direction) (Fig. 5 a)). Why is this? Most objects can be represented by a nearly flat surface (low frequencies) with some high frequency details of small amplitude superimposed on top. (We do not consider exceptions like hedgehog type objects). One more important observation is that the stand-off distance z s is, usually, much bigger than the measuring range. So the power spectrum of the measured signal displays a high peak 12

13 at zero spatial frequency. It is obvious that if we transmit the stand-off distance with high accuracy we put a great load onto the channel. This might be illustrated by the following example. If we want to resolve 1µm distance steps on a 1mm high object, from a stand-off of 1 m, this requires an extremely good sensor, thus, a high channel capacity. Although the object signal could be represented by just 10 bits/pixel, the stand-off adds another 10 bits. In many cases we are not really interested in measuring the stand-off. In other words, the source signal z(x, y) is not a white, but a colored random process. Such a source is called a source with memory, because its values are correlated (ϕ zz (x, y) 0 for some x, y 0). If a signal with memory is transmitted over the channel, a high channel capacity is necessary, but the channel transmits a significant amount of redundant data. To exploit the channel capacity with high efficiency, we should send a white random process. 25 Φ zz (f x, f y ) = const (18) and ϕ zz (x, y) = constδ(x, y) (19) Therefore source encoding is necessary. Source encoding removes redundancy from a source signal in order to reach higher information rate all the way up to the channel capacity. What does that imply for 3D metrology? Most optical methods do not reduce the redundant information. Let us consider, as a first example, active triangulation methods. The basic principle is to project a structured light pattern onto the object. The object s surface scatters the light into the pupil of the observation system, and an image of this illuminated surface is generated. The shape z(x, y) can be calculated from the image, by finding the local position of the projected pattern details. The distance of each pixel is determined independently of 13

14 the neighboring pixels. This independence can be considered as an advantage in terms of lateral resolution, however, such a sensor does not make any use of the redundancy within the shape signal. A second example is white light interferometry of rough surfaces. This method gives separate absolute distance data for every pixel, and hence does not reduce the inherent redundancy. This is somewhat different in the third type of sensing, classical interferometry of smooth surfaces. Classical interferometry does not deliver absolute distance data, as the stand-off distance is eliminated. However, the spatial correlation is not used for redundancy reduction. Obviously the mentioned principles are not optimal in the sense of information theory. In order to reduce the redundancy, an appropriate encoder needs to whiten the power spectrum. In the case of shape signals, high frequencies need to be increased. Differentiation is a proper operation to achieve a certain whitening. However, it should be emphasized that this differentiation is not done in the channel, but at the very moment when the illuminating light strikes the object. Thus the noise of the channel is added after the differentiation. Differentiation has the property that it can be done very easily. For diffusely scattering objects (Lambert scatterers), the local slope is found directly from the local intensity if the object is illuminated with a parallel light bundle (shape from shading). The differentiation z/ x of the shape signal yields a partially whitened power spectrum Φ z z (Fig. 5 b)). The spectrum is not white, the dc-term (stand-off distance) is lost, yet a comparison with the original spectrum Φ zz (Fig. 5 a)) shows improvement. We will demonstrate the considerations above by an example. Figure 6 a) shows an image of the object shape z(x, y), where the local height is intensity encoded (the brighter details have 14

15 lower height). Fig. 6 b) displays the derivative z/ x (riding on a bias). The spectrum of the object and of the derivative can be seen in Figs. 6 c) and d). The reader might wonder whether the differentiation encoding will increase the noise. To check this, two cases are considered, where z(x, y) is assumed to be a perfect shape signal. In the first, sensor A directly uses the shape signal z(x, y) which is sent through the channel where noise n(x, y) is added. An a posteriori differentiation yields z (z + n) = x x + n x (20) For second case of sensor B, the situation is different. Sensor B does not send z(x, y) but instead the derivative z/ x, which is a signal just as reliable as z(x, y). Therefore, the differentiation itself adds no noise. Only the channel is subjected to noise and the measured signal is z x + n (21) Comparing the two results in Eqs. (20) and (21), it can be seen that for sensor A the noise has been amplified by differentiation, whereas for B it has not. It is the sequence of operations that counts. The differentiation encoder proposed here is a type B sensor. The signal is differentiated optically before it is sent over the channel. This is different for most other sensors where the shape z is measured and the data is rendered afterwards. This explains why already very little noise can be seen clearly in shaded rendering of 3D data (shaded rendering performs a kind of differentiation). As mentioned before, the differentiation filter is not the perfect encoder, yet it was chosen because it can be implemented optically, 26 even with partial spatial coherence. Moreover, the implementation of it is very simple. 15

16 Interestingly, the measurement of slope data for optically rough objects has been known for a long time as shape from shading. 27 It has been known even before good optical 3D sensors were developed. Shape from shading is commonly not used for high precision shape measurements. We have been able to deduce, using information theory, that shape from shading is advantageous in the sense that it encodes the shape signal in a way that exploits the channel capacity effectively. Therefore we will proceed to develop the method further to use it in the context of high precision and low cost metrology. 5. Example of application and measurement We will now briefly explain the idea of shape from shading, or more precisely, photometric stereo, 30 and will discuss some further improvements. The basic idea is to take a series of images with different illumination directions, while the direction of the observation remains constant. The local surface slope can be obtained from these images. A surface element normal to incident light, for example, would show higher brightness than a surface element with oblique illumination. The method is very effective and simple for Lambertian surfaces, where the object s brightness does not depend on the viewing direction. For the reconstruction of the surface slope, a minimum of three illumination directions is necessary in order to eliminate the usually unknown local reflectivity. In the example shown here, an improved photometric method has been used. This method is able to cope with specular reflections typical for industrially shaped metal surfaces. The idea is not to use a point source, but an extended source which provides a higher illumination aperture. In this way disturbing hot spots, typical for small illumination apertures, can be avoided. 16

17 These considerations are already included in the discussion above about the necessity to avoid spatially coherent illumination. As a result, low measurement uncertainties, down to the micron level, can be demonstrated. Another improvement concerns measurement time. The measurement shown in Figs. 7 a) and b) were done with successive images: one for each illumination direction. It is obvious that the object must not move between the exposures to ensure a correct measurement. However, in industry there is a need to integrate extremely fast quality control systems into the production line. It is desirable to have a single shot technique to meet these demanding requirements. One solution based on color encoding has been implemented by Häusler and Ritter. 28 Ulich et al. have implemented another one, the so called Range and Reflectance Mapping technique using two images which are taken at the same time. 29 Another, based on photometric stereo, has been suggested by Woodham, 30 where a color CCD camera was used in order to combine three different illumination situations into a single frame. The three illumination directions were encoded in red, green and blue, and separated again by the color camera. This approach offers the answer to the questions posed in the introduction: it is possible to grab all the necessary raw data for a complete high quality 3D image, within one single 2D frame (3 color channels). We have successfully implemented such a single shot sensor using a 3-chip CCD camera, as will be shown later. The method is restricted to objects with uniform color, since otherwise the object s color texture can no longer be separated from the shading effects. For metal surfaces, a uniform reflectivity in terms of color can be assumed. The single shot method is thus specifically suitable for quality control of metal surfaces. Another restriction concerns the shape of the object. In order to obtain a valid derivative of the shape signal, the object s shape z(x, y) has 17

18 to be differentiable. The reader may argue that steps, edges and corners can pose a problem, however typical objects with only a finite number of discontinuities can always be separated into a set of differentiable areas. For every single area, the above method gives a valid result. In the case of our coin object, the edges did not pose a problem, because they are not real discontinuities but areas with rather steep slopes, which can nevertheless be measured. Photometric stereo has been used to measure the surface of a one-euro coin. The object was illuminated from different directions (encoding), pictures were taken (channel) and the surface slope was calculated. Finally, the slope was integrated (decoding), yielding the object shape z. With successively recorded images, the profile of the object could be measured with a resolution down to about one micron. Figure 7 a) shows a view of the measured 3D data, while Figure 7 b) shows a shape profile at a relatively flat area of the coin, with the depiction of the Atlantic ocean. The standard deviation of the shape was 1.2µm. It is surprising that even the smallest details like the structure of the European countries, as well as little scratches in the Atlantic ocean can be seen. This measurement was done according with the guidelines given above, namely: illumination with a high numerical aperture (low spatial coherence) resulting in low speckle contrast and a high optical SNR. This ensures high channel capacity. Source coding and decoding was performed in a way which removed redundancy of the source. Additionally, a single shot sensor has been implemented. A measurement example is shown in Fig. 7 c) and a sample profile of the Atlantic ocean in Fig. 7 d). The standard deviation of the surface data, in this relatively flat area of the coin, was 1.8µm, hence the measurement uncertainty of this area must have been below this value. The measurement uncertainty is slightly higher than for the multiple shot sensor, mainly due to a higher 18

19 camera noise of the color camera. Furthermore the calibration of the sensor is slightly more complicated than the calibration of the multiple shot sensor, because the gain of each color channel has to be adapted to the properties of the sources and to the uniform color of the object (this explains some remaining deformation in Fig. 7c)). This is the price which has to be paid for a single shot sensor due to the limitations of technology. For a measurement uncertainty of about one micron and a measurement time of some hundred milliseconds, the multi shot sensor is probably the best solution. For extremely fast measurements (down to some ten microseconds with a flashlamp) the single shot sensor is preferred. 6. Conclusion and Discussion In the introduction a question was posed: how much minimum raw data is needed to achieve a required performance? Raw data redundancy has to be eliminated in order to reduce the costs in hardware and time. The concept of information theory delivers a proper framework for discussing this question. A maximum amount of 3D information about the object is obtained if high channel capacity is combined with an appropriate encoding and decoding of the shape signal z(x, y). Removing object signal redundancy through optical means is the basic idea for the encoding scheme, and this idea works quite effectively. Spacial redundancy of typical 3D sources signals can be removed by whitening the spectrum. Whitening means, in practice, amplifying high frequencies and removing the stand-off distance of z(x, y). This is a valid manipulation as in many cases an absolute measurement of distance is not necessary, or it can be supplied by an additional sensor. This idea leads to a further improvement of the basic photometric stereo method. Assuming there is a finite number of differentiable areas of the object, it is desirable to know 19

20 the height of the steps between them. To do this, photometric stereo can be combined with stereo. Photometric stereo is very good for measuring high object frequencies and fine local shapes because of its derivative character. On the other hand, the global shape, including steps, can show higher measurement uncertainty. This is where binocular stereo is of aid, as here the situation is reversed. Binocular stereo is effective when handling the global shape of an object, whereas it gives poor results for measurements of the local shape. This is why photometric stereo and binocular stereo complement each other so well. 31 In addition, binocular stereo combines very well with the single shot concept because both stereo images can be taken at the same time. Results of these considerations will be presented in a later contribution. Whitening the spectrum by differentiation can be done optically using the concept of shape from shading. A great advantage of this method is that it works for diffusely reflecting objects and even for partially specular reflecting objects, without amplifying noise. The above concept was extended by two additional ideas of reducing coherent noise by bigger illumination aperture, and reducing the necessary optical time-bandwidth using a color camera. Combining the theoretical ideas with the technical abilities, a single shot sensor was designed, needing only one single raw image exposure (3 color channels) and working even on metal surfaces, with a shape uncertainty of the measured data in the range of less than 2µm. Further increase of knowledge of the physical measurement uncertainty limit, the object localization limits and the reduction of 3D signal redundancy may lead to a framework for a complete theory of optical 3D sensing, as well as to better sensor designs. 20

21 References 1. E. Ingelstam, An optical uncertainty principle and its application to the amount of information obtainable from multiple-beam interference, Arkiv för Fysik 7, no. 24, (1953). 2. R. Dorsch, G. Häusler, and J. Herrmann, Laser triangulation: Fundamental uncertainty in distance measurement, Appl. Opt. 33, (1994). 3. T. Dresel, G. Häusler, and H. Venzke, 3-D-sensing of rough surfaces by coherence radar, Appl. Opt. 31, (1992). 4. G. Häusler and G. Leuchs, Physikalische Grenzen der optischen Formerfassung mit Licht, in Physikalische Blätter 53 (1997) Nr. 5, G. Häusler, P. Ettl, M. Schenk, G. Bohn and I. Laszlo, Limits of Optical Range Sensors and How to Exploit Them, in Trends in Optics and Phototonics, Ico IV, T.Asakura, ed., (Springer Verlag Berlin, Heidelberg, New York, 1999), Vol. 74, X. Laboureux and G. Häusler, Localization and registartion of three-dimensional objects in space where are the limits? Appl. Opt. 28, (2001). 7. L. Yaroslavsky, The theory of optimal methods for localization of objects in pictures, in Progress in Optics XXXII, E. Wolf, ed., (Elsevier Science Publishers B. V., 1993), V. Srinivasan, H. Liu, and M. Halioua, Automated phase-measuring profilometry of 3-D diffuse objects, Appl. Opt. 23, 3105 (1984). 21

22 9. B. L. Ulich, Method and Apparatus for Three Dimensional Range Resolving Imaging, U. S. Patent 5,249,046 (1993) 10. C. Shannon and W. Weaver, The mathematical theory of communication (University of Illinois Press, 1949) 11. R. Fano, Transmission of information (MIT Press, Cambridge, Mass., 1961) 12. J. Wozencraft and I. Jacobs, Principles of communication engineering (John Wiley & Sons, New York, 1965) 13. R. Gallager, Information theory and reliable communications (John Wiley & Sons, New York, 1968) 14. A. Viterbi and J. Omura, Principles of digital communication and coding (McGraw Hill, New York, 1979) 15. R. Blahut, Principles and practice of information theory (Addison-Wesley, Reading Mass., 1987) 16. T. Cover and J. Thomas, Elements of information theory (John Wiley & Sons, New York, 1991) 17. R. Johannesson, Informationstheorie - Grundlagen der (Tele-)Kommunikation (Addison- Wesley Studentlitteratur, 1992) 22

23 18. M. Neifeld, Information, resolution, and space-bandwidth product, Optics Letters 23, (1998). 19. R. Röhler, Informationstheorie in der Optik (Wissenschaftliche Verlagsgesellschaft, Stuttgart, 1967) 20. B. Frieden, Probability, statistical optics, and data testing (Springer-Verlag, Berlin, Heidelberg, New-York, 2001) 21. R. Johannesson, Informationstheorie - Grundlagen der (Tele-)Kommunikation (Addison- Wesley Studentlitteratur, 1992), p R. Johannesson, Informationstheorie - Grundlagen der (Tele-)Kommunikation (Addison- Wesley Studentlitteratur, 1992), pp G. Häusler and J. Hermann, Physical limits of 3D-sensing, in Optics, Illumination, and Image Sensing for Machine Vision VII, D. Svetkoff, ed., Proc. SPIE 1822, (1992). 24. B. Frieden, Probability, statistical optics, and data testing (Springer-Verlag, Berlin, Heidelberg, New-York, 2001), Chap R. Röhler, Informationstheorie in der Optik (Wissenschaftliche Verlagsgesellschaft, Stuttgart, 1967), pp R. Röhler, Informationstheorie in der Optik (Wissenschaftliche Verlagsgesellschaft, Stuttgart, 23

24 1967), pp B. Horn and M. Brooks, Shape from shading (MIT Press, Cambridge, Massachusetts, London, 1989) 28. G. Häusler and D. Ritter, Parallel three-dimensional sensing by color coded triangulation, Appl. Opt. 32, (1993). 29. B. L. Ulich, P. Lacovara, S. E. Moran and M. J. DeWeert, Recent results in imaging lidar, in Advances in Laser Remote Sensing for Terrestrial and Oceanographic Applications, Proc. SPIE 3059, (1997). 30. R. Woodham, Photometric method for determining surface orientation from mutiple images, in Shape from shading, B. Horn and M. Brooks, eds., (MIT Press, Cambridge, Massachusetts, London, 1989), A. Blake, A. Zisserman, and G. Knowles, Surface description from stereo and shading, in Shape from shading, B. Horn and M. Brooks, eds., (MIT Press, Cambridge, Massachusetts, London),

25 List of Figures Fig. 1. Block diagram of a) information system, b) 3D metrology system Fig. 2. Channel structure Fig. 3. Optical system; for large object distances the aperture ratio is approx. D/f Fig. 4. Image of a coin illuminated with a) a laser and b) incoherent illumination Fig. 5. a) Typical power spectrum Φ zz of an object shape and b) typical power spectrum Φ z z of a shape derivative Fig. 6. Intensity encoded image of a) the shape z(x,y) and b) the shape derivative z/ x. Spectrum of c) the object height z(x,y) and d) the height derivative z/ x Fig. 7. a) Rendered 3D data and b) sample profile of successive exposures. c) Rendered 3D data and d) Sample profile for single shot 25

26 a) source encoder channel decoder sink b) optics object illumination algorithm electronics 3D data Figure 1, C. Wagner and G. Häusler 26

27 channel optical channel electronic channel discrete channel Figure 2, C. Wagner and G. Häusler 27

28 D f X Figure 3, C. Wagner and G. Häusler 28

29 (a) (b) Figure 4, C. Wagner and G. Häusler 29

30 (a) ZZ (b) Z Z fx fx Figure 5, C. Wagner and G. Häusler 30

31 (a) (b) (c) (d) Figure 6, C. Wagner and G. Häusler 31

32 z x 0.01 y z (mm) (a) (b) x (mm) z y x z (mm) (c) (d) x (mm) Figure 7, C. Wagner and G. Häusler 32

Optical 3D Sensors for Real Applications Potentials and Limits Applications pratiques de capteurs optiques tridimensionnels: potentiel et limites

Optical 3D Sensors for Real Applications Potentials and Limits Applications pratiques de capteurs optiques tridimensionnels: potentiel et limites Optical 3D Sensors for Real Applications Potentials and Limits Applications pratiques de capteurs optiques tridimensionnels: potentiel et limites Abstract Optical 3D-sensors measure local distances or

More information

Roughness parameters and surface deformation measured by "Coherence Radar" P. Ettl, B. Schmidt, M. Schenk, I. Laszlo, G. Häusler

Roughness parameters and surface deformation measured by Coherence Radar P. Ettl, B. Schmidt, M. Schenk, I. Laszlo, G. Häusler Roughness parameters and surface deformation measured by "Coherence Radar" P. Ettl, B. Schmidt, M. Schenk, I. Laszlo, G. Häusler University of Erlangen, Chair for Optics Staudtstr. 7/B2, 91058 Erlangen,

More information

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS J. KORNIS, P. PACHER Department of Physics Technical University of Budapest H-1111 Budafoki út 8., Hungary e-mail: kornis@phy.bme.hu, pacher@phy.bme.hu

More information

Flying Triangulation Acquiring the 360 Topography of the Human Body on the Fly

Flying Triangulation Acquiring the 360 Topography of the Human Body on the Fly Flying Triangulation Acquiring the 360 Topography of the Human Body on the Fly Svenja ETTL*, Oliver AROLD, Florian WILLOMITZER, Zheng YANG, Gerd HÄUSLER Institute of Optics, Information, and Photonics,

More information

THREE DIMENSIONAL ACQUISITION OF COLORED OBJECTS N. Schön, P. Gall, G. Häusler

THREE DIMENSIONAL ACQUISITION OF COLORED OBJECTS N. Schön, P. Gall, G. Häusler ISSN 143-3346, pp. 63 70, ZBS e. V. Ilmenau, October 00. THREE DIMENSIONAL ACQUISITION OF COLORED OBJECTS N. Schön, P. Gall, G. Häusler Chair for Optics Friedrich Alexander University Erlangen-Nuremberg

More information

A RADIAL WHITE LIGHT INTERFEROMETER FOR MEASUREMENT OF CYLINDRICAL GEOMETRIES

A RADIAL WHITE LIGHT INTERFEROMETER FOR MEASUREMENT OF CYLINDRICAL GEOMETRIES A RADIAL WHITE LIGHT INTERFEROMETER FOR MEASUREMENT OF CYLINDRICAL GEOMETRIES Andre R. Sousa 1 ; Armando Albertazzi 2 ; Alex Dal Pont 3 CEFET/SC Federal Center for Technological Education of Sta. Catarina

More information

REMOTE SENSING OF SURFACE STRUCTURES

REMOTE SENSING OF SURFACE STRUCTURES REMOTE SENSING OF SURFACE STRUCTURES A.W. Koch, P. Evanschitzky and M. Jakobi Technische Universität München Institute for Measurement Systems and Sensor Technology D-8090 München, Germany Abstract: The

More information

Time-of-flight basics

Time-of-flight basics Contents 1. Introduction... 2 2. Glossary of Terms... 3 3. Recovering phase from cross-correlation... 4 4. Time-of-flight operating principle: the lock-in amplifier... 6 5. The time-of-flight sensor pixel...

More information

IMAGE DE-NOISING IN WAVELET DOMAIN

IMAGE DE-NOISING IN WAVELET DOMAIN IMAGE DE-NOISING IN WAVELET DOMAIN Aaditya Verma a, Shrey Agarwal a a Department of Civil Engineering, Indian Institute of Technology, Kanpur, India - (aaditya, ashrey)@iitk.ac.in KEY WORDS: Wavelets,

More information

High Performance GPU-Based Preprocessing for Time-of-Flight Imaging in Medical Applications

High Performance GPU-Based Preprocessing for Time-of-Flight Imaging in Medical Applications High Performance GPU-Based Preprocessing for Time-of-Flight Imaging in Medical Applications Jakob Wasza 1, Sebastian Bauer 1, Joachim Hornegger 1,2 1 Pattern Recognition Lab, Friedrich-Alexander University

More information

What is Frequency Domain Analysis?

What is Frequency Domain Analysis? R&D Technical Bulletin P. de Groot 9/3/93 What is Frequency Domain Analysis? Abstract: The Zygo NewView is a scanning white-light interferometer that uses frequency domain analysis (FDA) to generate quantitative

More information

Draft SPOTS Standard Part III (7)

Draft SPOTS Standard Part III (7) SPOTS Good Practice Guide to Electronic Speckle Pattern Interferometry for Displacement / Strain Analysis Draft SPOTS Standard Part III (7) CALIBRATION AND ASSESSMENT OF OPTICAL STRAIN MEASUREMENTS Good

More information

LED holographic imaging by spatial-domain diffraction computation of. textured models

LED holographic imaging by spatial-domain diffraction computation of. textured models LED holographic imaging by spatial-domain diffraction computation of textured models Ding-Chen Chen, Xiao-Ning Pang, Yi-Cong Ding, Yi-Gui Chen, and Jian-Wen Dong* School of Physics and Engineering, and

More information

DYNAMIC ELECTRONIC SPECKLE PATTERN INTERFEROMETRY IN APPLICATION TO MEASURE OUT-OF-PLANE DISPLACEMENT

DYNAMIC ELECTRONIC SPECKLE PATTERN INTERFEROMETRY IN APPLICATION TO MEASURE OUT-OF-PLANE DISPLACEMENT Engineering MECHANICS, Vol. 14, 2007, No. 1/2, p. 37 44 37 DYNAMIC ELECTRONIC SPECKLE PATTERN INTERFEROMETRY IN APPLICATION TO MEASURE OUT-OF-PLANE DISPLACEMENT Pavla Dvořáková, Vlastimil Bajgar, Jan Trnka*

More information

Surface and thickness profile measurement of a transparent film by three-wavelength vertical scanning interferometry

Surface and thickness profile measurement of a transparent film by three-wavelength vertical scanning interferometry Surface and thickness profile measurement of a transparent film by three-wavelength vertical scanning interferometry Katsuichi Kitagawa Toray Engineering Co. Ltd., 1-1-45 Oe, Otsu 50-141, Japan Corresponding

More information

Overview of Active Vision Techniques

Overview of Active Vision Techniques SIGGRAPH 99 Course on 3D Photography Overview of Active Vision Techniques Brian Curless University of Washington Overview Introduction Active vision techniques Imaging radar Triangulation Moire Active

More information

WORCESTER POLYTECHNIC INSTITUTE

WORCESTER POLYTECHNIC INSTITUTE WORCESTER POLYTECHNIC INSTITUTE MECHANICAL ENGINEERING DEPARTMENT Optical Metrology and NDT ME-593L, C 2018 Introduction: Wave Optics January 2018 Wave optics: coherence Temporal coherence Review interference

More information

Optical Topography Measurement of Patterned Wafers

Optical Topography Measurement of Patterned Wafers Optical Topography Measurement of Patterned Wafers Xavier Colonna de Lega and Peter de Groot Zygo Corporation, Laurel Brook Road, Middlefield CT 6455, USA xcolonna@zygo.com Abstract. We model the measurement

More information

The maximum reachable astrometric precision. The Cramér-Rao Limit

The maximum reachable astrometric precision. The Cramér-Rao Limit The maximum reachable astrometric precision The Cramér-Rao Limit Ulrich Bastian, ARI, Heidelberg, 28-04-2004 translated into English by Helmut Lenhardt, 2009 Abstract: This small investigation shall give

More information

Chapter 7. Conclusions and Future Work

Chapter 7. Conclusions and Future Work Chapter 7 Conclusions and Future Work In this dissertation, we have presented a new way of analyzing a basic building block in computer graphics rendering algorithms the computational interaction between

More information

Hyperspectral interferometry for single-shot absolute measurement of 3-D shape and displacement fields

Hyperspectral interferometry for single-shot absolute measurement of 3-D shape and displacement fields EPJ Web of Conferences 6, 6 10007 (2010) DOI:10.1051/epjconf/20100610007 Owned by the authors, published by EDP Sciences, 2010 Hyperspectral interferometry for single-shot absolute measurement of 3-D shape

More information

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22)

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22) Digital Image Processing Prof. P. K. Biswas Department of Electronics and Electrical Communications Engineering Indian Institute of Technology, Kharagpur Module Number 01 Lecture Number 02 Application

More information

OPTI-521 Graduate Report 2 Matthew Risi Tutorial: Introduction to imaging, and estimate of image quality degradation from optical surfaces

OPTI-521 Graduate Report 2 Matthew Risi Tutorial: Introduction to imaging, and estimate of image quality degradation from optical surfaces OPTI-521 Graduate Report 2 Matthew Risi Tutorial: Introduction to imaging, and estimate of image quality degradation from optical surfaces Abstract The purpose of this tutorial is to introduce the concept

More information

High dynamic range scanning technique

High dynamic range scanning technique 48 3, 033604 March 2009 High dynamic range scanning technique Song Zhang, MEMBER SPIE Iowa State University Department of Mechanical Engineering Virtual Reality Applications Center Human Computer Interaction

More information

Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry

Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry Dynamic three-dimensional sensing for specular surface with monoscopic fringe reflectometry Lei Huang,* Chi Seng Ng, and Anand Krishna Asundi School of Mechanical and Aerospace Engineering, Nanyang Technological

More information

Advanced Stamping Manufacturing Engineering, Auburn Hills, MI

Advanced Stamping Manufacturing Engineering, Auburn Hills, MI RECENT DEVELOPMENT FOR SURFACE DISTORTION MEASUREMENT L.X. Yang 1, C.Q. Du 2 and F. L. Cheng 2 1 Dep. of Mechanical Engineering, Oakland University, Rochester, MI 2 DaimlerChrysler Corporation, Advanced

More information

Exploiting scattering media for exploring 3D objects

Exploiting scattering media for exploring 3D objects Exploiting scattering media for exploring 3D objects Alok Kumar Singh 1, Dinesh N Naik 1,, Giancarlo Pedrini 1, Mitsuo Takeda 1, 3 and Wolfgang Osten 1 1 Institut für Technische Optik and Stuttgart Research

More information

Miniaturized Camera Systems for Microfactories

Miniaturized Camera Systems for Microfactories Miniaturized Camera Systems for Microfactories Timo Prusi, Petri Rokka, and Reijo Tuokko Tampere University of Technology, Department of Production Engineering, Korkeakoulunkatu 6, 33720 Tampere, Finland

More information

Range Sensors (time of flight) (1)

Range Sensors (time of flight) (1) Range Sensors (time of flight) (1) Large range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic sensors, infra-red sensors

More information

Applications of Piezo Actuators for Space Instrument Optical Alignment

Applications of Piezo Actuators for Space Instrument Optical Alignment Year 4 University of Birmingham Presentation Applications of Piezo Actuators for Space Instrument Optical Alignment Michelle Louise Antonik 520689 Supervisor: Prof. B. Swinyard Outline of Presentation

More information

Shape reconstruction from gradient data

Shape reconstruction from gradient data Shape reconstruction from gradient data Svenja Ettl, 1, Jürgen Kaminski, Markus C. Knauer, and Gerd Häusler Institute of Optics, Information and Photonics, Max Planck Research Group, University Erlangen-Nuremberg,

More information

Analysis of Cornell Electron-Positron Storage Ring Test Accelerator's Double Slit Visual Beam Size Monitor

Analysis of Cornell Electron-Positron Storage Ring Test Accelerator's Double Slit Visual Beam Size Monitor Analysis of Cornell Electron-Positron Storage Ring Test Accelerator's Double Slit Visual Beam Size Monitor Senior Project Department of Physics California Polytechnic State University San Luis Obispo By:

More information

Optimization of optical systems for LED spot lights concerning the color uniformity

Optimization of optical systems for LED spot lights concerning the color uniformity Optimization of optical systems for LED spot lights concerning the color uniformity Anne Teupner* a, Krister Bergenek b, Ralph Wirth b, Juan C. Miñano a, Pablo Benítez a a Technical University of Madrid,

More information

Detector systems for light microscopy

Detector systems for light microscopy Detector systems for light microscopy The human eye the perfect detector? Resolution: 0.1-0.3mm @25cm object distance Spectral sensitivity ~400-700nm Has a dynamic range of 10 decades Two detectors: rods

More information

Metrology and Sensing

Metrology and Sensing Metrology and Sensing Lecture 4: Fringe projection 2016-11-08 Herbert Gross Winter term 2016 www.iap.uni-jena.de 2 Preliminary Schedule No Date Subject Detailed Content 1 18.10. Introduction Introduction,

More information

An Intuitive Explanation of Fourier Theory

An Intuitive Explanation of Fourier Theory An Intuitive Explanation of Fourier Theory Steven Lehar slehar@cns.bu.edu Fourier theory is pretty complicated mathematically. But there are some beautifully simple holistic concepts behind Fourier theory

More information

High-resolution 3D profilometry with binary phase-shifting methods

High-resolution 3D profilometry with binary phase-shifting methods High-resolution 3D profilometry with binary phase-shifting methods Song Zhang Department of Mechanical Engineering, Iowa State University, Ames, Iowa 511, USA (song@iastate.edu) Received 11 November 21;

More information

Visual inspection of metal surfaces

Visual inspection of metal surfaces Visual inspection of metal surfaces by J. L. MUNDY General Electric Company Schenectady, New York INTRODUCTION The majotity of applications of automatic visual inspection have been the case in which a

More information

Geometrical modeling of light scattering from paper substrates

Geometrical modeling of light scattering from paper substrates Geometrical modeling of light scattering from paper substrates Peter Hansson Department of Engineering ciences The Ångström Laboratory, Uppsala University Box 534, E-75 Uppsala, weden Abstract A light

More information

Coupling of surface roughness to the performance of computer-generated holograms

Coupling of surface roughness to the performance of computer-generated holograms Coupling of surface roughness to the performance of computer-generated holograms Ping Zhou* and Jim Burge College of Optical Sciences, University of Arizona, Tucson, Arizona 85721, USA *Corresponding author:

More information

Laser speckle based background oriented schlieren measurements in a fire backlayering front

Laser speckle based background oriented schlieren measurements in a fire backlayering front Laser speckle based background oriented schlieren measurements in a fire backlayering front Philipp Bühlmann 1*, Alexander H. Meier 1, Martin Ehrensperger 1, Thomas Rösgen 1 1: ETH Zürich, Institute of

More information

CS6670: Computer Vision

CS6670: Computer Vision CS6670: Computer Vision Noah Snavely Lecture 20: Light, reflectance and photometric stereo Light by Ted Adelson Readings Szeliski, 2.2, 2.3.2 Light by Ted Adelson Readings Szeliski, 2.2, 2.3.2 Properties

More information

HIGH-SPEED THEE-DIMENSIONAL TOMOGRAPHIC IMAGING OF FRAGMENTS AND PRECISE STATISTICS FROM AN AUTOMATED ANALYSIS

HIGH-SPEED THEE-DIMENSIONAL TOMOGRAPHIC IMAGING OF FRAGMENTS AND PRECISE STATISTICS FROM AN AUTOMATED ANALYSIS 23 RD INTERNATIONAL SYMPOSIUM ON BALLISTICS TARRAGONA, SPAIN 16-20 APRIL 2007 HIGH-SPEED THEE-DIMENSIONAL TOMOGRAPHIC IMAGING OF FRAGMENTS AND PRECISE STATISTICS FROM AN AUTOMATED ANALYSIS P. Helberg 1,

More information

NOVEL TEMPORAL FOURIER-TRANSFORM SPECKLE PATTERN SHEARING INTERFEROMETER

NOVEL TEMPORAL FOURIER-TRANSFORM SPECKLE PATTERN SHEARING INTERFEROMETER NOVEL TEMPORAL FOURIER-TRANSFORM SPECKLE PATTERN SHEARING INTERFEROMETER C. Joenathan*, B. Franze, P. Haible, and H. J. Tiziani Universitaet Stuttgart, Institut fuer Technische Optik, Pfaffenwaldring 9,

More information

White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting

White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting Maitreyee Roy 1, *, Joanna Schmit 2 and Parameswaran Hariharan 1 1 School of Physics, University

More information

Fourier, Fresnel and Image CGHs of three-dimensional objects observed from many different projections

Fourier, Fresnel and Image CGHs of three-dimensional objects observed from many different projections Fourier, Fresnel and Image CGHs of three-dimensional objects observed from many different projections David Abookasis and Joseph Rosen Ben-Gurion University of the Negev Department of Electrical and Computer

More information

arxiv: v1 [physics.optics] 9 Jan 2014

arxiv: v1 [physics.optics] 9 Jan 2014 Hand-guided 3D surface acquisition by combining simple light sectioning with real-time algorithms Oliver Arold Svenja Ettl Florian Willomitzer Gerd Häusler University Erlangen-Nuremberg Institute of Optics,

More information

Diffraction and Interference

Diffraction and Interference Diffraction and Interference Kyle Weigand, Mark Hillstrom Abstract: We measure the patterns produced by a CW laser near 650 nm passing through one and two slit apertures with a detector mounted on a linear

More information

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB HIGH ACCURACY 3-D MEASUREMENT USING MULTIPLE CAMERA VIEWS T.A. Clarke, T.J. Ellis, & S. Robson. High accuracy measurement of industrially produced objects is becoming increasingly important. The techniques

More information

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1. Range Sensors (time of flight) (1) Range Sensors (time of flight) (2) arge range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic

More information

Coarse-to-fine image registration

Coarse-to-fine image registration Today we will look at a few important topics in scale space in computer vision, in particular, coarseto-fine approaches, and the SIFT feature descriptor. I will present only the main ideas here to give

More information

Related topics Interference, wavelength, refractive index, speed of light, phase, virtuallight source.

Related topics Interference, wavelength, refractive index, speed of light, phase, virtuallight source. Determination of the refractive index TEP Overview Related topics Interference, wavelength, refractive index, speed of light, phase, virtuallight source. Principle Light is brought to interference by two

More information

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical

More information

Metallic Transmission Screen for Sub-wavelength Focusing

Metallic Transmission Screen for Sub-wavelength Focusing Metallic Transmission Screen for Sub-wavelength Focusing A.M.H. Wong, C.D. Sarris and G.V. leftheriades Abstract: A simple metallic transmission screen is proposed that is capable of focusing an incident

More information

ENHANCEMENT OF DIFFUSERS BRDF ACCURACY

ENHANCEMENT OF DIFFUSERS BRDF ACCURACY ENHANCEMENT OF DIFFUSERS BRDF ACCURACY Grégory Bazalgette Courrèges-Lacoste (1), Hedser van Brug (1) and Gerard Otter (1) (1) TNO Science and Industry, Opto-Mechanical Instrumentation Space, P.O.Box 155,

More information

Integration of Multiple-baseline Color Stereo Vision with Focus and Defocus Analysis for 3D Shape Measurement

Integration of Multiple-baseline Color Stereo Vision with Focus and Defocus Analysis for 3D Shape Measurement Integration of Multiple-baseline Color Stereo Vision with Focus and Defocus Analysis for 3D Shape Measurement Ta Yuan and Murali Subbarao tyuan@sbee.sunysb.edu and murali@sbee.sunysb.edu Department of

More information

Introduction to Inverse Problems

Introduction to Inverse Problems Introduction to Inverse Problems What is an image? Attributes and Representations Forward vs Inverse Optical Imaging as Inverse Problem Incoherent and Coherent limits Dimensional mismatch: continuous vs

More information

Metrology and Sensing

Metrology and Sensing Metrology and Sensing Lecture 4: Fringe projection 2018-11-09 Herbert Gross Winter term 2018 www.iap.uni-jena.de 2 Schedule Optical Metrology and Sensing 2018 No Date Subject Detailed Content 1 16.10.

More information

Rodenstock Products Photo Optics / Digital Imaging

Rodenstock Products Photo Optics / Digital Imaging Go to: Apo-Sironar digital Apo-Macro-Sironar digital Apo-Sironar digital HR Lenses for Digital Professional Photography Digital photography may be superior to conventional photography if the end-product

More information

Inspection system for microelectronics BGA package using wavelength scanning interferometry

Inspection system for microelectronics BGA package using wavelength scanning interferometry Inspection system for microelectronics BGA package using wavelength scanning interferometry C.M. Kang a, H.G. Woo a, H.S. Cho a, J.W. Hahn b, J.Y. Lee b a Dept. of Mechanical Eng., Korea Advanced Institute

More information

Depth. Common Classification Tasks. Example: AlexNet. Another Example: Inception. Another Example: Inception. Depth

Depth. Common Classification Tasks. Example: AlexNet. Another Example: Inception. Another Example: Inception. Depth Common Classification Tasks Recognition of individual objects/faces Analyze object-specific features (e.g., key points) Train with images from different viewing angles Recognition of object classes Analyze

More information

specular diffuse reflection.

specular diffuse reflection. Lesson 8 Light and Optics The Nature of Light Properties of Light: Reflection Refraction Interference Diffraction Polarization Dispersion and Prisms Total Internal Reflection Huygens s Principle The Nature

More information

HANDBOOK OF THE MOIRE FRINGE TECHNIQUE

HANDBOOK OF THE MOIRE FRINGE TECHNIQUE k HANDBOOK OF THE MOIRE FRINGE TECHNIQUE K. PATORSKI Institute for Design of Precise and Optical Instruments Warsaw University of Technology Warsaw, Poland with a contribution by M. KUJAWINSKA Institute

More information

SURFACE BOUNDARY MEASUREMENT USING 3D PROFILOMETRY

SURFACE BOUNDARY MEASUREMENT USING 3D PROFILOMETRY SURFACE BOUNDARY MEASUREMENT USING 3D PROFILOMETRY Prepared by Craig Leising 6 Morgan, Ste156, Irvine CA 92618 P: 949.461.9292 F: 949.461.9232 nanovea.com Today's standard for tomorrow's materials. 2013

More information

Techniques of Noninvasive Optical Tomographic Imaging

Techniques of Noninvasive Optical Tomographic Imaging Techniques of Noninvasive Optical Tomographic Imaging Joseph Rosen*, David Abookasis and Mark Gokhler Ben-Gurion University of the Negev Department of Electrical and Computer Engineering P. O. Box 653,

More information

3D Modeling of Objects Using Laser Scanning

3D Modeling of Objects Using Laser Scanning 1 3D Modeling of Objects Using Laser Scanning D. Jaya Deepu, LPU University, Punjab, India Email: Jaideepudadi@gmail.com Abstract: In the last few decades, constructing accurate three-dimensional models

More information

Lecture 15: Shading-I. CITS3003 Graphics & Animation

Lecture 15: Shading-I. CITS3003 Graphics & Animation Lecture 15: Shading-I CITS3003 Graphics & Animation E. Angel and D. Shreiner: Interactive Computer Graphics 6E Addison-Wesley 2012 Objectives Learn that with appropriate shading so objects appear as threedimensional

More information

Comment on Numerical shape from shading and occluding boundaries

Comment on Numerical shape from shading and occluding boundaries Artificial Intelligence 59 (1993) 89-94 Elsevier 89 ARTINT 1001 Comment on Numerical shape from shading and occluding boundaries K. Ikeuchi School of Compurer Science. Carnegie Mellon dniversity. Pirrsburgh.

More information

INFLUENCE OF CURVATURE ILLUMINATION WAVEFRONT IN QUANTITATIVE SHEAROGRAPHY NDT MEASUREMENT

INFLUENCE OF CURVATURE ILLUMINATION WAVEFRONT IN QUANTITATIVE SHEAROGRAPHY NDT MEASUREMENT 1 th A-PCNDT 6 Asia-Pacific Conference on NDT, 5 th 1 th Nov 6, Auckland, New Zealand INFLUENCE OF CURVATURE ILLUMINATION WAVEFRONT IN QUANTITATIVE SHEAROGRAPHY NDT MEASUREMENT Wan Saffiey Wan Abdullah

More information

Computer Vision. The image formation process

Computer Vision. The image formation process Computer Vision The image formation process Filippo Bergamasco (filippo.bergamasco@unive.it) http://www.dais.unive.it/~bergamasco DAIS, Ca Foscari University of Venice Academic year 2016/2017 The image

More information

Dynamic 3-D surface profilometry using a novel color pattern encoded with a multiple triangular model

Dynamic 3-D surface profilometry using a novel color pattern encoded with a multiple triangular model Dynamic 3-D surface profilometry using a novel color pattern encoded with a multiple triangular model Liang-Chia Chen and Xuan-Loc Nguyen Graduate Institute of Automation Technology National Taipei University

More information

Engineered Diffusers Intensity vs Irradiance

Engineered Diffusers Intensity vs Irradiance Engineered Diffusers Intensity vs Irradiance Engineered Diffusers are specified by their divergence angle and intensity profile. The divergence angle usually is given as the width of the intensity distribution

More information

25-1 Interference from Two Sources

25-1 Interference from Two Sources 25-1 Interference from Two Sources In this chapter, our focus will be on the wave behavior of light, and on how two or more light waves interfere. However, the same concepts apply to sound waves, and other

More information

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc.

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc. Minimizing Noise and Bias in 3D DIC Correlated Solutions, Inc. Overview Overview of Noise and Bias Digital Image Correlation Background/Tracking Function Minimizing Noise Focus Contrast/Lighting Glare

More information

Enhanced two-frequency phase-shifting method

Enhanced two-frequency phase-shifting method Research Article Vol. 55, No. 16 / June 1 016 / Applied Optics 4395 Enhanced two-frequency phase-shifting method JAE-SANG HYUN AND SONG ZHANG* School of Mechanical Engineering, Purdue University, West

More information

CS5670: Computer Vision

CS5670: Computer Vision CS5670: Computer Vision Noah Snavely Light & Perception Announcements Quiz on Tuesday Project 3 code due Monday, April 17, by 11:59pm artifact due Wednesday, April 19, by 11:59pm Can we determine shape

More information

Fringe modulation skewing effect in white-light vertical scanning interferometry

Fringe modulation skewing effect in white-light vertical scanning interferometry Fringe modulation skewing effect in white-light vertical scanning interferometry Akiko Harasaki and James C. Wyant An interference fringe modulation skewing effect in white-light vertical scanning interferometry

More information

Understanding Variability

Understanding Variability Understanding Variability Why so different? Light and Optics Pinhole camera model Perspective projection Thin lens model Fundamental equation Distortion: spherical & chromatic aberration, radial distortion

More information

Comparison between 3D Digital and Optical Microscopes for the Surface Measurement using Image Processing Techniques

Comparison between 3D Digital and Optical Microscopes for the Surface Measurement using Image Processing Techniques Comparison between 3D Digital and Optical Microscopes for the Surface Measurement using Image Processing Techniques Ismail Bogrekci, Pinar Demircioglu, Adnan Menderes University, TR; M. Numan Durakbasa,

More information

Image based 3D inspection of surfaces and objects

Image based 3D inspection of surfaces and objects Image Analysis for Agricultural Products and Processes 11 Image based 3D inspection of s and objects Michael Heizmann Fraunhofer Institute for Information and Data Processing, Fraunhoferstrasse 1, 76131

More information

Extensions of One-Dimensional Gray-level Nonlinear Image Processing Filters to Three-Dimensional Color Space

Extensions of One-Dimensional Gray-level Nonlinear Image Processing Filters to Three-Dimensional Color Space Extensions of One-Dimensional Gray-level Nonlinear Image Processing Filters to Three-Dimensional Color Space Orlando HERNANDEZ and Richard KNOWLES Department Electrical and Computer Engineering, The College

More information

Application of Photopolymer Holographic Gratings

Application of Photopolymer Holographic Gratings Dublin Institute of Technology ARROW@DIT Conference Papers Centre for Industrial and Engineering Optics 2004-2 Application of Photopolymer Holographic Gratings Emilia Mihaylova Dublin Institute of Technology,

More information

Lab Report: Optical Image Processing

Lab Report: Optical Image Processing Lab Report: Optical Image Processing Kevin P. Chen * Advanced Labs for Special Topics in Photonics (ECE 1640H) University of Toronto March 5, 1999 Abstract This report describes the experimental principle,

More information

Adaptive Waveform Inversion: Theory Mike Warner*, Imperial College London, and Lluís Guasch, Sub Salt Solutions Limited

Adaptive Waveform Inversion: Theory Mike Warner*, Imperial College London, and Lluís Guasch, Sub Salt Solutions Limited Adaptive Waveform Inversion: Theory Mike Warner*, Imperial College London, and Lluís Guasch, Sub Salt Solutions Limited Summary We present a new method for performing full-waveform inversion that appears

More information

Winter College on Optics in Environmental Science February Adaptive Optics: Introduction, and Wavefront Correction

Winter College on Optics in Environmental Science February Adaptive Optics: Introduction, and Wavefront Correction 2018-23 Winter College on Optics in Environmental Science 2-18 February 2009 Adaptive Optics: Introduction, and Wavefront Correction Love G. University of Durham U.K. Adaptive Optics: Gordon D. Love Durham

More information

And. Modal Analysis. Using. VIC-3D-HS, High Speed 3D Digital Image Correlation System. Indian Institute of Technology New Delhi

And. Modal Analysis. Using. VIC-3D-HS, High Speed 3D Digital Image Correlation System. Indian Institute of Technology New Delhi Full Field Displacement And Strain Measurement And Modal Analysis Using VIC-3D-HS, High Speed 3D Digital Image Correlation System At Indian Institute of Technology New Delhi VIC-3D, 3D Digital Image Correlation

More information

Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision

Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision Zhiyan Zhang 1, Wei Qian 1, Lei Pan 1 & Yanjun Li 1 1 University of Shanghai for Science and Technology, China

More information

Metrology and Sensing

Metrology and Sensing Metrology and Sensing Lecture 4: Fringe projection 2017-11-09 Herbert Gross Winter term 2017 www.iap.uni-jena.de 2 Preliminary Schedule No Date Subject Detailed Content 1 19.10. Introduction Introduction,

More information

Full-field optical methods for mechanical engineering: essential concepts to find one way

Full-field optical methods for mechanical engineering: essential concepts to find one way Full-field optical methods for mechanical engineering: essential concepts to find one way Yves Surrel Techlab September 2004 1 Contents 1 Introduction 3 2 White light methods 4 2.1 Random encoding............................................

More information

Shading of a computer-generated hologram by zone plate modulation

Shading of a computer-generated hologram by zone plate modulation Shading of a computer-generated hologram by zone plate modulation Takayuki Kurihara * and Yasuhiro Takaki Institute of Engineering, Tokyo University of Agriculture and Technology, 2-24-16 Naka-cho, Koganei,Tokyo

More information

A. K. Srivastava, K.C. Sati, Satyander Kumar alaser Science and Technology Center, Metcalfe House, Civil Lines, Delhi , INDIA

A. K. Srivastava, K.C. Sati, Satyander Kumar alaser Science and Technology Center, Metcalfe House, Civil Lines, Delhi , INDIA International Journal of Scientific & Engineering Research Volume 8, Issue 7, July-2017 1752 Optical method for measurement of radius of curvature of large diameter mirrors A. K. Srivastava, K.C. Sati,

More information

Improving the 3D Scan Precision of Laser Triangulation

Improving the 3D Scan Precision of Laser Triangulation Improving the 3D Scan Precision of Laser Triangulation The Principle of Laser Triangulation Triangulation Geometry Example Z Y X Image of Target Object Sensor Image of Laser Line 3D Laser Triangulation

More information

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD Takeo MIYASAKA and Kazuo ARAKI Graduate School of Computer and Cognitive Sciences, Chukyo University, Japan miyasaka@grad.sccs.chukto-u.ac.jp,

More information

A Survey of Light Source Detection Methods

A Survey of Light Source Detection Methods A Survey of Light Source Detection Methods Nathan Funk University of Alberta Mini-Project for CMPUT 603 November 30, 2003 Abstract This paper provides an overview of the most prominent techniques for light

More information

Guo, Wenjiang; Zhao, Liping; Chen, I-Ming

Guo, Wenjiang; Zhao, Liping; Chen, I-Ming Title Dynamic focal spots registration algorithm for freeform surface measurement Author(s) Guo, Wenjiang; Zhao, Liping; Chen, I-Ming Citation Guo, W., Zhao, L., & Chen, I.-M. (2013). Dynamic focal spots

More information

Comparison of Beam Shapes and Transmission Powers of Two Prism Ducts

Comparison of Beam Shapes and Transmission Powers of Two Prism Ducts Australian Journal of Basic and Applied Sciences, 4(10): 4922-4929, 2010 ISSN 1991-8178 Comparison of Beam Shapes and Transmission Powers of Two Prism Ducts 1 Z. Emami, 2 H. Golnabi 1 Plasma physics Research

More information

Information page for written examinations at Linköping University TER2

Information page for written examinations at Linköping University TER2 Information page for written examinations at Linköping University Examination date 2016-08-19 Room (1) TER2 Time 8-12 Course code Exam code Course name Exam name Department Number of questions in the examination

More information

Highly Accurate Photorealistic Modeling of Cultural Heritage Assets

Highly Accurate Photorealistic Modeling of Cultural Heritage Assets Highly Accurate Photorealistic Modeling of Cultural Heritage Assets Peter DORNINGER 1 / Marco BRUNNER 2 1 TU Vienna, Institute of Photogrammetry and Remote Sensing / 2 a:xperience audiovisuelle Kommunikation

More information

Capturing, Modeling, Rendering 3D Structures

Capturing, Modeling, Rendering 3D Structures Computer Vision Approach Capturing, Modeling, Rendering 3D Structures Calculate pixel correspondences and extract geometry Not robust Difficult to acquire illumination effects, e.g. specular highlights

More information

E (sensor) is given by; Object Size

E (sensor) is given by; Object Size A P P L I C A T I O N N O T E S Practical Radiometry It is often necessary to estimate the response of a camera under given lighting conditions, or perhaps to estimate lighting requirements for a particular

More information