Self-calibration of telecentric lenses : application to bubbly flow using moving stereoscopic camera.

Similar documents

Characterization of the Flow Field around a Transonic Wing by PIV.

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Exterior Orientation Parameters

Vortex Generator Induced Flow in a High Re Boundary Layer

Volumetric Velocimetry via Scanning Back-Projection and Least-Squares-Matching Algorithms of a Vortex Ring

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

And. Modal Analysis. Using. VIC-3D-HS, High Speed 3D Digital Image Correlation System. Indian Institute of Technology New Delhi

PANORAMA-BASED CAMERA CALIBRATION

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

4. A bulb has a luminous flux of 2400 lm. What is the luminous intensity of the bulb?

Flexible Calibration of a Portable Structured Light System through Surface Plane

Stereo PIV investigation of a round jet

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Chapter 12 Notes: Optics

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

An Overview of Matchmoving using Structure from Motion Methods

Pulsating flow around a stationary cylinder: An experimental study

Pattern Feature Detection for Camera Calibration Using Circular Sample

PARTICLE IMAGE VELOCIMETRY (PIV) AND VOLUMETRIC VELOCIMETRY (V3V) SYSTEMS

Optics INTRODUCTION DISCUSSION OF PRINCIPLES. Reflection by a Plane Mirror

MULTIPLE-SENSOR INTEGRATION FOR EFFICIENT REVERSE ENGINEERING OF GEOMETRY

Creating a distortion characterisation dataset for visual band cameras using fiducial markers.

CULTURAL HERITAGE RECORDING WITH LASER SCANNING, COMPUTER VISION AND EXPLOITATION OF ARCHITECTURAL RULES

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

A 100Hz Real-time Sensing System of Textured Range Images

Archeoviz: Improving the Camera Calibration Process. Jonathan Goulet Advisor: Dr. Kostas Daniilidis

SYNTHETIC SCHLIEREN. Stuart B Dalziel, Graham O Hughes & Bruce R Sutherland. Keywords: schlieren, internal waves, image processing

Measurement of droplets temperature by using a global rainbow technique with a pulse laser

3D-2D Laser Range Finder calibration using a conic based geometry shape

3D Modeling using multiple images Exam January 2008

Camera model and multiple view geometry

ADS40 Calibration & Verification Process. Udo Tempelmann*, Ludger Hinsken**, Utz Recke*

Stereo Image Rectification for Simple Panoramic Image Generation

A SIMPLIFIED OPTICAL TRIANGULATION METHOD FOR HEIGHT MEASUREMENTS ON INSTATIONARY WATER SURFACES

Projector Calibration for Pattern Projection Systems

A three-step system calibration procedure with error compensation for 3D shape measurement

Time-resolved PIV measurements with CAVILUX HF diode laser

A COMPREHENSIVE SIMULATION SOFTWARE FOR TEACHING CAMERA CALIBRATION

Vortex Generator Induced Flow in a High Re Boundary Layer

Introduction to 3D Machine Vision

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)

PART A Three-Dimensional Measurement with iwitness

Particle Velocimetry Data from COMSOL Model of Micro-channels

Robot Vision: Camera calibration

Camera models and calibration

3D DEFORMATION MEASUREMENT USING STEREO- CORRELATION APPLIED TO EXPERIMENTAL MECHANICS

Omnivergent Stereo-panoramas with a Fish-eye Lens

Computed Photography - Final Project Endoscope Exploration on Knee Surface

ECE-161C Cameras. Nuno Vasconcelos ECE Department, UCSD

Calibrating a Structured Light System Dr Alan M. McIvor Robert J. Valkenburg Machine Vision Team, Industrial Research Limited P.O. Box 2225, Auckland

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

STEREO VISION AND LASER STRIPERS FOR THREE-DIMENSIONAL SURFACE MEASUREMENTS

An Endoscope With 2 DOFs Steering of Coaxial Nd:YAG Laser Beam for Fetal Surgery [Yamanaka et al. 2010, IEEE trans on Mechatronics]

Camera Model and Calibration

arxiv: v1 [cs.cv] 28 Sep 2018

Two-view geometry Computer Vision Spring 2018, Lecture 10

An Acquisition Geometry-Independent Calibration Tool for Industrial Computed Tomography

A Computer Vision Sensor for Panoramic Depth Perception

Watchmaker precision for robotic placement of automobile body parts

Flow Field of Truncated Spherical Turrets

Measurements of Three-Dimensional Velocity Fields Under Breaking Waves

Automated Laser-Beam Steering. Margot Epstein

Symbol Detection Using Region Adjacency Graphs and Integer Linear Programming

Introducing Robotics Vision System to a Manufacturing Robotics Course

A 3-D Scanner Capturing Range and Color for the Robotics Applications

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

Improving Vision-Based Distance Measurements using Reference Objects

Tecnologie per la ricostruzione di modelli 3D da immagini. Marco Callieri ISTI-CNR, Pisa, Italy

Experimental investigation of Shedding Mode II in circular cylinder wake

Efficient Stereo Image Rectification Method Using Horizontal Baseline

CSE 252B: Computer Vision II

Optimized Design of 3D Laser Triangulation Systems

A RADIAL WHITE LIGHT INTERFEROMETER FOR MEASUREMENT OF CYLINDRICAL GEOMETRIES

1 Projective Geometry

Digital Image Processing COSC 6380/4393

CS4670: Computer Vision

Rapid parametric SAR reconstruction from a small number of measured E-field data : validation of an ellipsoidal model

Mirror Based Framework for Human Body Measurement

Holographic PIV for Large Scale Facilities

Measurement Techniques. Digital Particle Image Velocimetry

Kinematics of Machines Prof. A. K. Mallik Department of Mechanical Engineering Indian Institute of Technology, Kanpur. Module - 3 Lecture - 1

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Generic and Real-Time Structure from Motion

''VISION'' APPROACH OF CALmRATION METIIODS FOR RADIOGRAPHIC

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Miniaturized Camera Systems for Microfactories

ENGN D Photography / Spring 2018 / SYLLABUS

Infrared Camera Calibration in the 3D Temperature Field Reconstruction

Tecnologie per la ricostruzione di modelli 3D da immagini. Marco Callieri ISTI-CNR, Pisa, Italy

Calibration of a rotating multi-beam Lidar

calibrated coordinates Linear transformation pixel coordinates

GEOMETRIC OPTICS. LENSES refract light, so we need to know how light bends when entering and exiting a lens and how that interaction forms an image.

Catadioptric camera model with conic mirror

STRAIGHT LINE REFERENCE SYSTEM STATUS REPORT ON POISSON SYSTEM CALIBRATION

Calibration of a fish eye lens with field of view larger than 180

New Opportunities for 3D SPI

Transcription:

Self-calibration of telecentric lenses : application to bubbly flow using moving stereoscopic camera. S. COUDERT 1*, T. FOURNEL 1, J.-M. LAVEST 2, F. COLLANGE 2 and J.-P. SCHON 1 1 LTSI, Université Jean Monnet, 23 rue Michelon, 42 023 Saint- Etienne Cedex, France. 2 LASMEA, Université Blaise Pascal, 24 avenue des Landais, 63 177 AUBIERE Cedex, France. KEYWORDS: Main subject(s): advanced image processing, self-calibration, Fluid: hydrodynamics, Visualization method(s): shadow method, stereoscopic camera, moving camera, Other keywords: telecentric lenses, multi-phase flow, bubble, flow visualisation ABSTRACT : Using stereoscopic imaging techniques to measure structures in turbulent flows, a calibration of the cameras is needed to connect image units to real one. Generally, the calibration procedure is quite long and needs a huge translation stage to be introduced in the test section. The method proposed get rid of that translation stage. It uses the techniques of self-calibration. It consists in positioning the calibration target by hand (i.e. unknown positions). The self-calibrated stereoscopic system uses telecentric lenses for both cameras. It is applied to measure the different position in time of a bubble raising in a still water filled tank. The entire stereoscopic system is moving to follow the bubble. 1. Introduction The stereoscopic system described is used to follow a bubble in turbulent conditions and measure its position in time. The stereoscopic system is moving as it is mounted on a huge translation stage to follow the raise of the bubble. Firstly, a calibration of the cameras is needed to connect image units to real one. Generally, the calibration procedure is quite long as it needs to introduce a precise translation stage in the test facilities (Soloff et al. 1997, Oord 1997, Riou et al. 1998, Coudert et al. 2001, Coudert and Schon 2003), and then record images of a calibration target at different known positions, and at the end remove the stages. In many facilities, during these introduction and removal of a huge translation stage, the test section have to be both mounted and unmounted. The first aim of this paper is to remove the use of the calibration translation stage by using a selfcalibration algorithm of the system. This method uses the well known techniques of self-calibration * Corresponding author: Dr. Sébastien COUDERT, LTSI, UMR CNRS 5516 / Université Jean Monnet, 23 rue Michelon, 42023 Saint- Etienne Cedex, France email: coudert@univ-st-etienne.fr 1

developed in robotics research. It is applied here to stereoscopic image velocimetry. The second aim is to point out the problems encountered using moving imaging system. The main advantage of this method consists in positioning the calibration target by hand (i.e. random positions). The main draw-backs is that a model of the stereoscopic system have to be created. But, the method allows the introduction of non-linear functions in the modeling, such as Scheimpflug imaging, optical distortions of the lenses and transparent wall geometry of the facilities. The imaging system uses telecentric lenses for both camera of the stereoscopic system (Coudert et al. 2001, Fournel et al. 2000 and 2003). It is applied to measure the different position in time of a bubble raising in a still water filled tank. The entire stereoscopic system is moving to follow the bubble. The experimental setup allows a maximal camera speed of 1 m.s -1 on a distance of 0.5 m. In a first step, the self-calibration of the stereoscopic imaging system is done. The method consists in firstly positioning a 2D calibration target in front of the stereoscopic cameras in a few different 3D positions, secondly, these unknown positions are computed as well as the camera parameters. The optimised parameters include also the position of the dots on the 2D calibration target. For each target 3D position, the dot centre positions of the calibration target have to be determined with a high accuracy in the recorded images. In a second step, the images of the bubble that goes upwards are recorded by the moving stereoscopic cameras. The absolute positions and the 3D velocity vectors of the bubble can be reconstructed. Firstly, the experimental setup and its geometry is described in paragraph 2. Secondly the processed algorithms are depicted as well as its accuracy. Then, the results are discussed in paragraph 4. Finally, conclusion and prospects lay in the last paragraph. In a few paragraph, the text refers to an other paper in these proceedings (i.e. Coudert and al. reference F4055), as the experimental setup is almost the same. 2. Experimental setup and geometry The same experimental setup and geometry as for an other paper in these proceedings (i.e. Coudert and al. reference F4055) has been used, except that this time two identical cameras and lenses are used instead of only one in order to have a stereoscopic view of raising bubbles. The angle between both camera is 90. White screen and spot is also doubled in order to have equivalent shadow imaging of raising bubbles on both cameras. The whole setup lay on Fig. 1 : on the left side, the huge translation stage supports the moving cameras. The right camera is in the bottom right hand corner of the image, and the lenses of the left one lays in between the translation stage and the water tank. 2

Fig. 1 Global view of the experimental setup. A few bubble is raising from the orifice in the middle of the water tank. 3. Processing 3-1 Self-calibration Regarding to standard calibration, the self-calibration differs mainly by determining the calibration parameters and the positions of the calibration target at the same time (and, in a final stage, the position of the dots on the target, i.e. Lavest et al. 1999). As for standard calibration, the intrinsic camera parameters are computed (e.g. optical point, focal number, distortions; i.e. Coudert et al. 2001 Fournel et al. 2000 and 2003). But, as the positions of the calibration target are unknown, it have to be computed from different views of the same target. The mechanical 3D position of the target is also called extrinsic parameters. At least 6 calibration images are needed in order to fulfil the determination of the 6 space parameters (e.g. extrinsic parameters : 3 translations and 3 rotations). Different views of the calibration target (e.g. translations and rotations) are needed in the set of calibration images. A non linear optimisation algorithm is used (such as Levenberg-Marquart (i.e. More 1977)) to optimise the intrinsic and extrinsic parameters of the cameras. The optimisation is computed in the image plane 3

(e.g. CCD chip plane). The difference used is the one between the two positions of the recorded dot projection and the simulated position projection. On one hand, the recorded dot position is computed using a special image processing algorithm, which is much more precise compared to centroide of the dot image. One may refer to Fig. 2 and Lavest et al. 1999. On the other hand, the simulated position of the dot is computed from the optimised parameters at the computing step. This processing needs initial values for the parameters. In our case, optimised parameters are 2 sets of intrinsic parameters for the 2 cameras of the setup, 12 sets of extrinsic parameters for the relative position between the two cameras and the 12 views of the calibration target (e.g. 24 calibration images (sample on Fig. 3)) and 15 positions of the dots in the calibration target plane. The algorithm to find the position of the calibration target dot (e.g. Fig. 2) is 0.03 pixel RMS accuracy on the image. That corresponds to an accuracy of about 5 µm in space for the 3D points. 0 Y Z X 10 J 20 200 180 30 160 gl 40 40 140 50 0 10 20 30 40 50 I J 20 00 20 I 40 120 100 2.1 2.2 luminance gl max r 1 p = gl/ r gl min r 2 r 2.3 radius Fig. 2 Computation of a recorded dot centre using a model of luminance. 2.1 : A dot of the calibration target is assumed to be white on a black background. 2.2 : The grey level of this dots forms a 3D shape that is transformed to a single dimension space (e.g. from the ellipsoidal shape to circular model and finally from circular to 1D space). 2.3 : In this space, the model consists of a few geometrical parameters that fits the luminance (e.g. the grey levels denoted gl). 4

set 1 2 3 4 C1 calibration_00_013_000..tif calibration_00_014_000..tif calibration_00_018_000..tif calibration_00_019_000..tif C2 Fig. 3 Self-calibration of stereoscopic systems needs a few pairs of recorded images of the calibration target. For each pair, images recorded on right camera and respectively left camera, lay on the first line (labelled C1) and respectively on second line (labelled C2). Each position which is initially unknown, is determined using all sets by optimising the parameters of the stereoscopic system regarding to the position of the grid dots on the images. 3-2 Image processing The same image processing as for an other paper in these proceedings (i.e. Coudert and al. reference F4055) has been used. The main point to remember is that the detection accuracy of the bubble centre is estimated around 1.5 pixel RMS on the image. For both left and right images of the bubble at any time, the same image processing algorithm is computed (e.g. Fig. 4)). 5

set 1 1 2 2 C1 Bulle_000_072 image bin Bulle_000_074 image bin C2 Fig. 4 Set of two stereoscopic image pair recorded by the moving cameras, and the associated binary images from image processing. The bubble which goes upwards, have a different shape that is viewed from different 2 points of view. 3-3 Capturing system The same capturing system as for an other paper in these proceedings (i.e. Coudert and al. reference F4055) has been used. The main point to remember is that it gives a velocity accuracy around 1 % RMS. But, huge vibration of the whole system decreases this accuracy when the system is moved as the electronic controller try to compensate those vibrations. 3-4 Reconstruction The reconstruction of the bubble position is done using bubble image positions on both cameras and the camera parameters from self-calibration processing (e.g. Fig. 5). The 3D position is triangulated using the two rays coming from the centre of the bubble image (including the optical distortions) and passing through the centre of the optic for each camera (Figure 5.1). The bubble 3D position is the intersection of these two 3D rays (or, as it doesn't cross most of the time, the middle point of the segment between the two closest points of these lines). If this algorithm uses 0.1 pixel RMS accuracy on the image for the position of a 3D point projection, that would corresponds to an accuracy of about 10 µm in space. As announced in the paragraph 3-2, the position of a 3D point projection is assumed around 20 time less accurate. The algorithm described above is used to compute several relative position of the bubble in time. These positions are then processed to absolute positions by adding the absolute positions of the cameras from the capturing system (e.g. paragraph 3-3). In figure 5.2, the absolute position of a bubble in time is represented from a set of 59 recorded image pairs. 6

right ray left ray optical center Z Y CCD chip X left camera ZOOM right camera assumed line intersection 5 Z 0-5 20 15 X 10 5.1 5.2 Fig. 5 3D reconstruction of a bubble position : 5.1 : the 3D position of a bubble is reconstructed using 2 rays from left and right cameras; both rays (in continuous line) come from the centre of the bubble image (e.g. on the CCD chip) and the centre of the lenses. The intersection of these 2 lines in 3D space gives the position of the bubble. This intersection is assumed as the middle of the segment (denoted + on this figure) which is in between the closest points of both line. 5.2 : the reconstructed 3D positions of the bubble in time is represented by the continuous line using 59 image pairs processed by the reconstruction algorithm described above. 4. Results and discussion 4-1 Processing Most of the processing done are very accurate within less than 1 %, except for the determination of the position of the bubble on the image. This low accuracy of the processing doesn't come from the algorithm it self, but because of the shape of the bubble is moving as it raises in a turbulent flow. The accuracy which should be around 5 µm by the mean of calibration is moved to about 0.5 mm. 4-2 Moving cameras As previously said, the inaccuracy of the measurement system comes mainly from the turbulent phenomena, such as the distortion of the shapes of the bubble in time. Moreover, the moving system that carries the cameras, is sensitive to vibrations. Those vibrations add huge errors on absolute and relative positions (e.g. image position). It give also an other drawbacks, such as the adjustments of the cameras and lenses, that needed to be fixed very tightly. In the future, these kind of vibration should be suppressed by having a damping mechanical system : short carrying harms, steal cable to counterweight and perhaps a different electronic controller on the translation stage. 7

5. Conclusion and prospects Computer vision algorithms are appropriate for that kind of application. The original topic of selfcalibration is completed for a stereoscopic system using telecentric lenses. Image et 3D reconstruction processing give also good results. The second original topic of moving camera give many problem, such as rigidity of the stereoscopic system or vibrations of part of the system. The inaccuracy of the measurement system is due to the moving camera system. In the future, this system has to be improved. List of Movies Movie 1: Reconstructed 3D positions of a bubble (3155 kb) : the reconstructed path is rotated 360 times by step of 2 around Y axis. References Coudert S. and Schon J.-P.: Back-projection algorithm with misalignment corrections for 2D3C stereoscopic PIV. Measurement Science and Technology 12 pp. 1371-1381, 2001. Coudert, S., Fournier C., Bochard N., Fournel T. and Schon J.-P.: Corrections for misalignment between the laser sheet plane and the calibration plane : measurement in a turbulent round free jet using stereoscopic PIV with telecentric lenses. PIV'01 4th International Symposium on Particle Image Velocimetry, Göttingen, 2001. Fournel T., Coudert S. and Riou L.: Stereoscopic 2D3C DPIV with telecentric lenses : calibration and first results. Euromech 411 Colloquium, Rouen, 2000. Fournel T., Coudert S., Fournier C., and Ducottet C.: Stereoscopic Particle Image Velocimetry using telecentric lenses, Measurement Science and Technology 14 pp. 494-499, 2003. Lavest J.M., Viala M. and Dhome M.: Quelle précision pour une mire d'étalonnage?. Traitement du signal 16(3) pp. 241-254, 1999. More J. J.: The Levenberg-Marquardt Algorithm: Implementation and Theory. Numerical Analysis, ed. G. A. Watson, Lecture Notes in Mathematics 630, Springer Verlag, pp. 105-116, 1977. Oord, J. V., The design of a stereoscopic DPIV system. Delft (NetherLands), Laboratory for Aero & HydroDynamics: 1-50, 1997. Riou L., Fayolle J. and Fournel T.: PIV measurements using multiple cameras : the calibration method. 8th international symposium on flow visualization, Sorrento, 95.1-95.11, 1998 Soloff S. M., Adrian R. J. and Liu Z.-C.: Distortion compensation for generalized stereoscopic particle image velocimetry, Measurement Science and Technology 12 pp. 1441-1454, 1997. 8