NEW GEOMETRIES FOR 3D LASER SENSORS WITH PROJECTION DISCRIMINATION

Similar documents
AUTONOMOUS ROBOT NAVIGATION BASED ON FUZZY LOGIC AND REINFORCEMENT LEARNING

VARIATION OF INTERNAL FORCES USING ARTIFICIAL NEURONAL NETWORK

Projection: Mapping 3-D to 2-D. Orthographic Projection. The Canonical Camera Configuration. Perspective Projection

APPLICATIONS OF MICROSOFT EXCEL - SOLVER FOR HORIZONTAL AND LEVELLING NETWORKS ADJUSTMENT

MODELING THE FORCE-ELONGATION CURVE OF SINGLE YARNS

of Straight Lines 1. The straight line with gradient 3 which passes through the point,2

Chap 7, 2009 Spring Yeong Gil Shin

3D X-ray Laminography with CMOS Image Sensor Using a Projection Method for Reconstruction of Arbitrary Cross-sectional Images

Lecture 4: Viewing. Topics:

3-Dimensional Viewing

D-Calib: Calibration Software for Multiple Cameras System

Chapter 23. Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian

Modeling Transformations

BOOLEAN FUNCTION DECOMPOSITION BASED ON FPGA BASIC CELL STRUCTURE

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

Introduction to Homogeneous Transformations & Robot Kinematics

Evening s Goals. Mathematical Transformations. Discuss the mathematical transformations that are utilized for computer graphics

E V ER-growing global competition forces. Accuracy Analysis and Improvement for Direct Laser Sintering

Modeling with CMU Mini-FEA Program

Transformations of Functions. 1. Shifting, reflecting, and stretching graphs Symmetry of functions and equations

Computer Graphics. Bing-Yu Chen National Taiwan University The University of Tokyo

Proposal of a Touch Panel Like Operation Method For Presentation with a Projector Using Laser Pointer

arxiv: v1 [cs.cv] 28 Sep 2018

Flexible Calibration of a Portable Structured Light System through Surface Plane

Introduction to 3D Machine Vision

Determining the 2d transformation that brings one image into alignment (registers it) with another. And

MAN-522: COMPUTER VISION SET-2 Projections and Camera Calibration

Introduction to Homogeneous Transformations & Robot Kinematics

ANALYSIS OF DATA TRANSMITTED BETWEEN THE SERVER AND THE CLIENT THROUGH DIFFERENT TYPES OF COMMUNICATION

PLC APPLICATION FOR BRUSHLESS MOTOR POSITIONING

GLOBAL EDITION. Interactive Computer Graphics. A Top-Down Approach with WebGL SEVENTH EDITION. Edward Angel Dave Shreiner

CS770/870 Spring 2017 Transformations

Range Sensors (time of flight) (1)

I N T R O D U C T I O N T O C O M P U T E R G R A P H I C S

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Measurements using three-dimensional product imaging

Realtime 3D Computer Graphics Virtual Reality

Perspective Projection Transformation

A Practical Camera Calibration System on Mobile Phones

Scientific Manual FEM-Design 16.0

Optimized Design of 3D Laser Triangulation Systems

Discover how to solve this problem in this chapter.

Editing and Transformation

Overview of Active Vision Techniques

Nighttime Pedestrian Ranging Algorithm Based on Monocular Vision

Today s class. Geometric objects and transformations. Informationsteknologi. Wednesday, November 7, 2007 Computer Graphics - Class 5 1

Modeling Transformations

A rigid body free to move in a reference frame will, in the general case, have complex motion, which is simultaneously a combination of rotation and

Lines and Their Slopes

CHECKING THE HOMOGENEITY OF CONCRETE USING ARTIFICIAL NEURAL NETWORK

The liquid s index of refraction is. v liquid = nm = = 460 nm 1.38

Announcements. Equation of Perspective Projection. Image Formation and Cameras

STEREOSCOPIC ROBOT VISION SYSTEM

12.4 The Ellipse. Standard Form of an Ellipse Centered at (0, 0) (0, b) (0, -b) center

Chapter 26 Geometrical Optics

Computer Graphics. Jeng-Sheng Yeh 葉正聖 Ming Chuan University (modified from Bing-Yu Chen s slides)

TEAMS National Competition Middle School Version Photometry Solution Manual 25 Questions

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

Basic commands using the "Insert" menu: To insert a two-dimensional (2D) graph, use: To insert a three-dimensional (3D) graph, use: Insert > Plot > 3D

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

A FAULT PRIMITIVE BASED MODEL OF ALL STATIC FOUR- CELL COUPLING FAULTS IN RANDOM-ACCESS MEMORIES

What and Why Transformations?

BARCODE READER MANAGEMENT WITH THE ATMEL MICROCONTROLLER (I)

Three-Dimensional Coordinates

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

A STUDY ON CLASSIFIERS ACCURACY FOR HAND POSE RECOGNITION

ACTIVITY: Frieze Patterns and Reflections. a. Is the frieze pattern a reflection of itself when folded horizontally? Explain.

Computer Vision cmput 428/615

TEAMS National Competition High School Version Photometry Solution Manual 25 Questions

Approaches to Simulate the Operation of the Bending Machine

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

Chap 7, 2008 Spring Yeong Gil Shin

Dispersion (23.5) Neil Alberding (SFU Physics) Physics 121: Optics, Electricity & Magnetism Spring / 17

Laser Trianglation Displacement Measurement Method Using Prism-Based Optical Structure

Three-Dimensional Scanning and Graphic Processing System

2.2 Differential Forms

Research Article Scene Semantics Recognition Based on Target Detection and Fuzzy Reasoning

ABOUT MANUFACTURING PROCESSES CAPABILITY ANALYSIS

A NOVEL SYSTOLIC ALGORITHM FOR 2-D DISCRETE SINE TRANSFORM

To Do. Motivation. Demo (Projection Tutorial) What we ve seen so far. Computer Graphics. Summary: The Whole Viewing Pipeline

Announcements. Tutorial this week Life of the polygon A1 theory questions

Practical Robotics (PRAC)

User Interface for Optical Multi-Sensorial Measurements at Extruded Profiles

Measurement and Precision Analysis of Exterior Orientation Element Based on Landmark Point Auxiliary Orientation

Robot Vision: Camera calibration

Pattern Feature Detection for Camera Calibration Using Circular Sample

3D data merging using Holoimage

Available online at Procedia Engineering 7 (2010) Procedia Engineering 00 (2010)

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

3B SCIENTIFIC PHYSICS

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Automatic Facial Expression Recognition Using Neural Network

Refraction at a single curved spherical surface

Rectification and Disparity

CS F-07 Objects in 2D 1

Precision Peg-in-Hole Assembly Strategy Using Force-Guided Robot

To Do. Demo (Projection Tutorial) Motivation. What we ve seen so far. Outline. Foundations of Computer Graphics (Fall 2012) CS 184, Lecture 5: Viewing

Transformations in the Plane - Activity 1 Reflections in axes and an oblique line.

[ ] [ ] Orthogonal Transformation of Cartesian Coordinates in 2D & 3D. φ = cos 1 1/ φ = tan 1 [ 2 /1]

Transcription:

BULETINUL INSTITUTULUI OLITEHNIC DIN IAŞI ublicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LVI (LX), Fasc. 1, 2010 Secţia AUTOMATICĂ şi CALCULATOARE NEW GEOMETRIES FOR 3D LASER SENSORS WITH ROJECTION DISCRIMINATION BY MIHAI BULEA Abstract. Laser sensors are widel used for precision measurement of the distances. The all include one or more laser sources and one or more cameras and, using triangulation, the provide a precise measurement of the distance between the sensor (laser(s) plus camera(s)) and the target. The principle is simple: the laser creates a light spot on the target, while the camera takes a snapshot of the spot [1], [2]. Measuring the position of the spot projection in the image, the distance to the target can be measured. This paper presents a set of new geometries for multi-planar laser sensor, so that the location of each laser plane projection is uniquel determined in the projection plane. We will start with an analsis of the eisting 2D sensors and 2D+ sensors (with camera offset), and continue with a set of new geometries for a 3D sensor. Finall, some discussions regarding the multi-point and multi planar sensors are presented. Simulation results for all the geometries are also presented here. Ke words: laser sensors, multi-point sensors, geometries, camera. 2010 Mathematics Subject Classification: 51M15, 7420. 1. Introduction The basics of laser triangulation are simple and the could be reduced, for the beginning to a simple 2D geometrical problem: the laser generates a single light beam perpendicular to the sensor and the camera is capturing the trace (projection) of the beam on the target object. The camera has its focal point placed at a distance B (called the Baseline) from the laser, and its optical ais is at an angle α relative to the baseline (Fig.1). Of course, the laser beam and the camera focal ais are in the same plane. The camera has a known viewing angle β which defines two more important parameters of the laser

50 Mihai Bulea sensor. The first one is the standoff () is the minimum distance between the object and the sensor. The laser trace is not visible for the camera for distances smaller than. The second one is the depth of field (or the segment) is the active range of the sensor. The laser trace is not visible b the camera for distances bigger than +. standoff L 0 α Baseline Camera Fig. 1 The principle of laser sensors. In this paper we will suppose the ideal case: the camera has no optical distortions and its resolution is infinite. 2. The Geometr of the 2D Sensor The net drawing shows the geometr of a 2D sensor with 5 laser beams. The camera and the lasers are in the same plane and the laser beams are parallel. Because the camera and the lasers are in the same plane, the projections of all segments (in the camera) are overlapped significantl. The onl possible solutions to avoid this overlapping are: 1. Keep the lasers and the camera in the same plane, but move the camera along the baseline, closer to the lasers, until the overlapping disappears. Of course, this involves some restrictions in the form of a relationship between,, the number of laser beams, laser spacing, and the position of the camera. An etended discussion on this tpe of solution will be presented in another chapter. 2. Move the camera awa from the laser plane, so that the projections of the segments don t overlap anmore.

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 51 standoff L0 L1 L2 L3 L4 Fig. 2 The geometr of a parallel multi-beam laser sensor. Note that the closer the laser beam gets to the camera, the smaller the overlap is, or even it will disappear. But also note that the closer the laser beam gets to the camera, the smaller the segment projection is. From the geometr of the projection we can write (no optical distortions): (1) f () i dcam + f () i d cam + (i) th In (1) d cam is the distance between the camera and the i laser beam and f is the focal length of the camera. From (1) we can etract the sie of the projection of the segment: (2) () i 1 1 Δ + fdcam + As epected, the sie of the projection of the segment depends linearl on the distance between the camera and the laser beam. Now let s check Figs. 3 a and 3 b for the simulated results. It can be seen that the projection regions from each laser beam are different. In this case

52 Mihai Bulea the camera can be a linear one. Fig. 3 a The geometr of a 2D sensor with parallel beams. Fig. 3 b rojection regions for a 2D sensor with parallel beams. All the laser beams are parallel in the presented geometr. The effect of changing this geometr to radial laser beams is discussed net. L0 L1 L2 L3 L4 standoff Fig. 4 The geometr of a radial multi-beam laser sensor. From the geometr of the projection we can write (no optical distortions): (3) f dcam + tanα + f dcam + ( + ) tanα +

dcam Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 53 In (3) is the distance between the camera and the laser s base, α is the orientation of the laser beam, and f is the focal length of the camera. From (3) we can etract the sie of the projection of the segment: (4) dcam + tanα dcam + ( + )tanα Δ + f f + That is: dcam dcam (5) Δ f + tanα tanα + And finall: (6) 1 1 Δ fdcam + We got a formula prett similar to (2). As epected, the sie of the projection of the segment depends linearl on the distance between the camera and the laser base. But in the same time it does not depend on the orientation of the laser beam. This is mainl because, in the same time, the sie of the projection would decrease with the laser beam angle, but also the segment will increase in sie. Check in the simulated results (Figs. 5 a, 5 b) the constant width of the projection regions for each laser beam. In this case the camera can also be a linear one. Fig. 5 a The geometr of a 2D sensor with radial beams. Fig. 5 b rojection regions for a 2D sensor with radial beams. 3. The Geometr of the 2D+ Sensor If we move the camera awa from the laser plane, we get the net

54 Mihai Bulea geometr for parallel beam lasers: standoff L0 L1 L2 L3 L4 Fig. 6 The geometr of a parallel multi-beam laser sensor with camera offset. Now the projections of the segments don t overlap anmore, so I can label immediatel the projection of a laser trace based solel on its position in the projection plane. baseline B i O d f Fig. 7 The projection geometr for a parallel multi-beam laser sensor with camera offset.

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 55 From the geometr of the projection we can write from the aonometric projection (no optical distortions): O (7) f ( ) Bi + f ( ) Bi + From O aonometric projection we have: (8) f ( ) d + f d + ( ) In (7), (8) d is the distance offset the camera and the laser s plane, th is the baseline for the i laser beam, and f is the focal length of the camera. From (7) we can etract the etension of the projection of the segment: B i (9) ( ) ( ) ( ) 1 1 Δ + fbi + From (8) we can etract the segment: etension of the projection of the (10) ( ) 1 1 Δ fd + Again, we got a formula prett similar to (2). As epected, the sie of the projection of the segment depends linearl on the distance between the camera and the laser base. Because the etension of the projection of the depends on, the projections can be discriminated. Check in the simulated results (Figs. 8 a, 8 b) the sies and positions the projection regions for each laser beam. Sies are different. B i

56 Mihai Bulea O Fig. 8 a The geometr of a 2D+ sensor with parallel beams and camera offset. Fig. 8 b rojection regions for a 2D+ sensor with parallel beams and camera offset. All the laser beams are parallel in the previous geometr. The effect of changing this geometr to radial laser beams is presented in the net drawings: L0 L1 L2 L3 L4 standoff O Fig. 9 The geometr of a radial multi-beam laser sensor with camera offset.

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 57 O baseline d O f Fig. 10 The projection geometr (3D) for a radial multi-beam laser sensor with camera offset. α O baseline O f O O d Fig. 11 The projection geometr (aonometric) for a radial multi-beam laser sensor with camera offset.

58 Mihai Bulea Now the projections of the segments don t overlap anmore, so I can label immediatel the projection of a laser trace based solel on its position in the projection plane. From the geometr of the projection in O we can write (no optical distortions): (11) f ( ) B + tanα + f B + ( + ) tan ( ) α + That is: (12) ( ) B+ tanα f ( ) B+ ( + ) tanα + f + In (11) and (12) B is the baseline (distance between the camera and the laser s base), α is the orientation of the laser beam, and f is the focal length of the camera. From (11) we can etract the sie of the projection of the segment: (13) ( ) ( ) ( ) Δ + That is: ( + ) B + tanα B + tanα f f. + (14) ( ) 1 1 Δ fb +. From the geometr of the projection in O we can write also: (15) ( ) d ( ) B + tanα d B + ( + ) tanα ( ) + ( ) + Replacing (12) in (15) we get:

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 59 (16) B+ tanα fd ( ) B+ tanα B + ( + ) tanα fd ( ) + + B+ ( + ) tanα That is: (17) ( ) fd ( ) fd + + And finall: (18) ( ) ( ) ( ) 1 1 Δ fd + We got again two formulas ((15) and (18)) prett similar to (2). The sie of the projection of the segment depends linearl on the baseline and the camera offset which are constant. But in the same time, again, it does not depend on the orientation of the laser beam. This is mainl because, in the same time, the sie of the projection would decrease with the laser beam angle, but also the segment will increase in sie. Check in the simulated results (Figs. 12 a, 12 b) the sies and positions the projection regions for each laser beam. All the regions have the same sie. O Fig. 12 a The geometr of a 2D+ sensor with radial beams and camera offset. Fig. 12 b rojection regions for a 2D+ sensor with radial beams and camera offset.

60 Mihai Bulea 4. The Geometr of the 3D Sensor (arallel Laser lanes) Now we will stud the geometr of a 3D senor based on a set of parallel laser planes and a camera. The geometr of such a sensor is presented in Fig. 13. The camera must be placed in such a position that the projections of the segments don t overlap. standoff L 11 Fig. 13 Geometr of a sensor with parallel laser planes. Standoff L0 L1 L2 L3 4 focus point L L5 L6 L7 L8 L9 L 10 L11 projection plane Fig. 14 rojection geometr of a sensor with parallel laser planes.

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 61 Because the projections of the segments don t overlap, we will be able to discriminate de source of the projection. Analing in Fig.14 the appearance of the projection, we will note that each stripe corresponds to a single laser beam. The width of the stripe corresponds to the depth of the scene (distance to the object), while the length of the stripe corresponds to the field of view for the ais. The main problem seems to be the fact that, because the stripe width decreases for laser planes closer to the camera, the precision of the depth measurement will be prett low for those laser planes. base plane O laser plane O Standoff α α O d cam projection plane O focus point f O Fig. 15 rojection geometr of a sensor with parallel laser planes. is: (19) The sie of the projection of the segment on the projection plane ( ) ( ) ( ) dcam dcam Δ + f f + In (19) d cam is the distance in the base plane from the focus point (camera) to the laser plane. (20) ( ) 1 1 Δ fdcam + If the total angle of the laser plane is 2α we also have:

62 Mihai Bulea (21) tanα d cam ( ) fdcam That is: ( ) (22) f tanα This means that the length of all projection stripes is constant, while their width depends on the α angle. Check in the simulated results (Figs. 16 a and 16 b) the sies and positions the projection regions for each laser plane. The regions have different sies. Fig. 16 a The geometr of a 3D sensor with parallel laser planes. Fig. 16 b rojection regions for a 3D sensor with parallel laser planes. 5. The Geometr of the 3D Sensor (Radial Laser lanes) Analing the geometr of a sensor with radial laser planes, firstl we can compute the field of view knowing the camera angle 2 β with: (23) FOV ( + )tan β. If there are a total of onl one single laser plane is: N laser planes, the width of field available for + (24) Δ L tan β N + 1

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 63 L1 L2 B β B Fig. 17 Geometr of a sensor with radial laser planes. Now we can etract some relationships from the previous figure: Δ L (25) B B replacing (24) in (25) we get: (26) + tan β N + 1 B From the previous formula we can etract the baseline B value: (27) ( + ) ( N + 1) B tan β The last formula sas that, given the camera angle 2β, the standoff, and depth of field, we could compute easil the baseline B.

64 Mihai Bulea laser planes L 0 B B L 1 f 2D image Fig. 18 The geometr of the 3D Sensor with radial laser planes. The lasers are planar and radial, and are placed on both sides of the camera at a distance B. The focal length is f, the standoff is, the depth

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 65 of field is. The geometr was speciall designed so that the projections of the segments don t overlap. If the increases while keeping the same number of laser planes and camera angle, the baseline decreases, which makes the triangulation more difficult. Note also the wa the projection (piel) plane is used: each stripe corresponds to one laser plane, and the length of the stripe gives the field of view, while the stripe width gives the depth information. Onl the red laser planes are to be used, the rest are not needed, the are not needed in the recommended configuration (the most precise for depth measurement). But for some applications we could choose to drop entirel the left or the right set of laser planes, but use all the depicted planes (red and blue) in the remaining set. Or we could use one laser to generate the odd planes and the other for the even planes. Different versions of this geometr were simulated, as shown at the end of this chapter. α β O B f O Fig. 19 rojection geometr of a sensor with parallel laser planes (3D representation). The laser plane is in this case rotated around O ais with an angle α. The laser plane has an opening angle of 2β. We will use now aonometric projection for clarit.

66 Mihai Bulea laser plane α O O projection plane B f ( ) + () β β O O θ θ + ( ) ( ) + ( ) ( ) + Fig. 20 rojection geometr of a sensor with parallel laser planes (aonometric representation). From the previous figure (projection in O plane) we can etract some relationships: (28) f tanφ ( ) B+ tanα ( + ) f tanφ+ B + ( + ) tanα ( ) + From (28) we can etract the position of the segment projection:

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 67 (29) ( + tanα ) ( ) f B f B ( ) + + + + ( ) ( ) tanα That is the etension along ais for the segment projection is: (30) ( tanα ) f B+ ( + ) ( + ) tan ( ) f B+ α Δ That is: (31) ( ) B B Δ f + tanα tanα + And finall: (32) ( ) 1 1 Δ fb + This means that the etension along ais for the segment projection doesn t depend on α. We can get a simpler formula if we replace in (32) the baseline B we alread have from formula (27): (33) ( ) tan β Δ f N + 1 From the aonometric projection in O plane we can write: (34) ( ) tan β tanθ B tanα B+ tanα + f ( ) ( + ) tan β + tanθ+ B+ ( + ) tanα B+ ( + ) tanα f + From formula (34) we can etract finall:

68 Mihai Bulea (35) ( ) ( ) + f tan β Formulas (34) sas that the height of the projection stripe doesn t depend on α, which gives the rectangular stripes in the projection image (seen at the bottom of Fig. 21 and depicted again below): L0 L1 L2 L3 L4 L5 L6 L7 L8 L9 L10 f L 11 @( + ) f @ Fig. 21 rojection plane and discrimination procedure. f Each stripe corresponds to a laser plane. The width of a stripe (along ) corresponds to. So in order to get an accurate depth measurement, we f need a high resolution along coordinate in projection/piel plane. Check in the simulated results (Figs. 22 a, 22 b, 23 a, 23 b, 24 a, 24 b, 25 a, 25 b) the sies and positions the projection regions for each laser plane. The regions have identical sies. Various (interleaved or not) positions of the laser planes are proposed. Fig. 22 a The geometr of a 3D sensor with radial laser planes (one laser device). Fig. 22 b rojection regions for a 3D sensor with radial laser planes (one laser device).

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 69 Fig. 23 a The geometr of a 3D sensor with radial laser planes (two laser devices). Fig. 23 b rojection regions for a 3D sensor with radial laser planes (two laser devices). Fig. 24 a The geometr of a 3D sensor with radial laser planes (two laser devices). Fig. 24 b rojection regions for a 3D sensor with radial laser planes (two laser devices). Fig. 25 a The geometr of a 3D sensor with radial laser planes (two laser devices). Fig. 25 b rojection regions for a 3D sensor with radial laser planes (two laser devices).

70 Mihai Bulea 6. Conclusions The results of this stud are presented net: 1. It is possible to design a sensor based on a multi-planar and radial laser. 2. A sensor based on a multi-planar parallel sensor is b far not a better solution. 3. The discrimination of the laser plane in the projection plane is possible and eas to do (the projection is alwas contained inside a known rectangle in the captured image). 4. Strict design rules have to be followed: a) The bigger is the, the smaller is the baseline, and consequentl the bigger are the depth measurement errors. b) The bigger is the number of laser planes, the bigger is the required piel resolution for the same depth resolution. 5. For multi-point multi-planar and radial laser sensors it is possible to etend (double) the depth measurement precision while keeping the discrimination possible. Error analsis should be done net to determine correlations between the error level, sensor geometr parameters, and camera parameters (resolution, optical distortions, etc). Received: December 10, 2009 Snaptics Inc. Santa Clara, CA e-mail: mihai.bulea@ahoo.com R E F E R E N C E S 1. Blais F., A Review of 20 Years of Range Sensor Development. Journal of Elect. Imaging, 13, 1, 231 243 (2004). 2. Jahne B., Haußecker H., Geißler., Handbook of Computer Vision and Applications. Academic ress, San Diego, Vol. 1: Sensors and Imaging, Chap. 17 21 (1999). 3. Nguen H., Blackburn M., A Simple Method for Range Finding via Laser Triangulation. Technical Document 2734, NCCOSC, San Diego CA, Jan 1995. 4. Zhang B., Zhang J., Jiang C., Li Z., Zhang W., recision Laser Triangulation Range Sensor with Double Detectors for Measurement on CMMs. Industrial Optical Sensors for Metrolog and Inspection, Vol. 2349, Jan 1995, 44 52.

Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, 2010 71 NOI GEOMETRII ENTRU SENSORI CU LASER 3D CU DISCRIMINAREA ROIECŢIEI (Reumat) Sensorii cu laser se folosesc pentru măsurarea precisă a distanţelor şi se baeaă pe determinarea poiţiei urmei raei laser pe obiectul ţintă în imaginea furniată de o cameră video (triangulaţie). entru obiecte-ţintă complee se utilieaă fascicole laser multiple - punctuale sau liniare, dar etichetarea lor devine dificilă, chiar imposibilă pentru obiecte complee (cu concavităţi, goluri, etc), deorece urmele laser corespunătoare pot lipsi sau preenta discontinuităţi. Soluţii alternative cum ar fi folosirea unor lasere de culori diferite afecteaă semnificativ preţul de cost al sensorilor. Lucrarea de faţă preintă un set de geometrii noi pentru sensorii laser, geometrii care preintă proprietatea că urma laser este constrânsă într-o regiune unică a proiecţiei (imaginea video) şi deoarece regiunile nu se intersecteaă, etichetarea urmelor este etrem de simplă. Toate geometriile preentate sunt însoţite de criteriile de proiectare aferente şi sunt validate prin simulare.