NEW GEOMETRIES FOR 3D LASER SENSORS WITH PROJECTION DISCRIMINATION
|
|
- Blaze Welch
- 5 years ago
- Views:
Transcription
1 BULETINUL INSTITUTULUI OLITEHNIC DIN IAŞI ublicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LVI (LX), Fasc. 1, 2010 Secţia AUTOMATICĂ şi CALCULATOARE NEW GEOMETRIES FOR 3D LASER SENSORS WITH ROJECTION DISCRIMINATION BY MIHAI BULEA Abstract. Laser sensors are widel used for precision measurement of the distances. The all include one or more laser sources and one or more cameras and, using triangulation, the provide a precise measurement of the distance between the sensor (laser(s) plus camera(s)) and the target. The principle is simple: the laser creates a light spot on the target, while the camera takes a snapshot of the spot [1], [2]. Measuring the position of the spot projection in the image, the distance to the target can be measured. This paper presents a set of new geometries for multi-planar laser sensor, so that the location of each laser plane projection is uniquel determined in the projection plane. We will start with an analsis of the eisting 2D sensors and 2D+ sensors (with camera offset), and continue with a set of new geometries for a 3D sensor. Finall, some discussions regarding the multi-point and multi planar sensors are presented. Simulation results for all the geometries are also presented here. Ke words: laser sensors, multi-point sensors, geometries, camera Mathematics Subject Classification: 51M15, Introduction The basics of laser triangulation are simple and the could be reduced, for the beginning to a simple 2D geometrical problem: the laser generates a single light beam perpendicular to the sensor and the camera is capturing the trace (projection) of the beam on the target object. The camera has its focal point placed at a distance B (called the Baseline) from the laser, and its optical ais is at an angle α relative to the baseline (Fig.1). Of course, the laser beam and the camera focal ais are in the same plane. The camera has a known viewing angle β which defines two more important parameters of the laser
2 50 Mihai Bulea sensor. The first one is the standoff () is the minimum distance between the object and the sensor. The laser trace is not visible for the camera for distances smaller than. The second one is the depth of field (or the segment) is the active range of the sensor. The laser trace is not visible b the camera for distances bigger than +. standoff L 0 α Baseline Camera Fig. 1 The principle of laser sensors. In this paper we will suppose the ideal case: the camera has no optical distortions and its resolution is infinite. 2. The Geometr of the 2D Sensor The net drawing shows the geometr of a 2D sensor with 5 laser beams. The camera and the lasers are in the same plane and the laser beams are parallel. Because the camera and the lasers are in the same plane, the projections of all segments (in the camera) are overlapped significantl. The onl possible solutions to avoid this overlapping are: 1. Keep the lasers and the camera in the same plane, but move the camera along the baseline, closer to the lasers, until the overlapping disappears. Of course, this involves some restrictions in the form of a relationship between,, the number of laser beams, laser spacing, and the position of the camera. An etended discussion on this tpe of solution will be presented in another chapter. 2. Move the camera awa from the laser plane, so that the projections of the segments don t overlap anmore.
3 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, standoff L0 L1 L2 L3 L4 Fig. 2 The geometr of a parallel multi-beam laser sensor. Note that the closer the laser beam gets to the camera, the smaller the overlap is, or even it will disappear. But also note that the closer the laser beam gets to the camera, the smaller the segment projection is. From the geometr of the projection we can write (no optical distortions): (1) f () i dcam + f () i d cam + (i) th In (1) d cam is the distance between the camera and the i laser beam and f is the focal length of the camera. From (1) we can etract the sie of the projection of the segment: (2) () i 1 1 Δ + fdcam + As epected, the sie of the projection of the segment depends linearl on the distance between the camera and the laser beam. Now let s check Figs. 3 a and 3 b for the simulated results. It can be seen that the projection regions from each laser beam are different. In this case
4 52 Mihai Bulea the camera can be a linear one. Fig. 3 a The geometr of a 2D sensor with parallel beams. Fig. 3 b rojection regions for a 2D sensor with parallel beams. All the laser beams are parallel in the presented geometr. The effect of changing this geometr to radial laser beams is discussed net. L0 L1 L2 L3 L4 standoff Fig. 4 The geometr of a radial multi-beam laser sensor. From the geometr of the projection we can write (no optical distortions): (3) f dcam + tanα + f dcam + ( + ) tanα +
5 dcam Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, In (3) is the distance between the camera and the laser s base, α is the orientation of the laser beam, and f is the focal length of the camera. From (3) we can etract the sie of the projection of the segment: (4) dcam + tanα dcam + ( + )tanα Δ + f f + That is: dcam dcam (5) Δ f + tanα tanα + And finall: (6) 1 1 Δ fdcam + We got a formula prett similar to (2). As epected, the sie of the projection of the segment depends linearl on the distance between the camera and the laser base. But in the same time it does not depend on the orientation of the laser beam. This is mainl because, in the same time, the sie of the projection would decrease with the laser beam angle, but also the segment will increase in sie. Check in the simulated results (Figs. 5 a, 5 b) the constant width of the projection regions for each laser beam. In this case the camera can also be a linear one. Fig. 5 a The geometr of a 2D sensor with radial beams. Fig. 5 b rojection regions for a 2D sensor with radial beams. 3. The Geometr of the 2D+ Sensor If we move the camera awa from the laser plane, we get the net
6 54 Mihai Bulea geometr for parallel beam lasers: standoff L0 L1 L2 L3 L4 Fig. 6 The geometr of a parallel multi-beam laser sensor with camera offset. Now the projections of the segments don t overlap anmore, so I can label immediatel the projection of a laser trace based solel on its position in the projection plane. baseline B i O d f Fig. 7 The projection geometr for a parallel multi-beam laser sensor with camera offset.
7 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, From the geometr of the projection we can write from the aonometric projection (no optical distortions): O (7) f ( ) Bi + f ( ) Bi + From O aonometric projection we have: (8) f ( ) d + f d + ( ) In (7), (8) d is the distance offset the camera and the laser s plane, th is the baseline for the i laser beam, and f is the focal length of the camera. From (7) we can etract the etension of the projection of the segment: B i (9) ( ) ( ) ( ) 1 1 Δ + fbi + From (8) we can etract the segment: etension of the projection of the (10) ( ) 1 1 Δ fd + Again, we got a formula prett similar to (2). As epected, the sie of the projection of the segment depends linearl on the distance between the camera and the laser base. Because the etension of the projection of the depends on, the projections can be discriminated. Check in the simulated results (Figs. 8 a, 8 b) the sies and positions the projection regions for each laser beam. Sies are different. B i
8 56 Mihai Bulea O Fig. 8 a The geometr of a 2D+ sensor with parallel beams and camera offset. Fig. 8 b rojection regions for a 2D+ sensor with parallel beams and camera offset. All the laser beams are parallel in the previous geometr. The effect of changing this geometr to radial laser beams is presented in the net drawings: L0 L1 L2 L3 L4 standoff O Fig. 9 The geometr of a radial multi-beam laser sensor with camera offset.
9 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, O baseline d O f Fig. 10 The projection geometr (3D) for a radial multi-beam laser sensor with camera offset. α O baseline O f O O d Fig. 11 The projection geometr (aonometric) for a radial multi-beam laser sensor with camera offset.
10 58 Mihai Bulea Now the projections of the segments don t overlap anmore, so I can label immediatel the projection of a laser trace based solel on its position in the projection plane. From the geometr of the projection in O we can write (no optical distortions): (11) f ( ) B + tanα + f B + ( + ) tan ( ) α + That is: (12) ( ) B+ tanα f ( ) B+ ( + ) tanα + f + In (11) and (12) B is the baseline (distance between the camera and the laser s base), α is the orientation of the laser beam, and f is the focal length of the camera. From (11) we can etract the sie of the projection of the segment: (13) ( ) ( ) ( ) Δ + That is: ( + ) B + tanα B + tanα f f. + (14) ( ) 1 1 Δ fb +. From the geometr of the projection in O we can write also: (15) ( ) d ( ) B + tanα d B + ( + ) tanα ( ) + ( ) + Replacing (12) in (15) we get:
11 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, (16) B+ tanα fd ( ) B+ tanα B + ( + ) tanα fd ( ) + + B+ ( + ) tanα That is: (17) ( ) fd ( ) fd + + And finall: (18) ( ) ( ) ( ) 1 1 Δ fd + We got again two formulas ((15) and (18)) prett similar to (2). The sie of the projection of the segment depends linearl on the baseline and the camera offset which are constant. But in the same time, again, it does not depend on the orientation of the laser beam. This is mainl because, in the same time, the sie of the projection would decrease with the laser beam angle, but also the segment will increase in sie. Check in the simulated results (Figs. 12 a, 12 b) the sies and positions the projection regions for each laser beam. All the regions have the same sie. O Fig. 12 a The geometr of a 2D+ sensor with radial beams and camera offset. Fig. 12 b rojection regions for a 2D+ sensor with radial beams and camera offset.
12 60 Mihai Bulea 4. The Geometr of the 3D Sensor (arallel Laser lanes) Now we will stud the geometr of a 3D senor based on a set of parallel laser planes and a camera. The geometr of such a sensor is presented in Fig. 13. The camera must be placed in such a position that the projections of the segments don t overlap. standoff L 11 Fig. 13 Geometr of a sensor with parallel laser planes. Standoff L0 L1 L2 L3 4 focus point L L5 L6 L7 L8 L9 L 10 L11 projection plane Fig. 14 rojection geometr of a sensor with parallel laser planes.
13 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, Because the projections of the segments don t overlap, we will be able to discriminate de source of the projection. Analing in Fig.14 the appearance of the projection, we will note that each stripe corresponds to a single laser beam. The width of the stripe corresponds to the depth of the scene (distance to the object), while the length of the stripe corresponds to the field of view for the ais. The main problem seems to be the fact that, because the stripe width decreases for laser planes closer to the camera, the precision of the depth measurement will be prett low for those laser planes. base plane O laser plane O Standoff α α O d cam projection plane O focus point f O Fig. 15 rojection geometr of a sensor with parallel laser planes. is: (19) The sie of the projection of the segment on the projection plane ( ) ( ) ( ) dcam dcam Δ + f f + In (19) d cam is the distance in the base plane from the focus point (camera) to the laser plane. (20) ( ) 1 1 Δ fdcam + If the total angle of the laser plane is 2α we also have:
14 62 Mihai Bulea (21) tanα d cam ( ) fdcam That is: ( ) (22) f tanα This means that the length of all projection stripes is constant, while their width depends on the α angle. Check in the simulated results (Figs. 16 a and 16 b) the sies and positions the projection regions for each laser plane. The regions have different sies. Fig. 16 a The geometr of a 3D sensor with parallel laser planes. Fig. 16 b rojection regions for a 3D sensor with parallel laser planes. 5. The Geometr of the 3D Sensor (Radial Laser lanes) Analing the geometr of a sensor with radial laser planes, firstl we can compute the field of view knowing the camera angle 2 β with: (23) FOV ( + )tan β. If there are a total of onl one single laser plane is: N laser planes, the width of field available for + (24) Δ L tan β N + 1
15 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, L1 L2 B β B Fig. 17 Geometr of a sensor with radial laser planes. Now we can etract some relationships from the previous figure: Δ L (25) B B replacing (24) in (25) we get: (26) + tan β N + 1 B From the previous formula we can etract the baseline B value: (27) ( + ) ( N + 1) B tan β The last formula sas that, given the camera angle 2β, the standoff, and depth of field, we could compute easil the baseline B.
16 64 Mihai Bulea laser planes L 0 B B L 1 f 2D image Fig. 18 The geometr of the 3D Sensor with radial laser planes. The lasers are planar and radial, and are placed on both sides of the camera at a distance B. The focal length is f, the standoff is, the depth
17 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, of field is. The geometr was speciall designed so that the projections of the segments don t overlap. If the increases while keeping the same number of laser planes and camera angle, the baseline decreases, which makes the triangulation more difficult. Note also the wa the projection (piel) plane is used: each stripe corresponds to one laser plane, and the length of the stripe gives the field of view, while the stripe width gives the depth information. Onl the red laser planes are to be used, the rest are not needed, the are not needed in the recommended configuration (the most precise for depth measurement). But for some applications we could choose to drop entirel the left or the right set of laser planes, but use all the depicted planes (red and blue) in the remaining set. Or we could use one laser to generate the odd planes and the other for the even planes. Different versions of this geometr were simulated, as shown at the end of this chapter. α β O B f O Fig. 19 rojection geometr of a sensor with parallel laser planes (3D representation). The laser plane is in this case rotated around O ais with an angle α. The laser plane has an opening angle of 2β. We will use now aonometric projection for clarit.
18 66 Mihai Bulea laser plane α O O projection plane B f ( ) + () β β O O θ θ + ( ) ( ) + ( ) ( ) + Fig. 20 rojection geometr of a sensor with parallel laser planes (aonometric representation). From the previous figure (projection in O plane) we can etract some relationships: (28) f tanφ ( ) B+ tanα ( + ) f tanφ+ B + ( + ) tanα ( ) + From (28) we can etract the position of the segment projection:
19 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, (29) ( + tanα ) ( ) f B f B ( ) ( ) ( ) tanα That is the etension along ais for the segment projection is: (30) ( tanα ) f B+ ( + ) ( + ) tan ( ) f B+ α Δ That is: (31) ( ) B B Δ f + tanα tanα + And finall: (32) ( ) 1 1 Δ fb + This means that the etension along ais for the segment projection doesn t depend on α. We can get a simpler formula if we replace in (32) the baseline B we alread have from formula (27): (33) ( ) tan β Δ f N + 1 From the aonometric projection in O plane we can write: (34) ( ) tan β tanθ B tanα B+ tanα + f ( ) ( + ) tan β + tanθ+ B+ ( + ) tanα B+ ( + ) tanα f + From formula (34) we can etract finall:
20 68 Mihai Bulea (35) ( ) ( ) + f tan β Formulas (34) sas that the height of the projection stripe doesn t depend on α, which gives the rectangular stripes in the projection image (seen at the bottom of Fig. 21 and depicted again below): L0 L1 L2 L3 L4 L5 L6 L7 L8 L9 L10 f L + ) Fig. 21 rojection plane and discrimination procedure. f Each stripe corresponds to a laser plane. The width of a stripe (along ) corresponds to. So in order to get an accurate depth measurement, we f need a high resolution along coordinate in projection/piel plane. Check in the simulated results (Figs. 22 a, 22 b, 23 a, 23 b, 24 a, 24 b, 25 a, 25 b) the sies and positions the projection regions for each laser plane. The regions have identical sies. Various (interleaved or not) positions of the laser planes are proposed. Fig. 22 a The geometr of a 3D sensor with radial laser planes (one laser device). Fig. 22 b rojection regions for a 3D sensor with radial laser planes (one laser device).
21 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, Fig. 23 a The geometr of a 3D sensor with radial laser planes (two laser devices). Fig. 23 b rojection regions for a 3D sensor with radial laser planes (two laser devices). Fig. 24 a The geometr of a 3D sensor with radial laser planes (two laser devices). Fig. 24 b rojection regions for a 3D sensor with radial laser planes (two laser devices). Fig. 25 a The geometr of a 3D sensor with radial laser planes (two laser devices). Fig. 25 b rojection regions for a 3D sensor with radial laser planes (two laser devices).
22 70 Mihai Bulea 6. Conclusions The results of this stud are presented net: 1. It is possible to design a sensor based on a multi-planar and radial laser. 2. A sensor based on a multi-planar parallel sensor is b far not a better solution. 3. The discrimination of the laser plane in the projection plane is possible and eas to do (the projection is alwas contained inside a known rectangle in the captured image). 4. Strict design rules have to be followed: a) The bigger is the, the smaller is the baseline, and consequentl the bigger are the depth measurement errors. b) The bigger is the number of laser planes, the bigger is the required piel resolution for the same depth resolution. 5. For multi-point multi-planar and radial laser sensors it is possible to etend (double) the depth measurement precision while keeping the discrimination possible. Error analsis should be done net to determine correlations between the error level, sensor geometr parameters, and camera parameters (resolution, optical distortions, etc). Received: December 10, 2009 Snaptics Inc. Santa Clara, CA mihai.bulea@ahoo.com R E F E R E N C E S 1. Blais F., A Review of 20 Years of Range Sensor Development. Journal of Elect. Imaging, 13, 1, (2004). 2. Jahne B., Haußecker H., Geißler., Handbook of Computer Vision and Applications. Academic ress, San Diego, Vol. 1: Sensors and Imaging, Chap (1999). 3. Nguen H., Blackburn M., A Simple Method for Range Finding via Laser Triangulation. Technical Document 2734, NCCOSC, San Diego CA, Jan Zhang B., Zhang J., Jiang C., Li Z., Zhang W., recision Laser Triangulation Range Sensor with Double Detectors for Measurement on CMMs. Industrial Optical Sensors for Metrolog and Inspection, Vol. 2349, Jan 1995,
23 Bul. Inst. olit. Iaşi, t. LVI (LX), f. 1, NOI GEOMETRII ENTRU SENSORI CU LASER 3D CU DISCRIMINAREA ROIECŢIEI (Reumat) Sensorii cu laser se folosesc pentru măsurarea precisă a distanţelor şi se baeaă pe determinarea poiţiei urmei raei laser pe obiectul ţintă în imaginea furniată de o cameră video (triangulaţie). entru obiecte-ţintă complee se utilieaă fascicole laser multiple - punctuale sau liniare, dar etichetarea lor devine dificilă, chiar imposibilă pentru obiecte complee (cu concavităţi, goluri, etc), deorece urmele laser corespunătoare pot lipsi sau preenta discontinuităţi. Soluţii alternative cum ar fi folosirea unor lasere de culori diferite afecteaă semnificativ preţul de cost al sensorilor. Lucrarea de faţă preintă un set de geometrii noi pentru sensorii laser, geometrii care preintă proprietatea că urma laser este constrânsă într-o regiune unică a proiecţiei (imaginea video) şi deoarece regiunile nu se intersecteaă, etichetarea urmelor este etrem de simplă. Toate geometriile preentate sunt însoţite de criteriile de proiectare aferente şi sunt validate prin simulare.
AUTONOMOUS ROBOT NAVIGATION BASED ON FUZZY LOGIC AND REINFORCEMENT LEARNING
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi, Tomul LVI (LX), Fasc. 4, 2010 Secţia CONSTRUCŢII DE MAŞINI AUTONOMOUS ROBOT NAVIGATION BASED ON FUZZY
More informationVARIATION OF INTERNAL FORCES USING ARTIFICIAL NEURONAL NETWORK
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Volumul 63 (67), Numărul 1, 2017 Secţia CONSTRUCŢII. ARHITECTURĂ VARIATION OF INTERNAL FORCES USING
More informationProjection: Mapping 3-D to 2-D. Orthographic Projection. The Canonical Camera Configuration. Perspective Projection
Projection: Mapping 3-D to 2-D Our scene models are in 3-D space and images are 2-D so we need some wa of projecting 3-D to 2-D The fundamental approach: planar projection first, we define a plane in 3-D
More informationAPPLICATIONS OF MICROSOFT EXCEL - SOLVER FOR HORIZONTAL AND LEVELLING NETWORKS ADJUSTMENT
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Volumul 63 (67), Numărul 1-2, 2017 Secţia HIDROTEHNICĂ APPLICATIONS OF MICROSOFT EXCEL - SOLVER FOR
More informationMODELING THE FORCE-ELONGATION CURVE OF SINGLE YARNS
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LVI (LX), Fasc. 1, 2010 SecŃia TEXTILE. PIELĂRIE MODELING THE FORCE-ELONGATION CURVE OF SINGLE
More informationof Straight Lines 1. The straight line with gradient 3 which passes through the point,2
Learning Enhancement Team Model answers: Finding Equations of Straight Lines Finding Equations of Straight Lines stud guide The straight line with gradient 3 which passes through the point, 4 is 3 0 Because
More informationChap 7, 2009 Spring Yeong Gil Shin
Three-Dimensional i Viewingi Chap 7, 29 Spring Yeong Gil Shin Viewing i Pipeline H d fi i d? How to define a window? How to project onto the window? Rendering "Create a picture (in a snthetic camera) Specification
More information3D X-ray Laminography with CMOS Image Sensor Using a Projection Method for Reconstruction of Arbitrary Cross-sectional Images
Ke Engineering Materials Vols. 270-273 (2004) pp. 192-197 online at http://www.scientific.net (2004) Trans Tech Publications, Switzerland Online available since 2004/08/15 Citation & Copright (to be inserted
More informationLecture 4: Viewing. Topics:
Lecture 4: Viewing Topics: 1. Classical viewing 2. Positioning the camera 3. Perspective and orthogonal projections 4. Perspective and orthogonal projections in OpenGL 5. Perspective and orthogonal projection
More information3-Dimensional Viewing
CHAPTER 6 3-Dimensional Vieing Vieing and projection Objects in orld coordinates are projected on to the vie plane, hich is defined perpendicular to the vieing direction along the v -ais. The to main tpes
More informationD-Calib: Calibration Software for Multiple Cameras System
D-Calib: Calibration Software for Multiple Cameras Sstem uko Uematsu Tomoaki Teshima Hideo Saito Keio Universit okohama Japan {u-ko tomoaki saito}@ozawa.ics.keio.ac.jp Cao Honghua Librar Inc. Japan cao@librar-inc.co.jp
More informationChapter 23. Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian
Chapter 23 Geometrical Optics (lecture 1: mirrors) Dr. Armen Kocharian Reflection and Refraction at a Plane Surface The light radiate from a point object in all directions The light reflected from a plane
More informationModeling Transformations
Modeling Transformations Michael Kazhdan (601.457/657) HB Ch. 5 FvDFH Ch. 5 Overview Ra-Tracing so far Modeling transformations Ra Tracing Image RaTrace(Camera camera, Scene scene, int width, int heigh,
More informationBOOLEAN FUNCTION DECOMPOSITION BASED ON FPGA BASIC CELL STRUCTURE
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LXI (LXV), Fasc. 1, 2015 SecŃia AUTOMATICĂ şi CALCULATOARE BOOLEAN FUNCTION DECOMPOSITION BASED
More informationCIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM
CIS 580, Machine Perception, Spring 2015 Homework 1 Due: 2015.02.09. 11:59AM Instructions. Submit your answers in PDF form to Canvas. This is an individual assignment. 1 Camera Model, Focal Length and
More informationIntroduction to Homogeneous Transformations & Robot Kinematics
Introduction to Homogeneous Transformations & Robot Kinematics Jennifer Ka Rowan Universit Computer Science Department. Drawing Dimensional Frames in 2 Dimensions We will be working in -D coordinates,
More informationEvening s Goals. Mathematical Transformations. Discuss the mathematical transformations that are utilized for computer graphics
Evening s Goals Discuss the mathematical transformations that are utilized for computer graphics projection viewing modeling Describe aspect ratio and its importance Provide a motivation for homogenous
More informationE V ER-growing global competition forces. Accuracy Analysis and Improvement for Direct Laser Sintering
Accurac Analsis and Improvement for Direct Laser Sintering Y. Tang 1, H. T. Loh 12, J. Y. H. Fuh 2, Y. S. Wong 2, L. Lu 2, Y. Ning 2, X. Wang 2 1 Singapore-MIT Alliance, National Universit of Singapore
More informationModeling with CMU Mini-FEA Program
Modeling with CMU Mini-FEA Program Introduction Finite element analsis (FEA) allows ou analze the stresses and displacements in a bod when forces are applied. FEA determines the stresses and displacements
More informationTransformations of Functions. 1. Shifting, reflecting, and stretching graphs Symmetry of functions and equations
Chapter Transformations of Functions TOPICS.5.. Shifting, reflecting, and stretching graphs Smmetr of functions and equations TOPIC Horizontal Shifting/ Translation Horizontal Shifting/ Translation Shifting,
More informationComputer Graphics. Bing-Yu Chen National Taiwan University The University of Tokyo
Computer Graphics Bing-Yu Chen National Taiwan Universit The Universit of Toko Viewing in 3D 3D Viewing Process Classical Viewing and Projections 3D Snthetic Camera Model Parallel Projection Perspective
More informationProposal of a Touch Panel Like Operation Method For Presentation with a Projector Using Laser Pointer
Proposal of a Touch Panel Like Operation Method For Presentation with a Projector Using Laser Pointer Yua Kawahara a,* and Lifeng Zhang a a Kushu Institute of Technolog, 1-1 Sensui-cho Tobata-ku, Kitakushu
More informationarxiv: v1 [cs.cv] 28 Sep 2018
Camera Pose Estimation from Sequence of Calibrated Images arxiv:1809.11066v1 [cs.cv] 28 Sep 2018 Jacek Komorowski 1 and Przemyslaw Rokita 2 1 Maria Curie-Sklodowska University, Institute of Computer Science,
More informationFlexible Calibration of a Portable Structured Light System through Surface Plane
Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured
More informationIntroduction to 3D Machine Vision
Introduction to 3D Machine Vision 1 Many methods for 3D machine vision Use Triangulation (Geometry) to Determine the Depth of an Object By Different Methods: Single Line Laser Scan Stereo Triangulation
More informationDetermining the 2d transformation that brings one image into alignment (registers it) with another. And
Last two lectures: Representing an image as a weighted combination of other images. Toda: A different kind of coordinate sstem change. Solving the biggest problem in using eigenfaces? Toda Recognition
More informationMAN-522: COMPUTER VISION SET-2 Projections and Camera Calibration
MAN-522: COMPUTER VISION SET-2 Projections and Camera Calibration Image formation How are objects in the world captured in an image? Phsical parameters of image formation Geometric Tpe of projection Camera
More informationIntroduction to Homogeneous Transformations & Robot Kinematics
Introduction to Homogeneous Transformations & Robot Kinematics Jennifer Ka, Rowan Universit Computer Science Department Januar 25. Drawing Dimensional Frames in 2 Dimensions We will be working in -D coordinates,
More informationANALYSIS OF DATA TRANSMITTED BETWEEN THE SERVER AND THE CLIENT THROUGH DIFFERENT TYPES OF COMMUNICATION
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LIX (LXIII), Fasc. 1, 2013 Secţia ELECTROTEHNICĂ. ENERGETICĂ. ELECTRONICĂ ANALYSIS OF DATA TRANSMITTED
More informationPLC APPLICATION FOR BRUSHLESS MOTOR POSITIONING
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Volumul 64 (68), Numărul 2, 2018 Secţia ELECTROTEHNICĂ. ENERGETICĂ. ELECTRONICĂ PLC APPLICATION FOR
More informationGLOBAL EDITION. Interactive Computer Graphics. A Top-Down Approach with WebGL SEVENTH EDITION. Edward Angel Dave Shreiner
GLOBAL EDITION Interactive Computer Graphics A Top-Down Approach with WebGL SEVENTH EDITION Edward Angel Dave Shreiner This page is intentionall left blank. 4.10 Concatenation of Transformations 219 in
More informationCS770/870 Spring 2017 Transformations
CS770/870 Spring 2017 Transformations Coordinate sstems 2D Transformations Homogeneous coordinates Matrices, vectors, points Coordinate Sstems Coordinate sstems used in graphics Screen coordinates: the
More informationRange Sensors (time of flight) (1)
Range Sensors (time of flight) (1) Large range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic sensors, infra-red sensors
More informationI N T R O D U C T I O N T O C O M P U T E R G R A P H I C S
3D Viewing: the Synthetic Camera Programmer s reference model for specifying 3D view projection parameters to the computer General synthetic camera (e.g., PHIGS Camera, Computer Graphics: Principles and
More informationCamera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration
Camera Calibration Jesus J Caban Schedule! Today:! Camera calibration! Wednesday:! Lecture: Motion & Optical Flow! Monday:! Lecture: Medical Imaging! Final presentations:! Nov 29 th : W. Griffin! Dec 1
More informationMeasurements using three-dimensional product imaging
ARCHIVES of FOUNDRY ENGINEERING Published quarterly as the organ of the Foundry Commission of the Polish Academy of Sciences ISSN (1897-3310) Volume 10 Special Issue 3/2010 41 46 7/3 Measurements using
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Comuter Grahics Virtual Realit Viewing an rojection Classical an General Viewing Transformation Pieline CPU CPU Pol. Pol. DL DL Piel Piel Per Per Verte Verte Teture Teture Raster Raster Frag
More informationPerspective Projection Transformation
Perspective Projection Transformation Where does a point of a scene appear in an image?? p p Transformation in 3 steps:. scene coordinates => camera coordinates. projection of camera coordinates into image
More informationA Practical Camera Calibration System on Mobile Phones
Advanced Science and echnolog Letters Vol.7 (Culture and Contents echnolog 0), pp.6-0 http://dx.doi.org/0.57/astl.0.7. A Practical Camera Calibration Sstem on Mobile Phones Lu Bo, aegkeun hangbo Department
More informationScientific Manual FEM-Design 16.0
1.4.6 Calculations considering diaphragms All of the available calculation in FEM-Design can be performed with diaphragms or without diaphragms if the diaphragms were defined in the model. B the different
More informationOptimized Design of 3D Laser Triangulation Systems
The Scan Principle of 3D Laser Triangulation Triangulation Geometry Example of Setup Z Y X Target as seen from the Camera Sensor Image of Laser Line The Scan Principle of 3D Laser Triangulation Detektion
More informationDiscover how to solve this problem in this chapter.
A 2 cm tall object is 12 cm in front of a spherical mirror. A 1.2 cm tall erect image is then obtained. What kind of mirror is used (concave, plane or convex) and what is its focal length? www.totalsafes.co.uk/interior-convex-mirror-900mm.html
More informationEditing and Transformation
Lecture 5 Editing and Transformation Modeling Model can be produced b the combination of entities that have been edited. D: circle, arc, line, ellipse 3D: primitive bodies, etrusion and revolved of a profile
More informationOverview of Active Vision Techniques
SIGGRAPH 99 Course on 3D Photography Overview of Active Vision Techniques Brian Curless University of Washington Overview Introduction Active vision techniques Imaging radar Triangulation Moire Active
More informationNighttime Pedestrian Ranging Algorithm Based on Monocular Vision
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16 No 5 Special Issue on Application of Advanced Computing and Simulation in Information Sstems Sofia 016 Print ISSN: 1311-970;
More informationToday s class. Geometric objects and transformations. Informationsteknologi. Wednesday, November 7, 2007 Computer Graphics - Class 5 1
Toda s class Geometric objects and transformations Wednesda, November 7, 27 Computer Graphics - Class 5 Vector operations Review of vector operations needed for working in computer graphics adding two
More informationModeling Transformations
Modeling Transformations Michael Kazhdan (601.457/657) HB Ch. 5 FvDFH Ch. 5 Announcement Assignment 2 has been posted: Due: 10/24 ASAP: Download the code and make sure it compiles» On windows: just build
More informationA rigid body free to move in a reference frame will, in the general case, have complex motion, which is simultaneously a combination of rotation and
050389 - Analtical Elements of Mechanisms Introduction. Degrees of Freedom he number of degrees of freedom (DOF) of a sstem is equal to the number of independent parameters (measurements) that are needed
More informationLines and Their Slopes
8.2 Lines and Their Slopes Linear Equations in Two Variables In the previous chapter we studied linear equations in a single variable. The solution of such an equation is a real number. A linear equation
More informationCHECKING THE HOMOGENEITY OF CONCRETE USING ARTIFICIAL NEURAL NETWORK
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LXI (LXV), Fasc., 05 Secţia CONSTRUCŢII. ARHITECTURĂ CHECKING THE HOMOGENEITY OF CONCRETE USING
More informationThe liquid s index of refraction is. v liquid = nm = = 460 nm 1.38
HMWK 5 Ch 17: P 6, 11, 30, 31, 34, 42, 50, 56, 58, 60 Ch 18: P 7, 16, 22, 27, 28, 30, 51, 52, 59, 61 Ch. 17 P17.6. Prepare: The laser beam is an electromagnetic wave that travels with the speed of light.
More informationAnnouncements. Equation of Perspective Projection. Image Formation and Cameras
Announcements Image ormation and Cameras Introduction to Computer Vision CSE 52 Lecture 4 Read Trucco & Verri: pp. 22-4 Irfanview: http://www.irfanview.com/ is a good Windows utilit for manipulating images.
More informationSTEREOSCOPIC ROBOT VISION SYSTEM
Palinko Oskar, ipl. eng. Facult of Technical Sciences, Department of Inustrial Sstems Engineering an Management, 21 000 Novi Sa, Dositej Obraovic Square 6, Serbia & Montenegro STEREOSCOPIC ROBOT VISION
More information12.4 The Ellipse. Standard Form of an Ellipse Centered at (0, 0) (0, b) (0, -b) center
. The Ellipse The net one of our conic sections we would like to discuss is the ellipse. We will start b looking at the ellipse centered at the origin and then move it awa from the origin. Standard Form
More informationChapter 26 Geometrical Optics
Chapter 26 Geometrical Optics 26.1 The Reflection of Light 26.2 Forming Images With a Plane Mirror 26.3 Spherical Mirrors 26.4 Ray Tracing and the Mirror Equation 26.5 The Refraction of Light 26.6 Ray
More informationComputer Graphics. Jeng-Sheng Yeh 葉正聖 Ming Chuan University (modified from Bing-Yu Chen s slides)
Computer Graphics Jeng-Sheng Yeh 葉正聖 Ming Chuan Universit (modified from Bing-Yu Chen s slides) Viewing in 3D 3D Viewing Process Specification of an Arbitrar 3D View Orthographic Parallel Projection Perspective
More informationTEAMS National Competition Middle School Version Photometry Solution Manual 25 Questions
TEAMS National Competition Middle School Version Photometry Solution Manual 25 Questions Page 1 of 14 Photometry Questions 1. When an upright object is placed between the focal point of a lens and a converging
More informationCV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more
CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more Roadmap of topics n Review perspective transformation n Camera calibration n Stereo methods n Structured
More informationBasic commands using the "Insert" menu: To insert a two-dimensional (2D) graph, use: To insert a three-dimensional (3D) graph, use: Insert > Plot > 3D
Oct 7::3 - GraphsBasics5_ForPrinting.sm Eamples of two- and three-dimensional graphics in Smath Studio --------------------------------------------------------------- B Gilberto E. Urro, October Basic
More information10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.
Range Sensors (time of flight) (1) Range Sensors (time of flight) (2) arge range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic
More informationA FAULT PRIMITIVE BASED MODEL OF ALL STATIC FOUR- CELL COUPLING FAULTS IN RANDOM-ACCESS MEMORIES
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LIV (LVIII), Fasc. 1, 2008 Secţia AUTOMATICĂ şi CALCULATOARE A FAULT PRIMITIVE BASED MODEL OF
More informationWhat and Why Transformations?
2D transformations What and Wh Transformations? What? : The geometrical changes of an object from a current state to modified state. Changing an object s position (translation), orientation (rotation)
More informationBARCODE READER MANAGEMENT WITH THE ATMEL MICROCONTROLLER (I)
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Volumul 63 (67), Numărul 1, 2017 Secţia ELECTROTEHNICĂ. ENERGETICĂ. ELECTRONICĂ BARCODE READER MANAGEMENT
More informationThree-Dimensional Coordinates
CHAPTER Three-Dimensional Coordinates Three-dimensional movies superimpose two slightl different images, letting viewers with polaried eeglasses perceive depth (the third dimension) on a two-dimensional
More informationMachine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy
1 Machine vision Summary # 11: Stereo vision and epipolar geometry STEREO VISION The goal of stereo vision is to use two cameras to capture 3D scenes. There are two important problems in stereo vision:
More informationA STUDY ON CLASSIFIERS ACCURACY FOR HAND POSE RECOGNITION
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LIX (LXIII), Fasc. 2, 2013 SecŃia AUTOMATICĂ şi CALCULATOARE A STUDY ON CLASSIFIERS ACCURACY
More informationACTIVITY: Frieze Patterns and Reflections. a. Is the frieze pattern a reflection of itself when folded horizontally? Explain.
. Reflections frieze pattern? How can ou use reflections to classif a Reflection When ou look at a mountain b a lake, ou can see the reflection, or mirror image, of the mountain in the lake. If ou fold
More informationComputer Vision cmput 428/615
Computer Vision cmput 428/615 Basic 2D and 3D geometry and Camera models Martin Jagersand The equation of projection Intuitively: How do we develop a consistent mathematical framework for projection calculations?
More informationTEAMS National Competition High School Version Photometry Solution Manual 25 Questions
TEAMS National Competition High School Version Photometry Solution Manual 25 Questions Page 1 of 15 Photometry Questions 1. When an upright object is placed between the focal point of a lens and a converging
More informationApproaches to Simulate the Operation of the Bending Machine
Approaches to Simulate the Operation of the Bending Machine Ornwadee Ratanapinunchai Chanunchida Theerasilp Teerut Suksawai Project Advisors: Dr. Krit Chongsrid and Dr. Waree Kongprawechnon Department
More informationLaser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR
Mobile & Service Robotics Sensors for Robotics 3 Laser sensors Rays are transmitted and received coaxially The target is illuminated by collimated rays The receiver measures the time of flight (back and
More informationImage Transformations & Camera Calibration. Mašinska vizija, 2018.
Image Transformations & Camera Calibration Mašinska vizija, 2018. Image transformations What ve we learnt so far? Example 1 resize and rotate Open warp_affine_template.cpp Perform simple resize
More informationChap 7, 2008 Spring Yeong Gil Shin
Three-Dimensional i Viewingi Chap 7, 28 Spring Yeong Gil Shin Viewing i Pipeline H d fi i d? How to define a window? How to project onto the window? Rendering "Create a picture (in a synthetic camera)
More informationDispersion (23.5) Neil Alberding (SFU Physics) Physics 121: Optics, Electricity & Magnetism Spring / 17
Neil Alberding (SFU Physics) Physics 121: Optics, Electricity & Magnetism Spring 2010 1 / 17 Dispersion (23.5) The speed of light in a material depends on its wavelength White light is a mixture of wavelengths
More informationLaser Trianglation Displacement Measurement Method Using Prism-Based Optical Structure
Laser Trianglation Displacement Measurement Method Using Prism-Based Optical Structure bstract This paper presented a novel method using prismbased optical structure to correct the nonlinear problem of
More informationThree-Dimensional Scanning and Graphic Processing System
Volume 55, Number 1-2, 2014 83 Three-Dimensional Scanning and Graphic Processing System Valentin Dan ZAHARIA, Rodica HOLONEC, Septimiu CRIŞAN, Călin MUREŞAN, Radu MUNTEANU jr. Faculty of Electrical Engineering,
More information2.2 Differential Forms
2.2 Differential Forms With vector analsis, there are some operations such as the curl derivative that are difficult to understand phsicall. We will introduce a notation called the calculus of differential
More informationResearch Article Scene Semantics Recognition Based on Target Detection and Fuzzy Reasoning
Research Journal of Applied Sciences, Engineering and Technolog 7(5): 970-974, 04 DOI:0.906/rjaset.7.343 ISSN: 040-7459; e-issn: 040-7467 04 Mawell Scientific Publication Corp. Submitted: Januar 9, 03
More informationABOUT MANUFACTURING PROCESSES CAPABILITY ANALYSIS
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LIX (LXIII), Fasc. 4, 013 Secţia CONSTRUCŢII DE MAŞINI ABOUT MANUFACTURING PROCESSES CAPABILITY
More informationA NOVEL SYSTOLIC ALGORITHM FOR 2-D DISCRETE SINE TRANSFORM
BULETINUL INSTITUTULUI POLITEHNIC DIN IAŞI Publicat de Universitatea Tehnică Gheorghe Asachi din Iaşi Tomul LIX (LXIII), Fasc. 3, 2013 Secţia ELECTROTEHNICĂ. ENERGETICĂ. ELECTRONICĂ A NOVEL SYSTOLIC ALGORITHM
More informationTo Do. Motivation. Demo (Projection Tutorial) What we ve seen so far. Computer Graphics. Summary: The Whole Viewing Pipeline
Computer Graphics CSE 67 [Win 9], Lecture 5: Viewing Ravi Ramamoorthi http://viscomp.ucsd.edu/classes/cse67/wi9 To Do Questions/concerns about assignment? Remember it is due tomorrow! (Jan 6). Ask me or
More informationAnnouncements. Tutorial this week Life of the polygon A1 theory questions
Announcements Assignment programming (due Frida) submission directories are ied use (submit -N Ab cscd88 a_solution.tgz) theor will be returned (Wednesda) Midterm Will cover all o the materials so ar including
More informationPractical Robotics (PRAC)
Practical Robotics (PRAC) A Mobile Robot Navigation System (1) - Sensor and Kinematic Modelling Nick Pears University of York, Department of Computer Science December 17, 2014 nep (UoY CS) PRAC Practical
More informationUser Interface for Optical Multi-Sensorial Measurements at Extruded Profiles
Etruder v User Interface for Optical Multi-Sensorial Measurements at Etruded Profiles Place of Installation A. Weckenmann, J. Bernstein IMEKO XIX WCC 2009, 10.09.2009, Lisbon? v 1 Introduction 2 Bi-Sensorial
More informationMeasurement and Precision Analysis of Exterior Orientation Element Based on Landmark Point Auxiliary Orientation
2016 rd International Conference on Engineering Technology and Application (ICETA 2016) ISBN: 978-1-60595-8-0 Measurement and Precision Analysis of Exterior Orientation Element Based on Landmark Point
More informationRobot Vision: Camera calibration
Robot Vision: Camera calibration Ass.Prof. Friedrich Fraundorfer SS 201 1 Outline Camera calibration Cameras with lenses Properties of real lenses (distortions, focal length, field-of-view) Calibration
More informationPattern Feature Detection for Camera Calibration Using Circular Sample
Pattern Feature Detection for Camera Calibration Using Circular Sample Dong-Won Shin and Yo-Sung Ho (&) Gwangju Institute of Science and Technology (GIST), 13 Cheomdan-gwagiro, Buk-gu, Gwangju 500-71,
More information3D data merging using Holoimage
Iowa State University From the SelectedWorks of Song Zhang September, 27 3D data merging using Holoimage Song Zhang, Harvard University Shing-Tung Yau, Harvard University Available at: https://works.bepress.com/song_zhang/34/
More informationAvailable online at Procedia Engineering 7 (2010) Procedia Engineering 00 (2010)
Available online at www.sciencedirect.com Procedia Engineering 7 (2010) 290 296 Procedia Engineering 00 (2010) 000 000 Procedia Engineering www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia
More informationComputer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.
Computer Vision Coordinates Prof. Flávio Cardeal DECOM / CEFET- MG cardeal@decom.cefetmg.br Abstract This lecture discusses world coordinates and homogeneous coordinates, as well as provides an overview
More information3B SCIENTIFIC PHYSICS
3B SCIENTIFIC PHYSICS Instruction sheet 06/18 ALF Laser Optics Demonstration Set Laser Optics Supplement Set Page 1 2 3 3 3 4 4 4 5 5 5 6 6 6 7 7 7 8 8 8 9 9 9 10 10 10 11 11 11 12 12 12 13 13 13 14 14
More informationHomogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.
Homogeneous Coordinates Overall scaling is NOT important. CSED44:Introduction to Computer Vision (207F) Lecture8: Camera Models Bohyung Han CSE, POSTECH bhhan@postech.ac.kr (",, ) ()", ), )) ) 0 It is
More informationAutomatic Facial Expression Recognition Using Neural Network
Automatic Facial Epression Recognition Using Neural Network Behrang Yousef Asr Langeroodi, Kaveh Kia Kojouri Electrical Engineering Department, Guilan Universit, Rasht, Guilan, IRAN Electronic Engineering
More informationRefraction at a single curved spherical surface
Refraction at a single curved spherical surface This is the beginning of a sequence of classes which will introduce simple and complex lens systems We will start with some terminology which will become
More informationRectification and Disparity
Rectification and Disparity Nassir Navab Slides prepared by Christian Unger What is Stereo Vision? Introduction A technique aimed at inferring dense depth measurements efficiently using two cameras. Wide
More informationCS F-07 Objects in 2D 1
CS420-2010F-07 Objects in 2D 1 07-0: Representing Polgons We want to represent a simple polgon Triangle, rectangle, square, etc Assume for the moment our game onl uses these simple shapes No curves for
More informationPrecision Peg-in-Hole Assembly Strategy Using Force-Guided Robot
3rd International Conference on Machiner, Materials and Information Technolog Applications (ICMMITA 2015) Precision Peg-in-Hole Assembl Strateg Using Force-Guided Robot Yin u a, Yue Hu b, Lei Hu c BeiHang
More informationTo Do. Demo (Projection Tutorial) Motivation. What we ve seen so far. Outline. Foundations of Computer Graphics (Fall 2012) CS 184, Lecture 5: Viewing
Foundations of Computer Graphics (Fall 0) CS 84, Lecture 5: Viewing http://inst.eecs.berkele.edu/~cs84 To Do Questions/concerns about assignment? Remember it is due Sep. Ask me or TAs re problems Motivation
More informationTransformations in the Plane - Activity 1 Reflections in axes and an oblique line.
Name: Class: p 5 Maths Helper Plus Resource Set. Copyright 00 Bruce A. Vaughan, Teachers Choice Software Transformations in the Plane - Activity Reflections in axes and an oblique line. ) On the diagram
More information[ ] [ ] Orthogonal Transformation of Cartesian Coordinates in 2D & 3D. φ = cos 1 1/ φ = tan 1 [ 2 /1]
Orthogonal Transformation of Cartesian Coordinates in 2D & 3D A vector is specified b its coordinates, so it is defined relative to a reference frame. The same vector will have different coordinates in
More information