Automatic camera to laser calibration for high accuracy mobile mapping systems using INS

Size: px
Start display at page:

Download "Automatic camera to laser calibration for high accuracy mobile mapping systems using INS"

Transcription

1 Automatic camera to laser calibration for high accuracy mobile mapping systems using INS Werner Goeman* a+b, Koen Douterloigne a, Sidharta Gautama a a Department of Telecommunciations and information processing (TELIN), Ghent University, St. Pietersnieuwstraat 41, B-9000, Ghent, Belgium; b Geodata & Mobile Mapping group, Grontmij Belgium NV, Meersstraat 138, B-9000, Ghent ABSTRACT A mobile mapping system (MMS) is a mobile multi-sensor platform developed by the geoinformation community to support the acquisition of huge amounts of geodata in the form of georeferenced high resolution images and dense laser clouds. Since data fusion and data integration techniques are increasingly able to combine the complementary strengths of different sensor types, the external calibration of a camera to a laser scanner is a common pre-requisite on today's mobile platforms. The methods of calibration, nevertheless, are often relatively poorly documented, are almost always time-consuming, demand expert knowledge and often require a carefully constructed calibration environment. A new methodology is studied and explored to provide a high quality external calibration for a pinhole camera to a laser scanner which is automatic, easy to perform, robust and foolproof. The method presented here, uses a portable, standard ranging pole which needs to be positioned on a known ground control point. For calibration, a well studied absolute orientation problem needs to be solved. In many cases, the camera and laser sensor are calibrated in relation to the INS system. Therefore, the transformation from camera to laser contains the cumulated error of each sensor in relation to the INS. Here, the calibration of the camera is performed in relation to the laser frame using the time synchronization between the sensors for data association. In this study, the use of the inertial relative movement will be explored to collect more useful calibration data. This results in a better intersensor calibration allowing better coloring of the clouds and a more accurate depth mask for images, especially on the edges of objects in the scene. Keywords: mobile mapping, indirect calibration, laser 3D monoplotting, external calibration, relative calibration, cam calibration. 1. INTRODUCTION A land-based mobile mapping system (MMS) is a state-of-the-art system to collect visual and non-visual information from the environment on a large scale and as fast and complete as possible. In order to accomplish this challenging task, the mobile platform is equipped with several complementary sensors to capture as much information as possible of the surrounding scene. The current technology allows to navigate the platform with high velocities through the requested regions, capturing a huge amount of data to be processed offline for a variety of purposes such as asset management, road inspection, traffic sign detection, etc. Modern MMS systems generates highly redundant sensor data which is exploited by the customized data fusion algorithms with near real-time capabilities [1]. It gives the end-user more and more feedback about completeness and quality assurance. However, the quality of the extracted data highly depends on the quality of the calibration of the sensors. Two sets of parameters can be identified for each sensor, intrinsic and extrinsic. Both parameter sets can be estimated separately and have already been studied extensively. Also in many experiments, extrinsic calibration parameters are treated as already known through pre-calibration, while few details can be found on the used methodology. However, it is common knowledge that the process of extrinsic calibration is labor intensive. For this reason, off-the-shelf platforms such as [4][5][6] try to eliminate the calibration step for the end user by providing an all in one sensor package which can be mounted on a mobile platform. This eliminates the flexibility of a sensor configuration by preventing sensor adaptations customized for more specialized surveying and computer vision tasks, such as road inspection, surveys on rails, offshore operations, infrared heat inspection, visual odometry etc. *Werner.goeman@grontmij.be; Phone: ; Fax: ;

2 To fix these problems we present an extrinsic camera to laser calibration methodology which is automatic, robust, portable and easy to perform. This way, the calibration can be performed without time consuming procedures. There is no need for a target site or unpractical checkerboards under controlled lighting conditions, as presented in [2][9][10]. Only a one dimensional reference model such as a standard ranging pole is needed on which some reflectivity patches are added. The ranging pole can easily be setup to 5m in length while staying practical and portable compared to the checkerboards used in [2][11]. The focus of this work is on extrinsic camera calibration relative to the laser scanner. It calculates the relative rotation and translation between the laser scanner and the pinhole camera. The INS will be used to collect more useful data, thus improving the intersensor calibration. This methodology is preferred when fusing the color information of the camera with the laser range data, resulting in a colored laser point cloud and image depth masks with better accuracy. The procedure is an automated process and can be performed by a non-expert, reducing the operational cost. In previous work, a methodology was presented how direct external camera calibration can be performed for a MMS using an automated extraction of a one dimensional reference object in the field of view of the camera. Here, automated relative camera calibration is considered based on the same feature extraction presented in [8]. The experiments show how the relative process performs better for coloring laser point clouds and 3D monoplotting. Laser scanner and camera direct georeferencing 2. CALIBRATION METHODOLOGY Geometric camera calibration can be seen as the combination of intrinsic and extrinsic calibration, resulting in a set of intrinsic and a set of extrinsic parameters respectively. The intrinsic parameters describe the characteristics inherent to the sensor itself, such as focal length and central point, and do not change over time as long as the internal state of the sensor remains unchanged. Determining the camera s intrinsic parameters and the lens distortion can be done offline, separate from the external calibration, using e.g. the techniques in [3][9]. In this work, the internal calibration parameter set is considered known. This is a reasonable requirement, since experiments show that the internal parameter set does not degenerate over a period of several years when used intensively on a MMS. The extrinsic parameters, on the other hand, describe the relative rotation and translation between different sensor coordinate systems, and need to be recalculated every time the orientation or lever arm of the sensor in relation to the reference frame was changed. Direct georeferencing is defined as direct measurement of extrinsic orientation parameters, using positioning and orientation sensors, such as the Global Navigation Satellite System (GNSS) and the inertial navigation system (INS). It is the preferred and more recent approach to determine the extrinsic parameters of a camera in relation to the GPS/INS positioning system, i.e. lever arm and bore sight calibration. This way, the parameters of each image are determined directly by a combination of satellite and inertial navigation system measurements. Line and laser scanner systems direct georeferencing, on the other hand, is indispensible because of the needed extrinsic orientation information for each single measurement. For the camera direct georeferencing, a pinhole model is used. This way the projection from the 3D space to the image plane can be described by: or (1) Where P w (X w,y w,z w,1) are the coordinates of a 3D point in the world coordinate space W, p are the coordinates of the projection point in pixels, s is a scale factor. A is called a camera matrix, or a matrix of intrinsic parameters, and denotes a matrix gathering the extrinsic parameters (rotation and translation) of the camera. In (1), p is not the actual observed image point since virtually all imaging devices introduce a certain amount of nonlinear distortions. Among the nonlinear distortions, radial distortion is present and increasing along the radial direction from the center of distortion. It has been recognized to be the most severe. In this paper it is assumed that the intrinsic parameter set of the camera is known by an off-line calibration process such as the camera calibration algorithm presented in [10]. The intrinsic parameters define the projection from a point in the camera frame to the pixel coordinates in the image plane.

3 Figure 1. Left: camera frame O C in relation to INS frame O INS during direct georeferencing. Right: camera frame O C at time t 2 capturing point P. The laser scanner O L hits point P at time t 1. The transformation from world coordinate frame to the camera frame is defined by the following equations: ( ) ( ) ( ) (4) (2) (3) [ ], (5) and are the coordinates of the same point in respectively the camera frame, the GPS/INS frame and the world frame. Each rotation is defined by 3 parameters as shown in (4), and each translation is defined by three parameters as shown in (5). is the rotation from world frame to GPS/INS frame. This rotation is known by the positioning system, next to the position of the world reference frame origin in the INS frame, represented by translation. Thus, the position and orientation of the camera frame is defined by the six remaining parameters (t x,t y,t z,α,β,γ) of translation and rotation. These parameters can be estimated using 2D-3D correspondences. For this, a set of object points, their corresponding image projections, the camera matrix and the distortion parameters of the camera are required [12]. The extrinsic camera parameters set is calculated while minimizing the reprojection error, i.e. the sum of the squared distances between the observed image projections and the projected object points. The laser ranger direct georeferencing problem is less complex and is reduced to an absolute orientation problem since 3D measures are acquired directly by this active sensor in the sensor frame. Here, the extrinsic parameter set can be calculated using 3D-3D correspondences. A comparison of four major algorithms can be found in [13]. The mobile platform The platform used here is the MMS constructed by Grontmij Belgium NV. The main advantage of this platform is the flexibility for its use in a wide range of applications. The navigation sensor is an Applanix POS LV 420 tightly coupled GPS/INS system. The INS makes it possible to maintain the positional accuracy during GPS outages due to low satellite visibility, occurring in urban canyons, tunnels, etc.

4 Figure 2. The mobile platform from Grontmij Belgium NV mounted on car, boat and train The platform can handle up to 6 synchronized cameras and 4 laser scanners, a ground penetrating radar, an infrared scanner, a multibeam, and various other types of sensors. The cameras and laser scanners can be oriented conform the application requirements. The navigation data is processed offline together with the best available RINEX data. All sensors are synchronized with the navigation sensor for georeferencing. In the standard configuration setup, direct camera calibration is used conform the methodology presented in [8]. The accuracy of absolute measurements using triangulation is given in Table 1. Table 1 Performance of MMS showing the average error and standard deviation on absolute photogrammetric measurements Distance(m) Average error (cm) Std dev (cm) > The laser scanner mounted on the platform is a Riegl LMS-Q120. It measures point per second in a view angle of 80 degrees with an accuracy of 2.5cm. In the standard configuration, also the laser sensor was calibrated, using direct georeferencing. This results in absolute accuracies better than 4cm up to 30m distance of the scanner. Relative orientation parameter set between pinhole camera and laser scanner When both laser scanner and pinhole camera are calibrated using direct georeferencing, each of the sensor calibration extrinsic parameter set incorporates the error introduced by the inevitable limited accuracy and noise of the positioning sensors. Each point measurement is an extrapolation of the position of the sensor centers which makes each measurement highly sensitive for any type of error or inaccuracy during calibration. In some applications, a dedicated camera-laser sensor couple is used to augment the captured data. By integrating the sensor data, a depth mask can be added to the images or the intensity based grayscale laser point clouds can be converted to full RGB point clouds. In this case it is not desired to use direct georeferencing for both sensors, since the data fusion process will be subject to noise on the positioning data twice. To tackle this problem, the calibration of a pinhole camera in relation to a 2D laser scanner is studied here. The method explained above for calibrating the camera in relation to the INS, Figure 1(left), can be repeated, but now the object points will be collected in the laser sensor frame. The equation (3) is now replaced by:, with a marker reflection laser point in the laser scanner frame, and the unknown extrinsic camera parameter set. Together with equation (1), the correspondence between 3D laser point and 2D image pixel is made. However, there is a practical problem since the detection of the laser marker by the laser scanner and the capturing of the image with the ranging pole by the camera has to be synchronized. Both sensors must see the marker at the exact same moment in time in order to use the 2D-3D correspondence for calibration purposes. When using a 2D laser scanner, this is a very unpractical condition. Not only does the camera have to look in the same direction as the laser scanner, but also it is almost impossible to make sure the camera was not moving between the time when the laser scanner hits the marker, t=t 1 in Figure 3 (right), and the time a frame was captured by the camera, t=t 2. (6)

5 To solve this problem, the inertial navigation system will be used to compensate for the time difference. Since the laser scanner is already calibrated accurately using direct georeferencing, the relative motion of the laser scanning can be calculated. In Figure 3 (right), the marker is hit by the laser scanner at time t 1. The position of the marker in the laser scanner frame is given by, but the image is taken at time t 2, when the marker has position in the camera frame. To use a 2D-3D correspondence at time t 2, is needed, which can be easily calculated using the relative inertial movement of the INS system between time t 1 and t 2 and the laser calibration. Data acquisition The procedure to collect data for calibration has to be easy to perform and practical. Therefore, a standard ranging pole is used as a reference. The pole is set up and vertically leveled at a known ground control point. With the mobile platform a small survey around the pole is performed. This results in a set of georeferenced images and a 3D laser point cloud in which the pole is visible from different viewing angles and at different distances from the camera and the laser sensor. The ranging pole can be seen as a one dimensional sequence of colored segments in the image as is shown in Figure 3. Each transition in color of the ranging pole can be used as a reference point of which the world coordinates are known. On the ranging pole some markers can be added which can be used as high reflectivity reference points in the laser cloud. C L Figure 3: Left: One dimensional reference model. Middle: Top-down view of a typical calibration track, logged by an Applanix POS LV 420, where the MMS drives around the central ranging pole, marked by a cross. The pattern of overlapping ellipses enables the MMS to see the ranging pole from all angles. The overlaid squares measure about 5 x 5 meter each. The ranging pole is viewed from a distance of about 3 to 15 meter. Right: the configuration used for the experiment with cam C and laser L. 3. EXPERIMENT For the experiment, a MMS with configuration shown in Figure 3 was constructed. A 2 MP Scorpion Point Grey color camera with focal length of 4.8 mm and a Riegl LMS-Q120 laser scanner was mounted on the platform. The ranging pole was leveled on a known GCP and a survey capturing and scanning the pole from different viewing angles was performed. The extrinsic parameters set of both sensors were calculated using direct georeferencing. It is important to note that in order to calibrate the camera in relation to the laser scanner, it is not necessary to calibrate the camera using direct georeferencing. But here it is calculated to be able to compare the results. To calibrate the camera relative to the laser scanner, a set of 2D-3D laser point to image pixel correspondences are needed as explained above. The automatic ranging pole extraction tool described in [8] was used to collect useful frames from which the position of the reflecting markers in the image can be automatically detected, based on the invariance of the cross ratio for perspective transformations. Each of these 2D image coordinates of the markers are linked to the nearest (in time) laser point reflection in the laser point cloud, which is a 3D point in the laser frame. This set of 2D and 3D corresponding coordinates was used to calibrate the camera relative to the laser scanner. The maximum allowed time difference between laser marker reflection and frame capture was set to 2s. The data set exists out of frames selected from the survey making sure the ranging pole is in clear field of view of the camera. The time tag of each frame needs to be in a 2 second time interval of the time tag of at least 1 marker laser

6 reflection point. Using the INS inertial data, a 2D to 3D correspondence can then be made using data with different time tags. Based on viewing angle, a set of 36 marker reflection to image correspondences were collected and used for camera to laser calibration. Figure 4. An example of the calibration result. The white plus mark shows the ground truth representing the centre of the marker on the ranging pole. The black cross is the back projection of the 3D marker detected in the laser frame using the relative camera to laser calibration. The black o-mark is the back projection of the 3D marker using direct georeferencing for camera calibration. To analyze the results, the relative camera calibration was compared to the camera calibration using direct georeferencing in Table 2. From the survey, 331 frames were available showing the ranging pole. A set of 178 validation frames were selected with a time tag falling in the 2 seconds time interval of the nearest marker reflection. These frames show the ranging pole from different viewing angles, but were not used in the calibration process. For each of these frames, the ground truth was created by manually selecting the image coordinates of the markers on the ranging pole. The back projection of the 3D coordinates of the markers in these frames can now be compared to the ground truth using both the relative camera calibration and the calibration using direct georeferencing. A result of this procedure is given in Figure 4. Table 2. Results of the back projection error of relative camera to laser calibration and calibration using direct georeferencing Number of frames Number of points Average error(pixels) Maximum error(pixels) Direct georeferencing Relative camera to laser using INS calibration set validation set calibration set validation set When using direct georeferencing, an average pixel error of the back projected markers in the images of 5.4 was observed. The maximum pixel error was 11.8 pixels for the validation set. The method presented here reduces the maximum back projection error to 7.4 pixels, and the mean error to 4.0 pixels. The remaining error is mainly explained by the accuracy of the laser and the fact that the marker size is bigger than one pixel. 4. CONCLUSIONS In this paper, a new methodology was presented to improve and automate the extrinsic camera calibration in relation to a 2D laser range sensor. The calibration process is easy to perform, fast and automatic and uses only a standard ranging pole on a known ground control point. This eliminates the need for calibration grids or a target site used in other calibration methods. The use of inertial data makes it possible to collect more data by eliminating the need to capture the image and marker reflection at exact the same moment in time. A validation experiment shows a significant improvement in maximum and mean back projection error of 3D marker points in the image when comparing extrinsic camera calibrations using the presented method and camera calibration using direct georeferencing.

7 5. ACKNOWLEDGMENTS This work was made possible with the financial support of iminds. The authors would also like to thank Grontmij Belgium NV for the use of their mobile mapping data. REFERENCES [1] Brzezinska, D., Tutorial: Mobile mapping technology: paradigm shift and future trends, 7 th International Syposium on Mobile Mapping Technology (2011) [2] Zhang, Q., Extrinsic calibration of a camera and laser rangefinder, IEEE IROS, (2004) [3] Heikkilä, J.,Silvén,O., A Four-step Camera Calibration Procedure with Implicit Image Correction, 26th IEEE Conference on Computer Vision and Pattern Recognition, (1997) [4] MDL, Dynascan 3D mobile mapper, (11 July 2012) [5] TOPCON, IP-S2 HD Mobile Mapping System, (11 July 2012) [6] GeoVISAT, Data acquisition, (11 July 2012) [7] Hassan,T., El-Sheimy, N., System Calibration of Land-Based Mobile Mapping Systems, ISPRS, (2008) [8] Goeman, W., Douterloigne, K., Bogaert, P., Pires, R., Gautama, S., Automatic and robust external camera calibration for high accuracy mobile mapping, Proc. SPIE Optics and Photonics. San Diego, USA (2012) [9] Zhang, Z., A flexible new technique for camera calibration, IEEE Transactions on Pattern Analysis and Machine Intelligence 22, (2000) [10] Scheller, S., Westfeld, P., Ebersback, D., Calibration of a mobile mapping camera system with photogrammetic method, Proc ISPRS Vol XXXVI, 5 th International conference on mobile mapping technology (2007) [11] Geiger, A., Moosman, F., Car, O., Schutser, B., Automatic camera and range sensor calibration using a single shot, IEEE International Conf on Robotics and Automation, p (2012) [12] OpenCV, opencv. willowgarage.com/documentation/ camera_calibration_ and_3d_reconstruction.html [13] Eggert, D.W., Lorusso, A., Fisher, R.B., Estimating 3-D rigid body transformations: a comparison of four major algorithms, Machine Vision and Applications Vol 9, Issue 5-6, (March 1997)

A 3-D Scanner Capturing Range and Color for the Robotics Applications

A 3-D Scanner Capturing Range and Color for the Robotics Applications J.Haverinen & J.Röning, A 3-D Scanner Capturing Range and Color for the Robotics Applications, 24th Workshop of the AAPR - Applications of 3D-Imaging and Graph-based Modeling, May 25-26, Villach, Carinthia,

More information

Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision

Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision Adaptive Zoom Distance Measuring System of Camera Based on the Ranging of Binocular Vision Zhiyan Zhang 1, Wei Qian 1, Lei Pan 1 & Yanjun Li 1 1 University of Shanghai for Science and Technology, China

More information

Flexible Calibration of a Portable Structured Light System through Surface Plane

Flexible Calibration of a Portable Structured Light System through Surface Plane Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured

More information

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG. Computer Vision Coordinates Prof. Flávio Cardeal DECOM / CEFET- MG cardeal@decom.cefetmg.br Abstract This lecture discusses world coordinates and homogeneous coordinates, as well as provides an overview

More information

AR Cultural Heritage Reconstruction Based on Feature Landmark Database Constructed by Using Omnidirectional Range Sensor

AR Cultural Heritage Reconstruction Based on Feature Landmark Database Constructed by Using Omnidirectional Range Sensor AR Cultural Heritage Reconstruction Based on Feature Landmark Database Constructed by Using Omnidirectional Range Sensor Takafumi Taketomi, Tomokazu Sato, and Naokazu Yokoya Graduate School of Information

More information

3D-2D Laser Range Finder calibration using a conic based geometry shape

3D-2D Laser Range Finder calibration using a conic based geometry shape 3D-2D Laser Range Finder calibration using a conic based geometry shape Miguel Almeida 1, Paulo Dias 1, Miguel Oliveira 2, Vítor Santos 2 1 Dept. of Electronics, Telecom. and Informatics, IEETA, University

More information

Rigorous Scan Data Adjustment for kinematic LIDAR systems

Rigorous Scan Data Adjustment for kinematic LIDAR systems Rigorous Scan Data Adjustment for kinematic LIDAR systems Paul Swatschina Riegl Laser Measurement Systems ELMF Amsterdam, The Netherlands 13 November 2013 www.riegl.com Contents why kinematic scan data

More information

Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera

Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera Tomokazu Sato, Masayuki Kanbara and Naokazu Yokoya Graduate School of Information Science, Nara Institute

More information

Camera Calibration for a Robust Omni-directional Photogrammetry System

Camera Calibration for a Robust Omni-directional Photogrammetry System Camera Calibration for a Robust Omni-directional Photogrammetry System Fuad Khan 1, Michael Chapman 2, Jonathan Li 3 1 Immersive Media Corporation Calgary, Alberta, Canada 2 Ryerson University Toronto,

More information

A NEW AUTOMATIC SYSTEM CALIBRATION OF MULTI-CAMERAS AND LIDAR SENSORS

A NEW AUTOMATIC SYSTEM CALIBRATION OF MULTI-CAMERAS AND LIDAR SENSORS A NEW AUTOMATIC SYSTEM CALIBRATION OF MULTI-CAMERAS AND LIDAR SENSORS M. Hassanein a, *, A. Moussa a,b, N. El-Sheimy a a Department of Geomatics Engineering, University of Calgary, Calgary, Alberta, Canada

More information

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

Image Transformations & Camera Calibration. Mašinska vizija, 2018. Image Transformations & Camera Calibration Mašinska vizija, 2018. Image transformations What ve we learnt so far? Example 1 resize and rotate Open warp_affine_template.cpp Perform simple resize

More information

Quality Assurance and Quality Control Procedures for Survey-Grade Mobile Mapping Systems

Quality Assurance and Quality Control Procedures for Survey-Grade Mobile Mapping Systems Quality Assurance and Quality Control Procedures for Survey-Grade Mobile Mapping Systems Latin America Geospatial Forum November, 2015 Agenda 1. Who is Teledyne Optech 2. The Lynx Mobile Mapper 3. Mobile

More information

Chapters 1 9: Overview

Chapters 1 9: Overview Chapters 1 9: Overview Chapter 1: Introduction Chapters 2 4: Data acquisition Chapters 5 9: Data manipulation Chapter 5: Vertical imagery Chapter 6: Image coordinate measurements and refinements Chapters

More information

FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES

FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES Jie Shao a, Wuming Zhang a, Yaqiao Zhu b, Aojie Shen a a State Key Laboratory of Remote Sensing Science, Institute of Remote Sensing

More information

ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning

ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning 1 ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning Petri Rönnholm Aalto University 2 Learning objectives To recognize applications of laser scanning To understand principles

More information

Stereo and Epipolar geometry

Stereo and Epipolar geometry Previously Image Primitives (feature points, lines, contours) Today: Stereo and Epipolar geometry How to match primitives between two (multiple) views) Goals: 3D reconstruction, recognition Jana Kosecka

More information

Analysis of Different Reference Plane Setups for the Calibration of a Mobile Laser Scanning System

Analysis of Different Reference Plane Setups for the Calibration of a Mobile Laser Scanning System Analysis of Different Reference Plane Setups for the Calibration of a Mobile Laser Scanning System 18. Internationaler Ingenieurvermessungskurs Graz, Austria, 25-29 th April 2017 Erik Heinz, Christian

More information

Development of a Test Field for the Calibration and Evaluation of Kinematic Multi Sensor Systems

Development of a Test Field for the Calibration and Evaluation of Kinematic Multi Sensor Systems Development of a Test Field for the Calibration and Evaluation of Kinematic Multi Sensor Systems DGK-Doktorandenseminar Graz, Austria, 26 th April 2017 Erik Heinz Institute of Geodesy and Geoinformation

More information

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration Camera Calibration Jesus J Caban Schedule! Today:! Camera calibration! Wednesday:! Lecture: Motion & Optical Flow! Monday:! Lecture: Medical Imaging! Final presentations:! Nov 29 th : W. Griffin! Dec 1

More information

DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS

DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS DERIVING PEDESTRIAN POSITIONS FROM UNCALIBRATED VIDEOS Zoltan Koppanyi, Post-Doctoral Researcher Charles K. Toth, Research Professor The Ohio State University 2046 Neil Ave Mall, Bolz Hall Columbus, OH,

More information

Estimating Camera Position And Posture by Using Feature Landmark Database

Estimating Camera Position And Posture by Using Feature Landmark Database Estimating Camera Position And Posture by Using Feature Landmark Database Motoko Oe 1, Tomokazu Sato 2 and Naokazu Yokoya 2 1 IBM Japan 2 Nara Institute of Science and Technology, Japan Abstract. Estimating

More information

Three-dimensional nondestructive evaluation of cylindrical objects (pipe) using an infrared camera coupled to a 3D scanner

Three-dimensional nondestructive evaluation of cylindrical objects (pipe) using an infrared camera coupled to a 3D scanner Three-dimensional nondestructive evaluation of cylindrical objects (pipe) using an infrared camera coupled to a 3D scanner F. B. Djupkep Dizeu, S. Hesabi, D. Laurendeau, A. Bendada Computer Vision and

More information

Exterior Orientation Parameters

Exterior Orientation Parameters Exterior Orientation Parameters PERS 12/2001 pp 1321-1332 Karsten Jacobsen, Institute for Photogrammetry and GeoInformation, University of Hannover, Germany The georeference of any photogrammetric product

More information

SpatialFuser. Offers support for GNSS and INS navigation systems. Supports multiple remote sensing equipment configurations.

SpatialFuser. Offers support for GNSS and INS navigation systems. Supports multiple remote sensing equipment configurations. SPATIALFUSER SpatialFuser SpatialFuser is designed to georeference data acquired by our LiDAR mapping systems into common mapping formats such as LAS/LAZ or GPX. The main purpose of SpatialFuser is to

More information

Infrared Camera Calibration in the 3D Temperature Field Reconstruction

Infrared Camera Calibration in the 3D Temperature Field Reconstruction , pp.27-34 http://dx.doi.org/10.14257/ijmue.2016.11.6.03 Infrared Camera Calibration in the 3D Temperature Field Reconstruction Sun Xiaoming, Wu Haibin, Wang Wei, Liubo and Cui Guoguang The higher Educational

More information

REFINEMENT OF COLORED MOBILE MAPPING DATA USING INTENSITY IMAGES

REFINEMENT OF COLORED MOBILE MAPPING DATA USING INTENSITY IMAGES REFINEMENT OF COLORED MOBILE MAPPING DATA USING INTENSITY IMAGES T. Yamakawa a, K. Fukano a,r. Onodera a, H. Masuda a, * a Dept. of Mechanical Engineering and Intelligent Systems, The University of Electro-Communications,

More information

LUMS Mine Detector Project

LUMS Mine Detector Project LUMS Mine Detector Project Using visual information to control a robot (Hutchinson et al. 1996). Vision may or may not be used in the feedback loop. Visual (image based) features such as points, lines

More information

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation Alexander Andreopoulos, Hirak J. Kashyap, Tapan K. Nayak, Arnon Amir, Myron D. Flickner IBM Research March 25,

More information

An Overview of Applanix.

An Overview of Applanix. An Overview of Applanix The Company The Industry Leader in Developing Aided Inertial Technology Founded on Canadian Aerospace and Defense Industry Expertise Providing Precise Position and Orientation Systems

More information

REGISTRATION OF AIRBORNE LASER DATA TO SURFACES GENERATED BY PHOTOGRAMMETRIC MEANS. Y. Postolov, A. Krupnik, K. McIntosh

REGISTRATION OF AIRBORNE LASER DATA TO SURFACES GENERATED BY PHOTOGRAMMETRIC MEANS. Y. Postolov, A. Krupnik, K. McIntosh REGISTRATION OF AIRBORNE LASER DATA TO SURFACES GENERATED BY PHOTOGRAMMETRIC MEANS Y. Postolov, A. Krupnik, K. McIntosh Department of Civil Engineering, Technion Israel Institute of Technology, Haifa,

More information

POINT CLOUD ANALYSIS FOR ROAD PAVEMENTS IN BAD CONDITIONS INTRODUCTION

POINT CLOUD ANALYSIS FOR ROAD PAVEMENTS IN BAD CONDITIONS INTRODUCTION POINT CLOUD ANALYSIS FOR ROAD PAVEMENTS IN BAD CONDITIONS Yoshiyuki Yamamoto, Associate Professor Yasuhiro Shimizu, Doctoral Student Eiji Nakamura, Professor Masayuki Okugawa, Associate Professor Aichi

More information

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The

More information

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016 Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused

More information

Graph based INS-camera calibration

Graph based INS-camera calibration Graph based INS-camera calibration D. Bender a,b, M. Schikora a,b, J. Sturm b, D. Cremers b a Dept. Sensor Data and Information Fusion, Fraunhofer FKIE, Wachtberg, Germany b Computer Vision Group, Technical

More information

A Comparison of Laser Scanners for Mobile Mapping Applications

A Comparison of Laser Scanners for Mobile Mapping Applications A Comparison of Laser Scanners for Mobile Mapping Applications Craig Glennie 1, Jerry Dueitt 2 1 Department of Civil & Environmental Engineering The University of Houston 3605 Cullen Boulevard, Room 2008

More information

All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them.

All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them. All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them. - Aristotle University of Texas at Arlington Introduction

More information

Fully Automatic Endoscope Calibration for Intraoperative Use

Fully Automatic Endoscope Calibration for Intraoperative Use Fully Automatic Endoscope Calibration for Intraoperative Use Christian Wengert, Mireille Reeff, Philippe C. Cattin, Gábor Székely Computer Vision Laboratory, ETH Zurich, 8092 Zurich, Switzerland {wengert,

More information

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation. 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction 3D Shape from X shading silhouette texture stereo light striping motion mainly

More information

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482 Rigid Body Motion and Image Formation Jana Kosecka, CS 482 A free vector is defined by a pair of points : Coordinates of the vector : 1 3D Rotation of Points Euler angles Rotation Matrices in 3D 3 by 3

More information

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah Camera Models and Image Formation Srikumar Ramalingam School of Computing University of Utah srikumar@cs.utah.edu VisualFunHouse.com 3D Street Art Image courtesy: Julian Beaver (VisualFunHouse.com) 3D

More information

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah Camera Models and Image Formation Srikumar Ramalingam School of Computing University of Utah srikumar@cs.utah.edu Reference Most slides are adapted from the following notes: Some lecture notes on geometric

More information

Handheld Augmented Reality. Reto Lindegger

Handheld Augmented Reality. Reto Lindegger Handheld Augmented Reality Reto Lindegger lreto@ethz.ch 1 AUGMENTED REALITY 2 A Definition Three important characteristics: Combines real and virtual environment Interactive in real-time Registered in

More information

POSITIONING A PIXEL IN A COORDINATE SYSTEM

POSITIONING A PIXEL IN A COORDINATE SYSTEM GEOREFERENCING AND GEOCODING EARTH OBSERVATION IMAGES GABRIEL PARODI STUDY MATERIAL: PRINCIPLES OF REMOTE SENSING AN INTRODUCTORY TEXTBOOK CHAPTER 6 POSITIONING A PIXEL IN A COORDINATE SYSTEM The essential

More information

An Automatic Method for Adjustment of a Camera Calibration Room

An Automatic Method for Adjustment of a Camera Calibration Room An Automatic Method for Adjustment of a Camera Calibration Room Presented at the FIG Working Week 2017, May 29 - June 2, 2017 in Helsinki, Finland Theory, algorithms, implementation, and two advanced applications.

More information

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia MERGING POINT CLOUDS FROM MULTIPLE KINECTS Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia Introduction What do we want to do? : Use information (point clouds) from multiple (2+) Kinects

More information

Creating a distortion characterisation dataset for visual band cameras using fiducial markers.

Creating a distortion characterisation dataset for visual band cameras using fiducial markers. Creating a distortion characterisation dataset for visual band cameras using fiducial markers. Robert Jermy Council for Scientific and Industrial Research Email: rjermy@csir.co.za Jason de Villiers Council

More information

University of Technology Building & Construction Department / Remote Sensing & GIS lecture

University of Technology Building & Construction Department / Remote Sensing & GIS lecture 5. Corrections 5.1 Introduction 5.2 Radiometric Correction 5.3 Geometric corrections 5.3.1 Systematic distortions 5.3.2 Nonsystematic distortions 5.4 Image Rectification 5.5 Ground Control Points (GCPs)

More information

APPLANIX PRODUCTS AND SOLUTIONS FOR MOBILE MAPPING AND POSITIONING CAPTURE EVERYTHING. POSPac MMS HYDRO 08 November 2008

APPLANIX PRODUCTS AND SOLUTIONS FOR MOBILE MAPPING AND POSITIONING CAPTURE EVERYTHING. POSPac MMS HYDRO 08 November 2008 APPLANIX CAPTURE EVERYTHING POSPac MMS HYDRO 08 November 2008 Accurate Post Processed Position & Orientation For Modern Port Survey Operations. Increasing use of sonar & laser survey equipment with sub-centimetre

More information

Depth Range Accuracy for Plenoptic Cameras

Depth Range Accuracy for Plenoptic Cameras Depth Range Accuracy for Plenoptic Cameras Nuno Barroso Monteiro Institute for Systems and Robotics, University of Lisbon, Portugal Institute for Systems and Robotics, University of Coimbra, Portugal Simão

More information

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Vehicle Localization. Hannah Rae Kerner 21 April 2015 Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular

More information

Chapter 1: Overview. Photogrammetry: Introduction & Applications Photogrammetric tools:

Chapter 1: Overview. Photogrammetry: Introduction & Applications Photogrammetric tools: Chapter 1: Overview Photogrammetry: Introduction & Applications Photogrammetric tools: Rotation matrices Photogrammetric point positioning Photogrammetric bundle adjustment This chapter will cover the

More information

GABRIELE GUIDI, PHD POLITECNICO DI MILANO, ITALY VISITING SCHOLAR AT INDIANA UNIVERSITY NOV OCT D IMAGE FUSION

GABRIELE GUIDI, PHD POLITECNICO DI MILANO, ITALY VISITING SCHOLAR AT INDIANA UNIVERSITY NOV OCT D IMAGE FUSION GABRIELE GUIDI, PHD POLITECNICO DI MILANO, ITALY VISITING SCHOLAR AT INDIANA UNIVERSITY NOV 2017 - OCT 2018 3D IMAGE FUSION 3D IMAGE FUSION WHAT A 3D IMAGE IS? A cloud of 3D points collected from a 3D

More information

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB

Centre for Digital Image Measurement and Analysis, School of Engineering, City University, Northampton Square, London, ECIV OHB HIGH ACCURACY 3-D MEASUREMENT USING MULTIPLE CAMERA VIEWS T.A. Clarke, T.J. Ellis, & S. Robson. High accuracy measurement of industrially produced objects is becoming increasingly important. The techniques

More information

ACCURACY ANALYSIS FOR NEW CLOSE-RANGE PHOTOGRAMMETRIC SYSTEMS

ACCURACY ANALYSIS FOR NEW CLOSE-RANGE PHOTOGRAMMETRIC SYSTEMS ACCURACY ANALYSIS FOR NEW CLOSE-RANGE PHOTOGRAMMETRIC SYSTEMS Dr. Mahmoud El-Nokrashy O. ALI Prof. of Photogrammetry, Civil Eng. Al Azhar University, Cairo, Egypt m_ali@starnet.com.eg Dr. Mohamed Ashraf

More information

Towards a visual perception system for LNG pipe inspection

Towards a visual perception system for LNG pipe inspection Towards a visual perception system for LNG pipe inspection LPV Project Team: Brett Browning (PI), Peter Rander (co PI), Peter Hansen Hatem Alismail, Mohamed Mustafa, Joey Gannon Qri8 Lab A Brief Overview

More information

Photometric Laser Scanner to Camera Calibration for Low Resolution Sensors

Photometric Laser Scanner to Camera Calibration for Low Resolution Sensors Photometric Laser Scanner to Camera Calibration for Low Resolution Sensors Johannes Gräter 1, Tobias Strauss, Martin Lauer August 1, 2017 Abstract In modern automated systems the usage of sensors with

More information

ROAD SURFACE STRUCTURE MONITORING AND ANALYSIS USING HIGH PRECISION GPS MOBILE MEASUREMENT SYSTEMS (MMS)

ROAD SURFACE STRUCTURE MONITORING AND ANALYSIS USING HIGH PRECISION GPS MOBILE MEASUREMENT SYSTEMS (MMS) ROAD SURFACE STRUCTURE MONITORING AND ANALYSIS USING HIGH PRECISION GPS MOBILE MEASUREMENT SYSTEMS (MMS) Bonifacio R. Prieto PASCO Philippines Corporation, Pasig City, 1605, Philippines Email: bonifacio_prieto@pascoph.com

More information

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm) Chapter 8.2 Jo-Car2 Autonomous Mode Path Planning (Cost Matrix Algorithm) Introduction: In order to achieve its mission and reach the GPS goal safely; without crashing into obstacles or leaving the lane,

More information

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman Stereo 11/02/2012 CS129, Brown James Hays Slides by Kristen Grauman Multiple views Multi-view geometry, matching, invariant features, stereo vision Lowe Hartley and Zisserman Why multiple views? Structure

More information

(Subsea) Keith Vickery Zupt LLC

(Subsea) Keith Vickery Zupt LLC (Subsea) Keith Vickery Zupt LLC kv@zupt.com Offshore subsea infrastructure surveys (pipeline inspection, well, XT, and manifold inspections) are required to ensure compliance with both internal operator,

More information

Camera Calibration with a Simulated Three Dimensional Calibration Object

Camera Calibration with a Simulated Three Dimensional Calibration Object Czech Pattern Recognition Workshop, Tomáš Svoboda (Ed.) Peršlák, Czech Republic, February 4, Czech Pattern Recognition Society Camera Calibration with a Simulated Three Dimensional Calibration Object Hynek

More information

arxiv: v1 [cs.cv] 28 Sep 2018

arxiv: v1 [cs.cv] 28 Sep 2018 Camera Pose Estimation from Sequence of Calibrated Images arxiv:1809.11066v1 [cs.cv] 28 Sep 2018 Jacek Komorowski 1 and Przemyslaw Rokita 2 1 Maria Curie-Sklodowska University, Institute of Computer Science,

More information

Real Time Multi-Sensor Data Acquisition and Processing for a Road Mapping System

Real Time Multi-Sensor Data Acquisition and Processing for a Road Mapping System Real Time Multi-Sensor Data Acquisition and Processing for a Road Mapping System by Xiang Luo A thesis submitted for the degree of Master of Engineering (Research) Faculty of Engineering and Information

More information

Jeffrey A. Schepers P.S. EIT Geospatial Services Holland Engineering Inc. 220 Hoover Blvd, Suite 2, Holland, MI Desk

Jeffrey A. Schepers P.S. EIT Geospatial Services Holland Engineering Inc. 220 Hoover Blvd, Suite 2, Holland, MI Desk Jeffrey A. Schepers P.S. EIT Geospatial Services Holland Engineering Inc. 220 Hoover Blvd, Suite 2, Holland, MI 49423 616-594-5127 Desk 616-322-1724 Cell 616-392-5938 Office Mobile LiDAR - Laser Scanning

More information

Geometric camera models and calibration

Geometric camera models and calibration Geometric camera models and calibration http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 13 Course announcements Homework 3 is out. - Due October

More information

STEREOVISION MOBILE MAPPING: SYSTEM DESIGN AND PERFORMANCE EVALUATION

STEREOVISION MOBILE MAPPING: SYSTEM DESIGN AND PERFORMANCE EVALUATION STEREOVISION MOBILE MAPPING: SYSTEM DESIGN AND PERFORMANCE EVALUATION J. Burkhard, S. Cavegn, A. Barmettler, S. Nebiker Institute of Geomatics Engineering, FHNW University of Applied Sciences and Arts

More information

3D object recognition used by team robotto

3D object recognition used by team robotto 3D object recognition used by team robotto Workshop Juliane Hoebel February 1, 2016 Faculty of Computer Science, Otto-von-Guericke University Magdeburg Content 1. Introduction 2. Depth sensor 3. 3D object

More information

Outline. ETN-FPI Training School on Plenoptic Sensing

Outline. ETN-FPI Training School on Plenoptic Sensing Outline Introduction Part I: Basics of Mathematical Optimization Linear Least Squares Nonlinear Optimization Part II: Basics of Computer Vision Camera Model Multi-Camera Model Multi-Camera Calibration

More information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information Proceedings of the World Congress on Electrical Engineering and Computer Systems and Science (EECSS 2015) Barcelona, Spain July 13-14, 2015 Paper No. 335 Efficient SLAM Scheme Based ICP Matching Algorithm

More information

Product information. Hi-Tech Electronics Pte Ltd

Product information. Hi-Tech Electronics Pte Ltd Product information Introduction TEMA Motion is the world leading software for advanced motion analysis. Starting with digital image sequences the operator uses TEMA Motion to track objects in images,

More information

The Applanix Approach to GPS/INS Integration

The Applanix Approach to GPS/INS Integration Lithopoulos 53 The Applanix Approach to GPS/INS Integration ERIK LITHOPOULOS, Markham ABSTRACT The Position and Orientation System for Direct Georeferencing (POS/DG) is an off-the-shelf integrated GPS/inertial

More information

Real-Time Detection of Road Markings for Driving Assistance Applications

Real-Time Detection of Road Markings for Driving Assistance Applications Real-Time Detection of Road Markings for Driving Assistance Applications Ioana Maria Chira, Ancuta Chibulcutean Students, Faculty of Automation and Computer Science Technical University of Cluj-Napoca

More information

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: , 3D Sensing and Reconstruction Readings: Ch 12: 12.5-6, Ch 13: 13.1-3, 13.9.4 Perspective Geometry Camera Model Stereo Triangulation 3D Reconstruction by Space Carving 3D Shape from X means getting 3D coordinates

More information

TERRESTRIAL LASER SCANNER DATA PROCESSING

TERRESTRIAL LASER SCANNER DATA PROCESSING TERRESTRIAL LASER SCANNER DATA PROCESSING L. Bornaz (*), F. Rinaudo (*) (*) Politecnico di Torino - Dipartimento di Georisorse e Territorio C.so Duca degli Abruzzi, 24 10129 Torino Tel. +39.011.564.7687

More information

Integrating the Generations, FIG Working Week 2008,Stockholm, Sweden June 2008

Integrating the Generations, FIG Working Week 2008,Stockholm, Sweden June 2008 H. Murat Yilmaz, Aksaray University,Turkey Omer Mutluoglu, Selçuk University, Turkey Murat Yakar, Selçuk University,Turkey Cutting and filling volume calculation are important issues in many engineering

More information

Targetless Calibration of a Lidar - Perspective Camera Pair. Levente Tamás, Zoltán Kató

Targetless Calibration of a Lidar - Perspective Camera Pair. Levente Tamás, Zoltán Kató Targetless Calibration of a Lidar - Perspective Camera Pair Levente Tamás, Zoltán Kató Content Introduction Region-based calibration framework Evaluation on synthetic data Real data experiments Conclusions

More information

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any) Address: Hong Kong Polytechnic University, Phase 8, Hung Hom, Kowloon, Hong Kong. Telephone: (852) 3400 8441 Email: cnerc.steel@polyu.edu.hk Website: https://www.polyu.edu.hk/cnerc-steel/ Project Title:

More information

Vision-based endoscope tracking for 3D ultrasound image-guided surgical navigation [Yang et al. 2014, Comp Med Imaging and Graphics]

Vision-based endoscope tracking for 3D ultrasound image-guided surgical navigation [Yang et al. 2014, Comp Med Imaging and Graphics] Vision-based endoscope tracking for 3D ultrasound image-guided surgical navigation [Yang et al. 2014, Comp Med Imaging and Graphics] Gustavo Sato dos Santos IGI Journal Club 23.10.2014 Motivation Goal:

More information

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography Computational Photography Matthias Zwicker University of Bern Fall 2009 Today From 2D to 3D using multiple views Introduction Geometry of two views Stereo matching Other applications Multiview geometry

More information

Hand-Eye Calibration from Image Derivatives

Hand-Eye Calibration from Image Derivatives Hand-Eye Calibration from Image Derivatives Abstract In this paper it is shown how to perform hand-eye calibration using only the normal flow field and knowledge about the motion of the hand. The proposed

More information

Computer and Machine Vision

Computer and Machine Vision Computer and Machine Vision Lecture Week 12 Part-2 Additional 3D Scene Considerations March 29, 2014 Sam Siewert Outline of Week 12 Computer Vision APIs and Languages Alternatives to C++ and OpenCV API

More information

#65 MONITORING AND PREDICTING PEDESTRIAN BEHAVIOR AT TRAFFIC INTERSECTIONS

#65 MONITORING AND PREDICTING PEDESTRIAN BEHAVIOR AT TRAFFIC INTERSECTIONS #65 MONITORING AND PREDICTING PEDESTRIAN BEHAVIOR AT TRAFFIC INTERSECTIONS Final Research Report Luis E. Navarro-Serment, Ph.D. The Robotics Institute Carnegie Mellon University Disclaimer The contents

More information

Introduction to 3D Machine Vision

Introduction to 3D Machine Vision Introduction to 3D Machine Vision 1 Many methods for 3D machine vision Use Triangulation (Geometry) to Determine the Depth of an Object By Different Methods: Single Line Laser Scan Stereo Triangulation

More information

Pin Hole Cameras & Warp Functions

Pin Hole Cameras & Warp Functions Pin Hole Cameras & Warp Functions Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Today Pinhole Camera. Homogenous Coordinates. Planar Warp Functions. Example of SLAM for AR Taken from:

More information

CS5670: Computer Vision

CS5670: Computer Vision CS5670: Computer Vision Noah Snavely, Zhengqi Li Stereo Single image stereogram, by Niklas Een Mark Twain at Pool Table", no date, UCR Museum of Photography Stereo Given two images from different viewpoints

More information

A 100Hz Real-time Sensing System of Textured Range Images

A 100Hz Real-time Sensing System of Textured Range Images A 100Hz Real-time Sensing System of Textured Range Images Hidetoshi Ishiyama Course of Precision Engineering School of Science and Engineering Chuo University 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551,

More information

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots 3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots Yuncong Chen 1 and Will Warren 2 1 Department of Computer Science and Engineering,

More information

Pin Hole Cameras & Warp Functions

Pin Hole Cameras & Warp Functions Pin Hole Cameras & Warp Functions Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Today Pinhole Camera. Homogenous Coordinates. Planar Warp Functions. Motivation Taken from: http://img.gawkerassets.com/img/18w7i1umpzoa9jpg/original.jpg

More information

PHOTOGRAMMETRIC SOLUTIONS OF NON-STANDARD PHOTOGRAMMETRIC BLOCKS INTRODUCTION

PHOTOGRAMMETRIC SOLUTIONS OF NON-STANDARD PHOTOGRAMMETRIC BLOCKS INTRODUCTION PHOTOGRAMMETRIC SOLUTIONS OF NON-STANDARD PHOTOGRAMMETRIC BLOCKS Dor Yalon Co-Founder & CTO Icaros, Inc. ABSTRACT The use of small and medium format sensors for traditional photogrammetry presents a number

More information

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe Structured Light Tobias Nöll tobias.noell@dfki.de Thanks to Marc Pollefeys, David Nister and David Lowe Introduction Previous lecture: Dense reconstruction Dense matching of non-feature pixels Patch-based

More information

LaserFleX New Generation of High Performance Infrastructure Measurement System

LaserFleX New Generation of High Performance Infrastructure Measurement System Balfour Beatty Rail LaserFleX New Generation of High Performance Infrastructure Measurement System LaserFleX TM Introduction LaserFleX TM is the modular railway infrastructure and track measurement system

More information

AN INTEGRATED SENSOR ORIENTATION SYSTEM FOR AIRBORNE PHOTOGRAMMETRIC APPLICATIONS

AN INTEGRATED SENSOR ORIENTATION SYSTEM FOR AIRBORNE PHOTOGRAMMETRIC APPLICATIONS AN INTEGRATED SENSOR ORIENTATION SYSTEM FOR AIRBORNE PHOTOGRAMMETRIC APPLICATIONS M. J. Smith a, *, N. Kokkas a, D.W.G. Park b a Faculty of Engineering, The University of Nottingham, Innovation Park, Triumph

More information

CAMERA CALIBRATION FOR VISUAL ODOMETRY SYSTEM

CAMERA CALIBRATION FOR VISUAL ODOMETRY SYSTEM SCIENTIFIC RESEARCH AND EDUCATION IN THE AIR FORCE-AFASES 2016 CAMERA CALIBRATION FOR VISUAL ODOMETRY SYSTEM Titus CIOCOIU, Florin MOLDOVEANU, Caius SULIMAN Transilvania University, Braşov, Romania (ciocoiutitus@yahoo.com,

More information

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Computer Vision Projective Geometry and Calibration. Pinhole cameras Computer Vision Projective Geometry and Calibration Professor Hager http://www.cs.jhu.edu/~hager Jason Corso http://www.cs.jhu.edu/~jcorso. Pinhole cameras Abstract camera model - box with a small hole

More information

Miniature faking. In close-up photo, the depth of field is limited.

Miniature faking. In close-up photo, the depth of field is limited. Miniature faking In close-up photo, the depth of field is limited. http://en.wikipedia.org/wiki/file:jodhpur_tilt_shift.jpg Miniature faking Miniature faking http://en.wikipedia.org/wiki/file:oregon_state_beavers_tilt-shift_miniature_greg_keene.jpg

More information

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene Buyuksalih, G.*, Oruc, M.*, Topan, H.*,.*, Jacobsen, K.** * Karaelmas University Zonguldak, Turkey **University

More information

Agenda. Rotations. Camera models. Camera calibration. Homographies

Agenda. Rotations. Camera models. Camera calibration. Homographies Agenda Rotations Camera models Camera calibration Homographies D Rotations R Y = Z r r r r r r r r r Y Z Think of as change of basis where ri = r(i,:) are orthonormal basis vectors r rotated coordinate

More information

Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots

Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots Jorge L. Martínez, Jesús Morales, Antonio, J. Reina, Anthony Mandow, Alejandro Pequeño-Boter*, and

More information

A Robust Two Feature Points Based Depth Estimation Method 1)

A Robust Two Feature Points Based Depth Estimation Method 1) Vol.31, No.5 ACTA AUTOMATICA SINICA September, 2005 A Robust Two Feature Points Based Depth Estimation Method 1) ZHONG Zhi-Guang YI Jian-Qiang ZHAO Dong-Bin (Laboratory of Complex Systems and Intelligence

More information

Recap from Previous Lecture

Recap from Previous Lecture Recap from Previous Lecture Tone Mapping Preserve local contrast or detail at the expense of large scale contrast. Changing the brightness within objects or surfaces unequally leads to halos. We are now

More information