Vol. 21 No. 6, pp ,

Similar documents
Realtime Omnidirectional Stereo for Obstacle Detection and Tracking in Dynamic Environments

1 Introduction. 2 Real-time Omnidirectional Stereo

Vision-Motion Planning with Uncertainty

Image-Based Memory of Environment. homing uses a similar idea that the agent memorizes. [Hong 91]. However, the agent nds diculties in arranging its

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit

Proc. 14th Int. Conf. on Intelligent Autonomous Systems (IAS-14), 2016

3D Environment Measurement Using Binocular Stereo and Motion Stereo by Mobile Robot with Omnidirectional Stereo Camera

Path and Viewpoint Planning of Mobile Robots with Multiple Observation Strategies

Behavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism

OCCUPANCY GRID MODELING FOR MOBILE ROBOT USING ULTRASONIC RANGE FINDER

Modeling the Static and the Dynamic Parts of the Environment to Improve Sensor-based Navigation

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

Localization of Multiple Robots with Simple Sensors

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

Fusion Between Laser and Stereo Vision Data For Moving Objects Tracking In Intersection Like Scenario

W4. Perception & Situation Awareness & Decision making

Dept. of Adaptive Machine Systems, Graduate School of Engineering Osaka University, Suita, Osaka , Japan

Automatic Generation of Indoor VR-Models by a Mobile Robot with a Laser Range Finder and a Color Camera

Dominant plane detection using optical flow and Independent Component Analysis

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Scan-point Planning and 3-D Map Building for a 3-D Laser Range Scanner in an Outdoor Environment

Online Simultaneous Localization and Mapping in Dynamic Environments

Estimation of Camera Motion with Feature Flow Model for 3D Environment Modeling by Using Omni-Directional Camera

3D Sensing and Mapping for a Tracked Mobile Robot with a Movable Laser Ranger Finder

Today MAPS AND MAPPING. Features. process of creating maps. More likely features are things that can be extracted from images:

Visual Tracking of Unknown Moving Object by Adaptive Binocular Visual Servoing

Guide Robot s Navigation Based on Attention Estimation Using Gaze Information

EE631 Cooperating Autonomous Mobile Robots

Mobile Robot Localization Using Stereo Vision in Outdoor Environments under Various Illumination Conditions

Sensor Fusion of Laser & Stereo Vision Camera for Depth Estimation and Obstacle Avoidance

Matching Evaluation of 2D Laser Scan Points using Observed Probability in Unstable Measurement Environment

DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING

Panoramic Vision and LRF Sensor Fusion Based Human Identification and Tracking for Autonomous Luggage Cart

Real Time Face Detection using Geometric Constraints, Navigation and Depth-based Skin Segmentation on Mobile Robots

6 y [m] y [m] x [m] x [m]

AUTONOMOUS SYSTEMS. LOCALIZATION, MAPPING & SIMULTANEOUS LOCALIZATION AND MAPPING Part V Mapping & Occupancy Grid Mapping

RoboCup Rescue Summer School Navigation Tutorial

Mobile robot localization using laser range scanner and omnicamera

Optical Flow-Based Person Tracking by Multiple Cameras

Robotics and Autonomous Systems. On-line road boundary modeling with multiple sensory features, flexible road model, and particle filter

LAIR. UNDERWATER ROBOTICS Field Explorations in Marine Biology, Oceanography, and Archeology

A Genetic Algorithm for Simultaneous Localization and Mapping

Mobile Robot Mapping and Localization in Non-Static Environments

Grid-Based Models for Dynamic Environments

Appearance-based Visual Localisation in Outdoor Environments with an Omnidirectional Camera

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Spring Localization II. Roland Siegwart, Margarita Chli, Martin Rufli. ASL Autonomous Systems Lab. Autonomous Mobile Robots

Real-time Security Monitoring around a Video Surveillance Vehicle with a Pair of Two-camera Omni-imaging Devices

Histogram Matching and Global Initialization for Laser-only SLAM in Large Unstructured Environments

Transactions on Information and Communications Technologies vol 16, 1996 WIT Press, ISSN

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information

Robust Laser Scan Matching in Dynamic Environments

EE565:Mobile Robotics Lecture 3

Development of a video-rate range finder using dynamic threshold method for characteristic point detection

Tracking Multiple Moving Objects with a Mobile Robot

A Stochastic Environment Modeling Method for Mobile Robot by using 2-D Laser scanner Young D. Kwon,Jin.S Lee Department of Electrical Engineering, Poh

DETECTION OF 3D POINTS ON MOVING OBJECTS FROM POINT CLOUD DATA FOR 3D MODELING OF OUTDOOR ENVIRONMENTS

3D Collision Avoidance for Navigation in Unstructured Environments

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

3D Tracking Using Two High-Speed Vision Systems

DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda**

Localization algorithm using a virtual label for a mobile robot in indoor and outdoor environments

Mobile Manipulation for Everyday Environments

Robot Localization based on Geo-referenced Images and G raphic Methods

A High Dynamic Range Vision Approach to Outdoor Localization

HOG-Based Person Following and Autonomous Returning Using Generated Map by Mobile Robot Equipped with Camera and Laser Range Finder

Proc. Int. Symp. Robotics, Mechatronics and Manufacturing Systems 92 pp , Kobe, Japan, September 1992

Dealing with Laser Scanner Failure: Mirrors and Windows

Simultaneous Localization and Mapping (SLAM)

3D LIDAR Point Cloud based Intersection Recognition for Autonomous Driving

Autonomous Vehicles:

AR Cultural Heritage Reconstruction Based on Feature Landmark Database Constructed by Using Omnidirectional Range Sensor

Elastic Correction of Dead-Reckoning Errors in Map Building

Autonomous Vehicles:

L17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms

Environment Identification by Comparing Maps of Landmarks

Laser Eye a new 3D sensor for active vision

High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications

NERC Gazebo simulation implementation

Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera

SLAM: Robotic Simultaneous Location and Mapping

Stereo-Based Obstacle Avoidance in Indoor Environments with Active Sensor Re-Calibration

Probabilistic Matching for 3D Scan Registration

Exploring an Unknown Environment Using Ultrasonic and Infrared Sensors

1998 IEEE International Conference on Intelligent Vehicles 213

Cooperative Robot Localization with Vision-based Mapping *

An Automatic Method for Adjustment of a Camera Calibration Room

Interactive SLAM using Laser and Advanced Sonar

Simulation of a mobile robot with a LRF in a 2D environment and map building

Spring Localization II. Roland Siegwart, Margarita Chli, Juan Nieto, Nick Lawrance. ASL Autonomous Systems Lab. Autonomous Mobile Robots

Integrated systems for Mapping and Localization

A Mobile Robot Iconic Position Estimator using a Radial Laser Scanner*

Exploration of an Indoor-Environment by an Autonomous Mobile Robot

Simultaneous Localization

3D Environment Reconstruction

Vision Guided AGV Using Distance Transform

Path Tracking by a Mobile Robot Equipped with Only a Downward Facing Camera

Extraction of 3D Images Using Pitch-Actuated 2D Laser Range Finder for Robotic Vision

Qadeer Baig, Mathias Perrollaz, Jander Botelho Do Nascimento, Christian Laugier

Real Time 3D Environment Modeling for a Mobile Robot by Aligning Range Image Sequences

Transcription:

Vol. 21 No. 6, pp.69 696, 23 69 3 3 3 Map Generation of a Mobile Robot by Integrating Omnidirectional Stereo and Laser Range Finder Yoshiro Negishi 3, Jun Miura 3 and Yoshiaki Shirai 3 This paper describes a map generation method using an omnidirectional stereo and a laser range nder. Omnidirectional stereo has an advantage of 3D range acquisition, while it may suer from a low reliability and accuracy in range data. Laser range nders have advantage of reliable acquisition of data, while they usually obtain only 2D range information. By integrating these two sensors, a reliable map can be generated. Since the two sensors may detect dierent parts of an object, a separate probabilistic grid map is rst generated by temporal integration of data from each sensor. The resultant two maps are then integrated using a logical integration rule. An ego-motion estimation method is also described, which is necessary for integration of sensor data obtained at dierent positions. Experimental results on autonomous navigation in unknown environments show the feasibility of the method. Key Words: Moble robot, Omnidirectional stereo, Laser range nder, Sensor fusion, Probabilistic model of uncertainty. 1. [1] [2] [3] 36 LRF [4] [5] LRF 2 LRF 22 11 25 3 3 Graduate School of Engineering, Osaka University [6] [9] 3 LRF LRF 2 2 Fig. 1 2. 2. 1 Fig. 1 21 6 1 23 9

691 (b) Panoramic image converted from (a). (a) Input omnidirectional image (lower camera). Fig. 2 (c) panoramic disparity image obtained from (b). Omnidirectional stereo generates a panoramic disparity image. omnidirectional stereo cameras laser range finder Fig. 3 An example LRF measurement. The black triangle indicates the position and the direction of the robot. Fig. 1 Our mobile robot. 2 [1] 36 2 5 5 Pentium III 85MHz PC 1 :18[s] Fig. 2 3 5[m] 5[m] 2. 2 LRF SICK LRF LRF Fig. 1 35[cm] 1[deg] 65[cm] Fig. 2 Fig. 3 3. 2 [6] 2 2 1 1 LRF LRF 2 :5 LRF 2 JRSJ Vol. 21 No. 6 2 Sept., 23

692 occupied unknown free robot θ R min R R max Fig. 4 Determination of grid attributes. 2 3. 1 Elfes [6] Occupancy grid Thrun [11] forward sensor model [6] inverse model [12] 1 EM [11] LRF 3. 2 Fig. 4 Fig. 4 R R min R max [3] R min R max occupied free unknown O O O O O E P (E) O P (EjO) O P (EjO) P (OjE)P (E) P (EjO) = P (OjE)P (E) +P (OjE)P (E) P (OjE)P (E) P (EjO) = P (OjE)P (E) +P (OjE)P (E) P (E) :5 P (OjE) P (OjE) P (OjE) =1 P (OjE) P (OjE) =1 P (OjE) P (E) = 1 P (E) 5[m] 3. 3 2 P (OjE) P (OjE) 3. 3. 1 P (OjE) 21 6 3 23 9

693 probability.8.5.2 1.3 5.2 distance [m] Fig. 5 Stereo uncertainty model, P (OjE) Table 1 The integration rule. OB: obstacle FS: free space UD wt: undecided with observation UD wo : undecided without observation stereo OB UD wo UD wt FS L OB OB OB OB OB R UD wo OB OB OB OB F UD wt OB OB OB FS FS OB OB FS FS P (OjE) 1:3[m] :8 1:3[m] P (OjE) :5 P (OjE) Fig. 5 P(OjE) :5 3. 3. 2 LRF LRF P (OjE) :9 P(OjE) :5 3. 4 2 2 2 2 3 :7 obstacle :2 free space undecided 2 1 undecided with observation 1 undecided without observation N N 5[cm] :5 5 N = 5 LRF N =1 2 Table 1 4. LRF LRF 2 LRF JRSJ Vol. 21 No. 6 4 Sept., 23

694 2 S S = NX n2 Xi X i cos 1 + Y i sin 1 +1x 132 i=1 Fig. 6 Extracted feature points in LRF data. Y' Y (X' i, Y' i ) (X i, Y i ) + 2 Y i X i sin 1 + Y i cos 1 +1y 132o 1 S @S @1x =; @S @1y =; @S @1 = 2 (1) (2) Fig. 7 θ y X' X x Ego-motion estimation using point features. N [X i Yi ]N [XiY i ][X i ][Yi ]+[Y i ][X 1 =tan i] 1 N [X ixi ]+N [YiYi ][Xi][X i ][Yi][Y i ] 1x= [Xi] [X i]cos1 [Yi ]sin1 N 1y = [Yi] +[X i] sin 1 [Yi ]cos1 N [1] 1 N LRF 4. 1 [13] Lu Milios [14] Fig. 3 LRF 5[cm] LRF 4[cm] 1[cm] 3[cm] Fig. 6 LRF 2 2[cm] 4. 2 2 (1x; 1y; 1) (Xi;Y i ) (X i;y i) i i = 1; 111;N Fig. 7 cos 1 sin 1 1x sin 1 cos 1 X i + Yi 1y = Xi Y i 5. Fig. 1 LRF :5[s] Fig. 8 LRF LRF 1 5[cm] 2 2 2 1[m] 2 1[m] 1[m] [15] (a) (b) LRF (c) (d) LRF 21 6 5 23 9

695 stereo LRF free space (a) (b) white board (d) copy machine chairs (c) table table cabinet (c) desks partition (b) table (a) chairs (d) Fig. 8 A navigation result. Black and white triangles indicate the robot position and orientation. Fig. 8 (d) 3. 3. 1 6. 2 2 1 1 [3] [1] \PC " Vol. 18, No. 6, pp. 896-91, 2. [ 2 ] D. Murray and J. Little: \Using Real-Time Stereo Vision for Mobile Robot Navigation," Autonomous Robots, Vol. 8, No. 2, pp. 161{171, 2. [ 3] H. Koyasu, J. Miura, and Y. Shirai, \Recognizing Moving Obstacles for Robot Navigation Using Real-time Omnidirectional Stereo Vision," J. of Robotics and Mechatronics, Vol. 14, No. 2, pp. 147-156, 22. [ 4 ] M. Lindstrom and J.-O. Eklundh: \Detecting and Tracking Moving Objects from a Mobile Platform using a Laser Range Scanner," Proc. of IEEE Int. Conf. on Intelligent Robots and Systems, pp. 1364{1369, 21. [ 5 ] E. Prassler and J. Scholz: \Tracking Multiple Moving Objects for Real-Time Navigation" Autonomous Robots, Vol. 8, No. 2, pp. 15{116, 2. [ 6 ] A. Elfes: \Sonar-Based Real-World Mapping and Navigation," Int. J. of Robotics and Automat, Vol. 3, No. 3, pp. 249{265, 1987. [7] \ ", Vol. 31, No. 12, pp. 1743-1754, 199. JRSJ Vol. 21 No. 6 6 Sept., 23

696 [ 8 ] N. Ayache and O.D. Faugeras. \Maintaining Representations of the Environment of a Mobile Robot," IEEE Trans. on Robotics and Automat, Vol. RA-5, No. 6, pp. 84{819, 1989. [ 9] T. Yata, A. Ohya, and S. Yuta. \Fusion of Omni-directional Sonar and Omni-directional Vision for Environment Recognition of Mobile Robots" Proc. of the 21 IEEE/RSJ Int. Conf. on Intelligent Robots and Sysetms, pp. 3926{3931, 21. [1] J. Gluckman, S.K. Nayar, K.J. Thoresz: \Real-Time Omnidirectional and Panoramic Stereo," In Proc. Image Understanding Workshop, Vol. 1, pp. 299-33, 1998. [11] S. Thrun: \Learning Occupancy Grids with Forward Models," Proc. of 21 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1676{1681, 21. [12] Y. Sawano, J. Miura, and Y. Shirai: \Man Chasing Robot by an Environment Recognition Using Stereo Vision," Proc. of the 2 Int. Conf. on Machine Automation, pp. 389{394, 2. [13] I.J. Cox: \Blanche: Position Estimation for an Autonomous Robot Vehicle," Proc. of IEEE/RSJ Int. Workshop on Intelligent Robots and Systems, pp. 432{439, 1989. [14] F. Lu and E.E. Milios: \Robot Pose Estimation in Unknown Environments by Matching 2D Range Scans," Proc. of IEEE Conf. on Computer Vision and Pattern Recognition, 1994. [15] \ " 3 No. 2, pp115{116, 22. (Yoshiro Negishi) 22 (Jun Miura) 1984 1989 1994 1995 CMU 1997 IEEE AAAI (Yoshiaki Shirai) 1964 1969 1971 1972 MIT AI 1988 21 6 7 23 9