Vol. 21 No. 6, pp.69 696, 23 69 3 3 3 Map Generation of a Mobile Robot by Integrating Omnidirectional Stereo and Laser Range Finder Yoshiro Negishi 3, Jun Miura 3 and Yoshiaki Shirai 3 This paper describes a map generation method using an omnidirectional stereo and a laser range nder. Omnidirectional stereo has an advantage of 3D range acquisition, while it may suer from a low reliability and accuracy in range data. Laser range nders have advantage of reliable acquisition of data, while they usually obtain only 2D range information. By integrating these two sensors, a reliable map can be generated. Since the two sensors may detect dierent parts of an object, a separate probabilistic grid map is rst generated by temporal integration of data from each sensor. The resultant two maps are then integrated using a logical integration rule. An ego-motion estimation method is also described, which is necessary for integration of sensor data obtained at dierent positions. Experimental results on autonomous navigation in unknown environments show the feasibility of the method. Key Words: Moble robot, Omnidirectional stereo, Laser range nder, Sensor fusion, Probabilistic model of uncertainty. 1. [1] [2] [3] 36 LRF [4] [5] LRF 2 LRF 22 11 25 3 3 Graduate School of Engineering, Osaka University [6] [9] 3 LRF LRF 2 2 Fig. 1 2. 2. 1 Fig. 1 21 6 1 23 9
691 (b) Panoramic image converted from (a). (a) Input omnidirectional image (lower camera). Fig. 2 (c) panoramic disparity image obtained from (b). Omnidirectional stereo generates a panoramic disparity image. omnidirectional stereo cameras laser range finder Fig. 3 An example LRF measurement. The black triangle indicates the position and the direction of the robot. Fig. 1 Our mobile robot. 2 [1] 36 2 5 5 Pentium III 85MHz PC 1 :18[s] Fig. 2 3 5[m] 5[m] 2. 2 LRF SICK LRF LRF Fig. 1 35[cm] 1[deg] 65[cm] Fig. 2 Fig. 3 3. 2 [6] 2 2 1 1 LRF LRF 2 :5 LRF 2 JRSJ Vol. 21 No. 6 2 Sept., 23
692 occupied unknown free robot θ R min R R max Fig. 4 Determination of grid attributes. 2 3. 1 Elfes [6] Occupancy grid Thrun [11] forward sensor model [6] inverse model [12] 1 EM [11] LRF 3. 2 Fig. 4 Fig. 4 R R min R max [3] R min R max occupied free unknown O O O O O E P (E) O P (EjO) O P (EjO) P (OjE)P (E) P (EjO) = P (OjE)P (E) +P (OjE)P (E) P (OjE)P (E) P (EjO) = P (OjE)P (E) +P (OjE)P (E) P (E) :5 P (OjE) P (OjE) P (OjE) =1 P (OjE) P (OjE) =1 P (OjE) P (E) = 1 P (E) 5[m] 3. 3 2 P (OjE) P (OjE) 3. 3. 1 P (OjE) 21 6 3 23 9
693 probability.8.5.2 1.3 5.2 distance [m] Fig. 5 Stereo uncertainty model, P (OjE) Table 1 The integration rule. OB: obstacle FS: free space UD wt: undecided with observation UD wo : undecided without observation stereo OB UD wo UD wt FS L OB OB OB OB OB R UD wo OB OB OB OB F UD wt OB OB OB FS FS OB OB FS FS P (OjE) 1:3[m] :8 1:3[m] P (OjE) :5 P (OjE) Fig. 5 P(OjE) :5 3. 3. 2 LRF LRF P (OjE) :9 P(OjE) :5 3. 4 2 2 2 2 3 :7 obstacle :2 free space undecided 2 1 undecided with observation 1 undecided without observation N N 5[cm] :5 5 N = 5 LRF N =1 2 Table 1 4. LRF LRF 2 LRF JRSJ Vol. 21 No. 6 4 Sept., 23
694 2 S S = NX n2 Xi X i cos 1 + Y i sin 1 +1x 132 i=1 Fig. 6 Extracted feature points in LRF data. Y' Y (X' i, Y' i ) (X i, Y i ) + 2 Y i X i sin 1 + Y i cos 1 +1y 132o 1 S @S @1x =; @S @1y =; @S @1 = 2 (1) (2) Fig. 7 θ y X' X x Ego-motion estimation using point features. N [X i Yi ]N [XiY i ][X i ][Yi ]+[Y i ][X 1 =tan i] 1 N [X ixi ]+N [YiYi ][Xi][X i ][Yi][Y i ] 1x= [Xi] [X i]cos1 [Yi ]sin1 N 1y = [Yi] +[X i] sin 1 [Yi ]cos1 N [1] 1 N LRF 4. 1 [13] Lu Milios [14] Fig. 3 LRF 5[cm] LRF 4[cm] 1[cm] 3[cm] Fig. 6 LRF 2 2[cm] 4. 2 2 (1x; 1y; 1) (Xi;Y i ) (X i;y i) i i = 1; 111;N Fig. 7 cos 1 sin 1 1x sin 1 cos 1 X i + Yi 1y = Xi Y i 5. Fig. 1 LRF :5[s] Fig. 8 LRF LRF 1 5[cm] 2 2 2 1[m] 2 1[m] 1[m] [15] (a) (b) LRF (c) (d) LRF 21 6 5 23 9
695 stereo LRF free space (a) (b) white board (d) copy machine chairs (c) table table cabinet (c) desks partition (b) table (a) chairs (d) Fig. 8 A navigation result. Black and white triangles indicate the robot position and orientation. Fig. 8 (d) 3. 3. 1 6. 2 2 1 1 [3] [1] \PC " Vol. 18, No. 6, pp. 896-91, 2. [ 2 ] D. Murray and J. Little: \Using Real-Time Stereo Vision for Mobile Robot Navigation," Autonomous Robots, Vol. 8, No. 2, pp. 161{171, 2. [ 3] H. Koyasu, J. Miura, and Y. Shirai, \Recognizing Moving Obstacles for Robot Navigation Using Real-time Omnidirectional Stereo Vision," J. of Robotics and Mechatronics, Vol. 14, No. 2, pp. 147-156, 22. [ 4 ] M. Lindstrom and J.-O. Eklundh: \Detecting and Tracking Moving Objects from a Mobile Platform using a Laser Range Scanner," Proc. of IEEE Int. Conf. on Intelligent Robots and Systems, pp. 1364{1369, 21. [ 5 ] E. Prassler and J. Scholz: \Tracking Multiple Moving Objects for Real-Time Navigation" Autonomous Robots, Vol. 8, No. 2, pp. 15{116, 2. [ 6 ] A. Elfes: \Sonar-Based Real-World Mapping and Navigation," Int. J. of Robotics and Automat, Vol. 3, No. 3, pp. 249{265, 1987. [7] \ ", Vol. 31, No. 12, pp. 1743-1754, 199. JRSJ Vol. 21 No. 6 6 Sept., 23
696 [ 8 ] N. Ayache and O.D. Faugeras. \Maintaining Representations of the Environment of a Mobile Robot," IEEE Trans. on Robotics and Automat, Vol. RA-5, No. 6, pp. 84{819, 1989. [ 9] T. Yata, A. Ohya, and S. Yuta. \Fusion of Omni-directional Sonar and Omni-directional Vision for Environment Recognition of Mobile Robots" Proc. of the 21 IEEE/RSJ Int. Conf. on Intelligent Robots and Sysetms, pp. 3926{3931, 21. [1] J. Gluckman, S.K. Nayar, K.J. Thoresz: \Real-Time Omnidirectional and Panoramic Stereo," In Proc. Image Understanding Workshop, Vol. 1, pp. 299-33, 1998. [11] S. Thrun: \Learning Occupancy Grids with Forward Models," Proc. of 21 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1676{1681, 21. [12] Y. Sawano, J. Miura, and Y. Shirai: \Man Chasing Robot by an Environment Recognition Using Stereo Vision," Proc. of the 2 Int. Conf. on Machine Automation, pp. 389{394, 2. [13] I.J. Cox: \Blanche: Position Estimation for an Autonomous Robot Vehicle," Proc. of IEEE/RSJ Int. Workshop on Intelligent Robots and Systems, pp. 432{439, 1989. [14] F. Lu and E.E. Milios: \Robot Pose Estimation in Unknown Environments by Matching 2D Range Scans," Proc. of IEEE Conf. on Computer Vision and Pattern Recognition, 1994. [15] \ " 3 No. 2, pp115{116, 22. (Yoshiro Negishi) 22 (Jun Miura) 1984 1989 1994 1995 CMU 1997 IEEE AAAI (Yoshiaki Shirai) 1964 1969 1971 1972 MIT AI 1988 21 6 7 23 9