NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM
|
|
- Roland Gibson
- 5 years ago
- Views:
Transcription
1 NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM Jorma Selkäinaho, Aarne Halme and Janne Paanajärvi Automation Technology Laboratory, Helsinki University of Technology, Espoo, Finland Abstract: The paper deals with the navigation system of a hybrid service robot intended for outdoor applications. The system uses different navigation sensors depending the current situation and task being done. The main sensors are laser range finder, GPS and heading gyro. Keywords: autonomous navigation, hybrid locomotion, outdoor robotics 1 INTRODUCTION Service robots intended to work outdoors are supposed to travel relative long distances on variable terrain. Navigation requirements are demanding, because the robot should do accurate navigation close to the work positions and long distance navigation when traveling between them. Navigation system must be reliable and easy to use by both the user directly and by other tasks in the software system. WorkPartner shown in Fig. 1, is a service robot presently under development at HUT Automation Technology Laboratory [ Halme et al., 2001]. It is a centaur-like general purpose service robot intended to work in urban environment. The working area includes mainly outdoor places, like yards, parks etc, but also covered areas like big halls or storages. GPS is a natural positioning system when available, but because availability is not always guaranteed and accuracy is not enough for all needs, other means have been considered, too. These are a laser range finder, heading gyro and odometry. Because of the relatively complex kinematics of the robot mobility system with different locomotion modes, odometry is not a trivial thing. Fusing sensor data from different sensors and logically changing between different configurations of the navigation system are the ways how variable situations can be managed. Although navigation requirements may vary during mission, we start from the fact that 3D pose estimation is mostly needed. Besides radio beacon system, like GPS or gsm-network based systems in the future, other artificial beacon based systems cannot be considered, because this would restrict too much the use of the robot. Therefore, only natural landmarks can be utilized. The robot has a stereo camera system available, but it was decided to use a scanning laser range finder, because distances to suitable structures can vary much as well as the illumination conditions from dark to sunshine. Cameras are, however, used when controlling the tasks of the two hand manipulator system. The accuracy requirements depends much on the tasks included in the mission, but one can say that a typical accuracy needed for successful task execution with the two hand manipulator is few centimeters, and half a meter when traveling. 2 NAVIGATION UNIT Workpartner must know its attitude in order to be able to control stability. The pitch and roll angles of the front body are measured by accelerometers. Fig. 1 The workpartner robot. In the beginning of the project, a three axial magnetometer was chosen as the primary nondrifting heading sensor. It has been successfully used inside a personal car after iron calibration. However, it turned to be too sensitive to disturbances originating from electrical motor currents when installing on the robot. Odometry is commonly used on flat surface navigation. However, the WorkPartner will operate on difficult surfaces where conventional odometry is not reliable. Also changes in locomotion mode causes problems to make odometry accurate. Therefore,
2 navigation is based on matching successive laser scans in the vicinity of buildings and trees where GPS navigation is not working. On open areas GPS navigation can be used whereas laser scanner does not work because of the lack of targets. GPS and laser scanner are complementary methods that support each other. Odometry is considered as an optional method. The velocity and heading are measured moving by using a GPS receiver. Required velocity is approximately 0.1 m/s. Velocity and heading measurements are based on Doppler effect of signals from at least 3 GPS satellites. The velocity measurement accuracy is about 0.1 m/s. When there is less than three satellites available for the position fix then dead reckoning based on heading gyro and laser mapping is used. This is especially the case when the robot is operating indoors. Natural landmarks are detected by using a 2D-laser scanner. It is installed in the manipulator. Other navigation sensors are installed on a navigation unit. The navigation unit includes a 400 MHz Pentium computer in PC104 size. Additionally a 12 bit A/D converter card in PC104 size is used for reading the Murata ENV piezogyro and two accelerometers. The laser scanner and GPS receiver communicate through 2 serial ports on motherboard. The operating system used is QNX. The operation commands to the computer controlling the articulation angle, leg movements and wheel rpms are sent through Ethernet cable. 3 SENSOR FUSION Indoors or when less than 3 GPS satellites are on sight the navigation is based on heading gyro and laser scan matching. On structured environments wall lines can be identified and used for navigation [Jensfelt, 1999] and [Pears, 2000]. In our case the maps are made on laser range measurements from unstructured objects like bushes and tree trunks. Successive laser scan matching is used to measure the change in position and heading. The assumption behind this approach is that the time interval, and hence the displacement of the robot, between successive scan lines is small. This reduces the ambiguity associated with this type of approach because scenery changes very little between successive scans, ensuring high overlap and limiting effects of occlusion. Further, for small time intervals the displacement can be predicted with high accuracy together with the vehicle model and GPS based odometry. This information can be used to constraint the search for matches, which also decreases ambiguity and the amount of computation needed. The laser range finder measures distance and bearing to an obstacle that is not beyond 30 meters. On every scan total 361 measurements at half a degree intervals are obtained. The laser range finder measures the distance and bearing on a coordinate frame that is fixed on the front part of the robot. When the robot is moving same obstacles will be seen from different viewing points. In order to build a map, a static map coordinate system must be chosen. A natural choice for the map coordinate system is the position and heading of robot front body when it takes the first laser scan. This means that the 361 ranges and bearings of a laser scan are transformed from polar coordinates to cartesian coordinates. A map consists of these 361 x,y-pairs. The next laser scan is taken on the front body coordinates of moving robot. The transition and rotation of the robot body coordinate system between two successive viewing points can be found by matching the corresponding laser maps in cartesian coordinates. The matching is based on rotating and translating the most recent map and then measuring the goodness of fit between two successive maps. A hit is obtained if the distance between one point in previous map and any point in the last map is less than 0.2 meters. This value comes from the fact that the resolution of the laser range finder is 0.26 meters at 30 meters distance. The number of hits are maximized by a direct search algorithm. The searched area in x and y directions is from 0.4 meter to 0.4 meter in 0.2 meter steps around the predicted position. The searched heading is from 6 degrees to 6 degrees in 1 degree steps around the predicted heading. The predicted position and heading are obtained from GPS velocity and heading message. When GPS measurements are not available, heading gyro is used for predicting the heading. Odometry can be used for predicting the travelled distance. The robot is able to initialize its position and heading relative to a map made earlier in the same area. The local initialization map consists of 361 pairs of x and y coordinates. The x-axis points to east and y-axis points to north. Since the piezogyro drifts a continuous calibration of the reference voltage corresponding to the zero angular velocity is needed. At indoor use the heading increment obtained from map matching is used to update the reference voltage. At outdoor environment the GPS heading value is used for reference voltage calibration. Reference voltage update
3 is restricted to cases when robot is not making any fast heading changes. The reference voltage is initialized in the rest state when the workpartner has not moved yet. Odometry can be used for computing the predicted position. In outdoor environment the NMEA VTG message is used to compute the predicted position and heading. The position estimate drifts slowly and hence the NMEA GGA message is used to make small corrections to position estimates. The correction gain is equal to 0.02/HDOP where HDOP is horizontal dilution of precision obtained from GPS receiver. When NMEA GGA is used the GPS coordinates of the starting point should be known with some accuracy. The route points to goal are picked from a map (Fig. 5) showing the current working environment. This map has been joined from consecutive laser scan lines when the robot has moved in the working area. When navigating the robot orientates towards to the current route point until the distance from current position to the next route point is less than distance between current and next route point. The laser scanner is also used for reactive obstacle avoidance. When an obstacle appear at a distance less than 0.2 m from side of planned trajectory on 9 s forward in time a turn to other direction is commanded. When an obstacle appears at a distance of less than 3 m in front of robot a zero velocity command is send to the main computer. The simple navigation model that explains the movements projected on a horizontal plane fixed on the center of front body coordinates is represented (1). x& 1 = x4 *sin( x3) x& = x * cos( x ) (1) where x 1 is position to east, x 2 is position to north, x 3 is heading and x 4 is velocity. The laser range finder defines origin of the coordinates. 4 NAVIGATION EXPERIMENTS Some experiments have been performed at the vicinity of a three floors high university office building [Selkäinaho et al., 2000]. WorkPartner robot was driven manually along a 6 meters wide road. Figure 2 shows the route computed by integrating the velocity and heading obtained from a GPS receiver. The origin of the local coordinates is N and E. After 308 meters long route the position drift caused by Euler integration is equal to -0.5 meters north and 1.7 meters east. Figure 3 shows the GPS position fixes obtained at the same route. There exist position jumps that are over 10 meter. Because metal plates cover of the university building some multipath fixes did occur. The route computed from velocity and heading measurements is much more accurate than that obtained from position message (NMEA GGA) Fig. 2 Workpartner route in meters integrated from GPS velocity and heading. Tests were also made to evaluate the odometry information obtained from matching successive laser scans. The search area used in finding the match was from 0.5 m to 0.5 m sideways, -0.6 m to 0.9 m in heading direction and from 20 degrees to 20 degrees in heading displacement. The position was searched at 0.1 m steps and heading in steps of one degree Fig. 3 Workpartner route in meters plotted from GPS position messages. After the best estimate for displacement was found, the estimate was fine-tuned by minimizing the sum of squared angle and range displacements of the matched points with quasi-newton method. Matching of two 361-point scans took under a half- second on 533 MHz Pentium III. The laser data was gathered from a route along the side of the university building, corresponding roughly to
4 lower portion of the route in figures 2 and 3. A total of 220 laser scans was gathered and integrated together. Figure 4 shows the route obtained from the displacement estimates. One can see a drift of some meters as the starting and ending position differ. The environment included bushes, group of young trees and the wall of university building Fig. 4 A route computed by matching successive laser scans. Place is same as the left wing of route in Fig. 2. At the third experiment autonomous navigation was tested. The robot first initialized its position and heading by comparing the laser scanner readings to a map made by laser scanner earlier in the same area. The navigation unit read a route map given in local laser map (Fig. 5) coordinates. The navigation task was to autonomously move to a trailer placed about 50 meters away in the yard and pick a cardboard box and return it to start position. The task was started without any GPS readings because the building blocked many satellites. Successive laser scan matching was used. After about one minute the robot was able to see enough satellites and it was predicting new poses based on GPS and then matching successive laser scans until to the trailer. 5 LASER MAPPING A mobile robot can build a map of its environment when it takes laser range measurements from natural beacons like bushes and tree trunks on its route. Map building requires that the range and bearing measurements taken from different viewpoints are rotated and translated to the same coordinates. In the coordinate transformations accurate position and heading of the laser range finder is needed. However, any cumulative error in pose estimation causes distortion in the map. [Lu and Milios, 1995] have shown that taking every pose where a laser scan has taken as an estimated state the problem can be avoided. In the following computation in real time is required and the old pose estimates are not updated with new laser scan data and some drifting error will accumulate to the pose. However, it is possible to correct old pose estimates in real time, but this requires some development work An exact laser based map could be made by taking only one laser scan with a long range model like Riegl 3D laser range finder which is reaching up to 350 m with 0.17 degrees angular resolution [Forsman, 2001]. A Sick laser range finder installed in the manipulator was used to form a map of a university building neighborhood. The map build from range measurements from several viewpoints is named as global map and the map build from range measurements from one viewpoint is named as local map. A laser scan consists of 361 range measurements corresponding bearing values from 0 degrees to 180 degrees in laser coordinates. Successive local map matching was used to compute incremental change in position and heading of robot. Then the last local map was transformed to global map coordinates and compared to the global map. If the number of common points in the two successive local maps was over a predefined threshold then new points last local map which are at least 0.2 m apart from old global map points is added to the updated global map. Spurious points that exist only in one laser scan are not filtered out from the map. The map (Fig. 5) was used to define a route to the goal of the working task. The map was in the form of coordinates of occupancy grid points. Occupancy grid size used in map building was equal to 0.2 m and only coordinates of occupied cells were stored. When the map is matched accurately from several consecutive scan lines it can be used for position and heading estimation. Fig. 5 A 2D map build from laser range measurements at different successive viewpoints.
5 The laser scanner is operating on horizontal plane fixed on the robot manipulator. Because the robot is moving in pitch and roll and the manipulator is pitching the resulting depth picture is partially 3-dimensional. 6 CONCLUSION AND FUTURE WORK Navigation sensors used in workpartner robot have been presented in this paper. Some early test results have been shown, too. The ordinary GPS receiver became an accurate velocity and heading sensor on May 1 st, 2000 when USA government finished to disturb the position signal. The NMEA VTG message of the receiver gives non- drifting heading information better than most optical gyros and the velocity information is comparable or better in accuracy to ground speed radar. Since the GPS receiver needs at least three satellites at sight other sensors must be also used. A piezogyro was used as a secondary heading sensor. Workpartner articulation angle and velocity commands can be used to compute the velocity when GPS satellites are blocked. A laser scanner offers an alternative method for position and heading measurement relative to natural landmarks. It also carries out the task to detect obstacles in the environment. By connecting pitch and roll information to a 2-dimensional laser scanner a partially 3- dimensional view can be obtained. The above presentation shows some early tests that will be followed by new ones in near future. Methods like iterative closest point [Madhavan et al.,1998] or particle filters will be used. Particle filters have been successfully used at indoor environment [Thrun et al., 2000]. Environmental Models. Licentiate thesis, Royal institute of technology, Stockholm. 129 p. [Lu and Milios, 1995] Feng Lu and Evangelos Milios. Optimal Global Pose Estimation for Consistent Sensor Data Registration. IEEE International Conference on Robotics and Automation. Pp [Madhavan et al., 1998] Ray Madhavan, M. Dissanayake and Hugh Durrant-Whyte. Autonomous Underground Navigation of an LHD using a Combined ICP-EKF Approach. Proceedings of the 1998 IEEE International Conference on Robotics & Automation, Leuven. Pp [Pears, 2000] Nick Pears. Feature extraction and tracking for scanning range sensors. Robotics and Autonomous Systems 33. Pp [Selkäinaho et al., 2000] Jorma Selkäinaho and Janne Paanajärvi. On tuning of Kalman filter. International Symposium on Robotics and Automation, Monterrey, p. [Thrun et al., 2000] Sebastian Thrun, Dieter Fox, Wolfram Burgard and Frank Dellaert. Robust Monte Carlo Localization for Mobile Robots. Artificial Intelligence. ACKNOWLEDGEMENTS This work has been supported by a grant from Sandvik/Tamrock Ltd. REFERENCES [Forsman, 2001] Pekka Forsman. Feature based Registration of 3D Perception Data for Indoor and Outdoor Map Building. Proceedings of the 3 rd International Conference on Field and Service Robotics, Espoo, Finland. [Halme et al., 2001] Aarne Halme, Ilkka Leppänen, Sami Salmi, and Sami Ylönen. Hybrid Locomotion of the WorkPartner Service Robot. Proceedings of the 3 rd International Conference on Field and Service Robotics, Espoo, Finland. [Jensfelt, 1999] Patric Jensfelt. Localization using Laser Scanning and Minimalistic
Basics of Localization, Mapping and SLAM. Jari Saarinen Aalto University Department of Automation and systems Technology
Basics of Localization, Mapping and SLAM Jari Saarinen Aalto University Department of Automation and systems Technology Content Introduction to Problem (s) Localization A few basic equations Dead Reckoning
More informationLocalization and Map Building
Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples
More informationLocalization, Where am I?
5.1 Localization, Where am I?? position Position Update (Estimation?) Encoder Prediction of Position (e.g. odometry) YES matched observations Map data base predicted position Matching Odometry, Dead Reckoning
More informationExam in DD2426 Robotics and Autonomous Systems
Exam in DD2426 Robotics and Autonomous Systems Lecturer: Patric Jensfelt KTH, March 16, 2010, 9-12 No aids are allowed on the exam, i.e. no notes, no books, no calculators, etc. You need a minimum of 20
More informationHumanoid Robotics. Monte Carlo Localization. Maren Bennewitz
Humanoid Robotics Monte Carlo Localization Maren Bennewitz 1 Basis Probability Rules (1) If x and y are independent: Bayes rule: Often written as: The denominator is a normalizing constant that ensures
More informationRobot Localization based on Geo-referenced Images and G raphic Methods
Robot Localization based on Geo-referenced Images and G raphic Methods Sid Ahmed Berrabah Mechanical Department, Royal Military School, Belgium, sidahmed.berrabah@rma.ac.be Janusz Bedkowski, Łukasz Lubasiński,
More informationVehicle Localization. Hannah Rae Kerner 21 April 2015
Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular
More informationLocalization algorithm using a virtual label for a mobile robot in indoor and outdoor environments
Artif Life Robotics (2011) 16:361 365 ISAROB 2011 DOI 10.1007/s10015-011-0951-7 ORIGINAL ARTICLE Ki Ho Yu Min Cheol Lee Jung Hun Heo Youn Geun Moon Localization algorithm using a virtual label for a mobile
More informationEE565:Mobile Robotics Lecture 3
EE565:Mobile Robotics Lecture 3 Welcome Dr. Ahmad Kamal Nasir Today s Objectives Motion Models Velocity based model (Dead-Reckoning) Odometry based model (Wheel Encoders) Sensor Models Beam model of range
More informationGeometrical Feature Extraction Using 2D Range Scanner
Geometrical Feature Extraction Using 2D Range Scanner Sen Zhang Lihua Xie Martin Adams Fan Tang BLK S2, School of Electrical and Electronic Engineering Nanyang Technological University, Singapore 639798
More informationData Association for SLAM
CALIFORNIA INSTITUTE OF TECHNOLOGY ME/CS 132a, Winter 2011 Lab #2 Due: Mar 10th, 2011 Part I Data Association for SLAM 1 Introduction For this part, you will experiment with a simulation of an EKF SLAM
More information3-D MAP GENERATION BY A MOBILE ROBOT EQUIPPED WITH A LASER RANGE FINDER. Takumi Nakamoto, Atsushi Yamashita, and Toru Kaneko
3-D AP GENERATION BY A OBILE ROBOT EQUIPPED WITH A LAER RANGE FINDER Takumi Nakamoto, Atsushi Yamashita, and Toru Kaneko Department of echanical Engineering, hizuoka Univerty 3-5-1 Johoku, Hamamatsu-shi,
More informationOverview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models
Introduction ti to Embedded dsystems EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping Gabe Hoffmann Ph.D. Candidate, Aero/Astro Engineering Stanford University Statistical Models
More informationMapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform. Project Proposal. Cognitive Robotics, Spring 2005
Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform Project Proposal Cognitive Robotics, Spring 2005 Kaijen Hsiao Henry de Plinval Jason Miller Introduction In the context
More informationMobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS
Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2
More informationMTRX4700: Experimental Robotics
Stefan B. Williams April, 2013 MTR4700: Experimental Robotics Assignment 3 Note: This assignment contributes 10% towards your final mark. This assignment is due on Friday, May 10 th during Week 9 before
More informationVision-based Mobile Robot Localization and Mapping using Scale-Invariant Features
Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features Stephen Se, David Lowe, Jim Little Department of Computer Science University of British Columbia Presented by Adam Bickett
More informationIntegrated Sensing Framework for 3D Mapping in Outdoor Navigation
Integrated Sensing Framework for 3D Mapping in Outdoor Navigation R. Katz, N. Melkumyan, J. Guivant, T. Bailey, J. Nieto and E. Nebot ARC Centre of Excellence for Autonomous Systems Australian Centre for
More informationMotion estimation of unmanned marine vehicles Massimo Caccia
Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it
More informationDYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda**
DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda** * Department of Mechanical Engineering Technical University of Catalonia (UPC)
More informationMOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS. Josep Maria Font, Joaquim A. Batlle
MOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS Josep Maria Font, Joaquim A. Batlle Department of Mechanical Engineering Technical University of Catalonia (UC) Avda. Diagonal 647, 08028
More informationSelection and Integration of Sensors Alex Spitzer 11/23/14
Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs
More informationfor real time map scan assembly would be an effective compromise of accuracy and quickness. By fitting only some points in each scan with selected poi
Quantitative and qualitative comparison of three laser-range mapping algorithms using two types of laser scanner data Alexander Scott Lynne E. Parker Claude Touzet DePauw University Oak Ridge National
More informationScan-point Planning and 3-D Map Building for a 3-D Laser Range Scanner in an Outdoor Environment
Scan-point Planning and 3-D Map Building for a 3-D Laser Range Scanner in an Outdoor Environment Keiji NAGATANI 1, Takayuki Matsuzawa 1, and Kazuya Yoshida 1 Tohoku University Summary. During search missions
More informationJ. Roberts and P. Corke CRC for Mining technology and Equipment PO Box 883, Kenmore, Q
Experiments in Autonomous Underground Guidance S. Scheding, E. M. Nebot, M. Stevens and H. Durrant-Whyte Department of Mechanical Engineering The University of Sydney, NSW 6, Australia. e-mail: scheding/nebot/michael/hugh@tiny.me.su.oz.au
More informationHOG-Based Person Following and Autonomous Returning Using Generated Map by Mobile Robot Equipped with Camera and Laser Range Finder
HOG-Based Person Following and Autonomous Returning Using Generated Map by Mobile Robot Equipped with Camera and Laser Range Finder Masashi Awai, Takahito Shimizu and Toru Kaneko Department of Mechanical
More informationLocalization and Map Building
Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples
More informationNavigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM
Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement
More informationLocalization of Multiple Robots with Simple Sensors
Proceedings of the IEEE International Conference on Mechatronics & Automation Niagara Falls, Canada July 2005 Localization of Multiple Robots with Simple Sensors Mike Peasgood and Christopher Clark Lab
More informationRobot Mapping. A Short Introduction to the Bayes Filter and Related Models. Gian Diego Tipaldi, Wolfram Burgard
Robot Mapping A Short Introduction to the Bayes Filter and Related Models Gian Diego Tipaldi, Wolfram Burgard 1 State Estimation Estimate the state of a system given observations and controls Goal: 2 Recursive
More informationIntelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System
Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System Under supervision of: Prof. Dr. -Ing. Klaus-Dieter Kuhnert Dipl.-Inform.
More informationMobile Robots: An Introduction.
Mobile Robots: An Introduction Amirkabir University of Technology Computer Engineering & Information Technology Department http://ce.aut.ac.ir/~shiry/lecture/robotics-2004/robotics04.html Introduction
More informationCamera and Inertial Sensor Fusion
January 6, 2018 For First Robotics 2018 Camera and Inertial Sensor Fusion David Zhang david.chao.zhang@gmail.com Version 4.1 1 My Background Ph.D. of Physics - Penn State Univ. Research scientist at SRI
More informationRevising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History
Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Simon Thompson and Satoshi Kagami Digital Human Research Center National Institute of Advanced
More informationLocalization, Mapping and Exploration with Multiple Robots. Dr. Daisy Tang
Localization, Mapping and Exploration with Multiple Robots Dr. Daisy Tang Two Presentations A real-time algorithm for mobile robot mapping with applications to multi-robot and 3D mapping, by Thrun, Burgard
More informationSimultaneous Localization and Mapping (SLAM)
Simultaneous Localization and Mapping (SLAM) RSS Lecture 16 April 8, 2013 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 SLAM Problem Statement Inputs: No external coordinate reference Time series of
More informationMatching Evaluation of 2D Laser Scan Points using Observed Probability in Unstable Measurement Environment
Matching Evaluation of D Laser Scan Points using Observed Probability in Unstable Measurement Environment Taichi Yamada, and Akihisa Ohya Abstract In the real environment such as urban areas sidewalk,
More informationStable Vision-Aided Navigation for Large-Area Augmented Reality
Stable Vision-Aided Navigation for Large-Area Augmented Reality Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu Supun Samarasekera, Rakesh Teddy Kumar Vision and Robotics Laboratory SRI-International Sarnoff,
More informationVirtual Testbeds for Planetary Exploration: The Self Localization Aspect
Virtual Testbeds for Planetary Exploration: The Self Localization Aspect, RWTH Aachen University Björn Sondermann Markus Emde Jürgen Roßmann 1 Content Previous Work Self Localization in terrestrial forestry
More informationSeminar Dept. Automação e Sistemas - UFSC Scan-to-Map Matching Using the Hausdorff Distance for Robust Mobile Robot Localization
Seminar Dept. Automação e Sistemas - UFSC Scan-to-Map Matching Using the Hausdorff Distance for Robust Mobile Robot Localization Work presented at ICRA 2008, jointly with ANDRES GUESALAGA PUC Chile Miguel
More informationUsing 3D Laser Range Data for SLAM in Outdoor Environments
Using 3D Laser Range Data for SLAM in Outdoor Environments Christian Brenneke, Oliver Wulf, Bernardo Wagner Institute for Systems Engineering, University of Hannover, Germany [brenneke, wulf, wagner]@rts.uni-hannover.de
More informationEnvironment Identification by Comparing Maps of Landmarks
Environment Identification by Comparing Maps of Landmarks Jens-Steffen Gutmann Masaki Fukuchi Kohtaro Sabe Digital Creatures Laboratory Sony Corporation -- Kitashinagawa, Shinagawa-ku Tokyo, 4- Japan Email:
More informationCAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING
CAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING By Michael Lowney Senior Thesis in Electrical Engineering University of Illinois at Urbana-Champaign Advisor: Professor Minh Do May 2015
More informationME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100
ME 597/747 Autonomous Mobile Robots Mid Term Exam Duration: 2 hour Total Marks: 100 Instructions: Read the exam carefully before starting. Equations are at the back, but they are NOT necessarily valid
More informationEfficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information
Proceedings of the World Congress on Electrical Engineering and Computer Systems and Science (EECSS 2015) Barcelona, Spain July 13-14, 2015 Paper No. 335 Efficient SLAM Scheme Based ICP Matching Algorithm
More informationOutline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017
Electric Electronic Engineering Bogazici University December 13, 2017 Absolute position measurement Outline Motion Odometry Inertial systems Environmental Tactile Proximity Sensing Ground-Based RF Beacons
More informationL17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms
L17. OCCUPANCY MAPS NA568 Mobile Robotics: Methods & Algorithms Today s Topic Why Occupancy Maps? Bayes Binary Filters Log-odds Occupancy Maps Inverse sensor model Learning inverse sensor model ML map
More informationJo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)
Chapter 8.2 Jo-Car2 Autonomous Mode Path Planning (Cost Matrix Algorithm) Introduction: In order to achieve its mission and reach the GPS goal safely; without crashing into obstacles or leaving the lane,
More informationSimultaneous Localization
Simultaneous Localization and Mapping (SLAM) RSS Technical Lecture 16 April 9, 2012 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 Navigation Overview Where am I? Where am I going? Localization Assumed
More informationUSING 3D DATA FOR MONTE CARLO LOCALIZATION IN COMPLEX INDOOR ENVIRONMENTS. Oliver Wulf, Bernardo Wagner
USING 3D DATA FOR MONTE CARLO LOCALIZATION IN COMPLEX INDOOR ENVIRONMENTS Oliver Wulf, Bernardo Wagner Institute for Systems Engineering (RTS/ISE), University of Hannover, Germany Mohamed Khalaf-Allah
More informationAutonomous Mobile Robot Design
Autonomous Mobile Robot Design Topic: EKF-based SLAM Dr. Kostas Alexis (CSE) These slides have partially relied on the course of C. Stachniss, Robot Mapping - WS 2013/14 Autonomous Robot Challenges Where
More informationCamera Drones Lecture 2 Control and Sensors
Camera Drones Lecture 2 Control and Sensors Ass.Prof. Friedrich Fraundorfer WS 2017 1 Outline Quadrotor control principles Sensors 2 Quadrotor control - Hovering Hovering means quadrotor needs to hold
More informationRobot Mapping for Rescue Robots
Robot Mapping for Rescue Robots Nagesh Adluru Temple University Philadelphia, PA, USA nagesh@temple.edu Longin Jan Latecki Temple University Philadelphia, PA, USA latecki@temple.edu Rolf Lakaemper Temple
More informationRobot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1
Robot Mapping SLAM Front-Ends Cyrill Stachniss Partial image courtesy: Edwin Olson 1 Graph-Based SLAM Constraints connect the nodes through odometry and observations Robot pose Constraint 2 Graph-Based
More informationRobotic Mapping. Outline. Introduction (Tom)
Outline Robotic Mapping 6.834 Student Lecture Itamar Kahn, Thomas Lin, Yuval Mazor Introduction (Tom) Kalman Filtering (Itamar) J.J. Leonard and H.J.S. Feder. A computationally efficient method for large-scale
More informationWhere s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map
Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Sebastian Scherer, Young-Woo Seo, and Prasanna Velagapudi October 16, 2007 Robotics Institute Carnegie
More informationUAV Autonomous Navigation in a GPS-limited Urban Environment
UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight
More informationProbabilistic Matching for 3D Scan Registration
Probabilistic Matching for 3D Scan Registration Dirk Hähnel Wolfram Burgard Department of Computer Science, University of Freiburg, 79110 Freiburg, Germany Abstract In this paper we consider the problem
More informationSimulation of a mobile robot with a LRF in a 2D environment and map building
Simulation of a mobile robot with a LRF in a 2D environment and map building Teslić L. 1, Klančar G. 2, and Škrjanc I. 3 1 Faculty of Electrical Engineering, University of Ljubljana, Tržaška 25, 1000 Ljubljana,
More informationA Stochastic Environment Modeling Method for Mobile Robot by using 2-D Laser scanner Young D. Kwon,Jin.S Lee Department of Electrical Engineering, Poh
A Stochastic Environment Modeling Method for Mobile Robot by using -D Laser scanner Young D. Kwon,Jin.S Lee Department of Electrical Engineering, Pohang University of Science and Technology, e-mail: jsoo@vision.postech.ac.kr
More informationAN INCREMENTAL SLAM ALGORITHM FOR INDOOR AUTONOMOUS NAVIGATION
20th IMEKO TC4 International Symposium and 18th International Workshop on ADC Modelling and Testing Research on Electric and Electronic Measurement for the Economic Upturn Benevento, Italy, September 15-17,
More informationFinal project: 45% of the grade, 10% presentation, 35% write-up. Presentations: in lecture Dec 1 and schedule:
Announcements PS2: due Friday 23:59pm. Final project: 45% of the grade, 10% presentation, 35% write-up Presentations: in lecture Dec 1 and 3 --- schedule: CS 287: Advanced Robotics Fall 2009 Lecture 24:
More informationToday MAPS AND MAPPING. Features. process of creating maps. More likely features are things that can be extracted from images:
MAPS AND MAPPING Features In the first class we said that navigation begins with what the robot can see. There are several forms this might take, but it will depend on: What sensors the robot has What
More informationAnnouncements. Recap Landmark based SLAM. Types of SLAM-Problems. Occupancy Grid Maps. Grid-based SLAM. Page 1. CS 287: Advanced Robotics Fall 2009
Announcements PS2: due Friday 23:59pm. Final project: 45% of the grade, 10% presentation, 35% write-up Presentations: in lecture Dec 1 and 3 --- schedule: CS 287: Advanced Robotics Fall 2009 Lecture 24:
More information3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit
3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 9 Toyomi Fujita and Yuya Kondo Tohoku Institute of Technology Japan 1. Introduction A 3D configuration and terrain sensing
More informationBuild and Test Plan: IGV Team
Build and Test Plan: IGV Team 2/6/2008 William Burke Donaldson Diego Gonzales David Mustain Ray Laser Range Finder Week 3 Jan 29 The laser range finder will be set-up in the lab and connected to the computer
More informationSpring Localization II. Roland Siegwart, Margarita Chli, Martin Rufli. ASL Autonomous Systems Lab. Autonomous Mobile Robots
Spring 2016 Localization II Localization I 25.04.2016 1 knowledge, data base mission commands Localization Map Building environment model local map position global map Cognition Path Planning path Perception
More informationEvaluating the Performance of a Vehicle Pose Measurement System
Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of
More informationRobotics and Autonomous Systems
Robotics and Autonomous Systems Lecture 6: Perception/Odometry Terry Payne Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception
More informationRobotics and Autonomous Systems
Robotics and Autonomous Systems Lecture 6: Perception/Odometry Simon Parsons Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception
More informationIntroduction to Mobile Robotics
Introduction to Mobile Robotics Gaussian Processes Wolfram Burgard Cyrill Stachniss Giorgio Grisetti Maren Bennewitz Christian Plagemann SS08, University of Freiburg, Department for Computer Science Announcement
More informationPATENT LIABILITY ANALYSIS. Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers
PATENT LIABILITY ANALYSIS Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers Autonomous wheeled vehicle with obstacle avoidance Two infrared range finder
More informationInteractive SLAM using Laser and Advanced Sonar
Interactive SLAM using Laser and Advanced Sonar Albert Diosi, Geoffrey Taylor and Lindsay Kleeman ARC Centre for Perceptive and Intelligent Machines in Complex Environments Department of Electrical and
More information9th Intelligent Ground Vehicle Competition. Design Competition Written Report. Design Change Report AMIGO
9th Intelligent Ground Vehicle Competition Design Competition Written Report Design Change Report AMIGO AMIGO means the friends who will join to the IGV Competition. Watanabe Laboratory Team System Control
More informationA Genetic Algorithm for Simultaneous Localization and Mapping
A Genetic Algorithm for Simultaneous Localization and Mapping Tom Duckett Centre for Applied Autonomous Sensor Systems Dept. of Technology, Örebro University SE-70182 Örebro, Sweden http://www.aass.oru.se
More informationRobotics. Haslum COMP3620/6320
Robotics P@trik Haslum COMP3620/6320 Introduction Robotics Industrial Automation * Repetitive manipulation tasks (assembly, etc). * Well-known, controlled environment. * High-power, high-precision, very
More informationFusion Between Laser and Stereo Vision Data For Moving Objects Tracking In Intersection Like Scenario
Fusion Between Laser and Stereo Vision Data For Moving Objects Tracking In Intersection Like Scenario Qadeer Baig, Olivier Aycard, Trung Dung Vu and Thierry Fraichard Abstract Using multiple sensors in
More informationVINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem
VINet: Visual-Inertial Odometry as a Sequence-to-Sequence Learning Problem Presented by: Justin Gorgen Yen-ting Chen Hao-en Sung Haifeng Huang University of California, San Diego May 23, 2017 Original
More informationContinuous Localization Using Evidence Grids *
Proceedings of the 1998 IEEE International Conference on Robotics & Automation Leuven, Belgium May 1998 Continuous Localization Using Evidence Grids * Alan C. Schultz and William Adams Navy Center for
More informationRobust Laser Scan Matching in Dynamic Environments
Proceedings of the 2009 IEEE International Conference on Robotics and Biomimetics December 19-23, 2009, Guilin, China Robust Laser Scan Matching in Dynamic Environments Hee-Young Kim, Sung-On Lee and Bum-Jae
More informationEE565:Mobile Robotics Lecture 2
EE565:Mobile Robotics Lecture 2 Welcome Dr. Ing. Ahmad Kamal Nasir Organization Lab Course Lab grading policy (40%) Attendance = 10 % In-Lab tasks = 30 % Lab assignment + viva = 60 % Make a group Either
More informationF1/10 th Autonomous Racing. Localization. Nischal K N
F1/10 th Autonomous Racing Localization Nischal K N System Overview Mapping Hector Mapping Localization Path Planning Control System Overview Mapping Hector Mapping Localization Adaptive Monte Carlo Localization
More informationCSE 527: Introduction to Computer Vision
CSE 527: Introduction to Computer Vision Week 10 Class 2: Visual Odometry November 2nd, 2017 Today Visual Odometry Intro Algorithm SLAM Visual Odometry Input Output Images, Video Camera trajectory, motion
More informationHigh Accuracy Navigation Using Laser Range Sensors in Outdoor Applications
Proceedings of the 2000 IEEE International Conference on Robotics & Automation San Francisco, CA April 2000 High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications Jose Guivant, Eduardo
More informationLaser-based Geometric Modeling using Cooperative Multiple Mobile Robots
009 IEEE International Conference on Robotics and Automation Kobe International Conference Center Kobe, Japan, May -7, 009 Laser-based Geometric Modeling using Cooperative Multiple Mobile Robots Ryo Kurazume,
More informationVisual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching
Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching Hauke Strasdat, Cyrill Stachniss, Maren Bennewitz, and Wolfram Burgard Computer Science Institute, University of
More informationA MOBILE ROBOT MAPPING SYSTEM WITH AN INFORMATION-BASED EXPLORATION STRATEGY
A MOBILE ROBOT MAPPING SYSTEM WITH AN INFORMATION-BASED EXPLORATION STRATEGY Francesco Amigoni, Vincenzo Caglioti, Umberto Galtarossa Dipartimento di Elettronica e Informazione, Politecnico di Milano Piazza
More informationProbabilistic Robotics
Probabilistic Robotics Sebastian Thrun Wolfram Burgard Dieter Fox The MIT Press Cambridge, Massachusetts London, England Preface xvii Acknowledgments xix I Basics 1 1 Introduction 3 1.1 Uncertainty in
More informationExperiments in Free-Space Triangulation Using Cooperative Localization
Experiments in Free-Space Triangulation Using Cooperative Localization Ioannis Rekleitis 1, Gregory Dudek 2 and Evangelos Milios 3 1 Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA,
More informationUsing LMS-100 Laser Rangefinder for Indoor Metric Map Building
Using LMS-100 Laser Rangefinder for Indoor Metric Map Building János Rudan 1, Zoltán Tuza 1 and Gábor Szederkényi 1,2 1 Faculty of Information Technology, Pázmány Péter Catholic University H-1083 Budapest,
More informationFinal Exam Practice Fall Semester, 2012
COS 495 - Autonomous Robot Navigation Final Exam Practice Fall Semester, 2012 Duration: Total Marks: 70 Closed Book 2 hours Start Time: End Time: By signing this exam, I agree to the honor code Name: Signature:
More informationROBOTICS AND AUTONOMOUS SYSTEMS
ROBOTICS AND AUTONOMOUS SYSTEMS Simon Parsons Department of Computer Science University of Liverpool LECTURE 6 PERCEPTION/ODOMETRY comp329-2013-parsons-lect06 2/43 Today We ll talk about perception and
More informationSpring Localization II. Roland Siegwart, Margarita Chli, Juan Nieto, Nick Lawrance. ASL Autonomous Systems Lab. Autonomous Mobile Robots
Spring 2018 Localization II Localization I 16.04.2018 1 knowledge, data base mission commands Localization Map Building environment model local map position global map Cognition Path Planning path Perception
More informationPacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang
PacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang Project Goals The goal of this project was to tackle the simultaneous localization and mapping (SLAM) problem for mobile robots. Essentially, the
More informationControl of a quadrotor manipulating a beam (2 projects available)
Control of a quadrotor manipulating a beam (2 projects available) Supervisor: Emanuele Garone (egarone@ulb.ac.be), Tam Nguyen, Laurent Catoire General Goal: The goal of this project is to complete from
More informationW4. Perception & Situation Awareness & Decision making
W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic
More informationSLAM: Robotic Simultaneous Location and Mapping
SLAM: Robotic Simultaneous Location and Mapping William Regli Department of Computer Science (and Departments of ECE and MEM) Drexel University Acknowledgments to Sebastian Thrun & others SLAM Lecture
More informationIndoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK
Indoor navigation using smartphones Chris Hide IESSG, University of Nottingham, UK Overview Smartphones Available sensors Current positioning methods Positioning research at IESSG 1. Wi-Fi fingerprinting
More informationRobot Localization: Historical Context
Robot Localization: Historical Context Initially, roboticists thought the world could be modeled exactly Path planning and control assumed perfect, exact, deterministic world Reactive robotics (behavior
More informationTesting the Possibilities of Using IMUs with Different Types of Movements
137 Testing the Possibilities of Using IMUs with Different Types of Movements Kajánek, P. and Kopáčik A. Slovak University of Technology, Faculty of Civil Engineering, Radlinského 11, 81368 Bratislava,
More information