SLAM With Sparse Sensing

Size: px
Start display at page:

Download "SLAM With Sparse Sensing"

Transcription

1 To appear in the 2006 IEEE International Conference on Robotics & Automation (ICRA 2006) SLAM With Sparse Sensing Kristopher R. Beevers Wesley H. Huang Rensselaer Polytechnic Institute, Department of Computer Science 110 8th Street, Troy, NY 12180, U.S.A. Abstract Most work on the simultaneous localization and mapping (SLAM) problem assumes the frequent availability of dense information about the environment such as that provided by a laser rangefinder. However, for implementing SLAM in consumer-oriented products such as toys or cleaning robots, it is infeasible to use expensive sensing. In this work we examine the SLAM problem for robots with very sparse sensing that provides too little data to extract features of the environment from a single scan. We modify SLAM to group several scans taken as the robot moves into multiscans, achieving higher data density in exchange for greater measurement uncertainty due to odometry error. We formulate a full system model for this approach, and then introduce simplifications that enable efficient implementation using a Rao-Blackwellized particle filter. Finally, we describe simple algorithms for feature extraction and data association of line and line segment features from multiscans, and then present experimental results using real data from several environments. I. INTRODUCTION Simultaneous localization and mapping (SLAM) is a fundamental capability for autonomous mobile robots that must be able to navigate within their environment. The SLAM problem has received much attention in the robotics community, and a number of approaches have been demonstrated. However, nearly all the prevalent work on SLAM demands the use of costly, high-fidelity sensors such as laser rangefinders. As consumer-oriented robotic devices become more widely available, the demand for mapping capabilities in these robots also grows. Robust mapping and navigation will enable many robotic applications such as robotic assistants for the elderly and disabled, security monitoring, and even smart toys. However, few consumers would pay the cost that a laser rangefinder would entail for such robots. Furthermore, a laser rangefinder imposes power, size, and computation requirements that may be infeasible for consumer applications. In this work we present an efficient approach to enable SLAM in indoor environments on robots with much more limited sensing. We eventually plan to implement SLAM using an array of infrared rangefinders that costs less than US$40, several orders of magnitude less than a laser rangefinder. Using such a limited sensing array demands an approach that is somewhat different from the traditional SLAM sequence of moving, sensing, extracting features, finding correspondences, and then updating the map (a) Fig. 1. (a) A typical laser rangefinder scan arrangement, providing 180 degrees of dense coverage in one-degree increments; A sparse sensor, which gives only five range readings in a single scan. and pose estimates. To illustrate this, Figure 1a depicts the data density of a single scan of a 180 laser rangefinder. A single such scan can be used to extract meaningful features such as lines or corners, or can be matched directly with the robot s map. In contrast, Figure 1b shows the data density of a scan using only five radially-spaced range sensors, the arrangement we use in our experiments. Clearly, it is much more difficult to extract features from such sparse data. Our approach is to group consecutive sparse scans into multiscans so the data from multiple frames can be used to extract good features; the tradeoff is that uncertainty in the robot s motion contributes to noise in feature extraction. This approach requires the pose history to be kept in order to process the multiscan. While others (e.g., [8]) have explicitly stored the pose history in the system state, we instead separate the pose history from the system state so that feature extraction is performed only once per multiscan. In conjunction with a particle filter implementation, our approach yields a reasonably efficient SLAM algorithm for robots with sparse sensing. The paper proceeds as follows: after a brief discussion of related research, in Section II we discuss a previous approach for applying SLAM using a pose history. We then formulate the multiscan approach, examine its properties, and discuss a particle filtering version in Section III. In Section IV we describe simple methods for extracting line and line segment features from multiscans and finding correspondences. Finally, in Section V, we discuss our implementation of sparse sensing SLAM and the results of experiments on data from the Radish [5] repository. A. Related work Most recent SLAM work has focused on creating accurate maps using laser rangefinders either to extract

2 features from the environment [12] or to perform scan matching against the robot s map [4]. The particle filtering approach for performing SLAM has recently gained wide acceptance, and both landmark based and scanmatching based techniques have drawn on the strategy of Rao-Blackwellized particle filtering, first introduced in the SLAM literature by Murphy [9]. Relatively little work has examined the SLAM problem under sensing constraints such as sparseness or sensor range limitations. In the topological mapping domain, Huang and Beevers [7], [6] have dealt with both of these constraints and have developed a complete algorithm for tracing a version of the Voronoi diagram of a rectilinear environment under the L distance metric, using eight radially spaced short-range sensors. It is not immediately clear how the approach can be extended to more general environments. In work based on techniques similar to our multiscan approach, Zunino and Christensen [16] and Wijk and Christensen [14] have used SONAR data taken from multiple locations to extract features and perform SLAM. Similarly, Leonard et al. [8] and Tardós et al. [11] have used a ring of 24 SONAR sensors for mapping with an approach we discuss in Section II. In their experiments with feature extraction based on the Hough transform and an accurate SONAR sensor model, they closed loops in several real-world environments using an Extended Kalman Filter (EKF) based mapper. II. APPLYING SLAM WITH A POSE HISTORY When features can only be detected from multiple poses, one approach to SLAM is to incorporate the pose history into the state vector. This is the method used by Leonard et al. [8] who applied an EKF to the resulting system. In this section, we give a brief description of this approach and how one might apply a particle filter to this system. This sets the context for our simplified approach to the problem, described in Section III. Suppose we keep the most recent m robot poses in the system state vector. Then, x(k) = [x r (k) x f (k)] where x r (k) = [x tk x tk 1... x tk m+1 ] (with x tk being the robot pose at time k), and x f (k) is a vector of n landmarks [x f1... x fn ]. This fits into the standard system model: x(k) = f (x(k 1), u(k 1)) + v(k) (1) z(k) = h(x(k)) + w(k) (2) where u(k 1) is the input, z(k) the measurement, and v(k) and w(k) are system and measurement noise. A. Using the EKF Leonard et al. [8] apply the EKF to this system, so they maintain an estimate of the state vector as well as its full covariance: [ ] Pxr P xr x f P x = P T x r x f P x f (3) where P xr, P xr x f, and P x f are covariance matrices for the robot pose history and the landmark locations. The key points in applying the EKF to this system are: When the state is projected forward, the new robot pose x tk is inserted in x r (k) and the oldest pose is discarded. Feature extraction is done using the last m pose estimates and the corresponding m measurements. Since the full covariance (and cross covariance) of the past m robot poses are maintained, feature extraction can account for the uncertainty of (and correlation between) poses from which the sensor measurements were taken. However, the computation of the full covariance matrix is very expensive (at least O((m + n) 2 ) complexity), and it must be performed at each time step. Furthermore, EKF-SLAM is unable to represent multiple data association hypotheses. Recently, particle filtering techniques have been used to overcome these limitations in traditional SLAM problems. B. Using a particle filter A reasonably straightforward adaptation of the above model to a standard Rao-Blackwellized particle filter is: 1) For each particle p i = {x i (k 1), ω i (k 1)}, i = 1... N: a) Project the state forward by drawing a new robot pose xt i k from the distribution of v(k) centered at f (xt i k 1, u(k 1)), insert xt i k in xr(k) i and discard the oldest robot pose. b) Extract features from the measurements using the last m poses and perform data association. c) Update the map and initialize new features. d) Compute a new weight ω i (k) equal to the likelihood of the data association. 2) If necessary, resample the particles with probabilities proportional to ω i (k). Note that each particle contains its own pose history, so collectively the particles sample the space of the previous m pose histories. This approach avoids the expensive computation of covariance matrices, but potentially, many particles would be required to adequately sample the pose history space. Furthermore, feature extraction is required for every particle because of their unique pose histories, and this computation is also fairly expensive. III. SLAM USING MULTISCANS Our approach is based on grouping sparse scans from m consecutive robot poses into a multiscan z(k) = [z(k) z(k 1)... z(k m+1)]. We formulate a system model in which a SLAM update is performed only after each m steps, reducing the required computation. A further simplification enables a particle filter implementation where features are extracted only once per multiscan (instead of once per particle per multiscan).

3 A. System model As before, the state vector x(k) = [x r (k) x f (k)], but now x r (k) contains only the single pose x tk. Our system model is: x(k) = F(x(k m), u(k m)) (4) Z(k) = g(z(k), x(k)) (5) where u(k m) is the vector of inputs [u(k 1)... u(k m)]. The system function F is defined as: F(x(k m), u(k m)) = [x tk x f (k m)] T (6) where the pose x tk is modeled recursively as: x tk = f (x tk 1, u(k 1)) + v(k) (7) The pose x tk m is taken from the state vector x(k m) after the previous SLAM update. The function g computes a feature vector Z(k) containing the parameters of features extracted from the multiscan z(k), which is acquired from the intermediate robot poses x(k) = [x tk... x tk m+1 ]. Each scan z(k) in a multiscan is modeled by: z(k) = h(x tk, x f ) + w(k) (8) where w(k) is the measurement noise and x f is the feature vector from the most recent SLAM update. B. Approach We apply a Rao-Blackwellized particle filter to this system, assuming the landmarks are independent when conditioned on the robot s trajectory: 1) For m time steps: move and collect sparse scans. 2) Extract features by considering the data from all m scans simultaneously as a single multiscan. 3) For each particle p i = {x i (k m), ω i (k m)}, i =1... N: a) Project the pose forward for each motion by drawing samples for intermediate poses: For i = (k m + 1) to k, Draw v V(i) Let x ti f (x ti 1, u(i 1)) + v b) Find correspondences between extracted features and the landmark locations x i f (k m). c) Update map and initialize new features. d) Compute a new weight ω i (k) equal to the likelihood of the data association. 4) If necessary, resample the particles with probabilities proportional to ω i (k). C. Feature extraction Because our features are extracted from sensor readings taken from different poses, both the measurement noise and the odometry error contribute to uncertainty in the extracted features. In our particle filter implementation, however, we do not maintain a pose history of the intermediate states for each particle. Instead, we use the expected intermediate pose history x(k) = [x tk... x tk m+1 ], calculated from the odometry as: x tk = f (x tk 1, u(k 1)) (9) Without loss of generality, we assume x tk m is the origin. We perform feature extraction using this expected pose history, and then transform the features for each particle p i so that the pose x tk coincides with x i t k to find correspondences. While this approach does not precisely account for the correlation between robot movement and feature parameters and increases feature extraction uncertainty, it avoids performing feature extraction on an intermediate pose history for every particle. D. Innovation covariance We consider Z to be the measurement for SLAM updates, and thus the innovation for a measurement lies in the feature parameter space. The innovation covariance S is required for both maximum likelihood data association and to update landmark locations. To simplify our explanation, assume that all features in Z are associated with landmarks whose parameters are in x l (k) = Mx(k), where M is a matrix that simply selects the appropriate landmarks from x(k). The innovation is then ν = Z(k) x l (k), and its covariance is: S = J g P (z,x) J T g + MP x(k) M T (10) The covariance of landmark parameters from P x(k) is readily available, but the covariance of the multiscan and pose history is more complicated: [ ] Pz P zx P (z,x) = (11) P T zx Feature extraction can be performed using the full P (z,x). A further approximation that yields acceptable results is to represent P (z,x) using only the block diagonal portions of P z and P x (i.e, assuming that measurements are independent and that although pose uncertainty compounds, multiscan poses are independent.) For complicated feature extraction methods, J g, the Jacobian of the feature extraction with respect to the multiscan, is difficult to compute analytically. However, it is well-known that maximum likelihood estimation gives good estimates for the covariance of the parameters being estimated, even for an approximately specified model such the block diagonal version of P (z,x) [13]. Thus, by using maximum likelihood estimation as a feature extractor, we can obtain a good approximation to the measurement covariance S without computing J g. We use this approach to extract lines from multiscan data with a procedure described in Section IV. E. Co-dependence on odometry Notice that there is an issue of independence in the multiscan formulation. The robot s odometry is used twice, first to update the pose estimate, and second to P x

4 Algorithm 1 EXTRACT-LINES(x, z, P (z,x) ) 1: Compute Cartesian points from x and z and covariance of points from P (z,x) 2: Cluster points using a threshold based on the range of each reading 3: for each cluster C do 4: Compute argmax pi,p j C p i p j 5: Perform IEPF using p i, p j as temporary segment endpoints to split C into subclusters 6: end for 7: for each cluster C with enough supporting points do 8: l = [r θ] Maximum Likelihood line parameters for points in C 9: P l covariance of l returned by ML estimator 10: Add line l with covariance P l to L 11: end for 12: return L extract features. The multiscan approach makes an implicit assumption that the co-dependence of the robot s pose and the measurements can be ignored. For this to be true, m, the number of readings per multiscan, must be kept small enough that the odometry error experienced over the course of the multiscan is small. One strategy is to make m a function of the pose uncertainty. IV. FEATURE EXTRACTION AND DATA ASSOCIATION We have implemented sparse sensing SLAM with two types of features: lines parameterized by distance r and angle θ to the origin; and line segments, which explicitly include extent information in the representation. Line features have been used for SLAM on a number of occasions, e.g. [15], and methods for performing line extraction and data association with lines are readily available [10]. Fewer attempts have been made to perform line segment SLAM. Brunskill and Roy [2] recently described the use of probabilistic PCA to extract line segment parameters and compute covariance information for use in data association. Since our focus is on SLAM with very little data and minimal computation, we have employed simpler techniques. A. Line features Before line parameters can be estimated, the data from a multiscan must be segmented. Since our data are sparse and from different poses, the simplest segmentation methods cannot be employed because they assume ordered measurements from a single pose. We instead take all the Cartesian points from the multiscan and apply an agglomerative clustering algorithm with a threshold distance computed adaptively based on measurement range. This is similar to the technique used in an adaptive breakpoint detector [1] for a radial range scan. The resulting clusters are further split using an iterative endpoint filter (IEPF) [3]. Maximum likelihood line parameters are then estimated for each cluster using weights based on the covariance of each point. Lines with too few supporting points are discarded. Algorithm 1 shows the pseudocode for the line extraction procedure. Maximum likelihood data association and map updates for line features are straightforward since the parameters of lines can be directly compared and merged. B. Line segment features We have also implemented SLAM with line segment features. An advantage of segment features is that they explicitly encode extent information which can be useful in data association. In order to extract segments from multiscan data, we employ the same procedure as for line features, and then project the clustered points onto their associated lines. The two extremal projected points p i and p j are used as the segment parameters, and the parameter covariance is computed based on the covariances of the line parameters and of the projected endpoints, i.e. P s = J l P l J T l + J p i P pi J T p i + J pj P pj J T p j A complication of line segment features is that it is impossible to directly compare the parameters of two segments for data association since they may actually represent different portions of the same feature. A simple solution is to extend the segments so that their endpoints can be directly compared. We extend two segments so that their projections onto the angular bisector are the same. Care must be taken to update the endpoint covariances accordingly. Unfortunately the extension procedure represents a complicated function with convoluted Jacobians, so updating the covariance is hard. A simple approximation is to assume that the lengths each endpoint is extended, d 0 and d 1, are known parameters of a function that transforms a single segment, i.e., E(s, d 0, d 1 ), which has a simpler Jacobian J s = E/ s that can be used to transform the segment covariance. In practice, this simplification works reasonably well and is much easier to compute. V. RESULTS We have implemented our sparse sensing SLAM approach and tested it on a variety of datasets. Our Rao- Blackwellized particle filter is based mainly on Fast- SLAM 1.0 [12]; we also use the adaptive sampling approach described by Grisetti, Stachniss, and Burgard [4]. Most aspects of our implementation have been discussed in previous sections. In our feature extraction, a simple weighted least squares estimator was used instead of a full maximum likelihood approach, for efficiency. Also, the experiments presented here with line segment features estimated covariance using only the two segment endpoints rather than the full data. Our experiments used data from Radish [5], an online repository of SLAM datasets. Most of the available datasets use scanning laser rangefinders with 1 spacing. In order to test SLAM with sparse sensing, we simply

5 TABLE I EXPERIMENT STATISTICS USC SAL CMU NSH Stanford Dimensions 39m 20m 25m 25m 64m 56m Particles Sensing range 5 m 3 m 5 m Path length 122 m 114 m 517 m Path rotation 450 rad 133 rad 495 rad Scans per multiscan Total multiscans Avg. MS translation 1.37 m 0.97 m 0.45 m Avg. MS rotation 3.80 rad 1.12 rad 0.43 rad Feature type Lines Segments Segments Num. landmarks (a) Fig. 3. A partial map of Newell-Simon Hall Level A at CMU. Dataset courtesy of Nicholas Roy. (a) Occupancy data for the corrected map. The corrected line segment map (black) and trajectory (gray). (a) Fig. 2. The USC SAL Building, second floor. Dataset courtesy of Andrew Howard. (a) Occupancy data for the corrected map. The corrected line landmark map (black) and trajectory (gray). The landmark map shows only the original observed extent of the line features. discarded most of the data in each scan, using only the five range measurements at 0, 45, 90, 135, and 180. Additionally, we restricted the maximum sensor range to at most 5 m, and in some cases less. A. Experiments Figures 2, 3, and 4 show the results of sparse sensing SLAM on several datasets. (The occupancy grid images were generated using corrected trajectories from sparse sensing SLAM, but with the original dense laser data for better clarity.) Detailed statistics for these datasets are shown in Table I. 1) USC SAL Building: (Figure 2) This dataset consists of a primary loop and several small excursions. The robot closed the loop properly in its line-based landmark map despite only a small overlap between the loop s start and end. Notice that some slight aliasing exists in the occupancy grid map: this was a common occurrence since most features have high uncertainty due to the use of odometry to augment the sparse sensing. 2) CMU Newell-Simon Hall: (Figure 3) Because of the maximum sensing range of 3 m for this experiment, the fairly large initial loop (bottom) could not be closed until after the robot finished exploring the upper hallway. 3) Stanford Gates Building: (Figure 4) This long dataset has three large and several small loops. Figure 4a shows the uncorrected data, which exhibits significant error, particularly with respect to the robot s orientation. At several points the robot spins in place, a difficult situation for sparse sensing because rotation dramatically increases the pose uncertainty and decreases the scan density due to high rotational velocities. Note that some landmarks are spurious, a result of poor data association due to large uncertainty. No detection of spurious landmarks was implemented, so these landmarks remained in the map. Again, although there is some aliasing in the results, the environment s structure and the robot s trajectory were properly recovered. B. Discussion These results show that performing SLAM in large indoor environments is feasible even with minimal sensing. All of our tests used only five sensors with restricted range, but even large loops were closed correctly. However, there are tradeoffs to using such limited sensing: More particles are required since the parameters of landmarks are more uncertain due to the use of odometry to augment sensing. The success of SLAM is sensitive to the amount of pose uncertainty accumulated during a multiscan. The size of multiscans (m) is a parameter that must be determined, either by selecting a constant or computing a value based on pose uncertainty. Choosing m is a complex problem given all of the factors error models, environment complexity, robot behaviors, etc. that affect SLAM. Computationally-motivated approximations (such as those made in extracting features and computing the innovation covariance) can lead to poor data association, as exhibited by the existence of spurious landmarks. VI. CONCLUSIONS In this paper, we have presented a strategy for implementing particle filtering SLAM on a robot with very sparse sensing. Rather than performing feature extraction on every frame of scan data, we group the data into multiscans consisting of several consecutive frames of data. Feature extraction, data association, and state updates are then performed based on multiscan data as if it were data from a single scan. We formally specified a system model for this approach that maintains multiscan information separately from the system state, allowing

6 (a) the efficient application of particle filtering methods for SLAM. We then discussed properties of the innovation covariance for features extracted from multiscans, and presented a simplified representation for which the computation costs are significantly reduced. Finally, we described simple approaches for extracting line and line segment features from multiscan data and performing data association. In our experiments using measurements from only five one-dimensional range sensors, the robot was able to close loops and recover the correct map and trajectory in several large real environments. While the experiments uncovered several tradeoffs to using limited sensing, the success of our approach with real data shows that it is possible to implement SLAM with sparse sensors. VII. ACKNOWLEDGMENTS This work was supported in part by the NSF through award IIS (c) Fig. 4. The Gates Building at Stanford, first floor. Dataset courtesy of Brian Gerkey. (a) Raw uncorrected data (not to scale with corrected data). Occupancy data for the corrected map. (c) The corrected line segment landmark map (black) and trajectory (gray). REFERENCES [1] G. Borges and M.-J. Aldon. Line extraction in 2D range images for mobile robotics. Journal of Intelligent and Robotic Systems, 40: , [2] E. Brunskill and N. Roy. SLAM using incremental probabilistic PCA and dimensionality reduction. In Proc. of the IEEE Intl. Conf. on Robotics and Automation, pages , [3] R. Duda and P. Hart. Pattern classification and scene analysis. Wiley, New York, USA, [4] G. Grisetti, C. Stachniss, and W. Burgard. Improving gridbased SLAM with Rao-Blackwellized particle filters by adaptive proposals and selective resampling. In Proc. of the IEEE Intl. Conf. on Robotics and Automation, pages , [5] A. Howard and N. Roy. The Robotics Data Set Repository (Radish), [6] W. H. Huang and K. R. Beevers. Complete topological mapping with sparse sensing. Technical Report 05-06, Rensselaer Polytechnic Institute, Troy, NY, March [7] W. H. Huang and K. R. Beevers. Topological mapping with sensing-limited robots. In M. Erdmann et al., editors, Algorithmic Foundations of Robotics VI, pages Springer, [8] J. Leonard, R. Rikoski, P. Newman, and M. Bosse. Mapping partially observable features from multiple uncertain vantage points. Intl. Journal of Robotics Research, 21(10): , October [9] K. Murphy. Bayesian map learning in dynamic environments. In Advances in Neural Information Processing Systems, volume 12, pages MIT Press, [10] S. Pfister, S. Roumeliotis, and J. Burdick. Weighted line fitting algorithms for mobile robot map building and efficient data representation. In Proc. of the IEEE Intl. Conf. on Robotics and Automation, Taipei, Taiwan, [11] J. Tardós, J. Neira, P. Newman, and J. Leonard. Robust mapping and localization in indoor environments using SONAR data. Intl. Journal of Robotics Research, 21(4): , April [12] S. Thrun, M. Montemerlo, D. Koller, B. Wegbreit, J. Nieto, and E. Nebot. FastSLAM: An efficient solution to the simultaneous localization and mapping problem with unknown data association. Journal of Machine Learning Research, to appear. [13] H. White. Maximum likelihood estimation of misspecified models. Econometrica, 50(1):1 26, January [14] O. Wijk and H. Christensen. Triangulation based fusion of sonar data with application in robot pose tracking. IEEE Transactions on Robotics and Automation, 16(6): , [15] D. Yuen and B. MacDonald. Line-base SMC SLAM method in environments with polygonal obstacles. In Proc. of the Australasian Conf. on Robotics and Automation, [16] G. Zunino and H. Christensen. Navigation in realistic environments. In M. Devy, editor, 9th Intl. Symp. on Intelligent Robotic Systems, Toulouse, France, 2001.

Probabilistic Robotics. FastSLAM

Probabilistic Robotics. FastSLAM Probabilistic Robotics FastSLAM The SLAM Problem SLAM stands for simultaneous localization and mapping The task of building a map while estimating the pose of the robot relative to this map Why is SLAM

More information

Probabilistic Robotics

Probabilistic Robotics Probabilistic Robotics FastSLAM Sebastian Thrun (abridged and adapted by Rodrigo Ventura in Oct-2008) The SLAM Problem SLAM stands for simultaneous localization and mapping The task of building a map while

More information

Particle Filters. CSE-571 Probabilistic Robotics. Dependencies. Particle Filter Algorithm. Fast-SLAM Mapping

Particle Filters. CSE-571 Probabilistic Robotics. Dependencies. Particle Filter Algorithm. Fast-SLAM Mapping CSE-571 Probabilistic Robotics Fast-SLAM Mapping Particle Filters Represent belief by random samples Estimation of non-gaussian, nonlinear processes Sampling Importance Resampling (SIR) principle Draw

More information

IROS 05 Tutorial. MCL: Global Localization (Sonar) Monte-Carlo Localization. Particle Filters. Rao-Blackwellized Particle Filters and Loop Closing

IROS 05 Tutorial. MCL: Global Localization (Sonar) Monte-Carlo Localization. Particle Filters. Rao-Blackwellized Particle Filters and Loop Closing IROS 05 Tutorial SLAM - Getting it Working in Real World Applications Rao-Blackwellized Particle Filters and Loop Closing Cyrill Stachniss and Wolfram Burgard University of Freiburg, Dept. of Computer

More information

Simultaneous Localization and Mapping

Simultaneous Localization and Mapping Sebastian Lembcke SLAM 1 / 29 MIN Faculty Department of Informatics Simultaneous Localization and Mapping Visual Loop-Closure Detection University of Hamburg Faculty of Mathematics, Informatics and Natural

More information

Particle Filter in Brief. Robot Mapping. FastSLAM Feature-based SLAM with Particle Filters. Particle Representation. Particle Filter Algorithm

Particle Filter in Brief. Robot Mapping. FastSLAM Feature-based SLAM with Particle Filters. Particle Representation. Particle Filter Algorithm Robot Mapping FastSLAM Feature-based SLAM with Particle Filters Cyrill Stachniss Particle Filter in Brief! Non-parametric, recursive Bayes filter! Posterior is represented by a set of weighted samples!

More information

L17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms

L17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms L17. OCCUPANCY MAPS NA568 Mobile Robotics: Methods & Algorithms Today s Topic Why Occupancy Maps? Bayes Binary Filters Log-odds Occupancy Maps Inverse sensor model Learning inverse sensor model ML map

More information

Introduction to Mobile Robotics SLAM Grid-based FastSLAM. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello

Introduction to Mobile Robotics SLAM Grid-based FastSLAM. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello Introduction to Mobile Robotics SLAM Grid-based FastSLAM Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello 1 The SLAM Problem SLAM stands for simultaneous localization

More information

EKF Localization and EKF SLAM incorporating prior information

EKF Localization and EKF SLAM incorporating prior information EKF Localization and EKF SLAM incorporating prior information Final Report ME- Samuel Castaneda ID: 113155 1. Abstract In the context of mobile robotics, before any motion planning or navigation algorithm

More information

Implementation of Odometry with EKF for Localization of Hector SLAM Method

Implementation of Odometry with EKF for Localization of Hector SLAM Method Implementation of Odometry with EKF for Localization of Hector SLAM Method Kao-Shing Hwang 1 Wei-Cheng Jiang 2 Zuo-Syuan Wang 3 Department of Electrical Engineering, National Sun Yat-sen University, Kaohsiung,

More information

Mobile Robot Mapping and Localization in Non-Static Environments

Mobile Robot Mapping and Localization in Non-Static Environments Mobile Robot Mapping and Localization in Non-Static Environments Cyrill Stachniss Wolfram Burgard University of Freiburg, Department of Computer Science, D-790 Freiburg, Germany {stachnis burgard @informatik.uni-freiburg.de}

More information

Localization and Map Building

Localization and Map Building Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples

More information

Final project: 45% of the grade, 10% presentation, 35% write-up. Presentations: in lecture Dec 1 and schedule:

Final project: 45% of the grade, 10% presentation, 35% write-up. Presentations: in lecture Dec 1 and schedule: Announcements PS2: due Friday 23:59pm. Final project: 45% of the grade, 10% presentation, 35% write-up Presentations: in lecture Dec 1 and 3 --- schedule: CS 287: Advanced Robotics Fall 2009 Lecture 24:

More information

Kaijen Hsiao. Part A: Topics of Fascination

Kaijen Hsiao. Part A: Topics of Fascination Kaijen Hsiao Part A: Topics of Fascination 1) I am primarily interested in SLAM. I plan to do my project on an application of SLAM, and thus I am interested not only in the method we learned about in class,

More information

Announcements. Recap Landmark based SLAM. Types of SLAM-Problems. Occupancy Grid Maps. Grid-based SLAM. Page 1. CS 287: Advanced Robotics Fall 2009

Announcements. Recap Landmark based SLAM. Types of SLAM-Problems. Occupancy Grid Maps. Grid-based SLAM. Page 1. CS 287: Advanced Robotics Fall 2009 Announcements PS2: due Friday 23:59pm. Final project: 45% of the grade, 10% presentation, 35% write-up Presentations: in lecture Dec 1 and 3 --- schedule: CS 287: Advanced Robotics Fall 2009 Lecture 24:

More information

Robot Mapping. TORO Gradient Descent for SLAM. Cyrill Stachniss

Robot Mapping. TORO Gradient Descent for SLAM. Cyrill Stachniss Robot Mapping TORO Gradient Descent for SLAM Cyrill Stachniss 1 Stochastic Gradient Descent Minimize the error individually for each constraint (decomposition of the problem into sub-problems) Solve one

More information

Introduction to Mobile Robotics SLAM Landmark-based FastSLAM

Introduction to Mobile Robotics SLAM Landmark-based FastSLAM Introduction to Mobile Robotics SLAM Landmark-based FastSLAM Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Diego Tipaldi, Luciano Spinello Partial slide courtesy of Mike Montemerlo 1 The SLAM Problem

More information

Robot Localization based on Geo-referenced Images and G raphic Methods

Robot Localization based on Geo-referenced Images and G raphic Methods Robot Localization based on Geo-referenced Images and G raphic Methods Sid Ahmed Berrabah Mechanical Department, Royal Military School, Belgium, sidahmed.berrabah@rma.ac.be Janusz Bedkowski, Łukasz Lubasiński,

More information

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Simon Thompson and Satoshi Kagami Digital Human Research Center National Institute of Advanced

More information

Spring Localization II. Roland Siegwart, Margarita Chli, Juan Nieto, Nick Lawrance. ASL Autonomous Systems Lab. Autonomous Mobile Robots

Spring Localization II. Roland Siegwart, Margarita Chli, Juan Nieto, Nick Lawrance. ASL Autonomous Systems Lab. Autonomous Mobile Robots Spring 2018 Localization II Localization I 16.04.2018 1 knowledge, data base mission commands Localization Map Building environment model local map position global map Cognition Path Planning path Perception

More information

Introduction to Mobile Robotics. SLAM: Simultaneous Localization and Mapping

Introduction to Mobile Robotics. SLAM: Simultaneous Localization and Mapping Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping The SLAM Problem SLAM is the process by which a robot builds a map of the environment and, at the same time, uses this map to

More information

Navigation methods and systems

Navigation methods and systems Navigation methods and systems Navigare necesse est Content: Navigation of mobile robots a short overview Maps Motion Planning SLAM (Simultaneous Localization and Mapping) Navigation of mobile robots a

More information

Robot Mapping. Least Squares Approach to SLAM. Cyrill Stachniss

Robot Mapping. Least Squares Approach to SLAM. Cyrill Stachniss Robot Mapping Least Squares Approach to SLAM Cyrill Stachniss 1 Three Main SLAM Paradigms Kalman filter Particle filter Graphbased least squares approach to SLAM 2 Least Squares in General Approach for

More information

Graphbased. Kalman filter. Particle filter. Three Main SLAM Paradigms. Robot Mapping. Least Squares Approach to SLAM. Least Squares in General

Graphbased. Kalman filter. Particle filter. Three Main SLAM Paradigms. Robot Mapping. Least Squares Approach to SLAM. Least Squares in General Robot Mapping Three Main SLAM Paradigms Least Squares Approach to SLAM Kalman filter Particle filter Graphbased Cyrill Stachniss least squares approach to SLAM 1 2 Least Squares in General! Approach for

More information

Practical Course WS12/13 Introduction to Monte Carlo Localization

Practical Course WS12/13 Introduction to Monte Carlo Localization Practical Course WS12/13 Introduction to Monte Carlo Localization Cyrill Stachniss and Luciano Spinello 1 State Estimation Estimate the state of a system given observations and controls Goal: 2 Bayes Filter

More information

Uncertainties: Representation and Propagation & Line Extraction from Range data

Uncertainties: Representation and Propagation & Line Extraction from Range data 41 Uncertainties: Representation and Propagation & Line Extraction from Range data 42 Uncertainty Representation Section 4.1.3 of the book Sensing in the real world is always uncertain How can uncertainty

More information

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz Humanoid Robotics Monte Carlo Localization Maren Bennewitz 1 Basis Probability Rules (1) If x and y are independent: Bayes rule: Often written as: The denominator is a normalizing constant that ensures

More information

Robot Mapping. A Short Introduction to the Bayes Filter and Related Models. Gian Diego Tipaldi, Wolfram Burgard

Robot Mapping. A Short Introduction to the Bayes Filter and Related Models. Gian Diego Tipaldi, Wolfram Burgard Robot Mapping A Short Introduction to the Bayes Filter and Related Models Gian Diego Tipaldi, Wolfram Burgard 1 State Estimation Estimate the state of a system given observations and controls Goal: 2 Recursive

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: EKF-based SLAM Dr. Kostas Alexis (CSE) These slides have partially relied on the course of C. Stachniss, Robot Mapping - WS 2013/14 Autonomous Robot Challenges Where

More information

arxiv: v1 [cs.ro] 18 Jul 2016

arxiv: v1 [cs.ro] 18 Jul 2016 GENERATIVE SIMULTANEOUS LOCALIZATION AND MAPPING (G-SLAM) NIKOS ZIKOS AND VASSILIOS PETRIDIS arxiv:1607.05217v1 [cs.ro] 18 Jul 2016 Abstract. Environment perception is a crucial ability for robot s interaction

More information

Spring Localization II. Roland Siegwart, Margarita Chli, Martin Rufli. ASL Autonomous Systems Lab. Autonomous Mobile Robots

Spring Localization II. Roland Siegwart, Margarita Chli, Martin Rufli. ASL Autonomous Systems Lab. Autonomous Mobile Robots Spring 2016 Localization II Localization I 25.04.2016 1 knowledge, data base mission commands Localization Map Building environment model local map position global map Cognition Path Planning path Perception

More information

A Tree Parameterization for Efficiently Computing Maximum Likelihood Maps using Gradient Descent

A Tree Parameterization for Efficiently Computing Maximum Likelihood Maps using Gradient Descent A Tree Parameterization for Efficiently Computing Maximum Likelihood Maps using Gradient Descent Giorgio Grisetti Cyrill Stachniss Slawomir Grzonka Wolfram Burgard University of Freiburg, Department of

More information

SLAM: Robotic Simultaneous Location and Mapping

SLAM: Robotic Simultaneous Location and Mapping SLAM: Robotic Simultaneous Location and Mapping William Regli Department of Computer Science (and Departments of ECE and MEM) Drexel University Acknowledgments to Sebastian Thrun & others SLAM Lecture

More information

Monte Carlo Localization using Dynamically Expanding Occupancy Grids. Karan M. Gupta

Monte Carlo Localization using Dynamically Expanding Occupancy Grids. Karan M. Gupta 1 Monte Carlo Localization using Dynamically Expanding Occupancy Grids Karan M. Gupta Agenda Introduction Occupancy Grids Sonar Sensor Model Dynamically Expanding Occupancy Grids Monte Carlo Localization

More information

What is the SLAM problem?

What is the SLAM problem? SLAM Tutorial Slides by Marios Xanthidis, C. Stachniss, P. Allen, C. Fermuller Paul Furgale, Margarita Chli, Marco Hutter, Martin Rufli, Davide Scaramuzza, Roland Siegwart What is the SLAM problem? The

More information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information Proceedings of the World Congress on Electrical Engineering and Computer Systems and Science (EECSS 2015) Barcelona, Spain July 13-14, 2015 Paper No. 335 Efficient SLAM Scheme Based ICP Matching Algorithm

More information

Robust Real-Time Local Laser Scanner Registration with Uncertainty Estimation

Robust Real-Time Local Laser Scanner Registration with Uncertainty Estimation Robust Real-Time Local Laser Scanner Registration with Uncertainty Estimation Justin Carlson Charles Thorpe David L. Duke Carnegie Mellon University - Qatar Campus {justinca cet dlduke}@ri.cmu.edu Abstract

More information

Geometrical Feature Extraction Using 2D Range Scanner

Geometrical Feature Extraction Using 2D Range Scanner Geometrical Feature Extraction Using 2D Range Scanner Sen Zhang Lihua Xie Martin Adams Fan Tang BLK S2, School of Electrical and Electronic Engineering Nanyang Technological University, Singapore 639798

More information

PacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang

PacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang PacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang Project Goals The goal of this project was to tackle the simultaneous localization and mapping (SLAM) problem for mobile robots. Essentially, the

More information

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation 242 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 17, NO. 3, JUNE 2001 Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation José E. Guivant and Eduardo

More information

Probabilistic Robotics

Probabilistic Robotics Probabilistic Robotics Bayes Filter Implementations Discrete filters, Particle filters Piecewise Constant Representation of belief 2 Discrete Bayes Filter Algorithm 1. Algorithm Discrete_Bayes_filter(

More information

Jurnal Teknologi PARTICLE FILTER IN SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM) USING DIFFERENTIAL DRIVE MOBILE ROBOT. Full Paper

Jurnal Teknologi PARTICLE FILTER IN SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM) USING DIFFERENTIAL DRIVE MOBILE ROBOT. Full Paper Jurnal Teknologi PARTICLE FILTER IN SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM) USING DIFFERENTIAL DRIVE MOBILE ROBOT Norhidayah Mohamad Yatim a,b, Norlida Buniyamin a a Faculty of Engineering, Universiti

More information

Fast and Accurate SLAM with Rao-Blackwellized Particle Filters

Fast and Accurate SLAM with Rao-Blackwellized Particle Filters Fast and Accurate SLAM with Rao-Blackwellized Particle Filters Giorgio Grisetti a,b Gian Diego Tipaldi b Cyrill Stachniss c,a Wolfram Burgard a Daniele Nardi b a University of Freiburg, Department of Computer

More information

Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM)

Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM) Robotics Lecture 7: Simultaneous Localisation and Mapping (SLAM) See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College

More information

Probabilistic Robotics

Probabilistic Robotics Probabilistic Robotics Probabilistic Motion and Sensor Models Some slides adopted from: Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras and Probabilistic Robotics Book SA-1 Sensors for Mobile

More information

AN INCREMENTAL SLAM ALGORITHM FOR INDOOR AUTONOMOUS NAVIGATION

AN INCREMENTAL SLAM ALGORITHM FOR INDOOR AUTONOMOUS NAVIGATION 20th IMEKO TC4 International Symposium and 18th International Workshop on ADC Modelling and Testing Research on Electric and Electronic Measurement for the Economic Upturn Benevento, Italy, September 15-17,

More information

Today MAPS AND MAPPING. Features. process of creating maps. More likely features are things that can be extracted from images:

Today MAPS AND MAPPING. Features. process of creating maps. More likely features are things that can be extracted from images: MAPS AND MAPPING Features In the first class we said that navigation begins with what the robot can see. There are several forms this might take, but it will depend on: What sensors the robot has What

More information

This chapter explains two techniques which are frequently used throughout

This chapter explains two techniques which are frequently used throughout Chapter 2 Basic Techniques This chapter explains two techniques which are frequently used throughout this thesis. First, we will introduce the concept of particle filters. A particle filter is a recursive

More information

Adapting the Sample Size in Particle Filters Through KLD-Sampling

Adapting the Sample Size in Particle Filters Through KLD-Sampling Adapting the Sample Size in Particle Filters Through KLD-Sampling Dieter Fox Department of Computer Science & Engineering University of Washington Seattle, WA 98195 Email: fox@cs.washington.edu Abstract

More information

Adapting the Sample Size in Particle Filters Through KLD-Sampling

Adapting the Sample Size in Particle Filters Through KLD-Sampling Adapting the Sample Size in Particle Filters Through KLD-Sampling Dieter Fox Department of Computer Science & Engineering University of Washington Seattle, WA 98195 Email: fox@cs.washington.edu Abstract

More information

Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching

Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching Visual Bearing-Only Simultaneous Localization and Mapping with Improved Feature Matching Hauke Strasdat, Cyrill Stachniss, Maren Bennewitz, and Wolfram Burgard Computer Science Institute, University of

More information

Localization and Map Building

Localization and Map Building Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples

More information

Maritime UAVs Swarm Intelligent Robot Modelling and Simulation using Accurate SLAM method and Rao Blackwellized Particle Filters

Maritime UAVs Swarm Intelligent Robot Modelling and Simulation using Accurate SLAM method and Rao Blackwellized Particle Filters 20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Maritime UAVs Swarm Intelligent Robot Modelling and Simulation using Accurate

More information

L10. PARTICLE FILTERING CONTINUED. NA568 Mobile Robotics: Methods & Algorithms

L10. PARTICLE FILTERING CONTINUED. NA568 Mobile Robotics: Methods & Algorithms L10. PARTICLE FILTERING CONTINUED NA568 Mobile Robotics: Methods & Algorithms Gaussian Filters The Kalman filter and its variants can only model (unimodal) Gaussian distributions Courtesy: K. Arras Motivation

More information

2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, 25 IEEE Personal use of this material is permitted Permission from IEEE must be obtained for all other uses in any current or future media including reprinting/republishing this material for advertising

More information

Advanced Techniques for Mobile Robotics Graph-based SLAM using Least Squares. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Advanced Techniques for Mobile Robotics Graph-based SLAM using Least Squares. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Advanced Techniques for Mobile Robotics Graph-based SLAM using Least Squares Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz SLAM Constraints connect the poses of the robot while it is moving

More information

L15. POSE-GRAPH SLAM. NA568 Mobile Robotics: Methods & Algorithms

L15. POSE-GRAPH SLAM. NA568 Mobile Robotics: Methods & Algorithms L15. POSE-GRAPH SLAM NA568 Mobile Robotics: Methods & Algorithms Today s Topic Nonlinear Least Squares Pose-Graph SLAM Incremental Smoothing and Mapping Feature-Based SLAM Filtering Problem: Motion Prediction

More information

DYNAMIC ROBOT LOCALIZATION AND MAPPING USING UNCERTAINTY SETS. M. Di Marco A. Garulli A. Giannitrapani A. Vicino

DYNAMIC ROBOT LOCALIZATION AND MAPPING USING UNCERTAINTY SETS. M. Di Marco A. Garulli A. Giannitrapani A. Vicino DYNAMIC ROBOT LOCALIZATION AND MAPPING USING UNCERTAINTY SETS M. Di Marco A. Garulli A. Giannitrapani A. Vicino Dipartimento di Ingegneria dell Informazione, Università di Siena, Via Roma, - 1 Siena, Italia

More information

Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map

Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Sebastian Scherer, Young-Woo Seo, and Prasanna Velagapudi October 16, 2007 Robotics Institute Carnegie

More information

A Sparse Hybrid Map for Vision-Guided Mobile Robots

A Sparse Hybrid Map for Vision-Guided Mobile Robots A Sparse Hybrid Map for Vision-Guided Mobile Robots Feras Dayoub Grzegorz Cielniak Tom Duckett Department of Computing and Informatics, University of Lincoln, Lincoln, UK {fdayoub,gcielniak,tduckett}@lincoln.ac.uk

More information

5. Tests and results Scan Matching Optimization Parameters Influence

5. Tests and results Scan Matching Optimization Parameters Influence 126 5. Tests and results This chapter presents results obtained using the proposed method on simulated and real data. First, it is analyzed the scan matching optimization; after that, the Scan Matching

More information

Mini Survey Paper (Robotic Mapping) Ryan Hamor CPRE 583 September 2011

Mini Survey Paper (Robotic Mapping) Ryan Hamor CPRE 583 September 2011 Mini Survey Paper (Robotic Mapping) Ryan Hamor CPRE 583 September 2011 Introduction The goal of this survey paper is to examine the field of robotic mapping and the use of FPGAs in various implementations.

More information

Probabilistic Robotics

Probabilistic Robotics Probabilistic Robotics Sebastian Thrun Wolfram Burgard Dieter Fox The MIT Press Cambridge, Massachusetts London, England Preface xvii Acknowledgments xix I Basics 1 1 Introduction 3 1.1 Uncertainty in

More information

Multi-Robot Navigation and Cooperative Mapping in a Circular Topology

Multi-Robot Navigation and Cooperative Mapping in a Circular Topology Multi-Robot Navigation and Cooperative Mapping in a Circular Topology Simon Bultmann, Laëtitia Matignon, Olivier Simonin To cite this version: Simon Bultmann, Laëtitia Matignon, Olivier Simonin. Multi-Robot

More information

Simultaneous Localization and Mapping (SLAM)

Simultaneous Localization and Mapping (SLAM) Simultaneous Localization and Mapping (SLAM) RSS Lecture 16 April 8, 2013 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 SLAM Problem Statement Inputs: No external coordinate reference Time series of

More information

ICRA 2016 Tutorial on SLAM. Graph-Based SLAM and Sparsity. Cyrill Stachniss

ICRA 2016 Tutorial on SLAM. Graph-Based SLAM and Sparsity. Cyrill Stachniss ICRA 2016 Tutorial on SLAM Graph-Based SLAM and Sparsity Cyrill Stachniss 1 Graph-Based SLAM?? 2 Graph-Based SLAM?? SLAM = simultaneous localization and mapping 3 Graph-Based SLAM?? SLAM = simultaneous

More information

Online Simultaneous Localization and Mapping in Dynamic Environments

Online Simultaneous Localization and Mapping in Dynamic Environments To appear in Proceedings of the Intl. Conf. on Robotics and Automation ICRA New Orleans, Louisiana, Apr, 2004 Online Simultaneous Localization and Mapping in Dynamic Environments Denis Wolf and Gaurav

More information

Heuristic Optimization for Global Scan Matching with Focused Sonar on a Mobile Robot

Heuristic Optimization for Global Scan Matching with Focused Sonar on a Mobile Robot Heuristic Optimization for Global Scan Matching with Focused Sonar on a Mobile Robot Kristijan Lenac 1,2, Enzo Mumolo 1, Massimiliano Nolich 1 1 DEEI, Università degli Studi di Trieste, 34127 Trieste,

More information

Grid-Based Models for Dynamic Environments

Grid-Based Models for Dynamic Environments Grid-Based Models for Dynamic Environments Daniel Meyer-Delius Maximilian Beinhofer Wolfram Burgard Abstract The majority of existing approaches to mobile robot mapping assume that the world is, an assumption

More information

Ensemble of Bayesian Filters for Loop Closure Detection

Ensemble of Bayesian Filters for Loop Closure Detection Ensemble of Bayesian Filters for Loop Closure Detection Mohammad Omar Salameh, Azizi Abdullah, Shahnorbanun Sahran Pattern Recognition Research Group Center for Artificial Intelligence Faculty of Information

More information

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots D-SLAM: Decoupled Localization and Mapping for Autonomous Robots Zhan Wang, Shoudong Huang, and Gamini Dissanayake ARC Centre of Excellence for Autonomous Systems (CAS), Faculty of Engineering, University

More information

Localization, Mapping and Exploration with Multiple Robots. Dr. Daisy Tang

Localization, Mapping and Exploration with Multiple Robots. Dr. Daisy Tang Localization, Mapping and Exploration with Multiple Robots Dr. Daisy Tang Two Presentations A real-time algorithm for mobile robot mapping with applications to multi-robot and 3D mapping, by Thrun, Burgard

More information

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models Introduction ti to Embedded dsystems EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping Gabe Hoffmann Ph.D. Candidate, Aero/Astro Engineering Stanford University Statistical Models

More information

Matching Evaluation of 2D Laser Scan Points using Observed Probability in Unstable Measurement Environment

Matching Evaluation of 2D Laser Scan Points using Observed Probability in Unstable Measurement Environment Matching Evaluation of D Laser Scan Points using Observed Probability in Unstable Measurement Environment Taichi Yamada, and Akihisa Ohya Abstract In the real environment such as urban areas sidewalk,

More information

Least Squares and SLAM Pose-SLAM

Least Squares and SLAM Pose-SLAM Least Squares and SLAM Pose-SLAM Giorgio Grisetti Part of the material of this course is taken from the Robotics 2 lectures given by G.Grisetti, W.Burgard, C.Stachniss, K.Arras, D. Tipaldi and M.Bennewitz

More information

08 An Introduction to Dense Continuous Robotic Mapping

08 An Introduction to Dense Continuous Robotic Mapping NAVARCH/EECS 568, ROB 530 - Winter 2018 08 An Introduction to Dense Continuous Robotic Mapping Maani Ghaffari March 14, 2018 Previously: Occupancy Grid Maps Pose SLAM graph and its associated dense occupancy

More information

Simultaneous Localization

Simultaneous Localization Simultaneous Localization and Mapping (SLAM) RSS Technical Lecture 16 April 9, 2012 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 Navigation Overview Where am I? Where am I going? Localization Assumed

More information

ECE276A: Sensing & Estimation in Robotics Lecture 11: Simultaneous Localization and Mapping using a Particle Filter

ECE276A: Sensing & Estimation in Robotics Lecture 11: Simultaneous Localization and Mapping using a Particle Filter ECE276A: Sensing & Estimation in Robotics Lecture 11: Simultaneous Localization and Mapping using a Particle Filter Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu

More information

AUTONOMOUS SYSTEMS. LOCALIZATION, MAPPING & SIMULTANEOUS LOCALIZATION AND MAPPING Part V Mapping & Occupancy Grid Mapping

AUTONOMOUS SYSTEMS. LOCALIZATION, MAPPING & SIMULTANEOUS LOCALIZATION AND MAPPING Part V Mapping & Occupancy Grid Mapping AUTONOMOUS SYSTEMS LOCALIZATION, MAPPING & SIMULTANEOUS LOCALIZATION AND MAPPING Part V Mapping & Occupancy Grid Mapping Maria Isabel Ribeiro Pedro Lima with revisions introduced by Rodrigo Ventura Instituto

More information

SLAM WITH KLT POINT FEATURES. P.O. Box 22, Thammasat-Rangsit Post Office, Pathumthani Thailand

SLAM WITH KLT POINT FEATURES. P.O. Box 22, Thammasat-Rangsit Post Office, Pathumthani Thailand SLAM WITH KLT POINT FEATURES Yoichi Nakaguro, 1 Matthew Dailey, Stanislav Makhanov 1 1 Sirindhorn International Institute of Technology, Thammasat University P.O. Box, Thammasat-Rangsit Post Office, Pathumthani

More information

Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments

Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments Jacques C. Leedekerken, John J. Leonard, Michael C. Bosse, and Arjuna Balasuriya Massachusetts Institute of Technology

More information

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots D-SLAM: Decoupled Localization and Mapping for Autonomous Robots Zhan Wang, Shoudong Huang, and Gamini Dissanayake ARC Centre of Excellence for Autonomous Systems (CAS) Faculty of Engineering, University

More information

Artificial Intelligence for Robotics: A Brief Summary

Artificial Intelligence for Robotics: A Brief Summary Artificial Intelligence for Robotics: A Brief Summary This document provides a summary of the course, Artificial Intelligence for Robotics, and highlights main concepts. Lesson 1: Localization (using Histogram

More information

Graphical SLAM using Vision and the Measurement Subspace

Graphical SLAM using Vision and the Measurement Subspace Graphical SLAM using Vision and the Measurement Subspace John Folkesson, Patric Jensfelt and Henrik I. Christensen Centre for Autonomous Systems Royal Institute of Technology SE-100 44 Stockholm [johnf,patric,hic]@nada.kth.se

More information

AUTONOMOUS SYSTEMS. PROBABILISTIC LOCALIZATION Monte Carlo Localization

AUTONOMOUS SYSTEMS. PROBABILISTIC LOCALIZATION Monte Carlo Localization AUTONOMOUS SYSTEMS PROBABILISTIC LOCALIZATION Monte Carlo Localization Maria Isabel Ribeiro Pedro Lima With revisions introduced by Rodrigo Ventura Instituto Superior Técnico/Instituto de Sistemas e Robótica

More information

HybridSLAM: Combining FastSLAM and EKF-SLAM for reliable mapping

HybridSLAM: Combining FastSLAM and EKF-SLAM for reliable mapping HybridSLAM: Combining FastSLAM and EKF-SLAM for reliable mapping Alex Brooks 1 and Tim Bailey 1 Australian Centre for Field Robotics, University of Sydney {a.brooks,t.bailey}@acfr.usyd.edu.au Abstract:

More information

A scalable hybrid multi-robot SLAM method for highly detailed maps Pfingsthorn, M.; Slamet, B.; Visser, A.

A scalable hybrid multi-robot SLAM method for highly detailed maps Pfingsthorn, M.; Slamet, B.; Visser, A. UvA-DARE (Digital Academic Repository) A scalable hybrid multi-robot SLAM method for highly detailed maps Pfingsthorn, M.; Slamet, B.; Visser, A. Published in: Lecture Notes in Computer Science DOI: 10.1007/978-3-540-68847-1_48

More information

Evaluation of Pose Only SLAM

Evaluation of Pose Only SLAM The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Evaluation of Pose Only SLAM Gibson Hu, Shoudong Huang and Gamini Dissanayake Abstract In

More information

TORO - Efficient Constraint Network Optimization for SLAM

TORO - Efficient Constraint Network Optimization for SLAM TORO - Efficient Constraint Network Optimization for SLAM Cyrill Stachniss Joint work with Giorgio Grisetti and Wolfram Burgard Special thanks to and partial slide courtesy of: Diego Tipaldi, Edwin Olson,

More information

Improvements for an appearance-based SLAM-Approach for large-scale environments

Improvements for an appearance-based SLAM-Approach for large-scale environments 1 Improvements for an appearance-based SLAM-Approach for large-scale environments Alexander Koenig Jens Kessler Horst-Michael Gross Neuroinformatics and Cognitive Robotics Lab, Ilmenau University of Technology,

More information

Gaussian Processes, SLAM, Fast SLAM and Rao-Blackwellization

Gaussian Processes, SLAM, Fast SLAM and Rao-Blackwellization Statistical Techniques in Robotics (16-831, F11) Lecture#20 (November 21, 2011) Gaussian Processes, SLAM, Fast SLAM and Rao-Blackwellization Lecturer: Drew Bagnell Scribes: Junier Oliva 1 1 Comments on

More information

Localization, Where am I?

Localization, Where am I? 5.1 Localization, Where am I?? position Position Update (Estimation?) Encoder Prediction of Position (e.g. odometry) YES matched observations Map data base predicted position Matching Odometry, Dead Reckoning

More information

UNIVERSITÀ DEGLI STUDI DI GENOVA MASTER S THESIS

UNIVERSITÀ DEGLI STUDI DI GENOVA MASTER S THESIS UNIVERSITÀ DEGLI STUDI DI GENOVA MASTER S THESIS Integrated Cooperative SLAM with Visual Odometry within teams of autonomous planetary exploration rovers Author: Ekaterina Peshkova Supervisors: Giuseppe

More information

Introduction to Mobile Robotics

Introduction to Mobile Robotics Introduction to Mobile Robotics Gaussian Processes Wolfram Burgard Cyrill Stachniss Giorgio Grisetti Maren Bennewitz Christian Plagemann SS08, University of Freiburg, Department for Computer Science Announcement

More information

An Iterative Approach for Building Feature Maps in Cyclic Environments

An Iterative Approach for Building Feature Maps in Cyclic Environments An Iterative Approach for Building Feature Maps in Cyclic Environments Haris Baltzakis and Panos Trahanias Institute of Computer Science, Foundation for Research and Technology Hellas (FORTH), P.O.Box

More information

Monocular SLAM for a Small-Size Humanoid Robot

Monocular SLAM for a Small-Size Humanoid Robot Tamkang Journal of Science and Engineering, Vol. 14, No. 2, pp. 123 129 (2011) 123 Monocular SLAM for a Small-Size Humanoid Robot Yin-Tien Wang*, Duen-Yan Hung and Sheng-Hsien Cheng Department of Mechanical

More information

Orthogonal SLAM: a Step toward Lightweight Indoor Autonomous Navigation

Orthogonal SLAM: a Step toward Lightweight Indoor Autonomous Navigation Orthogonal SLAM: a Step toward Lightweight Indoor Autonomous Navigation Viet Nguyen, Ahad Harati, Agostino Martinelli, Roland Siegwart Autonomous Systems Laboratory Swiss Federal Institute of Technology

More information

Graphical SLAM using Vision and the Measurement Subspace

Graphical SLAM using Vision and the Measurement Subspace Graphical SLAM using Vision and the Measurement Subspace John Folkesson, Patric Jensfelt and Henrik I. Christensen Centre for Autonomous Systems Royal Institute Technology SE-100 44 Stockholm [johnf,patric,hic]@nada.kth.se

More information

EE565:Mobile Robotics Lecture 3

EE565:Mobile Robotics Lecture 3 EE565:Mobile Robotics Lecture 3 Welcome Dr. Ahmad Kamal Nasir Today s Objectives Motion Models Velocity based model (Dead-Reckoning) Odometry based model (Wheel Encoders) Sensor Models Beam model of range

More information

Efficient particle filter algorithm for ultrasonic sensor based 2D range-only SLAM application

Efficient particle filter algorithm for ultrasonic sensor based 2D range-only SLAM application Efficient particle filter algorithm for ultrasonic sensor based 2D range-only SLAM application Po Yang School of Computing, Science & Engineering, University of Salford, Greater Manchester, United Kingdom

More information