Mobile robot localisation and navigation using multi-sensor fusion via interval analysis and UKF

Size: px
Start display at page:

Download "Mobile robot localisation and navigation using multi-sensor fusion via interval analysis and UKF"

Transcription

1 Mobile robot localisation and navigation using multi-sensor fusion via interval analysis and UKF Immanuel Ashokaraj, Antonios Tsourdos, Peter Silson and Brian White. Department of Aerospace, Power and Sensors, RMCS, Cranfield University, Shrivenham, Swindon, UK - SN6 8LA I.A.Ashokaraj@rmcs.cranfield.ac.uk Abstract Multiple sensor fusion for robot localisation and navigation has attracted a lot of interest in recent years. This paper describes a sensor based navigation approach using an interval analysis IA based adaptive mechanism for an Unscented Kalman filter UKF. The robot is equipped with inertial sensors INS, encoders and ultrasonic sensors. An UKF is used to estimate the robot position using the inertial sensors and encoders. Since the UKF estimates may be affected by bias, drift etc., an adaptive mechanism using IA to correct these defects in estimates is proposed. In the presence of landmarks the complementary robot position information from the IA algorithm using ultrasonic sensors is used to estimate and bound the errors in the UKF robot position estimate. 1. INTRODUCTION Robot navigation is primarily about guiding a mobile robot to a desired destination or along a pre-specified path in which the robot s environment consists of landmarks and obstacles. In order to achieve this objective the robot needs to be equipped with sensors suitable to localise the robot throughout the path it is to follow. These sensors may give overlapping or complementary information and may also sometimes be redundant. There are many possible architectures to fuse these information. Mobile robots generally carry dead reckoning sensors such as wheel encoders and inertial sensors INS, such as accelerometers, gyroscopes, to measure acceleration and angle rate respectively, and landmark and obstacle detecting and map making sensors such as time of flight TOF ultrasonic sensors. All these sensor measurements can be fused to estimate the robot s position by using a sensor fusion algorithm. Sensor fusion in this case is the method of integrating data from distinctly different sensors to estimate the robot s position. Classical data fusion algorithms use stochastic filters such as Kalman filters for robot position estimation [Jetto et al., 1999]. However the disadvantages of using Kalman filters for robot localisation problems are that the data association step in Kalman filters is complex with ultrasonic sensors and also that, in the case of INS, they are often affected by bias and drift. Moreover an accurate model of the robot system and accurate statistics of the sensor noises are needed, which is not available in many cases. This paper is organised as follows. This introductory section continues by presenting a background for the problem of autonomous robot localisation in section 1.1, followed by a summary of previous work in robot localisation using interval analysis IA in section 1.2. The novelty of the proposed approach is presented in section 1.3. Section 2. explains the implementation of the UKF with INS and encoders for this problem. Section 3. gives a brief explanation of the IA algorithm for robot localisation and also describes how the sensor range limitation is incorporated. In section 4. the implementation of the adaptive mechanism for the UKF robot position estimation using IA with ultrasonic sensors is described and the results are shown and finally in section 5. the conclusions are given. 1.1 Background The problem considered here is that of robot navigation and localisation using multiple low cost sensors such as INS, encoders and ultrasonic sensors. Conventionally stochastic filters such as Extended Kalman filter EKF or Unscented Kalman filters UKF are used for robot localisation [Clark et al., 2001]. One of the main prerequisites for using Kalman filters when using INS and encoders, is to have an accurate model of the robot and also accurate sensor noise statistics. In practice it is difficult to have these parameters accurately, especially the drifts in accelerometers and gyroscopes. This affects the outcome of the UKF, thereby contributing to errors in the estimated position of the robot over a period of time. Moreover TOF sensors such as ultrasonic sensors are used to measure the distance of landmarks from the

2 robot and to recognise the presence of any obstacles in the robots path. When the 2-D map of the environment in which the robot travels is known a priori, the distance measurements from the ultrasonic sensors can be used independently to estimate the position of the robot in the map. Kalman filters can be used for this purpose as well [Castellanos and Tardos, 1999]. But a major limitations encountered of this approach is the problem of data association, as the data association problem in Kalman filters is extremely complex and is of the third order O 3. There are ways in which this problem can be simplified to O 2, but these solutions may be sub-optimal. In order to get the best estimate of the robots position, we can use different types of sensors with different algorithms that, in turn, have different sources of error. In this case we use an Unscented Kalman Filter UKF for fusing the data from the accelerometers, gyroscopes and encoders, instead of the EKF. This is because the UKF can linearise the non-linear models at every instant, up to the 3rd order of Taylor series expansion, thereby reducing the errors during linearisation, where as in the EKF the non-linear models can be linearised only up to the 1st order. In the case of ultrasonic sensors we use an Interval Analysis IA algorithm for estimating the robots position. It should also be noted that the IA algorithm for ultrasonic sensors bypasses the complex data association step and handles the problem in a non-linear way even while being robust to outliners. Thus we have two independent sets of the measurements for the robot position. The estimated robot position using UKF from INS, which might be affected by bias and drift, are then fused with the estimated interval robot position using IA from ultrasonic sensors. The fused robot position estimate is much better than either one them alone, since the errors in UKF estimated position are identified and corrected using the IA algorithm. 1.2 Prior work: Robot localisation with IA using range measurements. Interval analysis is basically about guaranteed numerical methods for approximating sets. Guaranteed in this context means that outer and sometimes inner approximations of the sets of interest are obtained, which can at least in principle, be made as precise as desired. Thus interval computation is a special case of computation on sets, and set theory provides the foundations for interval analysis. The localisation of an autonomous robot while navigating in a known or partially known environment is an important problem in mobile robotics. In this section an approach for the localisation of the robot using IA [Kieffer et al., 2000] with sensor readings from ultrasonic sensors is described briefly. The main advantage of this method is that it bypasses the data-association step, which is very complex in other stochastic methods such as Kalman Filters, and it handles the problem in a xi R θi ỹi si di bj+2 Figure 1: Robot and Sensor model γi aj+2 bj 1 non-linear way without any linearisation and it is very much robust to outliners. The robot model is assumed to move in a known 2D environment and its motion is planned with respect to a set of landmarks. The task now is to estimate the robots position in the map using the distance measurements d m provided by a belt of sensors around the robot. Moreover it is assumed that the bounds on the distance measurement error is know, thus resulting in interval distance measurements. If a model is available to model the ultrasonic sensor interval distance measurements d, the robot localisation problem now becomes a bounded error parameter estimation problem, namely that of characterising the set that contains all the configurations vectors that are consistent with the given map and measurements. The task is to find the configuration vector p where p = x c, y c, θ T and p o is the initial configuration space given by the equation below, aj 1 αidi aj 1 bj aj bj 1 = [p o ] d m 1 [d] 1 i.e. for a given configuration vector p the robot evaluates the measurements that its sensors would return and compares then with the actual measurements to check whether they are consistent. The problem described by the Equation 1 could then be solved using any of the two approaches namely SIVIA Set Inversion Via Interval Analysis [Jaulin and Walter, 1993] and ImageSP [Kieffer et al., 1998]. Both the above approaches have been described in detail in the book by Jaulin et al [Jaulin et al., 2001]. 1.3 Novelty This paper differs from the work in section 1.2 in three main aspects. Firstly, in [Jaulin et al., 2001] it was assumed that the robots TOF sensors had an unlimited range. The algorithm, described in this paper, takes account of the

3 sensor range limitation found in many real world applications, in which case there may be instances during which no landmarks are visible within the robots ultrasonic sensor range. In this paper it is proposed to use inertial sensors INS and encoders when there are no landmark data from the TOF sensors. Secondly, instead of using interval values in the kinematic model while tracking the robot as in [Jaulin et al., 2001], we use the physical constraints of the robot model to predict the robots next position. This approach uses the kinematics of the robot rather than the kinematic model and obviates the need for an accurate kinematic model of the robot that is not always available. This approach also makes the IA algorithm independent of the robot models, thereby reducing sources of uncertainty in the robots estimated position. However we do pay a price in terms of computation time. Finally, in the presence of landmarks, the complimentary robot position estimates are fused using the UKF by converting the intervals into covariance matrix values to represent the uncertainty in the interval robot position, thereby resulting in a much improved robot position estimate from the Unscented Kalman filter UKF. 2. Robot localisation using UKF with Inertial sensors and encoders By fusing the measurement data from the sensors - wheel encoders, gyroscopes and accelerometers - in the mobile robot, a reliable estimation of the position and heading of the robot can be obtained. There are many stochastic filters available for this purpose in the literature such as EKF [Alessandri et al., 1997] etc. In this research work because of the model non-linearity UKF is used instead of an EKF. The main reason for this is that because EKF based on the first order linearisation using the Taylor series expansion, errors in the estimated parameters may be introduced and therefore may result to sub-optimal performance and sometimes divergence of the filter. The problem described above can be overcome to a certain extent by using a method first described by Julier and Uhlman [Julier and Uhlmann, 1997] known as the unscented transform in the Kalman filter for the linearisation process. This formulation of the Kalman filter for a non-linear system is called the Unscented Kalman Filter UKF. The unscented transform is basically a deterministic sampling approach, where the state distribution is approximated by a Gaussian random variable, but is now described using a minimal set of carefully chosen sample points. These sample points are chosen using the unscented transform method that completely describes the true mean and covariance of the Gaussian random variable. When these chosen points are propagated through the true non-linear system, it can capture the posterior mean and covariance accurately up to the 3rd order for Taylor series expansion, where as in a Algorithm d m in: p; out: d m 1 for i := 1 to n s xc cos θ sin θ 2 s i := + s y c sin θ cos θ i 3 cosθ + u 1i := i γ sinθ + θ ; i γ u 2i := cosθ + θ i + γ sinθ + θ i + γ ; 4 d m i p := + ; 5 for j := 1 to n w d m i p := min d m i p, r s i, u 1i, u 2i, a j, b j. Figure 2: Model for calculating the distance expected from ultrasonic sensor when the configuration is p Algorithm [d m] in: [p;] out: [d m] 1 for i := 1 to n s xc cos[θ] sin[θ] 2 [s i ] := + s y c sin[θ] cos[θ] i 3 cos[θ] + [u 1i ] := i γ sin[θ] + θ ; i γ [u 2i ] := cos[θ] + θ i + γ sin[θ] + θ i + γ ; 4 [d m] i [p] := + ; 5 for j := 1 to n w [d m] i [p] := min [d m] i [p],[r] [s i ], [u 1i ], [u 2i ], a j, b j. Figure 3: Inclusion function for the measurement model EKF we can achieve only up to first-order accuracy. It should also be noted that the computational complexity of the UKF is the same order as that of EKF. The basic equations for the UKF has been given in detail in the book [Wan and der Merwe, 2001]. All the process noises are assumed to be zero mean, uncorrelated white random noises only. In the measurement model for this robot there are four sources of observations that are considered: 1. velocity measurements from the wheel encoders, 2. acceleration from the accelerometers, 3. heading angle measurement from the encoders and 4. angular velocity measurements from the rate gyroscope. These measurements are then fused together using the UKF to obtain the speed and heading angle of the robot. 3. Robot localisation with Interval Analysis using range measurements. In this section the robot localisation using IA described briefly in section 1.2 is further elaborated upon.

4 Algorithm [r]in : [s], [ u 1],[ u 2], a, b;out : [r]; ab, 1. [t r] := det as 0 if [t r] = 0 then [r] := + ; return; ab, ba, 2. [t h ] := a [s] 0 b [s] 0 ab, ab, [u1] 0 [u2] 0 ; 3. [r h ] := [χ] [t h ],[l][s],a, b,+ ; 4. [t a] := det [u1], [s]a 0 det [u2], [s]a 0 ; 5. [r a] := [χ] [t a], [s] a,+ ; 6. [t b ] := det [u1], [s]b 0 det [u2], [s] b 0 ; 7. [r b ] := [χ] [t b ], [s] b, + ; 8. for i := 1 to 2 [ ] thi := det [s]a, [ui ] 0 det [s] b, [ui ] 0 ; [ ] [thi ] [ ] rhi := [χ], l [s],a, b,+ ; [ui ] 9. [r] := min [r h ],[r a],[r b ], [ ] [ ] r h1, rh2 ; 10. [r] := [χ] [t r],[r], + Figure 4: Inclusion function for the remoteness of the cone from a segment A brief overview of the algorithm for a single measurement has been given in the Figures 2, 3 and 4. The main improvement in this version of the algorithm is that the range of the ultrasonic sensor has been limited to 3 meters, where as in [Kieffer et al., 2000] the range was unlimited. This is implemented by identifying the sensors n i that only give readings less than 3 meters which is done by setting all the interval ranges greater than 3 meters to infinity in Figure 4 and substituting them for instead of n s where the inclusion function was calculated for all the n s number of sensors in the Figure 3. Also it is assumed that the landmarks are spaced sparsely i.e there is a distance of at least 3 meters between the landmarks. This is because when the land marks are close to one another, some of the land marks may not be visible to the robot sensor model in the IA algorithm in the initial stages of the SIVIA algorithm due to the interval sensor position uncertainty, which results in some feasible robot configurations been eliminated prematurely. Additionally the landmarks which are visible to the robot in the 3 meter radius are only given to the inclusion function in Figure 3 instead of all the n w segments, there by saving computational time.this problem as given in equation 1 is then solved using any of the two approaches namely SIVIA Set Inversion Via Interval Analysis [Jaulin and Walter, 1993] and ImageSP. The Figure 1 gives a brief description of the sensor model and also the measurement process. A detailed description of the above inclusion functions has been provided by [Kieffer et al., 2000]. Also a better version of the above algorithm in terms of computational time, incorporating the interval elementary tests to eliminate some of the infeasible configurations in the configuration vector has been described in [Kieffer et al., 2000], in which the problem is reformulated as P = {p [p o ] tp holdstrue} i.e. P = {p [p o ] tp = 1}, where tp is a global test. The global test tp consists of various elementary tests three tests [Kieffer et al., 2000] and they are robust to outliners as well as described in [Kieffer et al., 2000]. Also the purpose of the first two tests is to eliminate some infeasible configurations there by saving computational time. But if all the three tests are used when the range of the sensor is limited, it leads to scenarios in which some feasible configurations are ignored prematurely. Therefore in our case only two tests were used first test in-room test and the third test data test in [Kieffer et al., 2000]. The main consequence of not using the second test i.e. leg in test when the sensor range is limited is that it may increase the computation time. The interval robot position obtained using the ultrasonic sensors are then fused with the inertial sensors estimated robot position. In the case when the robot is moving the robots position needs to be tracked, in which case the robots position needs to predicted at the next instant to estimate the robots position at that instant. This is done by using the physical limitations of the robot based on the maximum possible speed and heading angle rate of the robot, instead of using a dynamic robot model with interval values [Jaulin et al., 2001] or position information from other sensors namely INS and encoders in this case. Thus we obtain an independent interval position of the robot from the ultrasonic sensors. 4. Interval Analysis based Adaptive mechanism for UKF. Sensor fusion is a very important and keenly researched topic in the domain of mobile robotics. For the problem of sensor fusion stochastic filters, such as Kalman filters, are commonly used. But they suffer from the same disadvantages described before i.e. an accurate model of the system and statistics are needed. In order to overcome these difficulties while using the UKF, a new approach has been introduced in this paper. As described above the robots position are estimated using two independent sources namely, the robot position from inertial sensors and the interval robot position from ultrasonic sensors. The position obtained using IA is updated only once every second due to computation time, where as the position from inertial sensors and encoders are updated 100 times per second. Moreover we know that the robots interval position to be guaranteed. The interval robot position thus obtained using ultrasonic sensors is like a plane or rectangle which basically means that there is equal probability with in that rectangle for the robot to be present anywhere inside that interval robot position. Then the estimated position using the INS is checked whether they are inside this rectangle. In case they are present inside the rectangle then they are trusted to be accurate and used. Alternatively if they are not then

5 Covariance ellipse enclosing interval robot poostion Inertial Sensors Encoders Ultrasonic sensors Map of robot environment UKF estimated robot position UKF Sensor fusion for estimating robot speed and orientation Interval Analysis algorithm robot position estimation for UKF estimated X and Y robot coordinates Interval X and Interval Y robot coordinates which forms a box or a plane Interval Robot Position Check whether the UKF robot position is inside the interval robot position box. If so the robot position is the UKF estimated X and Y coordinates. OR If not the robot position is the point which is closest to the UKF estimate in the interval box and the covariance is the ellipse that encloses the interval robot position. Figure 6: Fusion of UKF estimated robot position and covariance ellipse enclosing interval position Robot X and Y coordinate Sensor fusion block diagram Figure 5: Block diagram of IA based adaptive mechanism for UKF both the measurements need to be fused. But we know that the interval robot position is guaranteed to contain the robots position within that interval i.e a rectangle and also that there is equal probability for the robot to be present anywhere inside that rectangle i.e uniform distribution. So we know that the error in the robot position estimation is with the UKF estimated robot position only. Inorder to correct the UKF estimated robot positions we need to fuse them with the interval robot position. Thus there is a need to combine intervals with a real number from the UKF and also we need to know the covariance i.e a measure of uncertainty of the real number. One of the methods to obtain a real number and covariance from the intervals is to enclose the interval position by an ellipse which diagrammatically represents the covariance in 2D [Uhlmann, 1996] which the mid point of the intervals as the center of the ellipse. The major and minor axis radius of the ellipse are taken as the covariance matrix values. There are two options in selecting where the center of the ellipse is in the interval robot position. One of the options is to select the point on the rectangle box i.e. the interval robot position, which is geometrically closest to the robot position estimated using UKF with inertial sensors, thereby bounding the error in the UKF estimates. Also the interval uncertainty is converted into covariance for the UKF based on the error ellipse. But one of the disadvantages of using this method to fuse the interval data with the Kalman estimated data is that when the geometrically nearest point is chosen in the interval box, it introduces more uncertainty in the form of large covariance matrix values as one edge of the box will become the center of the ellipse. An alternate way of fusing the data is to choose the mid point of the interval positions and convert them in to covariance matrix values as described before and then fuse them with the UKF estimated position using the UKF as illustrated in the Figure 6. Figure 7: UKF only estimated robot position The algorithm described above has been successfully implemented in simulation using C++ and MATLAB software. Figure 7 shows the robot position estimate using UKF only, which is affected by sensor bias, drift etc. The Figure 8 shows the robot position after using the adaptive UKF robot position estimate using the SIVIA interval robot position for adaptation. Similarly the Figure 9 shows the UKF robot position estimation using the IMAGESP interval algorithm for the adaptive mechanism. In both cases it can be observed that the UKF estimate with the interval adaptive mechanism is much better than the UKF position estimate alone. It should be noted however that the SIVIA interval position uncertainty is greater than that of the IMAGESP algorithm. The reduction in uncertainty of the estimated interval robot position using the IMAGESP algorithm is attained at the cost of more computation time when compared with the SIVIA algorithm. This is because, in the IMAGESP algorithm, the whole initial subpaving is divided into many boxes of identical width less than or equal to ǫ 0.01 in our case and the global test tp is performed on all of these boxes. It can be seen in Figure 9 that when the position estimate of UKF is fused with the IA it results to very small position error. That is because of the small interval of the position estimate obtained using IMAGESP. Hence even if IMAGESP interval estimate comes at the cost of more computation time when compared with the SIVIA algorithm it results

6 the major and minor axis radius of the ellipse as the covariance matrix values. Simulation results show that the UKF robot position estimate with the IA based adaptive mechanism gives a more accurate estimate when compared to robot position estimates with UKF alone. References Alessandri, A., Bartolini, G., Pavanati, P., Punta, E., and Vinci, A An application of the extended kalman filter for integrated navigation in mobile robotics. American Control Conference. Figure 8: Fused robot position using SIVIA algorithm as adaptive mechanism in UKF Castellanos, J. and Tardos, J Mobile Robot Localisation and Map Building: A Multisensor Fusion Approach. Kluwer Ac. Pub., Boston. Clark, S., Dissanayake, G., Newman, P., and Durrant- Whyte, H A solution to simultaneous localisation and map building slam problem. IEEE Journal of Robotics and Automation, 173. Jaulin, L., Kieffer, M., Didrit, O., and Walter, E Applied Interval Analysis with examples in parameter and state estimation robust control and robotics. Springer-Verlag, London. Jaulin, L. and Walter, E Set inversion via interval analysis for nonlinear bounded-error estimation. Automatica, 294:1053 to Figure 9: Fused robot position using IMAGESP algorithm as adaptive mechanism in UKF in a much more accurate fused robot position estimate. Moreover, in Figure 8, it can be seen that the interval position uncertainty increases as expected when there are no land marks visible to the robots ultrasonic sensors. This can be seen in the Figure 8 when the robot begins to move and also at instances in between landmarks, when the number of ultrasonic sensors for which the landmarks are visible is less. 5. Conclusion An Unscented Kalman filter UKF using an Interval Analysis IA based adaptive mechanism has been described. Previous work on robot localisation and navigation using IA has been extended, so as to incorporate sensor range limitation. Moreover instead of using dynamic interval model of the robot to predict the next step of the robot, the physical limitations of the robot are used to predict the next step of the robot in the IA algorithm. A method for fusing the intervals with the UKF has been described, which is based on the concept of representing the intervals with an error ellipse in 2D and using Jetto, L., Longhi, S., and Venturini, G Development and experimental validation of an adaptive extended kalman filter for the localization of mobile. IEEE Transactions on Robotics and Automation, 152. Julier, S. and Uhlmann, J A new extension of the kalman filter to nonlinear systems. In Proc. of Aerosense: The 11thInt. Symp. on Aerospace/Sensing, Simulation and Controls. Kieffer, M., Jaulin, L., Didrit, O., and Walter, E Robust autonomous robot localization using interval analysis. Reliable Computing, 63:337 to 362. Kieffer, M., Jaulin, L., and Walter, E Guaranteed recursive nonlinear state estimation using interval analysis. Internal Report, Laboratoire des Signaux et Systemes. Uhlmann, J. K General data fusion for estimates with unknown cross covariances. In Proceedings of the SPIE Aerosense Conference. Wan, E. and der Merwe, R. V Kalman Filtering and Neural Networks, Chapter 7: The Unscented Kalman Filter. Wiley Publishing.

Geometrical Feature Extraction Using 2D Range Scanner

Geometrical Feature Extraction Using 2D Range Scanner Geometrical Feature Extraction Using 2D Range Scanner Sen Zhang Lihua Xie Martin Adams Fan Tang BLK S2, School of Electrical and Electronic Engineering Nanyang Technological University, Singapore 639798

More information

High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications

High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications Proceedings of the 2000 IEEE International Conference on Robotics & Automation San Francisco, CA April 2000 High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications Jose Guivant, Eduardo

More information

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2

More information

EKF Localization and EKF SLAM incorporating prior information

EKF Localization and EKF SLAM incorporating prior information EKF Localization and EKF SLAM incorporating prior information Final Report ME- Samuel Castaneda ID: 113155 1. Abstract In the context of mobile robotics, before any motion planning or navigation algorithm

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: EKF-based SLAM Dr. Kostas Alexis (CSE) These slides have partially relied on the course of C. Stachniss, Robot Mapping - WS 2013/14 Autonomous Robot Challenges Where

More information

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models Introduction ti to Embedded dsystems EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping Gabe Hoffmann Ph.D. Candidate, Aero/Astro Engineering Stanford University Statistical Models

More information

Localization, Where am I?

Localization, Where am I? 5.1 Localization, Where am I?? position Position Update (Estimation?) Encoder Prediction of Position (e.g. odometry) YES matched observations Map data base predicted position Matching Odometry, Dead Reckoning

More information

PROGRAMA DE CURSO. Robotics, Sensing and Autonomous Systems. SCT Auxiliar. Personal

PROGRAMA DE CURSO. Robotics, Sensing and Autonomous Systems. SCT Auxiliar. Personal PROGRAMA DE CURSO Código Nombre EL7031 Robotics, Sensing and Autonomous Systems Nombre en Inglés Robotics, Sensing and Autonomous Systems es Horas de Horas Docencia Horas de Trabajo SCT Docentes Cátedra

More information

Exam in DD2426 Robotics and Autonomous Systems

Exam in DD2426 Robotics and Autonomous Systems Exam in DD2426 Robotics and Autonomous Systems Lecturer: Patric Jensfelt KTH, March 16, 2010, 9-12 No aids are allowed on the exam, i.e. no notes, no books, no calculators, etc. You need a minimum of 20

More information

Lecture 13 Visual Inertial Fusion

Lecture 13 Visual Inertial Fusion Lecture 13 Visual Inertial Fusion Davide Scaramuzza Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve

More information

DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda**

DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda** DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda** * Department of Mechanical Engineering Technical University of Catalonia (UPC)

More information

2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, 25 IEEE Personal use of this material is permitted Permission from IEEE must be obtained for all other uses in any current or future media including reprinting/republishing this material for advertising

More information

IN recent years, NASA s Mars Exploration Rovers (MERs)

IN recent years, NASA s Mars Exploration Rovers (MERs) Robust Landmark Estimation for SLAM in Dynamic Outdoor Environment Atsushi SAKAI, Teppei SAITOH and Yoji KURODA Meiji University, Department of Mechanical Engineering, 1-1-1 Higashimita, Tama-ku, Kawasaki,

More information

An Experimental Exploration of Low-Cost Solutions for Precision Ground Vehicle Navigation

An Experimental Exploration of Low-Cost Solutions for Precision Ground Vehicle Navigation An Experimental Exploration of Low-Cost Solutions for Precision Ground Vehicle Navigation Daniel Cody Salmon David M. Bevly Auburn University GPS and Vehicle Dynamics Laboratory 1 Introduction Motivation

More information

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements António Pedro Aguiar aguiar@ece.ucsb.edu João Pedro Hespanha hespanha@ece.ucsb.edu Dept.

More information

Nonlinear Filtering with IMM Algorithm for Coastal Radar Target Tracking System

Nonlinear Filtering with IMM Algorithm for Coastal Radar Target Tracking System TELKOMNIKA, Vol.13, No.1, March 2015, pp. 211~220 ISSN: 1693-6930, accredited A by DIKTI, Decree No: 58/DIKTI/Kep/2013 DOI: 10.12928/TELKOMNIKA.v13i1.791 211 Nonlinear Filtering with IMM Algorithm for

More information

ROBUST LOCALISATION OF AUTOMATED GUIDED VEHICLES FOR COMPUTER-INTEGRATED MANUFACTURING ENVIRONMENTS. R.C. Dixon 1 *, G.Bright 2 & R.

ROBUST LOCALISATION OF AUTOMATED GUIDED VEHICLES FOR COMPUTER-INTEGRATED MANUFACTURING ENVIRONMENTS. R.C. Dixon 1 *, G.Bright 2 & R. ROBUST LOCALISATION OF AUTOMATED GUIDED VEHICLES FOR COMPUTER-INTEGRATED MANUFACTURING ENVIRONMENTS R.C. Dixon 1 *, G.Bright 2 & R. Harley 3 1.2 Department of Mechanical Engineering University of KwaZulu-Natal,

More information

DYNAMIC TRIANGULATION FOR MOBILE ROBOT LOCALIZATION USING AN ANGULAR STATE KALMAN FILTER. Josep Maria Font, Joaquim A. Batlle

DYNAMIC TRIANGULATION FOR MOBILE ROBOT LOCALIZATION USING AN ANGULAR STATE KALMAN FILTER. Josep Maria Font, Joaquim A. Batlle DYNAMIC TRIANGULATION FOR MOBILE ROBOT LOCALIZATION USING AN ANGULAR STATE KALMAN FILTER Josep Maria Font, Joaquim A. Batlle Department of Mechanical Engineering Technical University of Catalonia (UPC)

More information

(1) and s k ωk. p k vk q

(1) and s k ωk. p k vk q Sensing and Perception: Localization and positioning Isaac Sog Project Assignment: GNSS aided INS In this project assignment you will wor with a type of navigation system referred to as a global navigation

More information

Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle

Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle Tommie J. Liddy and Tien-Fu Lu School of Mechanical Engineering; The University

More information

DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING

DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING Inženýrská MECHANIKA, roč. 15, 2008, č. 5, s. 337 344 337 DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING Jiří Krejsa, Stanislav Věchet* The paper presents Potential-Based

More information

IMU and Encoders. Team project Robocon 2016

IMU and Encoders. Team project Robocon 2016 IMU and Encoders Team project Robocon 2016 Harsh Sinha, 14265, harshsin@iitk.ac.in Deepak Gangwar, 14208, dgangwar@iitk.ac.in Swati Gupta, 14742, swatig@iitk.ac.in March 17 th 2016 IMU and Encoders Module

More information

Applied Interval Analysis

Applied Interval Analysis Luc Jaulin, Michel Kieffer, Olivier Didrit and Eric Walter Applied Interval Analysis With Examples in Parameter and State Estimation, Robust Control and Robotics With 125 Figures Contents Preface Notation

More information

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100 ME 597/747 Autonomous Mobile Robots Mid Term Exam Duration: 2 hour Total Marks: 100 Instructions: Read the exam carefully before starting. Equations are at the back, but they are NOT necessarily valid

More information

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains PhD student: Jeff DELAUNE ONERA Director: Guy LE BESNERAIS ONERA Advisors: Jean-Loup FARGES Clément BOURDARIAS

More information

DYNAMIC ROBOT LOCALIZATION AND MAPPING USING UNCERTAINTY SETS. M. Di Marco A. Garulli A. Giannitrapani A. Vicino

DYNAMIC ROBOT LOCALIZATION AND MAPPING USING UNCERTAINTY SETS. M. Di Marco A. Garulli A. Giannitrapani A. Vicino DYNAMIC ROBOT LOCALIZATION AND MAPPING USING UNCERTAINTY SETS M. Di Marco A. Garulli A. Giannitrapani A. Vicino Dipartimento di Ingegneria dell Informazione, Università di Siena, Via Roma, - 1 Siena, Italia

More information

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots D-SLAM: Decoupled Localization and Mapping for Autonomous Robots Zhan Wang, Shoudong Huang, and Gamini Dissanayake ARC Centre of Excellence for Autonomous Systems (CAS), Faculty of Engineering, University

More information

Efficient particle filter algorithm for ultrasonic sensor based 2D range-only SLAM application

Efficient particle filter algorithm for ultrasonic sensor based 2D range-only SLAM application Efficient particle filter algorithm for ultrasonic sensor based 2D range-only SLAM application Po Yang School of Computing, Science & Engineering, University of Salford, Greater Manchester, United Kingdom

More information

Offline Simultaneous Localization and Mapping (SLAM) using Miniature Robots

Offline Simultaneous Localization and Mapping (SLAM) using Miniature Robots Offline Simultaneous Localization and Mapping (SLAM) using Miniature Robots Objectives SLAM approaches SLAM for ALICE EKF for Navigation Mapping and Network Modeling Test results Philipp Schaer and Adrian

More information

A Stochastic Environment Modeling Method for Mobile Robot by using 2-D Laser scanner Young D. Kwon,Jin.S Lee Department of Electrical Engineering, Poh

A Stochastic Environment Modeling Method for Mobile Robot by using 2-D Laser scanner Young D. Kwon,Jin.S Lee Department of Electrical Engineering, Poh A Stochastic Environment Modeling Method for Mobile Robot by using -D Laser scanner Young D. Kwon,Jin.S Lee Department of Electrical Engineering, Pohang University of Science and Technology, e-mail: jsoo@vision.postech.ac.kr

More information

Monocular SLAM for a Small-Size Humanoid Robot

Monocular SLAM for a Small-Size Humanoid Robot Tamkang Journal of Science and Engineering, Vol. 14, No. 2, pp. 123 129 (2011) 123 Monocular SLAM for a Small-Size Humanoid Robot Yin-Tien Wang*, Duen-Yan Hung and Sheng-Hsien Cheng Department of Mechanical

More information

UNIVERSITÀ DEGLI STUDI DI GENOVA MASTER S THESIS

UNIVERSITÀ DEGLI STUDI DI GENOVA MASTER S THESIS UNIVERSITÀ DEGLI STUDI DI GENOVA MASTER S THESIS Integrated Cooperative SLAM with Visual Odometry within teams of autonomous planetary exploration rovers Author: Ekaterina Peshkova Supervisors: Giuseppe

More information

Navigation methods and systems

Navigation methods and systems Navigation methods and systems Navigare necesse est Content: Navigation of mobile robots a short overview Maps Motion Planning SLAM (Simultaneous Localization and Mapping) Navigation of mobile robots a

More information

Data Association for SLAM

Data Association for SLAM CALIFORNIA INSTITUTE OF TECHNOLOGY ME/CS 132a, Winter 2011 Lab #2 Due: Mar 10th, 2011 Part I Data Association for SLAM 1 Introduction For this part, you will experiment with a simulation of an EKF SLAM

More information

Encoder applications. I Most common use case: Combination with motors

Encoder applications. I Most common use case: Combination with motors 3.5 Rotation / Motion - Encoder applications 64-424 Intelligent Robotics Encoder applications I Most common use case: Combination with motors I Used to measure relative rotation angle, rotational direction

More information

Probabilistic Robotics

Probabilistic Robotics Probabilistic Robotics Sebastian Thrun Wolfram Burgard Dieter Fox The MIT Press Cambridge, Massachusetts London, England Preface xvii Acknowledgments xix I Basics 1 1 Introduction 3 1.1 Uncertainty in

More information

UAV Autonomous Navigation in a GPS-limited Urban Environment

UAV Autonomous Navigation in a GPS-limited Urban Environment UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight

More information

Implementation of Odometry with EKF for Localization of Hector SLAM Method

Implementation of Odometry with EKF for Localization of Hector SLAM Method Implementation of Odometry with EKF for Localization of Hector SLAM Method Kao-Shing Hwang 1 Wei-Cheng Jiang 2 Zuo-Syuan Wang 3 Department of Electrical Engineering, National Sun Yat-sen University, Kaohsiung,

More information

Towards Gaussian Multi-Robot SLAM for Underwater Robotics

Towards Gaussian Multi-Robot SLAM for Underwater Robotics Towards Gaussian Multi-Robot SLAM for Underwater Robotics Dave Kroetsch davek@alumni.uwaterloo.ca Christoper Clark cclark@mecheng1.uwaterloo.ca Lab for Autonomous and Intelligent Robotics University of

More information

MTRX4700: Experimental Robotics

MTRX4700: Experimental Robotics Stefan B. Williams April, 2013 MTR4700: Experimental Robotics Assignment 3 Note: This assignment contributes 10% towards your final mark. This assignment is due on Friday, May 10 th during Week 9 before

More information

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots D-SLAM: Decoupled Localization and Mapping for Autonomous Robots Zhan Wang, Shoudong Huang, and Gamini Dissanayake ARC Centre of Excellence for Autonomous Systems (CAS) Faculty of Engineering, University

More information

Simuntaneous Localisation and Mapping with a Single Camera. Abhishek Aneja and Zhichao Chen

Simuntaneous Localisation and Mapping with a Single Camera. Abhishek Aneja and Zhichao Chen Simuntaneous Localisation and Mapping with a Single Camera Abhishek Aneja and Zhichao Chen 3 December, Simuntaneous Localisation and Mapping with asinglecamera 1 Abstract Image reconstruction is common

More information

Neuro-adaptive Formation Maintenance and Control of Nonholonomic Mobile Robots

Neuro-adaptive Formation Maintenance and Control of Nonholonomic Mobile Robots Proceedings of the International Conference of Control, Dynamic Systems, and Robotics Ottawa, Ontario, Canada, May 15-16 2014 Paper No. 50 Neuro-adaptive Formation Maintenance and Control of Nonholonomic

More information

MOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS. Josep Maria Font, Joaquim A. Batlle

MOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS. Josep Maria Font, Joaquim A. Batlle MOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS Josep Maria Font, Joaquim A. Batlle Department of Mechanical Engineering Technical University of Catalonia (UC) Avda. Diagonal 647, 08028

More information

Localization and Map Building

Localization and Map Building Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples

More information

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017 Electric Electronic Engineering Bogazici University December 13, 2017 Absolute position measurement Outline Motion Odometry Inertial systems Environmental Tactile Proximity Sensing Ground-Based RF Beacons

More information

Constraints propagation techniques on intervals for a guaranteed localization using redundant data

Constraints propagation techniques on intervals for a guaranteed localization using redundant data Automatica 42 (26) 1167 1175 www.elsevier.com/locate/automatica Brief paper Constraints propagation techniques on intervals for a guaranteed localization using redundant data A. Gning, Ph. Bonnifait Heudiasyc

More information

Determination of Inner and Outer Bounds of Reachable Sets

Determination of Inner and Outer Bounds of Reachable Sets Determination of Inner and Outer Bounds of Reachable Sets Francisco Rego (francisco.rego@aero.ist.utl.pt) Abstract - To verify the safety of a dynamic system, one important procedure is to compute its

More information

Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design

Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design Sebastian Andersson School of Software Engineering Tongji University Shanghai, China

More information

Application of Robot Navigation Technology for Civil Robot Development

Application of Robot Navigation Technology for Civil Robot Development Safety, Reliability and Risk of Structures, Infrastructures and Engineering Systems Furuta, Frangopol & Shinozuka (eds) 2010 Taylor & Francis Group, London, ISBN 978-0-415-47557-0 Application of Robot

More information

Satellite and Inertial Navigation and Positioning System

Satellite and Inertial Navigation and Positioning System Satellite and Inertial Navigation and Positioning System Project Proposal By: Luke Pfister Dan Monroe Project Advisors: Dr. In Soo Ahn Dr. Yufeng Lu EE 451 Senior Capstone Project December 10, 2009 PROJECT

More information

GNSS-aided INS for land vehicle positioning and navigation

GNSS-aided INS for land vehicle positioning and navigation Thesis for the degree of Licentiate of Engineering GNSS-aided INS for land vehicle positioning and navigation Isaac Skog Signal Processing School of Electrical Engineering KTH (Royal Institute of Technology)

More information

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z Basic Linear Algebra Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ 1 5 ] 7 9 3 11 Often matrices are used to describe in a simpler way a series of linear equations.

More information

Robot Localization based on Geo-referenced Images and G raphic Methods

Robot Localization based on Geo-referenced Images and G raphic Methods Robot Localization based on Geo-referenced Images and G raphic Methods Sid Ahmed Berrabah Mechanical Department, Royal Military School, Belgium, sidahmed.berrabah@rma.ac.be Janusz Bedkowski, Łukasz Lubasiński,

More information

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry Distributed Vision-Aided Cooperative Navigation Based on hree-view Geometry Vadim Indelman, Pini Gurfil Distributed Space Systems Lab, Aerospace Engineering, echnion Ehud Rivlin Computer Science, echnion

More information

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza ETH Master Course: 151-0854-00L Autonomous Mobile Robots Summary 2 Lecture Overview Mobile Robot Control Scheme knowledge, data base mission

More information

navigation Isaac Skog

navigation Isaac Skog Foot-mounted zerovelocity aided inertial navigation Isaac Skog skog@kth.se Course Outline 1. Foot-mounted inertial navigation a. Basic idea b. Pros and cons 2. Inertial navigation a. The inertial sensors

More information

Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications

Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications 28 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 28 Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications Dilan

More information

Adaption of Robotic Approaches for Vehicle Localization

Adaption of Robotic Approaches for Vehicle Localization Adaption of Robotic Approaches for Vehicle Localization Kristin Schönherr, Björn Giesler Audi Electronics Venture GmbH 85080 Gaimersheim, Germany kristin.schoenherr@audi.de, bjoern.giesler@audi.de Alois

More information

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang MULTI-MODAL MAPPING Robotics Day, 31 Mar 2017 Frank Mascarich, Shehryar Khattak, Tung Dang Application-Specific Sensors Cameras TOF Cameras PERCEPTION LiDAR IMU Localization Mapping Autonomy Robotic Perception

More information

Sensor fusion methods for indoor navigation using UWB radio aided INS/DR

Sensor fusion methods for indoor navigation using UWB radio aided INS/DR Sensor fusion methods for indoor navigation using UWB radio aided INS/DR JOSÉ BORRÀS SILLERO Master s Degree Project Stockholm, Sweden July 2012 XR-EE-SB 2012:015 Abstract Some applications such as industrial

More information

EE565:Mobile Robotics Lecture 3

EE565:Mobile Robotics Lecture 3 EE565:Mobile Robotics Lecture 3 Welcome Dr. Ahmad Kamal Nasir Today s Objectives Motion Models Velocity based model (Dead-Reckoning) Odometry based model (Wheel Encoders) Sensor Models Beam model of range

More information

Introduction to Autonomous Mobile Robots

Introduction to Autonomous Mobile Robots Introduction to Autonomous Mobile Robots second edition Roland Siegwart, Illah R. Nourbakhsh, and Davide Scaramuzza The MIT Press Cambridge, Massachusetts London, England Contents Acknowledgments xiii

More information

BBR Progress Report 006: Autonomous 2-D Mapping of a Building Floor

BBR Progress Report 006: Autonomous 2-D Mapping of a Building Floor BBR Progress Report 006: Autonomous 2-D Mapping of a Building Floor Andy Sayler & Constantin Berzan November 30, 2010 Abstract In the past two weeks, we implemented and tested landmark extraction based

More information

A single ping showing threshold and principal return. ReturnStrength Range(m) Sea King Feature Extraction. Y (m) X (m)

A single ping showing threshold and principal return. ReturnStrength Range(m) Sea King Feature Extraction. Y (m) X (m) Autonomous Underwater Simultaneous Localisation and Map Building Stefan B. Williams, Paul Newman, Gamini Dissanayake, Hugh Durrant-Whyte Australian Centre for Field Robotics Department of Mechanical and

More information

Stereo Vision as a Sensor for EKF SLAM

Stereo Vision as a Sensor for EKF SLAM Stereo Vision as a Sensor for EKF SLAM Wikus Brink Electronic Systems Lab Electrical and Electronic Engineering University of Stellenbosch Email: 1483986@sun.ac.za Corné E. van Daalen Electronic Systems

More information

Proceedings of ESDA2006 8th Biennial ASME Conference on Engineering Systems Design and Analysis July 4-7, 2006, Torino, Italy

Proceedings of ESDA2006 8th Biennial ASME Conference on Engineering Systems Design and Analysis July 4-7, 2006, Torino, Italy Proceedings of ESDA2006 8th Biennial ASME Conference on Engineering Systems Design and Analysis July 4-7, 2006, Torino, Italy ESDA2006-95412 LOCALIZATION OF A MOBILE ROBOT WITH OMNIDIRECTIONAL WHEELS USING

More information

Selection and Integration of Sensors Alex Spitzer 11/23/14

Selection and Integration of Sensors Alex Spitzer 11/23/14 Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs

More information

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Stable Vision-Aided Navigation for Large-Area Augmented Reality Stable Vision-Aided Navigation for Large-Area Augmented Reality Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu Supun Samarasekera, Rakesh Teddy Kumar Vision and Robotics Laboratory SRI-International Sarnoff,

More information

POTENTIAL ACTIVE-VISION CONTROL SYSTEMS FOR UNMANNED AIRCRAFT

POTENTIAL ACTIVE-VISION CONTROL SYSTEMS FOR UNMANNED AIRCRAFT 26 TH INTERNATIONAL CONGRESS OF THE AERONAUTICAL SCIENCES POTENTIAL ACTIVE-VISION CONTROL SYSTEMS FOR UNMANNED AIRCRAFT Eric N. Johnson* *Lockheed Martin Associate Professor of Avionics Integration, Georgia

More information

The Performance Evaluation of the Integration of Inertial Navigation System and Global Navigation Satellite System with Analytic Constraints

The Performance Evaluation of the Integration of Inertial Navigation System and Global Navigation Satellite System with Analytic Constraints Journal of Environmental Science and Engineering A 6 (2017) 313-319 doi:10.17265/2162-5298/2017.06.005 D DAVID PUBLISHING The Performance Evaluation of the Integration of Inertial Navigation System and

More information

3D Model Acquisition by Tracking 2D Wireframes

3D Model Acquisition by Tracking 2D Wireframes 3D Model Acquisition by Tracking 2D Wireframes M. Brown, T. Drummond and R. Cipolla {96mab twd20 cipolla}@eng.cam.ac.uk Department of Engineering University of Cambridge Cambridge CB2 1PZ, UK Abstract

More information

Attack Resilient State Estimation for Vehicular Systems

Attack Resilient State Estimation for Vehicular Systems December 15 th 2013. T-SET Final Report Attack Resilient State Estimation for Vehicular Systems Nicola Bezzo (nicbezzo@seas.upenn.edu) Prof. Insup Lee (lee@cis.upenn.edu) PRECISE Center University of Pennsylvania

More information

Localization and Map Building

Localization and Map Building Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples

More information

Trajectory Generation for Constant Velocity Target Motion Estimation Using Monocular Vision

Trajectory Generation for Constant Velocity Target Motion Estimation Using Monocular Vision Trajectory Generation for Constant Velocity Targ Motion Estimation Using Monocular Vision Eric W. Frew, Stephen M. Rock Aerospace Robotics Laboratory Department of Aeronautics and Astronautics Stanford

More information

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Simon Thompson and Satoshi Kagami Digital Human Research Center National Institute of Advanced

More information

Quaternion Kalman Filter Design Based on MEMS Sensors

Quaternion Kalman Filter Design Based on MEMS Sensors , pp.93-97 http://dx.doi.org/10.14257/astl.2014.76.20 Quaternion Kalman Filter Design Based on MEMS Sensors Su zhongbin,yanglei, Kong Qingming School of Electrical and Information. Northeast Agricultural

More information

Modeling Mobile Robot Motion with Polar Representations

Modeling Mobile Robot Motion with Polar Representations Modeling Mobile Robot Motion with Polar Representations Joseph Djugash, Sanjiv Singh, and Benjamin Grocholsky Abstract This article compares several parameterizations and motion models for improving the

More information

Centre for Autonomous Systems

Centre for Autonomous Systems Robot Henrik I Centre for Autonomous Systems Kungl Tekniska Högskolan hic@kth.se 27th April 2005 Outline 1 duction 2 Kinematic and Constraints 3 Mobile Robot 4 Mobile Robot 5 Beyond Basic 6 Kinematic 7

More information

Landmark-based Navigation of Industrial Mobile Robots

Landmark-based Navigation of Industrial Mobile Robots International Journal of Industry Robot, Vol. 27, No. 6, 2000, pages 458-467 Landmark-based Navigation of Industrial Mobile Robots Huosheng Hu and Dongbing Gu Department of Computer Science, University

More information

Development of the Low-Cost PNT Unit for Maritime Applications Using MEMS Inertial Sensors

Development of the Low-Cost PNT Unit for Maritime Applications Using MEMS Inertial Sensors Development of the Low-Cost PNT Unit for Maritime Applications Using MEMS Inertial Sensors Michailas Romanovas, Luis Lanca, Ralf Ziebold Institute of Communications and Navigation German Aerospace Centre

More information

Autonomous Landing of an Unmanned Aerial Vehicle

Autonomous Landing of an Unmanned Aerial Vehicle Autonomous Landing of an Unmanned Aerial Vehicle Joel Hermansson, Andreas Gising Cybaero AB SE-581 12 Linköping, Sweden Email: {joel.hermansson, andreas.gising}@cybaero.se Martin Skoglund and Thomas B.

More information

A set theoretic approach to path planning for mobile robots

A set theoretic approach to path planning for mobile robots A set theoretic approach to path planning for mobile robots Nicola Ceccarelli, Mauro Di Marco, Andrea Garulli, Antonio Giannitrapani Dipartimento di Ingegneria dell Informazione Università di Siena Via

More information

Using constraint propagation for cooperative UAV localization from vision and ranging

Using constraint propagation for cooperative UAV localization from vision and ranging Using constraint propagation for cooperative UAV localization from vision and ranging Ide-Flore Kenmogne, Vincent Drevelle, and Eric Marchand Univ Rennes, Inria, CNRS, IRISA ide-flore.kenmogne@inria.fr,

More information

Passive Multi Target Tracking with GM-PHD Filter

Passive Multi Target Tracking with GM-PHD Filter Passive Multi Target Tracking with GM-PHD Filter Dann Laneuville DCNS SNS Division Toulon, France dann.laneuville@dcnsgroup.com Jérémie Houssineau DCNS SNS Division Toulon, France jeremie.houssineau@dcnsgroup.com

More information

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Asaf Moses Systematics Ltd., Technical Product Manager aviasafm@systematics.co.il 1 Autonomous

More information

COS Lecture 13 Autonomous Robot Navigation

COS Lecture 13 Autonomous Robot Navigation COS 495 - Lecture 13 Autonomous Robot Navigation Instructor: Chris Clark Semester: Fall 2011 1 Figures courtesy of Siegwart & Nourbakhsh Control Structure Prior Knowledge Operator Commands Localization

More information

A development of PSD sensor system for navigation and map building in the indoor environment

A development of PSD sensor system for navigation and map building in the indoor environment A development of PSD sensor system for navigation and map building in the indoor environment Tae Cheol Jeong*, Chang Hwan Lee **, Jea yong Park ***, Woong keun Hyun **** Department of Electronics Engineering,

More information

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The

More information

E80. Experimental Engineering. Lecture 9 Inertial Measurement

E80. Experimental Engineering. Lecture 9 Inertial Measurement Lecture 9 Inertial Measurement http://www.volker-doormann.org/physics.htm Feb. 19, 2013 Christopher M. Clark Where is the rocket? Outline Sensors People Accelerometers Gyroscopes Representations State

More information

Inner and outer approximation of capture basin using interval analysis

Inner and outer approximation of capture basin using interval analysis Inner and outer approximation of capture basin using interval analysis M. Lhommeau 1 L. Jaulin 2 L. Hardouin 1 1 Laboratoire d'ingénierie des Systèmes Automatisés ISTIA - Université d'angers 62, av. Notre

More information

On-line Convex Optimization based Solution for Mapping in VSLAM

On-line Convex Optimization based Solution for Mapping in VSLAM On-line Convex Optimization based Solution for Mapping in VSLAM by Abdul Hafeez, Shivudu Bhuvanagiri, Madhava Krishna, C.V.Jawahar in IROS-2008 (Intelligent Robots and Systems) Report No: IIIT/TR/2008/178

More information

GT "Calcul Ensembliste"

GT Calcul Ensembliste GT "Calcul Ensembliste" Beyond the bounded error framework for non linear state estimation Fahed Abdallah Université de Technologie de Compiègne 9 Décembre 2010 Fahed Abdallah GT "Calcul Ensembliste" 9

More information

Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor

Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor Chang Liu & Stephen D. Prior Faculty of Engineering and the Environment, University of Southampton,

More information

13. Learning Ballistic Movementsof a Robot Arm 212

13. Learning Ballistic Movementsof a Robot Arm 212 13. Learning Ballistic Movementsof a Robot Arm 212 13. LEARNING BALLISTIC MOVEMENTS OF A ROBOT ARM 13.1 Problem and Model Approach After a sufficiently long training phase, the network described in the

More information

Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving assistance applications on smart mobile devices

Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving assistance applications on smart mobile devices Technical University of Cluj-Napoca Image Processing and Pattern Recognition Research Center www.cv.utcluj.ro Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving

More information

Intelligent Sensor Positioning and Orientation Through Constructive Neural Network-Embedded INS/GPS Integration Algorithms

Intelligent Sensor Positioning and Orientation Through Constructive Neural Network-Embedded INS/GPS Integration Algorithms Sensors 21, 1, 9252-9285; doi:1.339/s119252 OPEN ACCESS sensors ISSN 1424-822 www.mdpi.com/journal/sensors Article Intelligent Sensor Positioning and Orientation Through Constructive Neural Network-Embedded

More information

State-space models for 3D visual tracking

State-space models for 3D visual tracking SA-1 State-space models for 3D visual tracking Doz. G. Bleser Prof. Stricker Computer Vision: Object and People Tracking Example applications Head tracking, e.g. for mobile seating buck Object tracking,

More information

Calibration of Inertial Measurement Units Using Pendulum Motion

Calibration of Inertial Measurement Units Using Pendulum Motion Technical Paper Int l J. of Aeronautical & Space Sci. 11(3), 234 239 (2010) DOI:10.5139/IJASS.2010.11.3.234 Calibration of Inertial Measurement Units Using Pendulum Motion Keeyoung Choi* and Se-ah Jang**

More information

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation 242 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 17, NO. 3, JUNE 2001 Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation José E. Guivant and Eduardo

More information