Evaluation of a Coupled Laser Inertial Navigation System for Pedestrian Tracking

Size: px
Start display at page:

Download "Evaluation of a Coupled Laser Inertial Navigation System for Pedestrian Tracking"

Transcription

1 Evaluation of a Coupled Laser Inertial Navigation System for Pedestrian Tracking Christoph Degen, Hichem El Mokni, Felix Govaers, Sensor Data- and Information Fusion, Fraunhofer-Institut für Kommunikation, Informationsverarbeitung und Ergonomie FKIE Wachtberg, Germany. Abstract This paper evaluates a previously presented method for indoor pedestrian tracking using inertial sensing and a laser scanner (Light Detection and Ranging LIDAR). The zero velocity updating technique [2], which is used to enhance the performances of an inertial sensing sensor mounted on the foot, cannot observe heading, resulting in a horizontal position drift. A LIDAR mounted on the head is used as a complementary technique to correct heading. The well known Iterative Closest Point (ICP) algorithm [5] is adapted to treat captured laser scans at given instances that we call middle of foot stance phases. The detection process of those instances is presented, which is followed by a LIDAR-inertial coupling: the corrected position delivered by the ICP algorithm is forwarded as a position fix to the extended Kalman filter, treating the inertial sensor data on the foot, and thus compensates its drift. After presenting the tracking algorithm and the system description, a visual and numerical evaluation is carried out to assess the presented tracking system with regard to stability and accuracy. Index Terms inertial navigation, zero velocity updating, pedestrian tracking, laser scanner, Iterative Closest Point ICP, Kalman filtering, simultaneous localization and mapping, SLAM, numerical evaluation. T I. INTRODUCTION HE production of pedestrian tracks in indoor environments (operations of the fire brigade, police, maintenance work, etc.) is a challenging task. Such a technology improves the situation awareness during an operation. The team leader will be able to better understand the state (position of each team member) and successfully provide the needed support (e.g. how to reach a team member, how to guide him through a smoky or dusty environment). The necessarily on the man mounted sensor technology has to be lightweight and must not limit the freedom of movement. Beyond that, places of action are frequently within buildings or in urban area, so that GPS availability is usually reduced. An Inertial Measurement Unit (IMU, sometimes confused with Inertial Navigation System INS) [3] cannot be disturbed by external influences. An IMU is a system, which combines acceleration and orientation sensors, so that the position of a carrier can be extrapolated by a double integration and a known initial position. Thus, it guarantees a constantly available, complete navigation solution; however this is consistent only for short time and suffers from the increasing navigational error with time, especially heading drift. The stability of an inertial navigation system can be extended, if the positioning procedure is externally enriched. That is the aim of this work, trying to enhance the IMU navigation using a laser scanner mounted on the head of the tracked person. A laser scanner belongs to the class of range finders, widely used in robotic field to locate a robot in its environment and extract a map, and will be next noted as LIDAR (Light Detection and Ranging). Recent advances have permitted to reduce the size and cost of LIDARs: the used LIDAR weights in fact only 26g, and has a range of about 3 meters. This was the main motivation for our work. After introducing related work in section II, section III starts with an overview of the tracking system. Section IV presents the tracking algorithm in detail. The evaluation is presented in section V. II. RELATED WORK Navigation in an unknown environment leads to what is called a SLAM problem in the robotics literature (Simultaneous Localization And Mapping) [5]: it consists of modeling the environment and determining the tracked vehicle location (that is its position and orientation) from data which it acquires during its movement. These operations are done simultaneously. The map is totally unknown at first (in particular, no information of topological type is supplied). Many probabilistic formulation of the SLAM problem were introduced: Extended Kalman Filters [11], Unscented Kalman Filters [12], Sparse Extended Information Filters [13] or Rao- Blackwellized Particle Filters [14]. Another approach is the multi-view range image registration, i.e., integrating and aggregating range measurements taken at different positions and orientations in the environment into a common coordinate frame forming the environment model [5]. Using a LIDAR, the problem can be formulated as following: having a laser scan S and a map model M, find the translation t and rotation r which best matches S into M. The ICP algorithm by Besl & McKay (1992) is a widely used solution to the registration problem. Cinaz and Kenn [1] have used a head mounted laser scanner for pedestrian tracking, but of short range, and without an IMU on the foot. Another work within our team [6] has used ultrasonic sensors mounted on shoulders to detect 1292

2 surrounding walls and correct the track produced by an IMU mounted on the foot. Kessler, Ascher and Trommer [7] exploited the orthogonally constraints of indoor detected walls, extracted by the Adaptive Line Extraction Algorithm. They used a graph representation, containing all trajectory positions and observed features. III. SYSTEM DESCRIPTION buffer is traversed. For each instant t corresponding to a foot IMU observation, we compute the sum of absolute accelerations in a surrounding window of 3ms (empirical value). If this sum is less than a given threshold, instant t will be marked as stance phase (foot on the ground). The threshold is set dynamically relatively to the maximum acceleration measured in a bigger window of 1 second: this takes the displacement speed of the pedestrian into account. The step detection process has two aims. Within a stance phase (interval between T 1 and T 2 in figure 2) we apply the zero velocity update (ZUPT) technique which stabilizes the tracking solution as presented in the next paragraph. Besides, the middle instant of each stance phase (that we call position fix instant) is used to couple the LIDAR on the head with the IMU on the foot: pass the deduced position by the LIDAR as a position fix to the IMU on the foot. The choice of this instant is based on the assumption that at this instant the LIDAR and the foot IMU have nearly the same 2D coordinates (differ only in elevation). B. Zero velocity update technique Figure 1: sensors mounted on the pedestrian For our work we used the MTi Xsens [8] sensor for inertial sensing mounted on the foot, providing 3D orientation as well as 3D acceleration, 3D rate of turn and 3D earth-magnetic field data. The Hokuyo UTM-3LX Laser Range Finder [1] was used as a LIDAR, which is mounted on a same platform as a second MTi-G Xsens [9] sensor. This one is enhanced with a GPS receiver and a barometer, which can be used to extend our work in the future. It delivers the orientation of the moving LIDAR which is necessary to correctly interpret the laser scans. Figure 1 shows how the sensors are mounted on the pedestrian. All sensors are connected to a laptop via USB cables. IV. TRACKING ALGORITHM The tracking algorithm starts by buffering all sensor data for one second before treating them: this is necessary for the step detection process as will be explained next. A. Pre-processing: Step detection Figure 2: stance phase between T 1 and T 2 (taken from [4]) The main task in this phase is to detect the stance phases of the foot (foot on the ground). For that purpose the sensors data Figure 3: inertial navigation algorithm The error characteristic of inertial sensing makes it impossible to solely rely on this technique to develop a navigation solution. Even a small drift rate in gyros can lead to a position drift of more than 1 meter in only 1 seconds. As depicted in figure 3, rate-gyroscope signals are integrated to get the orientation. To track position, the three accelerometer signals are projected into global frame coordinates using the gyroscope orientation. After compensation of gravity acceleration, global acceleration is integrated to get velocity (using initial velocity), which in turn is integrated to get position (using initial position). The horizontal acceleration error is linearly related to the tilt error (9.8 m/s2 times the tilt error in radians). But the double integration results in a cubically growing position error. Zero velocity updating technique tries to overcome this drift problem by breaking open navigation into periods. A person walk is an alternation of two phases of approximately.5 seconds each of them (can vary if the person runs quicker or slower): a stance phase where the foot stays motionless on the ground, and a moving phase. The idea is to apply zero velocity updates (ZUPTs) into the extended Kalman filter (EKF) navigation error corrector at stance phases (apply a virtual measurement of a zero velocity) [2]. A stance phase is detected when the sum of absolute accelerations is below a given threshold. Thus, when detecting a stance phase, the EKF is able to correct the velocity error, breaking the cubically growing error into a linear accumulation 1293

3 (3) Predict new position and apply ZUPT update. <If position fix> (1) Foot IMU observation Master (2)IMU observation Foot IMU EKF (5) get latest laser scan and orientation (6) Laser scan and orientation (4)IMU Position (8) Position Fix Laser Scanner ICP SLAM Buffer Head IMU Figure 4: coupling a laser scan with foot IMU at position fix instant (7) Run one ICP iteration in the number of steps. The big advantage of introducing ZUPTs as measurements into the EKF instead of just resetting the velocity to zero is that it lets the EKF retroactively correct position, velocity, accelerometers biases, pitch, roll and the pitch and roll gyro biases. This is possible thanks to the strong correlation between acceleration, velocity, position, roll, and pitch errors. What we retain is that a correct velocity (ZUPT) or a correct position (GPS position if available or LIDAR position as presented next) as a measurement allows a retroactive correction of many navigation parameters. It is important to mention that zero velocity updating is unable to correct the yaw (heading) and yaw bias. The coupled LIDAR technique of this work tries to solve this problem. C. Processing of buffered Sensors Data After the pre-processing step, the buffer is traversed a second time to treat all kinds of sensor observations. For a head IMU observation, we simply save the registered orientation. Once we find a LIDAR observation, we take the last registered orientation and associate it with the laser scan. To process a foot IMU observation, we have to distinguish between three cases: moving phase, stance phase and position fix instant (middle of stance phase). If we are at a moving phase (foot not on the ground), we simply process the foot IMU observation with the EKF which updates its state and a new position prediction is delivered. At a stance phase, we apply next to the new position prediction a ZUPT (see section IV.B), as depicted in steps (1), (2) and (3) in figure 4 which illustrates the more specific and last case, position fix instant. At the position fix instant (middle of a stance phase) the latest captured laser scan is coupled with the IMU filtered position by running an iteration of the ICP algorithm [5]. The ICP algorithm starts by obtaining the IMU filtered position which is the starting position P (step (4) in figure 4). Afterwards it processes the latest stored laser scan S associated with its orientation (steps (5) and (6) in figure 4). Now making the assumption that S was captured at position P (we consider here only the 2D position, making the assumption that the LIDAR on the head and the foot IMU have the same 2 D position at this instant, the altitude is reduced to the floor level), the ICP algorithm tries iteratively to find a position P1 (in a surrounding range R) which best matches S to a stored map formed incrementally by all previous scans. The needed computational power for the ICP algorithm is proportional to the search range R. The position P1 is fed back to the IMU EKF as a position fix to correct its navigation parameters, as depicted in step (8) figure 4. V. EXPERIMENTAL EVALUATION OF THE TRACKING- SYSTEM A. Introduction to the Evaluation Methodology In the following the experimental studies and the corresponding results of the tracking-system will be presented. Therefore several individual and representative scenarios are considered where a test person walks a predetermined path both within and outside a building. In experiments which take place within a building the appropriate coordinates of the starting points are fixed by the user in advance, since no exploitable GPS-signal is available. In scenarios where the run starts outside the system determines the position of the testperson autonomously by using the GPS-antenna in the head sensor. Besides the visual evaluation also a statistical analysis is carried out to assess the experimental-system with respect to stability and accuracy. The visual evaluation is based on the trajectories (in terms of GPS-coordinates) produced by the tracking-system and a corresponding visualization in Google 1294

4 Earth. GPS-reference points are the basis of the statistical evaluation. scenario. Thereby it is remarkable that occasionally outliers occur but on average the true trajectory is maintained. B. Scenario 1 Corner In the first scenario, which is located completely inside a building, the test person starts at a predetermined reference point and walks a route in L -form (see figure 5, black dashed track). The GPS-coordinates are therefore estimated previously and handed over to the tracking-system as initialization. The blue solid line corresponds to the track recorded by the tracking-system, the black dashed path represents the ground truth, the black circle indicates the starting point of the scenario and the black arrow reveals the running direction. The visualization presented in this paper is generated with the GPS-coordinates of the recorded paths that are produced by the tracking-system and an estimated ground truth in Google-Earth. From figure 5 it can be seen easily that the recorded trajectory almost matches the actually passed route in the considered test run. Figure 7. The (visual) result of 2 test runs. This visual impression is also numerically evaluated by 2 test runs with five estimated reference points of the walked path. By use of an algorithm which computes the minimal distance of the walked path to each reference point, the deviations between the recorded route and the (estimated) ground truth are determined and presented. Furthermore the mean value and the variance are calculated for each reference point and all test runs. The following table contains the respective distances of the test runs to the reference points P1-P5 in meter. TABLE I. Figure 5. Recorded track (blue, solid) and ground truth (black, dashed). Figure 6. The internal map of the tracking-system. Figure 6 shows the corresponding map of the tracking-system which is generated in parallel with the position estimation and which saves the characteristics of the environment to support the INS. Due to the LIDAR-measurements even parts of the building which are not entered become visible. Figure 7 visualizes the GPS-coordinates of 2 test runs of the Corner - Ref. point/ Run NUMERICAL RESULTS OF THE CORNER -SCENARIO. P1 P2 P3 P4 P5,1113,3913, ,33751,543627,3689,3689 1,2267,6435,18121,72497,5565,72361,6243 1,1882,39686,6535,889,286,66144,68294,98765,61826, ,2155,72497,55 3,8523,58273, ,11529, , ,6243 1,6388,2145 1,2267,13229, ,2653 2, ,149,34147,6435 2, , , ,96547, , , ,583, ,5898 3,9888,74885,58273, ,98697, , , ,2918,143 3, ,6388 1, , , , ,65922, , , ,3875 2, , , , ,53447, , ,55227,26458 When calculating the mean value and the variance of the deviations at each reference point the following table is obtained. 1295

5 TABLE II. Mean value Variance STATISTICAL QUANTITIES AT EACH REFERENCE POINT. P1 P2 P3 P4 P5,9451, ,7152 1, ,9768,8679,6421, ,3355 4,27215 generated with the LIDAR-measurements of the considered test run. Hereby a considerable increase of the mean value and the variance between P4 and P5 can be seen. This raise is based on the fact that the critical point is placed at the corner of the track. Since the reference points P4 and P5 are located behind the corner, the mean values and the variances of the deviations increase significantly. Moreover, it can be seen that both mean value and variance are located at a low level between P1 and P3 and that due to the accumulation of the deviations over time both values increase from reference point P1 to P5. C. Scenario 2 Zig-Zag This scenario considers a Zig-Zag path which is, analogously to the first scenario, located completely inside a building. Therefore (as in the Corner -scenario), a starting point is estimated in GPS-coordinates and handed over to the tracking-system as initialization. The blue solid path corresponds hereby to the recorded path, the black dashed track connects the reference points and therefore it does not correspond to the walked track but to the path which is in the middle of the considered hallway. From this, conclusions about the system stability with regard to a changing of the running direction can be drawn. The biggest difficulty for the tracking-system in this situation is to assign the right orientation to the LIDAR-measurements. Figure 9. The internal map of the tracking-system. It can be seen that there are LIDAR-detections of the walls which are located at the middle of the hallway (black arrows). These (false) detections of the walls are originated in the situations where the test person moves towards the wall and then changes the direction abruptly. Thereby the assignment of the orientation of the head IMU is associated incorrectly to the LIDAR-measurements. Figure 1 visualizes the recorded tracks of 2 test runs. It is remarkable that on average the direction of the tracks is stable with regard to the direction of the ground truth. Due to the accumulation of errors the highest deviations occur at the end of the respective test run. For the quantitative evaluation of the test runs 3 GPS-reference points are fixed and, correspondingly to the procedure presented in the first scenario, the deviations between the estimated tracks and the ground truth are calculated (see table III). As mentioned before, the accumulation of the deviations is also present in this scenario: A significant increase between reference point P2 and P3 is conspicuous. Figure 8. Recorded track (blue, solid) and path through the middle of the hallway (black, dashed). As can be seen in figure 8 the tracking system succeeds the task of assigning the right orientation to the LIDARmeasurements in this test run. The following map (see figure 9), which corresponds to the walked through hallway, is Figure 1. Test runs of the Zig-Zag -course. 1296

6 TABLE III. NUMERICAL RESULTS OF THE ZIG-ZAG -SCENARIO. Ref. P1 P2 P3 point/ Run 1,715, , ,715,286 3,6297 3,18129,682939,9295 4,65354, , , , ,6695 5, , , ,286 1,7169 1,1113,715 7, ,149 8, ,715, , ,715,286 3, ,18129,682939, ,65354, , , , ,6695 5, , , ,286 1,7169 Analogously to the first scenario the mean value and the variance is calculated with respect to each reference point. TABLE IV. STATISTICAL QUANTITIES AT EACH REFERENCE POINT. P1 P2 P3 Mean value,46134, ,39647 Variance,3633, ,64259 While the deviations at the second reference point are on a moderate level (comparable to the deviations at the second reference point of the Corner -scenario), the errors at reference point 3 increase significantly (see table III). This is mainly due to the false detections of walls in some test runs (see figure 9): If at some point of the run the orientation from the head sensor is associated wrongly to the LIDARmeasurements, the recorded track drifts away from the ground truth. Therefore, even if the association works afterwards quite perfectly in such test runs, high deviations occur at reference point P3. For this reason the considerably increase of the mean value and variance between the second and the last reference point is evident (see table II). D. Scenario 3 Outdoor-Indoor In the third considered scenario the test person starts outside a building with available GPS-signal. Then he walks through the building and finally terminates the test run outside, again with available GPS-signal. Figure 11 shows the recorded track (blue solid) and the estimation of the actually walked path (black dashed). Figure 11. Outdoor-Indoor -Scenario. Hereby, the tracking-system determines at the beginning of the test run the GPS-coordinates of the test-person. At the transition from outdoor to indoor, the indoor environment mode is started by an automatical detection of walls and uses LIDAR-measurements instead of GPS-data. To this end, the threshold of the invalid LIDAR-measurements has to be adapted. Thereby the challenge is to find the threshold such that the system does not switch to soon into the indoor environment mode to avoid that available and correct GPSmeasurements are not discarded. On the other hand a threshold which is chosen too low leads to a system which relies on the GPS-data too long instead of using the LIDAR-measurements. Therefore there is a risk that the tracking-system uses incorrect GPS-data that arise at the transition from the outside to the inside at the walls of a building. For the considered test run a tradeoff between both yields the best results. E. Scenario 4 Closed Loop In the fourth scenario the test person walks a closed loop counterclockwise completely inside a building (see figure 12, black dashed path). Thereby, analogously to the Outdoor- Indoor -scenario, only one test run is considered and analyzed. As can be seen in figure 12 the position determination is quite exact except for the last corner, where two walls consist completely out of glass (see figure 14, bottom left). Due to the localization error the walked track is not recorded as a closed loop. Figure 13 shows the corresponding map, generated by the LIDAR-measurements, and illustrates the critical point (black arrow) in this scenario. The error which occurs at the last corner of the scenario is clearly visible: Due to the glass surfaces at the walls points outside the building are also detected by the laser. Afterwards the ICP-algorithm processes the saved LIDAR-measurements and tries to match the transition. Because of the high number of LIDAR returns outside the building, the position of the wall is moved towards the direction of the glass wall which leads to the effect that the recorded track is not closed. 1297

7 Figure 14. Glass-Surfaces which influence the tracking results. Figure 12. Closed Loop -scenario: ground truth (dashed,black) and recorded track (blue). Opened doors in the hallways sometimes yield the problem that the tracking-system associates the corresponding doors as walls and therefore the whole hallway is shifted as it can be seen in the left picture of figure 15. Figure 15. Passage doors: internal map of the tracking-system (left) and passage door in the hallway (right). Figure 13. Internal map of the tracking-system. F. Challenges The key challenge that occurs in all considered scenarios are glass-surfaces: Due to the high transparency of window glass the laser beams penetrate these surfaces nearly without reflection. Hence the system cannot identify these walls as a limitation of the hallway. In the situation of the first scenario, it happens that for single measurements the corner is not detected due to a glass door (see figure 14 top right) which causes higher deviations at the reference points P4 and P5. Furthermore the right wall (seen from running direction) of the second hallway in the first scenario consists almost entirely out of glass (see figure 14 top left), which involves that the right wall is moved in some test runs to the right (seen from running direction) and therefore the recorded track does not fit to the actually walked track. The reason for this problem is the detection of objects outside the building by the laser scanner which involves a wrong interpretation of these LIDARmeasurements. This is due to the processing of the data by the ICP-algorithm. In the second scenario deviations occur due to the glass surface at the end of the hallway (see figure 14 bottom left). In the Outdoor-Indoor -scenario glass doors have to be passed (see figure 14 bottom right) which is a challenge for the exact person tracking, due to the distraction of the laser beams. Due to the small detection area of the laser scanner the test person has to mind that the head has to be kept in an appropriate position to avoid a loss of LIDAR-measurements. Leaving this area of detection means a loss of LIDARmeasurements. The tracking of the person then continues, but without a laser supported update. VI. CONCLUSIONS AND FUTURE WORK We have shown that quasi-horizontal scans of a LIDAR, which is mounted on a pedestrian, can be used in combination with inertial navigation for indoor localization and mapping. Moreover it could be proven numerically that the presented tracking system delivers stable and accurate results for simple scenarios like straight tracks. Even in more complicated situations like the Corner - and the Zig-Zag -scenario the system was able to track the pedestrian quite stable on average. But also a major problem becomes visible when considering the results of the visual and numerical evaluation in detail: Glass surfaces that do not reflect laser beams yield to a mismatch from previously detected indoor features and objects that were detected outside. These wrong assignments between the LIDAR-measurement data yield to high deviations in the respective test runs. A solution to overcome the described challenges could be the use of a 3D laser-scanner which enables the system to exploit environment information that are not available when using only horizontal LIDARmeasurements: For example the floor and the ceiling of a 1298

8 hallway are incorporated as usually reflecting areas into the ICP-algorithm when using a 3D laser scanner. Therefore glass surfaces at the walls and opened passage doors would not influence the tracking system with regard to stability and accuracy. Furthermore the problem of the irregularity in the movements of humans could be solved using a 3D laser scanner. REFERENCES [1] B. Cinaz and H. Kenn. HeadSLAM - Simultaneous Localization and Mapping with Head-Mounted Inertial and Laser Range Sensors. Proc. 12th IEEE Int. Symp. Wearable Computers, IEEE CS Press, 28, pages 3 1. [2] E. Foxlin. Pedestrian Tracking with Shoe-Mounted Inertial Sensors. IEEE Computer Society Press, 25. [3] Oliver J. Woodman. An Introduction to Inertial Navigation. UCAM- CL-TR-696, ISSN , University of Cambridge, August 27. [4] L. Ojeda and J. Borenstein. Non-GPS Navigation with the Personal Dead-Reckoning System. SPIE Defense and Security Conference, Unmanned Systems Technology IX, Orlando, Florida, April 9-13, 27. [5] D. Holz, D. Droeschel, S. Behnke, S. May, and H. Surmann. Fast 3D Perception for Collision Avoidance and SLAM in Domestic Environments. In Mobile Robots Navigation, Alejandra Barrera (Ed.), ISBN: , pages IN-TECH Education and Publishing, Vienna, Austria, 21. [6] H. El Mokni, L. Broetje, F. Govaers, M. Wieneke. Coupled Sonar Inertial Navigation System for Pedestrian Tracking. International Conference on Information Fusion, 21. [7] C. Kessler, C. Ascher, C. Trommer. Multi-Sensor Indoor Navigation System with Vision- and Laser-based Localization and Mapping Capabilities. European Journal of Navigation, Vol.9, Nr.3 December 211. [8] [9] [1] LASER4.html [11] J.J. Leonard, H.J.S. Feder, A Computationally Efficient Method for Large-Scale Concurrent Mapping and Localization. Ninth Int. Symp. on Robotics Research. Salt Lake City, Utah, USA, [12] D. Chekhlov, M. Pupilli, W. Mayol-Cuevas, and A. Calway. Real-time and robust monocular SLAM using predictive multiresolution descriptors. In Proc. 2nd Int. Symp. on Visual Computing, pages , 26. [13] S. Thrun, Y. Liu, D. Koller, A.Y Ng, Z. Ghahramani, H. Durrant- Whyte. Simultaneous Localization and Mapping with Sparse Extended Information Filters. International Journal of Robotics Research, Vol. 23, No. 7-8, pages , 24. [14] G. Grisetti et al. Improved Techniques for Grid Mapping with Rao- Blackwellized Particle Filters. In IEEE Transactions on Robotics, pages 34-46,

Coupled sonar inertial navigation system for pedestrian tracking

Coupled sonar inertial navigation system for pedestrian tracking Coupled sonar inertial navigation system for pedestrian tracking Hichem El Mokni*, Lars Broetje*, Felix Govaers*, Monika Wieneke* * Sensor Data- and Information Fusion Fraunhofer-Institut für Kommunikation,

More information

Testing the Possibilities of Using IMUs with Different Types of Movements

Testing the Possibilities of Using IMUs with Different Types of Movements 137 Testing the Possibilities of Using IMUs with Different Types of Movements Kajánek, P. and Kopáčik A. Slovak University of Technology, Faculty of Civil Engineering, Radlinského 11, 81368 Bratislava,

More information

Satellite and Inertial Navigation and Positioning System

Satellite and Inertial Navigation and Positioning System Satellite and Inertial Navigation and Positioning System Project Proposal By: Luke Pfister Dan Monroe Project Advisors: Dr. In Soo Ahn Dr. Yufeng Lu EE 451 Senior Capstone Project December 10, 2009 PROJECT

More information

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang MULTI-MODAL MAPPING Robotics Day, 31 Mar 2017 Frank Mascarich, Shehryar Khattak, Tung Dang Application-Specific Sensors Cameras TOF Cameras PERCEPTION LiDAR IMU Localization Mapping Autonomy Robotic Perception

More information

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement

More information

L17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms

L17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms L17. OCCUPANCY MAPS NA568 Mobile Robotics: Methods & Algorithms Today s Topic Why Occupancy Maps? Bayes Binary Filters Log-odds Occupancy Maps Inverse sensor model Learning inverse sensor model ML map

More information

Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design

Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design Sebastian Andersson School of Software Engineering Tongji University Shanghai, China

More information

ROTATING IMU FOR PEDESTRIAN NAVIGATION

ROTATING IMU FOR PEDESTRIAN NAVIGATION ROTATING IMU FOR PEDESTRIAN NAVIGATION ABSTRACT Khairi Abdulrahim Faculty of Science and Technology Universiti Sains Islam Malaysia (USIM) Malaysia A pedestrian navigation system using a low-cost inertial

More information

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK Indoor navigation using smartphones Chris Hide IESSG, University of Nottingham, UK Overview Smartphones Available sensors Current positioning methods Positioning research at IESSG 1. Wi-Fi fingerprinting

More information

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than An Omnidirectional Vision System that finds and tracks color edges and blobs Felix v. Hundelshausen, Sven Behnke, and Raul Rojas Freie Universität Berlin, Institut für Informatik Takustr. 9, 14195 Berlin,

More information

Exam in DD2426 Robotics and Autonomous Systems

Exam in DD2426 Robotics and Autonomous Systems Exam in DD2426 Robotics and Autonomous Systems Lecturer: Patric Jensfelt KTH, March 16, 2010, 9-12 No aids are allowed on the exam, i.e. no notes, no books, no calculators, etc. You need a minimum of 20

More information

navigation Isaac Skog

navigation Isaac Skog Foot-mounted zerovelocity aided inertial navigation Isaac Skog skog@kth.se Course Outline 1. Foot-mounted inertial navigation a. Basic idea b. Pros and cons 2. Inertial navigation a. The inertial sensors

More information

PacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang

PacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang PacSLAM Arunkumar Byravan, Tanner Schmidt, Erzhuo Wang Project Goals The goal of this project was to tackle the simultaneous localization and mapping (SLAM) problem for mobile robots. Essentially, the

More information

Building Reliable 2D Maps from 3D Features

Building Reliable 2D Maps from 3D Features Building Reliable 2D Maps from 3D Features Dipl. Technoinform. Jens Wettach, Prof. Dr. rer. nat. Karsten Berns TU Kaiserslautern; Robotics Research Lab 1, Geb. 48; Gottlieb-Daimler- Str.1; 67663 Kaiserslautern;

More information

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Vehicle Localization. Hannah Rae Kerner 21 April 2015 Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular

More information

Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation

Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation Patrick Dunau 1 Fraunhofer-Institute, of Optronics, Image Exploitation and System Technologies (IOSB), Gutleuthausstr.

More information

W4. Perception & Situation Awareness & Decision making

W4. Perception & Situation Awareness & Decision making W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic

More information

3D LIDAR Point Cloud based Intersection Recognition for Autonomous Driving

3D LIDAR Point Cloud based Intersection Recognition for Autonomous Driving 3D LIDAR Point Cloud based Intersection Recognition for Autonomous Driving Quanwen Zhu, Long Chen, Qingquan Li, Ming Li, Andreas Nüchter and Jian Wang Abstract Finding road intersections in advance is

More information

Aided-inertial for Long-term, Self-contained GPS-denied Navigation and Mapping

Aided-inertial for Long-term, Self-contained GPS-denied Navigation and Mapping Aided-inertial for Long-term, Self-contained GPS-denied Navigation and Mapping Erik Lithopoulos, Louis Lalumiere, Ron Beyeler Applanix Corporation Greg Spurlock, LTC Bruce Williams Defense Threat Reduction

More information

Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System

Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System Under supervision of: Prof. Dr. -Ing. Klaus-Dieter Kuhnert Dipl.-Inform.

More information

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models Introduction ti to Embedded dsystems EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping Gabe Hoffmann Ph.D. Candidate, Aero/Astro Engineering Stanford University Statistical Models

More information

MEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS

MEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS MEMS technology quality requirements as applied to multibeam echosounder Jerzy DEMKOWICZ, Krzysztof BIKONIS Gdansk University of Technology Gdansk, Narutowicza str. 11/12, Poland demjot@eti.pg.gda.pl Small,

More information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information Proceedings of the World Congress on Electrical Engineering and Computer Systems and Science (EECSS 2015) Barcelona, Spain July 13-14, 2015 Paper No. 335 Efficient SLAM Scheme Based ICP Matching Algorithm

More information

Camera Drones Lecture 2 Control and Sensors

Camera Drones Lecture 2 Control and Sensors Camera Drones Lecture 2 Control and Sensors Ass.Prof. Friedrich Fraundorfer WS 2017 1 Outline Quadrotor control principles Sensors 2 Quadrotor control - Hovering Hovering means quadrotor needs to hold

More information

AN INCREMENTAL SLAM ALGORITHM FOR INDOOR AUTONOMOUS NAVIGATION

AN INCREMENTAL SLAM ALGORITHM FOR INDOOR AUTONOMOUS NAVIGATION 20th IMEKO TC4 International Symposium and 18th International Workshop on ADC Modelling and Testing Research on Electric and Electronic Measurement for the Economic Upturn Benevento, Italy, September 15-17,

More information

Probabilistic Robotics

Probabilistic Robotics Probabilistic Robotics FastSLAM Sebastian Thrun (abridged and adapted by Rodrigo Ventura in Oct-2008) The SLAM Problem SLAM stands for simultaneous localization and mapping The task of building a map while

More information

Inertial Navigation Systems

Inertial Navigation Systems Inertial Navigation Systems Kiril Alexiev University of Pavia March 2017 1 /89 Navigation Estimate the position and orientation. Inertial navigation one of possible instruments. Newton law is used: F =

More information

CAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING

CAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING CAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING By Michael Lowney Senior Thesis in Electrical Engineering University of Illinois at Urbana-Champaign Advisor: Professor Minh Do May 2015

More information

Low Cost solution for Pose Estimation of Quadrotor

Low Cost solution for Pose Estimation of Quadrotor Low Cost solution for Pose Estimation of Quadrotor mangal@iitk.ac.in https://www.iitk.ac.in/aero/mangal/ Intelligent Guidance and Control Laboratory Indian Institute of Technology, Kanpur Mangal Kothari

More information

Collaboration is encouraged among small groups (e.g., 2-3 students).

Collaboration is encouraged among small groups (e.g., 2-3 students). Assignments Policies You must typeset, choices: Word (very easy to type math expressions) Latex (very easy to type math expressions) Google doc Plain text + math formula Your favorite text/doc editor Submit

More information

6D SLAM with Kurt3D. Andreas Nüchter, Kai Lingemann, Joachim Hertzberg

6D SLAM with Kurt3D. Andreas Nüchter, Kai Lingemann, Joachim Hertzberg 6D SLAM with Kurt3D Andreas Nüchter, Kai Lingemann, Joachim Hertzberg University of Osnabrück, Institute of Computer Science Knowledge Based Systems Research Group Albrechtstr. 28, D-4969 Osnabrück, Germany

More information

CS 4758 Robot Navigation Through Exit Sign Detection

CS 4758 Robot Navigation Through Exit Sign Detection CS 4758 Robot Navigation Through Exit Sign Detection Aaron Sarna Michael Oleske Andrew Hoelscher Abstract We designed a set of algorithms that utilize the existing corridor navigation code initially created

More information

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation 242 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 17, NO. 3, JUNE 2001 Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation José E. Guivant and Eduardo

More information

Robot Localization based on Geo-referenced Images and G raphic Methods

Robot Localization based on Geo-referenced Images and G raphic Methods Robot Localization based on Geo-referenced Images and G raphic Methods Sid Ahmed Berrabah Mechanical Department, Royal Military School, Belgium, sidahmed.berrabah@rma.ac.be Janusz Bedkowski, Łukasz Lubasiński,

More information

Camera and Inertial Sensor Fusion

Camera and Inertial Sensor Fusion January 6, 2018 For First Robotics 2018 Camera and Inertial Sensor Fusion David Zhang david.chao.zhang@gmail.com Version 4.1 1 My Background Ph.D. of Physics - Penn State Univ. Research scientist at SRI

More information

CS4758: Moving Person Avoider

CS4758: Moving Person Avoider CS4758: Moving Person Avoider Yi Heng Lee, Sze Kiat Sim Abstract We attempt to have a quadrotor autonomously avoid people while moving through an indoor environment. Our algorithm for detecting people

More information

5. Tests and results Scan Matching Optimization Parameters Influence

5. Tests and results Scan Matching Optimization Parameters Influence 126 5. Tests and results This chapter presents results obtained using the proposed method on simulated and real data. First, it is analyzed the scan matching optimization; after that, the Scan Matching

More information

Selection and Integration of Sensors Alex Spitzer 11/23/14

Selection and Integration of Sensors Alex Spitzer 11/23/14 Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs

More information

Perspective Sensing for Inertial Stabilization

Perspective Sensing for Inertial Stabilization Perspective Sensing for Inertial Stabilization Dr. Bernard A. Schnaufer Jeremy Nadke Advanced Technology Center Rockwell Collins, Inc. Cedar Rapids, IA Agenda Rockwell Collins & the Advanced Technology

More information

LOAM: LiDAR Odometry and Mapping in Real Time

LOAM: LiDAR Odometry and Mapping in Real Time LOAM: LiDAR Odometry and Mapping in Real Time Aayush Dwivedi (14006), Akshay Sharma (14062), Mandeep Singh (14363) Indian Institute of Technology Kanpur 1 Abstract This project deals with online simultaneous

More information

Localization, Where am I?

Localization, Where am I? 5.1 Localization, Where am I?? position Position Update (Estimation?) Encoder Prediction of Position (e.g. odometry) YES matched observations Map data base predicted position Matching Odometry, Dead Reckoning

More information

E80. Experimental Engineering. Lecture 9 Inertial Measurement

E80. Experimental Engineering. Lecture 9 Inertial Measurement Lecture 9 Inertial Measurement http://www.volker-doormann.org/physics.htm Feb. 19, 2013 Christopher M. Clark Where is the rocket? Outline Sensors People Accelerometers Gyroscopes Representations State

More information

Improving Door Detection for Mobile Robots by fusing Camera and Laser-Based Sensor Data

Improving Door Detection for Mobile Robots by fusing Camera and Laser-Based Sensor Data Improving Door Detection for Mobile Robots by fusing Camera and Laser-Based Sensor Data Jens Hensler, Michael Blaich, and Oliver Bittel University of Applied Sciences Brauneggerstr. 55, 78462 Konstanz,

More information

Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery

Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery 1 Charles TOTH, 1 Dorota BRZEZINSKA, USA 2 Allison KEALY, Australia, 3 Guenther RETSCHER,

More information

Localization and Map Building

Localization and Map Building Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples

More information

Geometrical Feature Extraction Using 2D Range Scanner

Geometrical Feature Extraction Using 2D Range Scanner Geometrical Feature Extraction Using 2D Range Scanner Sen Zhang Lihua Xie Martin Adams Fan Tang BLK S2, School of Electrical and Electronic Engineering Nanyang Technological University, Singapore 639798

More information

Indoor positioning based on foot-mounted IMU

Indoor positioning based on foot-mounted IMU BULLETIN OF THE POLISH ACADEMY OF SCIENCES TECHNICAL SCIENCES, Vol. 63, No. 3, 2015 DOI: 10.1515/bpasts-2015-0074 Indoor positioning based on foot-mounted IMU H. GUO 1, M. URADZINSKI 2, H. YIN 1, and M.

More information

(1) and s k ωk. p k vk q

(1) and s k ωk. p k vk q Sensing and Perception: Localization and positioning Isaac Sog Project Assignment: GNSS aided INS In this project assignment you will wor with a type of navigation system referred to as a global navigation

More information

Aided-inertial for GPS-denied Navigation and Mapping

Aided-inertial for GPS-denied Navigation and Mapping Aided-inertial for GPS-denied Navigation and Mapping Erik Lithopoulos Applanix Corporation 85 Leek Crescent, Richmond Ontario, Canada L4B 3B3 elithopoulos@applanix.com ABSTRACT This paper describes the

More information

MAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER

MAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER MAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Engineering By SHASHIDHAR

More information

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM Jorma Selkäinaho, Aarne Halme and Janne Paanajärvi Automation Technology Laboratory, Helsinki University of Technology, Espoo,

More information

This chapter explains two techniques which are frequently used throughout

This chapter explains two techniques which are frequently used throughout Chapter 2 Basic Techniques This chapter explains two techniques which are frequently used throughout this thesis. First, we will introduce the concept of particle filters. A particle filter is a recursive

More information

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 9 Toyomi Fujita and Yuya Kondo Tohoku Institute of Technology Japan 1. Introduction A 3D configuration and terrain sensing

More information

Implementation of Odometry with EKF for Localization of Hector SLAM Method

Implementation of Odometry with EKF for Localization of Hector SLAM Method Implementation of Odometry with EKF for Localization of Hector SLAM Method Kao-Shing Hwang 1 Wei-Cheng Jiang 2 Zuo-Syuan Wang 3 Department of Electrical Engineering, National Sun Yat-sen University, Kaohsiung,

More information

Programming-By-Example Gesture Recognition Kevin Gabayan, Steven Lansel December 15, 2006

Programming-By-Example Gesture Recognition Kevin Gabayan, Steven Lansel December 15, 2006 Programming-By-Example Gesture Recognition Kevin Gabayan, Steven Lansel December 15, 6 Abstract Machine learning and hardware improvements to a programming-by-example rapid prototyping system are proposed.

More information

Probabilistic Robotics. FastSLAM

Probabilistic Robotics. FastSLAM Probabilistic Robotics FastSLAM The SLAM Problem SLAM stands for simultaneous localization and mapping The task of building a map while estimating the pose of the robot relative to this map Why is SLAM

More information

Autonomous Landing of an Unmanned Aerial Vehicle

Autonomous Landing of an Unmanned Aerial Vehicle Autonomous Landing of an Unmanned Aerial Vehicle Joel Hermansson, Andreas Gising Cybaero AB SE-581 12 Linköping, Sweden Email: {joel.hermansson, andreas.gising}@cybaero.se Martin Skoglund and Thomas B.

More information

UAV Autonomous Navigation in a GPS-limited Urban Environment

UAV Autonomous Navigation in a GPS-limited Urban Environment UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight

More information

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Przemys law G asior, Stanis law Gardecki, Jaros law Gośliński and Wojciech Giernacki Poznan University of

More information

DriftLess Technology to improve inertial sensors

DriftLess Technology to improve inertial sensors Slide 1 of 19 DriftLess Technology to improve inertial sensors Marcel Ruizenaar, TNO marcel.ruizenaar@tno.nl Slide 2 of 19 Topics Problem, Drift in INS due to bias DriftLess technology What is it How it

More information

A METHOD OF MAP MATCHING FOR PERSONAL POSITIONING SYSTEMS

A METHOD OF MAP MATCHING FOR PERSONAL POSITIONING SYSTEMS The 21 st Asian Conference on Remote Sensing December 4-8, 2000 Taipei, TAIWA A METHOD OF MAP MATCHIG FOR PERSOAL POSITIOIG SSTEMS Kay KITAZAWA, usuke KOISHI, Ryosuke SHIBASAKI Ms., Center for Spatial

More information

Test Report iµvru. (excerpt) Commercial-in-Confidence. imar Navigation GmbH Im Reihersbruch 3 D St. Ingbert Germany.

Test Report iµvru. (excerpt) Commercial-in-Confidence. imar Navigation GmbH Im Reihersbruch 3 D St. Ingbert Germany. 1 of 11 (excerpt) Commercial-in-Confidence imar Navigation GmbH Im Reihersbruch 3 D-66386 St. Ingbert Germany www.imar-navigation.de sales@imar-navigation.de 2 of 11 CHANGE RECORD Date Issue Paragraph

More information

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz Humanoid Robotics Monte Carlo Localization Maren Bennewitz 1 Basis Probability Rules (1) If x and y are independent: Bayes rule: Often written as: The denominator is a normalizing constant that ensures

More information

Matching Evaluation of 2D Laser Scan Points using Observed Probability in Unstable Measurement Environment

Matching Evaluation of 2D Laser Scan Points using Observed Probability in Unstable Measurement Environment Matching Evaluation of D Laser Scan Points using Observed Probability in Unstable Measurement Environment Taichi Yamada, and Akihisa Ohya Abstract In the real environment such as urban areas sidewalk,

More information

Motion estimation of unmanned marine vehicles Massimo Caccia

Motion estimation of unmanned marine vehicles Massimo Caccia Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it

More information

Rigorous Scan Data Adjustment for kinematic LIDAR systems

Rigorous Scan Data Adjustment for kinematic LIDAR systems Rigorous Scan Data Adjustment for kinematic LIDAR systems Paul Swatschina Riegl Laser Measurement Systems ELMF Amsterdam, The Netherlands 13 November 2013 www.riegl.com Contents why kinematic scan data

More information

ADVANTAGES OF INS CONTROL SYSTEMS

ADVANTAGES OF INS CONTROL SYSTEMS ADVANTAGES OF INS CONTROL SYSTEMS Pavol BOŽEK A, Aleksander I. KORŠUNOV B A Institute of Applied Informatics, Automation and Mathematics, Faculty of Material Science and Technology, Slovak University of

More information

Simultaneous Localization and Mapping (SLAM)

Simultaneous Localization and Mapping (SLAM) Simultaneous Localization and Mapping (SLAM) RSS Lecture 16 April 8, 2013 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 SLAM Problem Statement Inputs: No external coordinate reference Time series of

More information

Lecture 13 Visual Inertial Fusion

Lecture 13 Visual Inertial Fusion Lecture 13 Visual Inertial Fusion Davide Scaramuzza Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve

More information

A High Precision Reference Data Set for Pedestrian Navigation using Foot-Mounted Inertial Sensors

A High Precision Reference Data Set for Pedestrian Navigation using Foot-Mounted Inertial Sensors 200 International Conference on Indoor Positioning and Indoor Navigation (IPIN), 5-7 September 200, Zürich, Switzerland A High Precision Reference Data Set for Pedestrian Navigation using Foot-Mounted

More information

Inertial Navigation Static Calibration

Inertial Navigation Static Calibration INTL JOURNAL OF ELECTRONICS AND TELECOMMUNICATIONS, 2018, VOL. 64, NO. 2, PP. 243 248 Manuscript received December 2, 2017; revised April, 2018. DOI: 10.24425/119518 Inertial Navigation Static Calibration

More information

3-D MAP GENERATION BY A MOBILE ROBOT EQUIPPED WITH A LASER RANGE FINDER. Takumi Nakamoto, Atsushi Yamashita, and Toru Kaneko

3-D MAP GENERATION BY A MOBILE ROBOT EQUIPPED WITH A LASER RANGE FINDER. Takumi Nakamoto, Atsushi Yamashita, and Toru Kaneko 3-D AP GENERATION BY A OBILE ROBOT EQUIPPED WITH A LAER RANGE FINDER Takumi Nakamoto, Atsushi Yamashita, and Toru Kaneko Department of echanical Engineering, hizuoka Univerty 3-5-1 Johoku, Hamamatsu-shi,

More information

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017 Electric Electronic Engineering Bogazici University December 13, 2017 Absolute position measurement Outline Motion Odometry Inertial systems Environmental Tactile Proximity Sensing Ground-Based RF Beacons

More information

EE368 Project: Visual Code Marker Detection

EE368 Project: Visual Code Marker Detection EE368 Project: Visual Code Marker Detection Kahye Song Group Number: 42 Email: kahye@stanford.edu Abstract A visual marker detection algorithm has been implemented and tested with twelve training images.

More information

Evaluation and Comparison of Performance Analysis of Indoor Inertial Navigation system Based on Foot Mounted IMU

Evaluation and Comparison of Performance Analysis of Indoor Inertial Navigation system Based on Foot Mounted IMU Evaluation and Comparison of Performance Analysis of Indoor Inertial Navigation system Based on Foot Mounted IMU Feyissa Woyano ab, Soyeon Lee b, Sangjoon Park ab a Department of Computer Software and,

More information

Basics of Localization, Mapping and SLAM. Jari Saarinen Aalto University Department of Automation and systems Technology

Basics of Localization, Mapping and SLAM. Jari Saarinen Aalto University Department of Automation and systems Technology Basics of Localization, Mapping and SLAM Jari Saarinen Aalto University Department of Automation and systems Technology Content Introduction to Problem (s) Localization A few basic equations Dead Reckoning

More information

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Stable Vision-Aided Navigation for Large-Area Augmented Reality Stable Vision-Aided Navigation for Large-Area Augmented Reality Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu Supun Samarasekera, Rakesh Teddy Kumar Vision and Robotics Laboratory SRI-International Sarnoff,

More information

Error Simulation and Multi-Sensor Data Fusion

Error Simulation and Multi-Sensor Data Fusion Error Simulation and Multi-Sensor Data Fusion AERO4701 Space Engineering 3 Week 6 Last Week Looked at the problem of attitude determination for satellites Examined several common methods such as inertial

More information

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds.

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds. Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds. SIMULATION RESULTS FOR LOCALIZATION AND MAPPING ALGORITHMS

More information

GPS denied Navigation Solutions

GPS denied Navigation Solutions GPS denied Navigation Solutions Krishnraj Singh Gaur and Mangal Kothari ksgaur@iitk.ac.in, mangal@iitk.ac.in https://www.iitk.ac.in/aero/mangal/ Intelligent Guidance and Control Laboratory Indian Institute

More information

Sensory Augmentation for Increased Awareness of Driving Environment

Sensory Augmentation for Increased Awareness of Driving Environment Sensory Augmentation for Increased Awareness of Driving Environment Pranay Agrawal John M. Dolan Dec. 12, 2014 Technologies for Safe and Efficient Transportation (T-SET) UTC The Robotics Institute Carnegie

More information

Evaluating the Performance of a Vehicle Pose Measurement System

Evaluating the Performance of a Vehicle Pose Measurement System Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of

More information

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion 007 IEEE International Conference on Robotics and Automation Roma, Italy, 0-4 April 007 FrE5. Accurate Motion Estimation and High-Precision D Reconstruction by Sensor Fusion Yunsu Bok, Youngbae Hwang,

More information

Satellite/Inertial Navigation and Positioning System (SINAPS)

Satellite/Inertial Navigation and Positioning System (SINAPS) Satellite/Inertial Navigation and Positioning System (SINAPS) Functional Requirements List and Performance Specifications by Daniel Monroe, Luke Pfister Advised By Drs. In Soo Ahn and Yufeng Lu ECE Department

More information

Calibration of a rotating multi-beam Lidar

Calibration of a rotating multi-beam Lidar The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Calibration of a rotating multi-beam Lidar Naveed Muhammad 1,2 and Simon Lacroix 1,2 Abstract

More information

An actor-critic reinforcement learning controller for a 2-DOF ball-balancer

An actor-critic reinforcement learning controller for a 2-DOF ball-balancer An actor-critic reinforcement learning controller for a 2-DOF ball-balancer Andreas Stückl Michael Meyer Sebastian Schelle Projektpraktikum: Computational Neuro Engineering 2 Empty side for praktikums

More information

Towards Optimal 3D Point Clouds

Towards Optimal 3D Point Clouds By Andreas Nüchter, Jan Elseberg and Dorit Borrmann, Germany feature Automation in 3D Mobile Laser Scanning Towards Optimal 3D Point Clouds Motivated by the increasing need for rapid characterisation of

More information

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The

More information

Sensor Modalities. Sensor modality: Different modalities:

Sensor Modalities. Sensor modality: Different modalities: Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature

More information

Performance Evaluation of INS Based MEMES Inertial Measurement Unit

Performance Evaluation of INS Based MEMES Inertial Measurement Unit Int'l Journal of Computing, Communications & Instrumentation Engg. (IJCCIE) Vol. 2, Issue 1 (215) ISSN 2349-1469 EISSN 2349-1477 Performance Evaluation of Based MEMES Inertial Measurement Unit Othman Maklouf

More information

Detection and Tracking of Moving Objects Using 2.5D Motion Grids

Detection and Tracking of Moving Objects Using 2.5D Motion Grids Detection and Tracking of Moving Objects Using 2.5D Motion Grids Alireza Asvadi, Paulo Peixoto and Urbano Nunes Institute of Systems and Robotics, University of Coimbra September 2015 1 Outline: Introduction

More information

CS 4758: Automated Semantic Mapping of Environment

CS 4758: Automated Semantic Mapping of Environment CS 4758: Automated Semantic Mapping of Environment Dongsu Lee, ECE, M.Eng., dl624@cornell.edu Aperahama Parangi, CS, 2013, alp75@cornell.edu Abstract The purpose of this project is to program an Erratic

More information

The Performance Evaluation of the Integration of Inertial Navigation System and Global Navigation Satellite System with Analytic Constraints

The Performance Evaluation of the Integration of Inertial Navigation System and Global Navigation Satellite System with Analytic Constraints Journal of Environmental Science and Engineering A 6 (2017) 313-319 doi:10.17265/2162-5298/2017.06.005 D DAVID PUBLISHING The Performance Evaluation of the Integration of Inertial Navigation System and

More information

Exterior Orientation Parameters

Exterior Orientation Parameters Exterior Orientation Parameters PERS 12/2001 pp 1321-1332 Karsten Jacobsen, Institute for Photogrammetry and GeoInformation, University of Hannover, Germany The georeference of any photogrammetric product

More information

Simultaneous Localization

Simultaneous Localization Simultaneous Localization and Mapping (SLAM) RSS Technical Lecture 16 April 9, 2012 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 Navigation Overview Where am I? Where am I going? Localization Assumed

More information

SLAM: Robotic Simultaneous Location and Mapping

SLAM: Robotic Simultaneous Location and Mapping SLAM: Robotic Simultaneous Location and Mapping William Regli Department of Computer Science (and Departments of ECE and MEM) Drexel University Acknowledgments to Sebastian Thrun & others SLAM Lecture

More information

Graph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle)

Graph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle) Graph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle) Taekjun Oh 1), Sungwook Jung 2), Seungwon Song 3), and Hyun Myung 4) 1), 2), 3), 4) Urban

More information

TEST EXAM PART 2 INTERMEDIATE LAND NAVIGATION

TEST EXAM PART 2 INTERMEDIATE LAND NAVIGATION NAME DATE TEST EXAM PART 2 INTERMEDIATE LAND NAVIGATION 1. Knowing these four basic skills, it is impossible to be totally lost; what are they? a. Track Present Location / Determine Distance / Sense of

More information

3D-Reconstruction of Indoor Environments from Human Activity

3D-Reconstruction of Indoor Environments from Human Activity 3D-Reconstruction of Indoor Environments from Human Activity Barbara Frank Michael Ruhnke Maxim Tatarchenko Wolfram Burgard Abstract Observing human activities can reveal a lot about the structure of the

More information

Evaluation of a laser-based reference system for ADAS

Evaluation of a laser-based reference system for ADAS 23 rd ITS World Congress, Melbourne, Australia, 10 14 October 2016 Paper number ITS- EU-TP0045 Evaluation of a laser-based reference system for ADAS N. Steinhardt 1*, S. Kaufmann 2, S. Rebhan 1, U. Lages

More information