Outdoor Exploration and SLAM using a Compressed Filter

Size: px
Start display at page:

Download "Outdoor Exploration and SLAM using a Compressed Filter"

Transcription

1 Outdoor Exploration and SLAM using a Compressed Filter John Folkesson and Henrik Christensen Centre for Autonomous Systems Numerical Analysis and Computer Science Royal Institute of Technology Stockholm, Sweden, johnf@nadakthse Abstract In this paper we describe the use of automatic exploration for autonomous mapping of outdoor scenes We describe a real-time SLAM implementation along with an autonomous exploration algorithm We have implemented SLAM with a compressed extended Kalman filter (CEKF) on an outdoor robot Our implementation uses walls of buildings as features The state predictions are made by using a combination of odometry and inertial data The system was tested on a 200 x 200 m site with 18 buildings on variable terrain The paper helps explain some of the implementation details of the compressed filter such as, how to organize the map as well as more general issues like, how to include the effects of pitch and roll and efficient feature detection I INTRODUCTION Outdoor environments present many challenges for an autonomous vehicle With respect to indoor environments, distances and speeds are higher, the ground surface is not level, there are more movements and changes in the surroundings and generally there is a larger variety of structures The robot can encounter everything from the smooth walls of buildings to rocky hills covered with vegetation Robot autonomy requires availability of location information to enable task achievement The exact precision requirements for localisation will in general depend upon the specific task GPS can in some cases solve the localization problem but one often is more interested in relative positions So that it may be more useful to know that the robot is at an intersection or in front of a building then the exact latitude and longitude of the robot A robot can localize itself on a suitable map of landmarks if such is available, [1], [2] and [3] If no such map is available a map can be created by the robot while it simultaneously tracks its position on the map This is known as SLAM, simultaneous localization and mapping Good SLAM is considered to be essential for mobile robots to become autonomous SLAM is just begining to be used in real-time to control robot behaviour Newman et al, [4] have used real-time SLAM while controlling a robot remotely during the explore stage and then having the robot return home by retracing its path Much recent progress has been made on the SLAM problem There are a variety of tools now available to solve the problem Particle filters and Kalman filters are the two main subdivisions of these methods We focus on the Kalman filter approach in this paper The Kalman filter works well as long as errors are small and data association never is wrong The idea is to make small adjustments to the positions of all the features of the map along with the position of the robot based on the difference between the current observation and the value expected for that observation as given by the map and dead-reckoning As the errors must be kept small, non-linear dependencies can be linearized resulting in an extended Kalman filter, EKF, [5] The EKF is a recursive algorithm in which the update equation for the state are all linear The state vector consists of the position and orientation of the robot, POSE, along with the positions of all the features on the map The heart of the EKF approach is the covariance matrix of the state vector This covariance matrix is the glue which holds the map together The correlations between coordinates tell us how much a change in one will pull the other with it Thus one can discover that a wall should be turned 1 degree but the correlation to the angle of another wall might be high so that it too must turn 1 degree Unfortunately, keeping track of all these correlations is computationally expensive This limits the size of maps that can be calculated in real-time using a Kalman filter One solution to this is the compressed extended Kalman filter, CEKF, [6] The idea with the CEKF is that one need not calculate the whole map on each iteration One need only update the part of the map one can see The update of the rest is postponed The information to do the eventual update must be saved and Guivant and Nebot, [6], show how to save the needed information in a compact form After each iteration one needs to update a few matrices that are of order the size of that part of the map that you can see These matrices are much smaller that the full map size so that the calculation is only a problem when a full update is done, typically after several hundred iterations of the compressed filter An important point is that the CEKF is entirely equivalent to the EKF The final results are the same but the calculations are reduced During the map building

2 phase, some parts of the map will not reflect the recent observations This is not usually a problem as the map does not change dramatically in areas far from the current visible region At any time a full update can be done with the full computation complexity of a single EKF iteration We have implemented the CEKF on an outdoor robot from irobot, the ARTV2 The robot is equiped with a dual pentium computer a SICK laser scanner, a Crossbow DMU-FOG inertial sensor and an array of 12 sonar sensors One goal was to be able to bring the robot someplace, designate an area to map and tell the robot to explore The robot would then go out, drive around the area and return with a map There is thus a need to complement the traditional SLAM methods with techniques for autonomous exploration of a region to ensure a complete and consistent map For this we need to provide a SLAM method operating in real-time, and an exploration method We chose to use the CEKF as a basis for the SLAM to ensure bounded computational complexity II THE COMPRESSED FILTER As we stated, the main idea of the compressed EKF is that the map can be separated into two parts, the part being observed and the unobserved part We will refer to the part being observed as the visible map rather than local map to emphasis that it only contains recently seen features The key insight is that while one makes updates to the visible map using the observations, one can accumulate the effect of these updates on the rest of the map in a set of matrices of order the size of the visible map Thus, the calculation burden of each iteration become nearly constant, until one needs to update the whole map The equations needed to do this are given in [6] What was not so clearly stated there but is quite important in a real implementation is that as one observes a feature from the part of the map not being updated one can selectively update only that feature In other words one can move a single feature up to the visible map without the need for a full global update This is important since a full update could take several seconds or even minutes This would force the robot to stop and think often By only updating the features as needed the robot can operate non-stop A full mathmatical description of how a single point can be moved from a local map up to the visable map would take too much space here but we can outline the basic idea by using the notation from [6] The update matricies can be extended by adding row i=a+1 thus: Φ Φ a 1 a 1 k aa k 0 (1) P ia 0 Ψ aa k 1 Ψ a 1 a 1 k Ψ aa k (2) θ θ a 1 k a k (3) 0 Here by P i a 0 we mean the ith row of the covariance matrix at iteration 0, assuming the visable map was formed at iteration 0 The subscript a indicates the subspace of the state space cooresponding to the visable map The covariance and the position of the added point will be calcualted in the normal way for the compressed filter only we just calcualte the ith row A careful examination at the update equations will show that equations (1) to (3) are what one would get if one had included the extra row from the formation of the visable map In our implementation a visible map begins with just the three pose coordinates as the state vector, x r k As features are actually observed, they are put on the visible map Once the visible map becomes larger than a specified size a new visible map is begun with only the pose on it The old visable map becomes a local map from which the current visable map can draw features One can think of the visible map as being at the current time while the local map is at an earlier time From that point on, as features from the older map are observed again they will have to be brought up to the current time using the update matrices for the current visible map Then as this new visible map becomes too large the process is repeated There is now a chain of three maps, two local maps at earlier times and the current visible map Now observed features not on the visible map can be one of three types: 1) A new feature not on any local map will be simply added to the current visible map 2) A feature from the previous, (second), local map will need to be brought up to date using the current update matricies 3) A feature from the first local map will need to first be brought up to the time of the second local map and then up to the current time This process continues opening new visible maps and chaining them together, so that each map in this chain has a father and a son When a feature is observed the visible map is asked for the feature If the feature is not on the visible map, and thus not up to date, the visible map asks his father for the feature This call then will travel down the chain until it gets to the map that has the feature Then the updates are done for that feature back up the chain, with each map updating the feature and then turning it over to its son, who does the same and so on In figure 1 we have illustrated the idea There we show three maps chained together by the update matrices ψi, φi and θi The first map, LM1, is at a time t 1 the second LM2 at t 2, while the visable map, VM, is at the current time, t 3 As the robot continues down the street, wall A

3 LM1 LM2 VM Ψ2, Φ2, θ2 Ψ1, Φ1, θ1 Fig 1 Here we illustrate the operation of the compressed filter The dotted lines enclose each of three maps, LM1, LM2 and VM VM is the visible map at the current time LM1 ans LM2 are local maps at earlier times The robot moves in the direction shown and wall A is observed Since it is on LM1 it will need to be first added to LM2 and then to VM before it can be updated with the observation will come into view At that point the feature matcher will ask VM for wall A As VM does not have such a wall it will ask its father, LM2 for the wall LM2 will similarly ask LM1 LM1 will update the wall with ψ1, φ1 and θ1 before giving the wall to LM2, and so on It is this feature update that limits the size of the map or rather the chain The size of the global map does not actually ever affect the complexity of the calculation unless a global update is required for some reason Global updates are not required for pose estimation or local navigation However if this chain of maps gets to be too long, the observation of features from a map far down or at the bottom of the chain could lead to significantly longer update times Thus it may be necessary to stop the robot and do a global update from time to time in order to collapse the chain III FEATURE DETECTION Our features were walls, which appear as straight lines in the laser scan data We use two clues to help us find these lines First we expect that the lines found in the last scan will be found again in about the same location We can then use the predicted motion of the laser to narrow our search for these lines Of course, the lines may be a bit longer or shorter We refer to these lines as tracked walls Second we use the existing map and the predicted pose to do an imaginary scan on the map from the pose This imaginary scan will then give us a list of walls that we should be able to see along with the minimum and maximum, range and bearing values in the laser scan We refer to these lines as map walls The search for both tracked walls and map walls is done using a range weighed Hough transform, (RWHT) [7], where we only use those scans that have angles and ranges consistent with the line we are looking for We similarly limit the accumulator space to the region of the line we are looking for One can think of a 4-D search space formed from the scan angle, scan length, wall distance and wall angle By restricting the search in this space we can save wall A considerable computation We begin the search with the tracked walls then look for the map walls As the walls are found the scan segments assigned to the wall are removed so that the scan data becomes less after each wall is found Finally the scan points that were not assigned to any wall are then used to fill up a RWHT accumulator The single maximum value from the accumulator is selected as a new line By only extracting one new line from each scan we avoid having to look though the accumulator after filling it, one simply keeps track of the maximum as one fills the accumulator This then makes the Hough transform much faster As the new wall will then become a tracked wall we can find a different wall in the next scan What this means is if we start with 5 walls in view it will take 5 scans to find the shortest one Once the robot is in motion new walls generally appear one at a time so that this is not a problem in practice As previously unseen walls are detected they will be tested to see if they should be added to the map In this test use is made of the previous scan data for the tracked wall This test then checks the variance of the points from a straight line, lenght of the continuous line, with no gaps of more than 30 cm, and the number of points per meter of length, the point density Based on thresholds on these the wall might be added to the map Adding the wall could be done in several ways One choice is to add the wall at the position given by the accumulated data but with a very large variance Then let the subsequent measurements bring the covariance matrix to a smaller value This leads to a consistant map It also throws out all the information gathered up to the time the wall was added When we tried this it proved to indeed give nicely behaved covariance matricies, however the resulting maps were poor We therefore use an inconsistant method to add walls We add the wall with the variance given by the variance in the scan data plus the variance in the pose at the time we add the wall, the coorelation to the pose is also estimated This leads to overly optimistic covariances as the correlations to the other walls are not included The maps turn out much better which makes the consistant choice above harder to chose IV DEAD-RECKONING Our robot uses skid steering to turn This type of steering will not give accurate odometry data while turning This is because the turning angle will depend on the road surface, turning radius, incline of the ground and so on One only need see the robot turning on a sandy slope while simultaneously sliding downhill to realize the uselessness of the odometry data while turning For this reason we augmented the odometer with an inertial sensor

4 We use a Crossbow DMU-FOG sensor, which is a combination of 3 micro-machined linear accelerometers and 3 fiber optic gyros The linear accelerometers are used to correct for drift in the sensor s angle for the vertical direction The average linear acceleration is assumed to be due to gravity and the pitch and roll angles are adjusted to agree with this fact over long times The relative weight given to the gyros with respect to the accelerometers is adjustable internally by setting a parameter called the erection rate We set this to 10 degrees per minute We use the inertial sensor to keep track of the vertical axis, which we simply accept as adequate for our application, and use the gyros along with the vertical axis to try and estimate incremental rotations in the horizontal plane These pitch and roll angles are also used to adjust the measurements that go into the Kalman filter update phase We take readings of these adjusted pitch and roll angles along with the unadjusted angular velocities, along the three axes, at 35 msec intervals In order to remove some of the noise, we low pass filter the angular velocities The low pass filter is implemented by a recursive formula: ω k a ω k 1 1 a ω k (4) Here ω k is the angular velocity vector in 3 dimensions and ω k is the filtered angular velocity This filter introduces a delay To see the effect of the delay we can imagine a unit pulse of angular velocity occurring at time k=0 The resulting angle θ k will then be given by, θ k 1 a k 1 k 0 (5) Here we face a trade off, by making a small we can hold the delay small but we will have noisier angle readings We chose a =6 which means that the unit pulse takes 35 sample times to reach 90% of the correct angle This would lead to a problem with data registration The time stamp of the angular velocities will need to be adjusted before combining with pitch and roll and with the odometry and laser data We tried a few values and settled on subtracting 2 sample times from the unfiltered inertial time stamp to get the filtered timestamp At this point it may be good to point out that in order to do the SLAM update step all sensor data will be interpolated to the time of the laser scan data, which is 200 msec old when we get it Thus delaying the inertial timestamp by 70 msec does not cause us to have to extrapolate angular velocities It also should be said that we choose an integer number of sample times for the delay because we need to include the pitch and roll at the delayed time with the filtered angular velocities By saving these numbers from two samples back we can align the inertial data set at the delayed time We use these pitch and roll angles from two samples earlier in order to rotate the filtered angular velocities into the direction of the true vertical 1 We then integrate the result to get an angle θ, which is the orientation within a horizontal x-y plane The reading of θ is then used in the prediction phase of the CEKF There we use an angle defined with respect to the robot frame so that theta, pitch and roll angles are used to calculate a rotation about the robot s own z-axis The pose coordinate x 4 is then defined as the rotation about the x-axis, followed by x 3 about the y-axis and then x 2 about the z-axis This is the opposite order for the rotations from the sensor itself The advantage of the change is that the laser scanner has the same axis as x 2 This makes the implementation of the CEKF slightly easier as one need only transform the map points by a projection of the 3-D rotation on a tilted 2-D plane The problem then looks like a 2-D SLAM problem with a strange rotation matrix: cos x 2 sec x 3 Asin x 2 sin x 2 sec x 4 sin x 2 sec x 3 Acos x 2 cos x 2 sec x 4 where A tan x 3 tan x 4 Changes in the angle x 2 will be the same as the rotations in the odometry model Thus, the change in x 2 can be directly measured by odometry The assumption is that the tilt angle from the inertial sensor is good enough Although the tilt is not completely accurate, estimating it based on its small coupling to wall distance and angle would likely make the Kalman filter behave badly One could easily see the filter generating steep tilt angles to try to explain the measurement errors On the other hand incorporating the tilt as directly given by the sensor will tend to make the errors smaller and the filter better behaved in environments with significant slopes The variance in the incremental change in robot orientation, θ, as measured over an interval t is taken as σ 2 inertial θ t rads This is consistent with attempts to measure the error directly tempered by the CEKF tuning 2 The variance in the inertial sensor s x 2 is then: σ 2 inertial θ σ 2 post θ t 2 (6) rads 2 (7) 1 This is not just to take the projection of ω in the vertical direction! 2 The constant term may bother some but t=0 is not important This and the odometry error are adaquately approximated by the formulas in the regions that are important

5 where σ 2 post θ is the error in x 2 after the last update step and t is the time interval since that update in seconds This assumes that the error is uncorrelated over time This can be compared with the error in odometry angle, which is taken as: Odometer Data Odometer + Inertial SLAM path σ 2 odometry θ σ 2 post θ s f r θ (8) 0 5 f 1 5 r 13 r 1 40r 2 13 r 5 (9) 001 r 5 where r= s θ is the turning radius and s = linear distance traveled This model is empirical For small values of r the formula gives a far too conservative estimate of the error but this is used to effectively turn off the odometry estimate and use only inertial readings in this case Thus the poor error estimate does not materially affect the final turning error as shown below One sees that the error from the inertial sensor is proportional to the time interval (constant 200 msec in our case), while the odometry error varies like the change in angle Therefore, when standing still or moving straight the odometry error will be much smaller than the inertial error When turning, on the other hand, the inertial error will be smaller We use a weighted average of these two measurements based on this observation Where, A x 2 w θ odometry w 1 w θ inertial (10) σinertial 2 θ σpost 2 θ A (11) σ 2 inertial θ σ 2 odometry θ 2 σ 2 post θ (12) This equation gives us a smooth switching between odometry and inertial sensor estimates When the robot is moving straight the w will be close to 1 While turning rapidly it will be close to 0 Now note that the estimates of the incremental changes in x 2 are independent, but the post values are the same (100% correlated) Thus the variance in x 2 is given by: σ 2 22 w2 σ 2 inertial θ 1 w 2 σ 2 odometry θ 2w 1 w σ 2 post θ (13) Finally the variance in the odometry distance traveled estimate is given by σ 2 odometry s s (14) This rather simple model worked well The geometry of the robot path between updates then determines how Fig 2 Here is an illustration of the benefit of adding inertial sensor data to our odometry data The robot begins at the origin and the true path can be taken as the SLAM path, the lowest in the figure the path obtained by fusing inertial data with the odometer data, shown in the middle curve, is clearly much closer to the true path than the odometer data alone, shown in the top curve the variance in the incremental changes in distance and angle generate the full 3 x 3 covariance matrix and the other matrices needed for the CEKF update in the usual way In figure 2 we show the comparison of the deadreckoning with and without the inertial data being used The dotted line is the path using the SLAM algorithm and can be used as a true path for this test The path is clearly much improved by using the inertial sensor Besides the benefit to dead-reckoning the inertial sensor gives us valuable pitch and roll information to correct the laser scans with Without those corrections outdoor SLAM can not be done At least if one is not restricted to level outdoor areas V MAP REPRESENTATION We chose to represent the map as a collection of points A wall is then are given by its two endpoints This representation has a few drawbacks which turn out to be outweighed by its advantages A popular alternative way to represent walls in the EKF is as an angle and distance from the origin This at first seems better as the number of dimensions is cut in half, two per wall verses four Also the measurements of walls consist of the angle and distance to the wall There is no need to represent dimensions that are not observed The main advantage of our representation of walls is that the last sentence above does not apply We do extract the start or end angle of the wall from the scan when possible and use it in the Kalman filter update Although most of the lines do not allow positive identification of the endpoint, the few that do add valuable information to the map We were able to run through our data samples with and without endpoint detection and in each case the endpoints made a significant improvement

6 One could of course add corners and door openings as separate features, but one then needs to somehow force the corners to stay on the walls With our representation the corners are just two walls sharing an endpoint An observation of one of the walls automatically changes the other wall through the movements of the common point Everything happens inside the Kalman filter We do end up with a calculation penalty as most of our measurements end up being two values which then change four map coordinates Thanks to the time saving of the compressed filter we can afford a bit of extra work We still maintain a real-time SLAM implementation VI EXPLORE Having SLAM working we began to work on the autonomous explore behavior We first needed a reliable obstacle avoidance behavior Obstacles were detected and avoided using the laser scanner and the sonar Each of the points detected by the sonar and the laser became sources for a repulsive vector field As the laser can only see 180 degrees in front of the robot we needed to fill in the backside by remembering prior scan data adjusted by odometry data We thus store 360 scan points with the actual last scan as the first 181 and the remembered scans filling out the back We do the same for sonar trying to maintain a 360 degree scan that we think is clear of obstacles We then calculate a vector force from each scan element pointing away from the scan point with a magnitude r r , where r is the scan distance in cm One should note that as an object gets closer, the number of scan points from objects increases like 1/r Thus the force from an object will increase like 1 r This effectively sets a 90 cm radius shell about the robot The summed forces from the laser and sonar are added with the laser being given a 5 times higher weight This then added to a constant vector pointing to the goal point gives the direction command to the motors The explore module then need only provide suitable goals that will keep the robot out of trouble while covering the region Our algorithm for doing this was simple and robust The final explore path would not be optimal in the sense of covering the region in the shortest time or even covering the region but it does work fairly well and is a good start for further work We start by doing a fat scan in all directions That is we use the laser scan data to search for a 3 m wide open path We then chose the longest such fat scan direction and place an endpoint there We then form an edge as a line between our current position and the endpoint All subsequent scans and pose data will be transformed to the edge coordinate system This has origin at the starting point and y axis towards the endpoint This gives us a easy way to tell if a scan is right of our path, x 0 and how back A x y C right B forward Fig 3 This is an illustration of out autonomous exploration algorithm The dotted lines represent the width of the fat scan taken at A and used to set up the first edge AB of the explore Upon reaching point C the robot detects an opening on its right A new fat scan is done and the longest forward, back and right scans are used to form the choices for continued exploration far along the path it is, the y value This is illustrated in figure 3 The robot starts from point A and finds that the longest fat scan is to point B We then travel the edge As we go we take each scan and separate it into right and left We use the right to look for continuous blockage along the right of our path and similar for the left Once one of these indicates a gap large enough the explore stops the robot for a decision This is what has happened at point C where the robot was unable to continue the wall on its right any further The robot then must enumerate the choices from the current position This is done by rotating the robot 360 degrees and doing a fat scan at each angle The longest fat scan in each of 6 60 sectors is saved These are then further reduced to 4 representing the choices forward, backward, right and left There may be less than four choices available as very short fat scans are not considered In the figure the robot has found three choices, straight, right and back The robot then must decide among these choices and then set a new edge to travel along The problem then becomes that of making the decision We do not have the space here to describe in details our decision algorithm but we will give the main outline The explore has a state that determines the decision So that he may be in a right turn state in which case he will turn right if he can otherwise go straight an so on There are actually 9 states After making a decision the state changes at random according to a probability distribution that depends on the current state The probabilities were chosen to favor making complete circuits of buildings So that the probability of making four lefts in a row is higher than alternating left then right By adding a random element to the decision we avoid most of the trapped in a loop type problems Since we maintain no global explore graph or grid we have no way of knowing if we are going over the same area again The explore is reactive behavior with almost no state information at all One could easily imagine saving the coordinates of points

7 start Fig 4 This is an example of a typical SLAM test at the mock town The terrain behind the buildings where we start is very rough with the pitch and roll angles reaching maximums of 30, respective 10 degrees The results show that we can build quite accurate maps over the difficult terrain Fig 5 This is an example of a small explore test at the mock town This is the real-time data saved at the time of the explore 100 such as A and C in the figure Then forming a graph with these decision points as nodes and the robot path as edges One could then store the unexplored choices at each node after makeing a decision This could be formed into a structure that could be used to find unexplored edges and to eventually plan paths through the environment VII EXPERIMENTS We were fortunate to have access to a military urban warfare training facility outside of Stockholm There we have a mock town with 18 mock buildings arranged along several streets There are various vehicles and debris lying in the streets as well The road surfaces are unpaved complete with large ruts, holes and rocks The road is nowhere level and the slopes can be steep enough for the robot to start to slide sideways To tune and test our slam program we used a few large data sets taken by driving the robot by hand We drove the robot through a large nearly closed loop in the area and the results are shown in Figure 4 This is a typical result from our SLAM tests in the area The dotted lines show the survey maps of the area and the solid lines are the result of the SLAM We did some smaller tests of our explore behavior as well, shown in Figure 5 Here the robot was given a rectangle to explore which consisted of a length of the street and told to explore The result was a nice path and a good map We also drove the robot outside of our lab, Figure 6 This building is about 70 m square with fences around most of it The robot was taken out to a spot in the parking lot between the fence and the building The robot was then start Fig 6 This is an example of autonomous exploration and map making The dotted lines represent the actual building The solid lines are the walls, fences and cars that the robot built into a map The robot path is shown as the continuous curve encircling the building This is also the real-time data saved at the time of the explore Despite the errors the robot maintained a good enough idea of its pose to carry out a complete exploration Distances are in meters told to explore a large rectangular region chosen to enclose the entire building The robot then drove the path shown while simultaneously calculating his path and the map as shown At two points human intervention was required Once the robot was heading for a patch of grass that would be torn up by its wheels A human stood in its way to prevent that At another point the robot had trouble seeing a low bench with its sonar and drove too close to get away It was driven by hand a few feet and told to resume exploring As one can see, the map has two points where errors

8 occurred These are shown by the arrows starting from the robot and pointing to the feature that caused the trouble The first point was just before turning the first corner There its position estimate slide backwards by about 2 m This turned out to be due to a fence that the robot could see through and was partially hidden by vegetation This was sometimes seen as being closer and sometimes further away We could eliminate the poor observations by making the feature threshold a bit tighter but this caused some good measurements to be lost resulting in a worse map The second place is similar and occurred before the last turn There the robot was traveling down a slope and the laser scanner was seeing the ground The ground looked much like a wall Once the ground was put on the map as a wall the robot tried to localize with it As the robot moved down the slope the wall seemed to move as well which was interpreted as pose errors These problems are being investigated further to see how to best eliminate them Nevertheless the test was not disappointing as it simply shows the limitations of our SLAM algorithm The explore behaviour worked well despite the errors in the localization Our SLAM worked very well in the rather artifical environment of the mock town but could not fully cope with the variation in features in the more real-world environment outside our lab CONCLUSIONS We implemented one of the first systems that actually uses real-time SLAM in the execution of an autonomous behaviour on a mobile robot, explore The special problems associated with doing SLAM outdoors were largely overcome by the inclusion of the pitch and roll angles into the estimates The SLAM implementation uses a compressed filter and walls as features This favored a novel structure for the map as a chain of local maps at different times The explore algorithm was simple and reactive which has the advantage of being robust and the disadvantage of not having any sort of global knowledge that could garrentee complete coverage The results show that large maps in difficult terrain can be made accurately in some cases when the robot is driven by hand Also smaller explore regions in clean environments were handled nicely The limit was reached with a large explore in a complex real outdoor environment There the explore worked well but the extraction and matching of walls had some errors which prevented closing the large loop This demonstrats the biggest problem with our SLAM method, data association We have been able to filter out some of the mistakes in data association since this experiment was done, however, in the real world there will always be situtaions that cause bad features to be used We have seen problems with laser reflections from fences appearing to come from behind the fence We have seen busses added to the map only to cause confusion when they begin to slowly drive away People can look like trees, hills like walls and so on The weakness of the method is that once such an error occurs the map is distorted in ways that don t fit into the error model and can not be corrected without going back in time and reruning the algorithm without the mistake Even that would require that one could locate the problem feature What is needed is a method that automatically detects these problems and corrects them This is an area for future work ACKNOWLEDGMENT This research has been sponsored by the Swedish Defence Material Administration and the Swedish Foundation for Strategic Research through the Centre for Autonomous Systems The funding is gratefully acknowledged VIII REFERENCES [1] P Jensfelt, Approaches to mobile robot localization in indoor Environments, PhD thesis, Royal Institute of Technology, Sweden, 2001 [2] J Leonard and HF Durrant-Whyte, Mobile robot localization by tracking geometric beacon, IEEE Transactions on Robotics and Automation, vol 7, no 3, pp , June 1991 [3] R Chatila and J P Laumond, Position referencing and consistent world modeling for mobile robots, in Proc of the IEEE International Conference on Robotics and Automation (ICRA93), 1985, vol 1, pp [4] P Newman, J P Laumond, J D Tardos, and J Neira, Explore and return: Experimental validation of realtime concurrent mapping and localization, in Proc of the IEEE International Conference on Robotics and Automation (ICRA02), 2002, vol 1 [5] MWM Gamini Dissanayake, Paul Newman, Steven Clark, HF Durrant-Whyte, and M Corba, A solution to the simultaneous localization and map building (slam) problem, IEEE Transactions on Robotics and Automation, vol 17, no 3, pp , June 2001 [6] Jose E Guivant and Eduardo Mario Nebot, Optimization of the simultaneous localization and mapbuilding algorithm for real-time implementation, IEEE Transactions on Robotics and Automation, vol 17, no 3, pp , June 2001 [7] J Forsberg, P Åhman, and Wernersson, The hough transform inside the feedback loop of a mobile robot, in Proc of the IEEE International Conference on Robotics and Automation (ICRA93), 1993, vol 1, pp

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation 242 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 17, NO. 3, JUNE 2001 Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation José E. Guivant and Eduardo

More information

Exam in DD2426 Robotics and Autonomous Systems

Exam in DD2426 Robotics and Autonomous Systems Exam in DD2426 Robotics and Autonomous Systems Lecturer: Patric Jensfelt KTH, March 16, 2010, 9-12 No aids are allowed on the exam, i.e. no notes, no books, no calculators, etc. You need a minimum of 20

More information

Robot Localization based on Geo-referenced Images and G raphic Methods

Robot Localization based on Geo-referenced Images and G raphic Methods Robot Localization based on Geo-referenced Images and G raphic Methods Sid Ahmed Berrabah Mechanical Department, Royal Military School, Belgium, sidahmed.berrabah@rma.ac.be Janusz Bedkowski, Łukasz Lubasiński,

More information

Robotics Project. Final Report. Computer Science University of Minnesota. December 17, 2007

Robotics Project. Final Report. Computer Science University of Minnesota. December 17, 2007 Robotics Project Final Report Computer Science 5551 University of Minnesota December 17, 2007 Peter Bailey, Matt Beckler, Thomas Bishop, and John Saxton Abstract: A solution of the parallel-parking problem

More information

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM Jorma Selkäinaho, Aarne Halme and Janne Paanajärvi Automation Technology Laboratory, Helsinki University of Technology, Espoo,

More information

High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications

High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications Proceedings of the 2000 IEEE International Conference on Robotics & Automation San Francisco, CA April 2000 High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications Jose Guivant, Eduardo

More information

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Vehicle Localization. Hannah Rae Kerner 21 April 2015 Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular

More information

Geometrical Feature Extraction Using 2D Range Scanner

Geometrical Feature Extraction Using 2D Range Scanner Geometrical Feature Extraction Using 2D Range Scanner Sen Zhang Lihua Xie Martin Adams Fan Tang BLK S2, School of Electrical and Electronic Engineering Nanyang Technological University, Singapore 639798

More information

Practical Robotics (PRAC)

Practical Robotics (PRAC) Practical Robotics (PRAC) A Mobile Robot Navigation System (1) - Sensor and Kinematic Modelling Nick Pears University of York, Department of Computer Science December 17, 2014 nep (UoY CS) PRAC Practical

More information

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

Robotics. Lecture 5: Monte Carlo Localisation. See course website  for up to date information. Robotics Lecture 5: Monte Carlo Localisation See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review:

More information

Worksheet Answer Key: Scanning and Mapping Projects > Mine Mapping > Investigation 2

Worksheet Answer Key: Scanning and Mapping Projects > Mine Mapping > Investigation 2 Worksheet Answer Key: Scanning and Mapping Projects > Mine Mapping > Investigation 2 Ruler Graph: Analyze your graph 1. Examine the shape formed by the connected dots. i. Does the connected graph create

More information

Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map

Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Sebastian Scherer, Young-Woo Seo, and Prasanna Velagapudi October 16, 2007 Robotics Institute Carnegie

More information

Data Association for SLAM

Data Association for SLAM CALIFORNIA INSTITUTE OF TECHNOLOGY ME/CS 132a, Winter 2011 Lab #2 Due: Mar 10th, 2011 Part I Data Association for SLAM 1 Introduction For this part, you will experiment with a simulation of an EKF SLAM

More information

DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda**

DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda** DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda** * Department of Mechanical Engineering Technical University of Catalonia (UPC)

More information

Localization and Map Building

Localization and Map Building Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples

More information

LOAM: LiDAR Odometry and Mapping in Real Time

LOAM: LiDAR Odometry and Mapping in Real Time LOAM: LiDAR Odometry and Mapping in Real Time Aayush Dwivedi (14006), Akshay Sharma (14062), Mandeep Singh (14363) Indian Institute of Technology Kanpur 1 Abstract This project deals with online simultaneous

More information

Robotics (Kinematics) Winter 1393 Bonab University

Robotics (Kinematics) Winter 1393 Bonab University Robotics () Winter 1393 Bonab University : most basic study of how mechanical systems behave Introduction Need to understand the mechanical behavior for: Design Control Both: Manipulators, Mobile Robots

More information

A single ping showing threshold and principal return. ReturnStrength Range(m) Sea King Feature Extraction. Y (m) X (m)

A single ping showing threshold and principal return. ReturnStrength Range(m) Sea King Feature Extraction. Y (m) X (m) Autonomous Underwater Simultaneous Localisation and Map Building Stefan B. Williams, Paul Newman, Gamini Dissanayake, Hugh Durrant-Whyte Australian Centre for Field Robotics Department of Mechanical and

More information

Graphical SLAM - A Self-Correcting Map

Graphical SLAM - A Self-Correcting Map Graphical SLAM - A Self-Correcting Map John Folkesson and Henrik Christensen Centre for Autonomous Systems, Royal Institute of Technology Stockholm, Sweden, johnf@nada.kth.se Abstract In this paper we

More information

Robot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1

Robot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1 Robot Mapping SLAM Front-Ends Cyrill Stachniss Partial image courtesy: Edwin Olson 1 Graph-Based SLAM Constraints connect the nodes through odometry and observations Robot pose Constraint 2 Graph-Based

More information

Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle

Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle Tommie J. Liddy and Tien-Fu Lu School of Mechanical Engineering; The University

More information

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2

More information

Comparison of integrated GPS-IMU aided by map matching and stand-alone GPS aided by map matching for urban and suburban areas

Comparison of integrated GPS-IMU aided by map matching and stand-alone GPS aided by map matching for urban and suburban areas Comparison of integrated GPS-IMU aided by map matching and stand-alone GPS aided by map matching for urban and suburban areas Yashar Balazadegan Sarvrood and Md. Nurul Amin, Milan Horemuz Dept. of Geodesy

More information

Evaluating the Performance of a Vehicle Pose Measurement System

Evaluating the Performance of a Vehicle Pose Measurement System Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of

More information

Building Reliable 2D Maps from 3D Features

Building Reliable 2D Maps from 3D Features Building Reliable 2D Maps from 3D Features Dipl. Technoinform. Jens Wettach, Prof. Dr. rer. nat. Karsten Berns TU Kaiserslautern; Robotics Research Lab 1, Geb. 48; Gottlieb-Daimler- Str.1; 67663 Kaiserslautern;

More information

Unit 2: Locomotion Kinematics of Wheeled Robots: Part 3

Unit 2: Locomotion Kinematics of Wheeled Robots: Part 3 Unit 2: Locomotion Kinematics of Wheeled Robots: Part 3 Computer Science 4766/6778 Department of Computer Science Memorial University of Newfoundland January 28, 2014 COMP 4766/6778 (MUN) Kinematics of

More information

Using 3D Laser Range Data for SLAM in Outdoor Environments

Using 3D Laser Range Data for SLAM in Outdoor Environments Using 3D Laser Range Data for SLAM in Outdoor Environments Christian Brenneke, Oliver Wulf, Bernardo Wagner Institute for Systems Engineering, University of Hannover, Germany [brenneke, wulf, wagner]@rts.uni-hannover.de

More information

Final Exam Practice Fall Semester, 2012

Final Exam Practice Fall Semester, 2012 COS 495 - Autonomous Robot Navigation Final Exam Practice Fall Semester, 2012 Duration: Total Marks: 70 Closed Book 2 hours Start Time: End Time: By signing this exam, I agree to the honor code Name: Signature:

More information

Encoder applications. I Most common use case: Combination with motors

Encoder applications. I Most common use case: Combination with motors 3.5 Rotation / Motion - Encoder applications 64-424 Intelligent Robotics Encoder applications I Most common use case: Combination with motors I Used to measure relative rotation angle, rotational direction

More information

Physics 101, Lab 1: LINEAR KINEMATICS PREDICTION SHEET

Physics 101, Lab 1: LINEAR KINEMATICS PREDICTION SHEET Physics 101, Lab 1: LINEAR KINEMATICS PREDICTION SHEET After reading through the Introduction, Purpose and Principles sections of the lab manual (and skimming through the procedures), answer the following

More information

MAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER

MAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER MAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Engineering By SHASHIDHAR

More information

5/27/12. Objectives. Plane Curves and Parametric Equations. Sketch the graph of a curve given by a set of parametric equations.

5/27/12. Objectives. Plane Curves and Parametric Equations. Sketch the graph of a curve given by a set of parametric equations. Objectives Sketch the graph of a curve given by a set of parametric equations. Eliminate the parameter in a set of parametric equations. Find a set of parametric equations to represent a curve. Understand

More information

Integrated systems for Mapping and Localization

Integrated systems for Mapping and Localization Integrated systems for Mapping and Localization Patric Jensfelt, Henrik I Christensen & Guido Zunino Centre for Autonomous Systems, Royal Institute of Technology SE 1 44 Stockholm, Sweden fpatric,hic,guidozg@nada.kth.se

More information

To Measure a Constant Velocity. Enter.

To Measure a Constant Velocity. Enter. To Measure a Constant Velocity Apparatus calculator, black lead, calculator based ranger (cbr, shown), Physics application this text, the use of the program becomes second nature. At the Vernier Software

More information

Fast Local Planner for Autonomous Helicopter

Fast Local Planner for Autonomous Helicopter Fast Local Planner for Autonomous Helicopter Alexander Washburn talexan@seas.upenn.edu Faculty advisor: Maxim Likhachev April 22, 2008 Abstract: One challenge of autonomous flight is creating a system

More information

Character Recognition

Character Recognition Character Recognition 5.1 INTRODUCTION Recognition is one of the important steps in image processing. There are different methods such as Histogram method, Hough transformation, Neural computing approaches

More information

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots D-SLAM: Decoupled Localization and Mapping for Autonomous Robots Zhan Wang, Shoudong Huang, and Gamini Dissanayake ARC Centre of Excellence for Autonomous Systems (CAS), Faculty of Engineering, University

More information

EE565:Mobile Robotics Lecture 3

EE565:Mobile Robotics Lecture 3 EE565:Mobile Robotics Lecture 3 Welcome Dr. Ahmad Kamal Nasir Today s Objectives Motion Models Velocity based model (Dead-Reckoning) Odometry based model (Wheel Encoders) Sensor Models Beam model of range

More information

Introduction to Motion

Introduction to Motion Date Partners Objectives: Introduction to Motion To investigate how motion appears on a position versus time graph To investigate how motion appears on a velocity versus time graph and the relationship

More information

MTRX4700: Experimental Robotics

MTRX4700: Experimental Robotics Stefan B. Williams April, 2013 MTR4700: Experimental Robotics Assignment 3 Note: This assignment contributes 10% towards your final mark. This assignment is due on Friday, May 10 th during Week 9 before

More information

Motion estimation of unmanned marine vehicles Massimo Caccia

Motion estimation of unmanned marine vehicles Massimo Caccia Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: EKF-based SLAM Dr. Kostas Alexis (CSE) These slides have partially relied on the course of C. Stachniss, Robot Mapping - WS 2013/14 Autonomous Robot Challenges Where

More information

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 9 Toyomi Fujita and Yuya Kondo Tohoku Institute of Technology Japan 1. Introduction A 3D configuration and terrain sensing

More information

2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media,

2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, 25 IEEE Personal use of this material is permitted Permission from IEEE must be obtained for all other uses in any current or future media including reprinting/republishing this material for advertising

More information

Mobile robot localisation and navigation using multi-sensor fusion via interval analysis and UKF

Mobile robot localisation and navigation using multi-sensor fusion via interval analysis and UKF Mobile robot localisation and navigation using multi-sensor fusion via interval analysis and UKF Immanuel Ashokaraj, Antonios Tsourdos, Peter Silson and Brian White. Department of Aerospace, Power and

More information

DIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY

DIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY DIGITAL SURFACE MODELS OF CITY AREAS BY VERY HIGH RESOLUTION SPACE IMAGERY Jacobsen, K. University of Hannover, Institute of Photogrammetry and Geoinformation, Nienburger Str.1, D30167 Hannover phone +49

More information

Relating Local Vision Measurements to Global Navigation Satellite Systems Using Waypoint Based Maps

Relating Local Vision Measurements to Global Navigation Satellite Systems Using Waypoint Based Maps Relating Local Vision Measurements to Global Navigation Satellite Systems Using Waypoint Based Maps John W. Allen Samuel Gin College of Engineering GPS and Vehicle Dynamics Lab Auburn University Auburn,

More information

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Simon Thompson and Satoshi Kagami Digital Human Research Center National Institute of Advanced

More information

ACTIVITY TWO CONSTANT VELOCITY IN TWO DIRECTIONS

ACTIVITY TWO CONSTANT VELOCITY IN TWO DIRECTIONS 1 ACTIVITY TWO CONSTANT VELOCITY IN TWO DIRECTIONS Purpose The overall goal of this activity is for students to analyze the motion of an object moving with constant velocity along a diagonal line. In this

More information

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots

D-SLAM: Decoupled Localization and Mapping for Autonomous Robots D-SLAM: Decoupled Localization and Mapping for Autonomous Robots Zhan Wang, Shoudong Huang, and Gamini Dissanayake ARC Centre of Excellence for Autonomous Systems (CAS) Faculty of Engineering, University

More information

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture 16 Cutting Plane Algorithm We shall continue the discussion on integer programming,

More information

Basics of Computational Geometry

Basics of Computational Geometry Basics of Computational Geometry Nadeem Mohsin October 12, 2013 1 Contents This handout covers the basic concepts of computational geometry. Rather than exhaustively covering all the algorithms, it deals

More information

ROBOTICS AND AUTONOMOUS SYSTEMS

ROBOTICS AND AUTONOMOUS SYSTEMS ROBOTICS AND AUTONOMOUS SYSTEMS Simon Parsons Department of Computer Science University of Liverpool LECTURE 6 PERCEPTION/ODOMETRY comp329-2013-parsons-lect06 2/43 Today We ll talk about perception and

More information

Localization of Multiple Robots with Simple Sensors

Localization of Multiple Robots with Simple Sensors Proceedings of the IEEE International Conference on Mechatronics & Automation Niagara Falls, Canada July 2005 Localization of Multiple Robots with Simple Sensors Mike Peasgood and Christopher Clark Lab

More information

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016 edestrian Detection Using Correlated Lidar and Image Data EECS442 Final roject Fall 2016 Samuel Rohrer University of Michigan rohrer@umich.edu Ian Lin University of Michigan tiannis@umich.edu Abstract

More information

Robotics and Autonomous Systems

Robotics and Autonomous Systems Robotics and Autonomous Systems Lecture 6: Perception/Odometry Terry Payne Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception

More information

Robotics and Autonomous Systems

Robotics and Autonomous Systems Robotics and Autonomous Systems Lecture 6: Perception/Odometry Simon Parsons Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception

More information

A Simplified Vehicle and Driver Model for Vehicle Systems Development

A Simplified Vehicle and Driver Model for Vehicle Systems Development A Simplified Vehicle and Driver Model for Vehicle Systems Development Martin Bayliss Cranfield University School of Engineering Bedfordshire MK43 0AL UK Abstract For the purposes of vehicle systems controller

More information

Testing the Possibilities of Using IMUs with Different Types of Movements

Testing the Possibilities of Using IMUs with Different Types of Movements 137 Testing the Possibilities of Using IMUs with Different Types of Movements Kajánek, P. and Kopáčik A. Slovak University of Technology, Faculty of Civil Engineering, Radlinského 11, 81368 Bratislava,

More information

Calibration of a rotating multi-beam Lidar

Calibration of a rotating multi-beam Lidar The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Calibration of a rotating multi-beam Lidar Naveed Muhammad 1,2 and Simon Lacroix 1,2 Abstract

More information

Implementation of Odometry with EKF for Localization of Hector SLAM Method

Implementation of Odometry with EKF for Localization of Hector SLAM Method Implementation of Odometry with EKF for Localization of Hector SLAM Method Kao-Shing Hwang 1 Wei-Cheng Jiang 2 Zuo-Syuan Wang 3 Department of Electrical Engineering, National Sun Yat-sen University, Kaohsiung,

More information

CS 4758 Robot Navigation Through Exit Sign Detection

CS 4758 Robot Navigation Through Exit Sign Detection CS 4758 Robot Navigation Through Exit Sign Detection Aaron Sarna Michael Oleske Andrew Hoelscher Abstract We designed a set of algorithms that utilize the existing corridor navigation code initially created

More information

CSE 490R P1 - Localization using Particle Filters Due date: Sun, Jan 28-11:59 PM

CSE 490R P1 - Localization using Particle Filters Due date: Sun, Jan 28-11:59 PM CSE 490R P1 - Localization using Particle Filters Due date: Sun, Jan 28-11:59 PM 1 Introduction In this assignment you will implement a particle filter to localize your car within a known map. This will

More information

Matrices. Chapter Matrix A Mathematical Definition Matrix Dimensions and Notation

Matrices. Chapter Matrix A Mathematical Definition Matrix Dimensions and Notation Chapter 7 Introduction to Matrices This chapter introduces the theory and application of matrices. It is divided into two main sections. Section 7.1 discusses some of the basic properties and operations

More information

FACOLTÀ DI INGEGNERIA DELL INFORMAZIONE ELECTIVE IN ROBOTICS. Quadrotor. Motion Planning Algorithms. Academic Year

FACOLTÀ DI INGEGNERIA DELL INFORMAZIONE ELECTIVE IN ROBOTICS. Quadrotor. Motion Planning Algorithms. Academic Year FACOLTÀ DI INGEGNERIA DELL INFORMAZIONE ELECTIVE IN ROBOTICS Quadrotor Motion Planning Algorithms Prof. Marilena Vendittelli Prof. Jean-Paul Laumond Jacopo Capolicchio Riccardo Spica Academic Year 2010-2011

More information

EKF Localization and EKF SLAM incorporating prior information

EKF Localization and EKF SLAM incorporating prior information EKF Localization and EKF SLAM incorporating prior information Final Report ME- Samuel Castaneda ID: 113155 1. Abstract In the context of mobile robotics, before any motion planning or navigation algorithm

More information

Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design

Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design Orientation Capture of a Walker s Leg Using Inexpensive Inertial Sensors with Optimized Complementary Filter Design Sebastian Andersson School of Software Engineering Tongji University Shanghai, China

More information

Chapter 9 Object Tracking an Overview

Chapter 9 Object Tracking an Overview Chapter 9 Object Tracking an Overview The output of the background subtraction algorithm, described in the previous chapter, is a classification (segmentation) of pixels into foreground pixels (those belonging

More information

A Relative Mapping Algorithm

A Relative Mapping Algorithm A Relative Mapping Algorithm Jay Kraut Abstract This paper introduces a Relative Mapping Algorithm. This algorithm presents a new way of looking at the SLAM problem that does not use Probability, Iterative

More information

2D and 3D Transformations AUI Course Denbigh Starkey

2D and 3D Transformations AUI Course Denbigh Starkey 2D and 3D Transformations AUI Course Denbigh Starkey. Introduction 2 2. 2D transformations using Cartesian coordinates 3 2. Translation 3 2.2 Rotation 4 2.3 Scaling 6 3. Introduction to homogeneous coordinates

More information

Orthogonal SLAM: a Step toward Lightweight Indoor Autonomous Navigation

Orthogonal SLAM: a Step toward Lightweight Indoor Autonomous Navigation Orthogonal SLAM: a Step toward Lightweight Indoor Autonomous Navigation Viet Nguyen, Ahad Harati, Agostino Martinelli, Roland Siegwart Autonomous Systems Laboratory Swiss Federal Institute of Technology

More information

Improving autonomous orchard vehicle trajectory tracking performance via slippage compensation

Improving autonomous orchard vehicle trajectory tracking performance via slippage compensation Improving autonomous orchard vehicle trajectory tracking performance via slippage compensation Dr. Gokhan BAYAR Mechanical Engineering Department of Bulent Ecevit University Zonguldak, Turkey This study

More information

Graphics and Interaction Transformation geometry and homogeneous coordinates

Graphics and Interaction Transformation geometry and homogeneous coordinates 433-324 Graphics and Interaction Transformation geometry and homogeneous coordinates Department of Computer Science and Software Engineering The Lecture outline Introduction Vectors and matrices Translation

More information

Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM)

Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM) Robotics Lecture 7: Simultaneous Localisation and Mapping (SLAM) See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College

More information

COMP30019 Graphics and Interaction Transformation geometry and homogeneous coordinates

COMP30019 Graphics and Interaction Transformation geometry and homogeneous coordinates COMP30019 Graphics and Interaction Transformation geometry and homogeneous coordinates Department of Computer Science and Software Engineering The Lecture outline Introduction Vectors and matrices Translation

More information

Correcting INS Drift in Terrain Surface Measurements. Heather Chemistruck Ph.D. Student Mechanical Engineering Vehicle Terrain Performance Lab

Correcting INS Drift in Terrain Surface Measurements. Heather Chemistruck Ph.D. Student Mechanical Engineering Vehicle Terrain Performance Lab Correcting INS Drift in Terrain Surface Measurements Ph.D. Student Mechanical Engineering Vehicle Terrain Performance Lab October 25, 2010 Outline Laboratory Overview Vehicle Terrain Measurement System

More information

Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications

Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications 28 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 28 Integrated Laser-Camera Sensor for the Detection and Localization of Landmarks for Robotic Applications Dilan

More information

(1) and s k ωk. p k vk q

(1) and s k ωk. p k vk q Sensing and Perception: Localization and positioning Isaac Sog Project Assignment: GNSS aided INS In this project assignment you will wor with a type of navigation system referred to as a global navigation

More information

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z Basic Linear Algebra Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ 1 5 ] 7 9 3 11 Often matrices are used to describe in a simpler way a series of linear equations.

More information

Lab 10 - GEOMETRICAL OPTICS

Lab 10 - GEOMETRICAL OPTICS L10-1 Name Date Partners OBJECTIVES OVERVIEW Lab 10 - GEOMETRICAL OPTICS To examine Snell s Law. To observe total internal reflection. To understand and use the lens equations. To find the focal length

More information

International Symposium Non-Destructive Testing in Civil Engineering (NDT-CE) September 15-17, 2015, Berlin, Germany

International Symposium Non-Destructive Testing in Civil Engineering (NDT-CE) September 15-17, 2015, Berlin, Germany More Info at Open Access Database www.ndt.net/?id=18355 Effect of Surface Unevenness on In Situ Measurements and Theoretical Simulation in Non-Contact Surface Wave Measurements Using a Rolling Microphone

More information

L17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms

L17. OCCUPANCY MAPS. NA568 Mobile Robotics: Methods & Algorithms L17. OCCUPANCY MAPS NA568 Mobile Robotics: Methods & Algorithms Today s Topic Why Occupancy Maps? Bayes Binary Filters Log-odds Occupancy Maps Inverse sensor model Learning inverse sensor model ML map

More information

Uncertainties: Representation and Propagation & Line Extraction from Range data

Uncertainties: Representation and Propagation & Line Extraction from Range data 41 Uncertainties: Representation and Propagation & Line Extraction from Range data 42 Uncertainty Representation Section 4.1.3 of the book Sensing in the real world is always uncertain How can uncertainty

More information

Integrated Sensing Framework for 3D Mapping in Outdoor Navigation

Integrated Sensing Framework for 3D Mapping in Outdoor Navigation Integrated Sensing Framework for 3D Mapping in Outdoor Navigation R. Katz, N. Melkumyan, J. Guivant, T. Bailey, J. Nieto and E. Nebot ARC Centre of Excellence for Autonomous Systems Australian Centre for

More information

University of Technology Building & Construction Department / Remote Sensing & GIS lecture

University of Technology Building & Construction Department / Remote Sensing & GIS lecture 5. Corrections 5.1 Introduction 5.2 Radiometric Correction 5.3 Geometric corrections 5.3.1 Systematic distortions 5.3.2 Nonsystematic distortions 5.4 Image Rectification 5.5 Ground Control Points (GCPs)

More information

EE565:Mobile Robotics Lecture 2

EE565:Mobile Robotics Lecture 2 EE565:Mobile Robotics Lecture 2 Welcome Dr. Ing. Ahmad Kamal Nasir Organization Lab Course Lab grading policy (40%) Attendance = 10 % In-Lab tasks = 30 % Lab assignment + viva = 60 % Make a group Either

More information

Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments

Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments Jacques C. Leedekerken, John J. Leonard, Michael C. Bosse, and Arjuna Balasuriya Massachusetts Institute of Technology

More information

Sensor Modalities. Sensor modality: Different modalities:

Sensor Modalities. Sensor modality: Different modalities: Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature

More information

Simulation of a mobile robot with a LRF in a 2D environment and map building

Simulation of a mobile robot with a LRF in a 2D environment and map building Simulation of a mobile robot with a LRF in a 2D environment and map building Teslić L. 1, Klančar G. 2, and Škrjanc I. 3 1 Faculty of Electrical Engineering, University of Ljubljana, Tržaška 25, 1000 Ljubljana,

More information

CS4758: Rovio Augmented Vision Mapping Project

CS4758: Rovio Augmented Vision Mapping Project CS4758: Rovio Augmented Vision Mapping Project Sam Fladung, James Mwaura Abstract The goal of this project is to use the Rovio to create a 2D map of its environment using a camera and a fixed laser pointer

More information

Localization, Mapping and Exploration with Multiple Robots. Dr. Daisy Tang

Localization, Mapping and Exploration with Multiple Robots. Dr. Daisy Tang Localization, Mapping and Exploration with Multiple Robots Dr. Daisy Tang Two Presentations A real-time algorithm for mobile robot mapping with applications to multi-robot and 3D mapping, by Thrun, Burgard

More information

QUASI-3D SCANNING WITH LASERSCANNERS

QUASI-3D SCANNING WITH LASERSCANNERS QUASI-3D SCANNING WITH LASERSCANNERS V. Willhoeft, K. Ch. Fuerstenberg, IBEO Automobile Sensor GmbH, vwi@ibeo.de INTRODUCTION: FROM 2D TO 3D Laserscanners are laser-based range-finding devices. They create

More information

Coupled sonar inertial navigation system for pedestrian tracking

Coupled sonar inertial navigation system for pedestrian tracking Coupled sonar inertial navigation system for pedestrian tracking Hichem El Mokni*, Lars Broetje*, Felix Govaers*, Monika Wieneke* * Sensor Data- and Information Fusion Fraunhofer-Institut für Kommunikation,

More information

CS4758: Moving Person Avoider

CS4758: Moving Person Avoider CS4758: Moving Person Avoider Yi Heng Lee, Sze Kiat Sim Abstract We attempt to have a quadrotor autonomously avoid people while moving through an indoor environment. Our algorithm for detecting people

More information

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz Humanoid Robotics Monte Carlo Localization Maren Bennewitz 1 Basis Probability Rules (1) If x and y are independent: Bayes rule: Often written as: The denominator is a normalizing constant that ensures

More information

Kinematics of Machines Prof. A. K. Mallik Department of Mechanical Engineering Indian Institute of Technology, Kanpur. Module 10 Lecture 1

Kinematics of Machines Prof. A. K. Mallik Department of Mechanical Engineering Indian Institute of Technology, Kanpur. Module 10 Lecture 1 Kinematics of Machines Prof. A. K. Mallik Department of Mechanical Engineering Indian Institute of Technology, Kanpur Module 10 Lecture 1 So far, in this course we have discussed planar linkages, which

More information

Unit 1, Lesson 1: Moving in the Plane

Unit 1, Lesson 1: Moving in the Plane Unit 1, Lesson 1: Moving in the Plane Let s describe ways figures can move in the plane. 1.1: Which One Doesn t Belong: Diagrams Which one doesn t belong? 1.2: Triangle Square Dance m.openup.org/1/8-1-1-2

More information

DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING

DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING Inženýrská MECHANIKA, roč. 15, 2008, č. 5, s. 337 344 337 DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING Jiří Krejsa, Stanislav Věchet* The paper presents Potential-Based

More information

CMSC 425: Lecture 9 Geometric Data Structures for Games: Geometric Graphs Thursday, Feb 21, 2013

CMSC 425: Lecture 9 Geometric Data Structures for Games: Geometric Graphs Thursday, Feb 21, 2013 CMSC 425: Lecture 9 Geometric Data Structures for Games: Geometric Graphs Thursday, Feb 21, 2013 Reading: Today s materials is presented in part in Computational Geometry: Algorithms and Applications (3rd

More information

An object in 3D space

An object in 3D space An object in 3D space An object's viewpoint Every Alice object has a viewpoint. The viewpoint of an object is determined by: The position of the object in 3D space. The orientation of the object relative

More information