Sensor-Landmark Succession for Motion Planning along a Planned Trajectory

Similar documents
Obstacles Avoidance for Car-Like Robots Integration And Experimentation on Two Robots

ROBOT NAVIGATION IN VERY COMPLEX, DENSE, AND CLUTTERED INDOOR/OUTDOOR ENVIRONMENTS

Elastic Bands: Connecting Path Planning and Control

Foundations of Parameterized Trajectoriesbased Space Transformations for Obstacle Avoidance

Abstracting any Vehicle Shape and the Kinematics and Dynamic Constraints from Reactive Collision Avoidance Methods

Calibration of a rotating multi-beam Lidar

Int. Conf. on Intelligent Autonomous Systems July 2527, Venezia (IT). pp Planning Safe Paths for Nonholonomic Car-Like Robots Navigating

Robot Motion Planning

Lessons Learned in Integration for Sensor-Based Robot Navigation Systems

Planning Movement of a Robotic Arm for Assembly of Products

Keeping features in the camera s field of view: a visual servoing strategy

An Architecture for Automated Driving in Urban Environments

Spring 2010: Lecture 9. Ashutosh Saxena. Ashutosh Saxena

The Obstacle-Restriction Method (ORM) for Robot Obstacle Avoidance in Difficult Environments

Introduction to Robotics

Real-Time Obstacle Avoidance For Polygonal Robots With A Reduced Dynamic Window

Smooth Nearness Diagram Navigation

A Fuzzy Local Path Planning and Obstacle Avoidance for Mobile Robots

Kinematics of Wheeled Robots

Safe proactive plans and their execution

A 2-Stages Locomotion Planner for Digital Actors

Reactive Obstacle Avoidance for Mobile Robots that Operate in Confined 3D Workspaces

Intermediate Desired Value Approach for Continuous Transition among Multiple Tasks of Robots

Variable-resolution Velocity Roadmap Generation Considering Safety Constraints for Mobile Robots

A Genetic Algorithm for Robust Motion Planning

Motion Control (wheeled robots)

10/11/07 1. Motion Control (wheeled robots) Representing Robot Position ( ) ( ) [ ] T

EE631 Cooperating Autonomous Mobile Robots

Introduction to Mobile Robotics Path Planning and Collision Avoidance

Grasp Planning for Non-Convex Objects

A GENTLE INTRODUCTION TO THE BASIC CONCEPTS OF SHAPE SPACE AND SHAPE STATISTICS

CANAL FOLLOWING USING AR DRONE IN SIMULATION

Geometric Path Planning McGill COMP 765 Oct 12 th, 2017

Resolution of spherical parallel Manipulator (SPM) forward kinematic model (FKM) near the singularities

COMPARISON OF ROBOT NAVIGATION METHODS USING PERFORMANCE METRICS

Motion-based obstacle detection and tracking for car driving assistance

Jean-Claude Eatornbe

Agent Based Intersection Traffic Simulation

CALCULATING TRANSFORMATIONS OF KINEMATIC CHAINS USING HOMOGENEOUS COORDINATES

Path Deformation Roadmaps

Task selection for control of active vision systems

Smooth Motion Planning for Car-Like Vehicles

CMPUT 412 Motion Control Wheeled robots. Csaba Szepesvári University of Alberta

Assignment 3. Position of the center +/- 0.1 inches Orientation +/- 1 degree. Decal, marker Stereo, matching algorithms Pose estimation

Avoiding Moving Obstacles: the Forbidden Velocity Map

Laser Scanner-Based Navigation and Motion Planning for Truck-Trailer Combinations

Navigation methods and systems

Advanced Robotics Path Planning & Navigation

Nonholonomic motion planning for car-like robots

A local planner for closed-loop robot

FACOLTÀ DI INGEGNERIA DELL INFORMAZIONE ELECTIVE IN ROBOTICS. Quadrotor. Motion Planning Algorithms. Academic Year

Search Spaces I. Before we get started... ME/CS 132b Advanced Robotics: Navigation and Perception 4/05/2011

A Real-Time Navigation Architecture for Automated Vehicles in Urban Environments

CONSTRUCTION OF THE VORONOI DIAGRAM BY A TEAM OF COOPERATIVE ROBOTS

Trajectory Modification Using Elastic Force for Collision Avoidance of a Mobile Manipulator

f-v v-f f-e e-f f-f e-e-cross e-v v-e degenerate PCs e-e-touch v-v

Neuro-adaptive Formation Maintenance and Control of Nonholonomic Mobile Robots

Path Coordination for Multiple Mobile Robots: A Resolution-Complete Algorithm

A Three dimensional Path Planning algorithm

Real-world Robot Navigation amongst Deformable Obstacles

EE565:Mobile Robotics Lecture 2

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

Determination of an Unmanned Mobile Object Orientation by Natural Landmarks

Advanced Robotics Path Planning & Navigation

Redundancy Resolution by Minimization of Joint Disturbance Torque for Independent Joint Controlled Kinematically Redundant Manipulators

Robot Motion Control Matteo Matteucci

Motion Planning for Dynamic Knotting of a Flexible Rope with a High-speed Robot Arm

AMR 2011/2012: Final Projects

Autonomous Mobile Robot Design

Probabilistic Robotics

Introduction to Mobile Robotics Path Planning and Collision Avoidance. Wolfram Burgard, Maren Bennewitz, Diego Tipaldi, Luciano Spinello

Introduction to Autonomous Mobile Robots

Robotics Project. Final Report. Computer Science University of Minnesota. December 17, 2007

Time Optimal Trajectories for Bounded Velocity Differential Drive Robots

Path Planning. Jacky Baltes Dept. of Computer Science University of Manitoba 11/21/10

1498. End-effector vibrations reduction in trajectory tracking for mobile manipulator

UAV Autonomous Navigation in a GPS-limited Urban Environment

Robotics Tasks. CS 188: Artificial Intelligence Spring Manipulator Robots. Mobile Robots. Degrees of Freedom. Sensors and Effectors

Geometrical Feature Extraction Using 2D Range Scanner

Path-Planning for Multiple Generic-Shaped Mobile Robots with MCA

A ROS REACTIVE NAVIGATION SYSTEM FOR GROUND VEHICLES BASED ON TP-SPACE TRANSFORMATIONS

Motion Control in Dynamic Multi-Robot Environments

S-SHAPED ONE TRAIL PARALLEL PARKING OF A CAR-LIKE MOBILE ROBOT

Chapter 8 Visualization and Optimization

Space Robot Path Planning for Collision Avoidance

A Simple Method of the TEX Surface Drawing Suitable for Teaching Materials with the Aid of CAS

MEAM 620 Part II Introduction to Motion Planning. Peng Cheng. Levine 403,GRASP Lab

An Extended Line Tracking Algorithm

Abstract. Introduction

Modeling the Static and the Dynamic Parts of the Environment to Improve Sensor-based Navigation

Robotized Assembly of a Wire Harness in Car Production Line

Robot Localization based on Geo-referenced Images and G raphic Methods

Occluded Facial Expression Tracking

DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING

Simulation of a mobile robot with a LRF in a 2D environment and map building

53:244 (58:254) Energy Principles in Structural Mechanics. virtual work, principles of stationary and minimum potential energy;

Introduction to Information Science and Technology (IST) Part IV: Intelligent Machines and Robotics Planning

A set theoretic approach to path planning for mobile robots

Path Planning among Movable Obstacles: a Probabilistically Complete Approach

A 3-D RECOGNITION AND POSITIONING ALGORITHM USING GEOMETRICAL MATCHING BETWEEN PRIMITIVE SURFACES

Transcription:

Sensor-Landmark Succession for Motion Planning along a Planned Trajectory Abed C. Malti, Florent Lamiraux and Michel Taïx LAAS/CNRS 7, av. du Colonel Roche F-377 Toulouse Cedex 4 - France first-name.lastname@laas.fr Abstract In this paper we introduce a strategy to process continuous sensor-landmark sequences along a geometric planned At each robot configuration of this planned motion, the robot will be able to get the best landmarks that allow to suit the inaccuracy of the reference model used to plan the geometric In this way, the executed configuration can be corrected to respect the planned trajectory in relation to the reference model. Our contribution is to state the problem in a generic formalism and to propose some simulations that show the relevance of our resolution approach. I. INTRODUCTION Planning geometric trajectory in a reference map for a robot produces a non-collision continuous path in the robot configuration space [7], [6]. However, the execution of this trajectory in a real environment remains difficult for two main reasons. The first one resides in the inaccuracy of the map of the environment used to plan the trajectory and the second reason is that the navigation task in cluttered environments requires precise localization. Many approaches have been proposed to solve these two problems. Some authors proposed to localize the robot along the trajectory with respect to local landmarks [4], [2], [], others proposed reactive methods to avoid unexpected obstacles [9], [], [5], []. Path planning with uncertainty approach for mobile robots has been addressed by [4], [2], [3] to produce safe path planner. In our work, we aim at introducing sensor-landmark constraints along a geometric In the previous work [8] we introduced a generic approach to correct a planned geometric In a given planned configuration, we described how the robot processes a relative local localization configuration in its map model based on preselected sensor-landmark pairs. Once this configuration is computed, we defined a map reference configuration in the real environment where the difference between the robot s sensors perception in the model and in the real world is minimized. At this step, we assumed as known the set of sensor-landmark pairs. In this paper, we propose a generic approach to include the choice of landmarks and sensors along a planned The idea is to plan sensor-landmark primitives to perform sensor based motion along a Instead of planning a trajectory in a first step and follow this trajectory in a second step, we will produce a sequence of sensorlandmark based motions. In section II we introduce the notations and definitions for the sensor-landmark planned motion. In section III the definition of the cost function for a sensor-landmark pair is presented. In the next section, we use this cost function to automatically compute a sequence of continuous sensor-landmark pairs along a geometric Finally, the last section provides some simulation results to show the relevance of the proposed approach. II. SENSOR-LANDMARK PLANNED MOTION We assume we have a robot with a set of n sensors s i, i =, n. This robot moves in an environment of L = {l, l 2,...} landmarks. For the sake of lightening the notations, s represents both the sensor and its configuration. Similarly l represents both the landmark and its configuration. Our reasoning is based on two different spaces. The first one is related to a reference map: the model. In this model, we have a non-collision geometric planned The sensor-landmark based motion is planned in this reference map. The notations related to this space are denoted with an exponent m. The second space is related to the real environment. In this space the planned motion is executed. The notations related to this space are denoted with an exponent r. In this section we extend definitions established in the previous paper to take into account the weight that will be assigned to a sensor-landmark pair. Definition : A sensor-landmark planned motion is a continuous trajectory in the robot configuration space defined as : M : C S L R + q ((S, L), W ) Where the planned trajectory is defined as : With : γ : [, T ] C τ q = γ(τ) W = (w(s i, l ), w(s j, l 2 ),..., w(s k, l p )) t, i, j, k [, n]. T is the maximum range parameter of the At each configuration q, along a geometric planned trajectory, L = {l,..., l p } p landmarks are selected and

associated to the robot s sensors. Each one of the p sensorlandmark pairs has to be weighed. Through the trajectory the function that gives the weight of a sensor-landmark pair is a positive and continuous one. The weight attached to a sensor-landmark pair describes the relevance of this primitives given the current robot configuration. It allows to put emphasis on the most relevant sensor-landmark pairs among others. Like, for example, crossing a door for a mobile robot, where the pair sensor-door would be the most relevant one. Similarly in a grasping task where the pair sensor-object to grasp is the most relevant pair. The function that gives the values of w ij is presented later in section III. In [8], we described a configuration called the localization configuration qloc m. Given a set of p sensor-landmark SL = {(s i, l ),..., (s j, l p )}, this configuration allow to localize the robot in the map model. These correction is carried out by minimizing the square distance between the sensor image in the real environment of the preselected p pairs and the perception in the model map. A second configuration qmapref r allows the robot to correct its configuration in the real environment so that it satisfies the perception in the map model. Before introducing the weighing matrix to the equation, we recall some important notations : Y = Ψ(L, q) is the function that gives the images of the p landmarks viewed by the sensors when the robot is in a given configuration. is the measured images of the p landmarks from the real environment when the robot is in the executed configuration qe. r The dimension of Ye r is equal to p v, v is the dimension of the image of the landmark l when it is viewed by the sensor s. q m is the desired configuration located in the model map. qe r is the executed configuration of q m in the real environment. YO m = Ψ(Lm, q ) Y r e A. Localization Weighing the sensor-landmark pairs alters slightly the solution that becomes : ( qloc(y m m, Ye r ) q m = W Ψ ) + W (Y r q (L m,q m) () Where : w(s i, l ) w(s j, l 2 ) W =.... w(s k, l p ) With :w(s i, l k ) is a bloc diagonal matrix of values w(s i, l k ). The dimension of this bloc matrix is,v, the dimension of the image space of the sensor-landmark pair. This equation allows to compute qloc m Ye r and q m. with respect to Y m, B. Correction The goal of the previous paper was to find an expression that computes qmapref r where the robot minimizes the distances between the perception in the map model and the perception in the real world given a finite set of preselected sensor-landmark pairs. The solution is given as : q r MapRef (Y m, Y r e ) q r e = (q m loc(y m, Y r e ) q m ) (2) This expression means that the correction rate in the real world is the opposite of the localization rate in the map model. Now it is possible to modify a geometric trajectory by taking into account sensor-landmark perceptions. The problem that arises is how to compute the weights and how to select sensor-landmark pairs along a planned III. COST FUNCTION OF A SENSOR-LANDMARK PAIR To respect our goal which is defining a generic formalism of the sensor-landmark based motion, we have to define a generic way to compute the weight to attach to the sensor-landmark pairs. This weight has to be mainly positive and continuous along a planned trajectory for each sensor-landmark pair. In our work, we are concerned by the geometric aspect of the landmarks. Also, along a geometric planned trajectory we are interested only by the visible landmarks from the robot s sensors in their current configuration. Definition 2: We call a sensor space visibility LV for a sensor s and a landmark l, the set defined as : LV = {l v / l v l if l is visible from s, l v = else.} Definition 3: For a sensor-landmark pair (s, l), we call the configuration map CMap of a landmark l mapped by the sensor s, the configuration defined as : CMap : L S LV (l, s) l v = CMap(l, s). The mapping function CMap processes the geometric aspect of a landmark l when it is viewed by a sensor s. l v is the visible part of l given a configuration of the sensor. The configuration space LV may be different from L when for example the landmark is defined in a 3D space whereas the sensor maps the environment in a 2D space. Definition 4: For a sensor-landmark pair (s, l), where l v = CMap(l, s), the weighing function W sl is defined as : e Ψ(L m, q m )) W sl : LV [, ] l v w = W sl (l v ) The computation of W sl take into account geometric aspect of the mapped landmark l v. For a given set of sensor-landmark pairs SL = {(l, s i ), (l 2, s j ),..., (l p, s k )}; i, j, k [, n], a vector of p weights can be computed and assigned to those pairs. This vector is denoted by : W sl CMap(l, s i ) W = W sl CMap(l 2, s j ) ; i, j, k [, n]. W sl CMap(l p, s k ) The weighing function W sl is liable to the next property :

Property : W sl is continuous along a continous W sl is null out of the sensor range. The null property allows to skip landmarks that are out of the sensor range. The continuous condition is imposed to have continuous trajectory on the localization configuration q m loc and on the correction configuration qr MapRef. Finding a generic way to compute W sl so that for each point of a geometric trajectory it classifies sensor-landmark pairs from the most to the least relevant one, seems to be a difficult issue. Since there are several criterion to take into account to resolve this problem. In a first step, we take only the aspect related to the sensor-landmark pair. This aspect is represented by the geometric visibility and the geometric detectability of a landmark when it s viewed by a sensor. A. Cost of the Geometric Visibility Definition 5: The cost of the geometric visibility W visi is defined as : W visi : LV [, ] In order to satisfy property W visi follows the next property : Property 2: W visi is continuous along a continuous W visi is null out of the sensor range. For a sensor-landmark pair (s, l), W visi represents the occupancy of l v in the sensor range manifold. To clarify this concept, let see the following example. Let us consider a 2D laser sensor and a 2D segment as landmark. It clearly appears that the visible part of such a landmark in this kind of sensor is also a 2D segment. The figure shows the visible parts of two segments viewed by the laser sensor. The considered sensor has a carrying distance range of 2 meter and an angular scanning range of π radian. l l v2 l 2 l v s X s Fig.. The laser sensor considered here has an angular scanning range of π. Two segments l and l 2 are captured by the laser sensor. In this sensor configuration, a part of l is hidden by l 2. l v is the segment part of l which is not hidden by l 2. While l 2 is not entirely contained in the sensor range since the sensor has a sacanning of π radian. l v2 is the segment part of l 2 contained in the sensor scanning range. Now, let us compute the expression of W visi function for this example. Previously, we see that W visi means the geometric occupancy of the visible part of the landmark in the sensor map. Since while sensing the environment, the laser broadcasts rays to get the points of the landmarks p 4 v2 p 5 v2 p 6 v2 p 2 p 3 v2 v2 p 6 v p5 vp 4 v p 3 v p 2 v p v2 X s s l l 2 p v Fig. 2. 2 shows in this case the points p v that are extracted from the visible segment part to compute the value of W visi for the pair (s, l). 2 shows the function (f θ (p v) f d (p v)) through the laser range. from the sensor frame it defines for each visible point p v the W sl value that has to be assigned to this point. This function is null out of the sensor range. It has a constant maximum value from the origin sensor frame until 8% of the sensor range. It decreases linearly to zero until the end carrying area of the sensor. (figure 2), we first define a weighing function related to a point. So for a point p v (x v, y v ) belonging to the sensor range, we attach a weight w visi computed as : 5 Xs 5 2-2 -5 w visi (p v ) = f d (p v ) f θ (p v ) Where :, if d = O s p v 2 2meter f d (p v ) = f d max, if d = f d ], fd max ], else., if θ = ( X s, O s p v ) = or π f θ (p v ) = fθ max, if θ = π/2 f θ ], fθ max ], else. With : (, X s, ) is the sensor frame. fd max and fθ max can be any real positive functions respecting the property 2 related to W visi. Figure 2 presents the product of these two functions for a point varying through the sensor range. Defining those two functions, we give the expression of W visi for a laser2d-segment2d pair as : W visi (l v ) = f θ (p v ) f d (p v ) (3) p v l v The points p v are extracted from l v by considering the intersections between rays scanning from the origin of the sensor frame and the visible segment part (see figure 2). It clearly appears that W sl (l v ) is null out of the sensor range. The prove of the continuously of this function along a continuous geometric planned trajectory is straightforward. If we suppose s = γ(τ) the function that gives the configuration of the sensor given the trajectory parameter τ. l v = CMap(s = γ(τ), l) is continuous since it is the composition of γ(τ) and CMap which are continuous. Then W visi defined in equation (3) is continuous along a continuous planned - -5 5 "weight.dat" 5 2

o2 v2 o v2 o2 v X s s l 2 l o v "detect.dat".5.5 2 2.5 3 Fig. 3. 3. The angle made up by the origin of the sensor coordinate frame and each end point o v and o2 v of the visible segment part l v is used to compute the W detect function values. The maximum value that this angle can take is approximatively π. The minimum values that it can take is (when the segment landmark is superposed to the broadcasted laser ray). 3. Shows the shape of W detect. Wider is the angle β, more the segment can be recognized by the sensor and then higher is the value of W detect. In theory the maximum value of β is π but in practice this value is never reached. B. The Cost of the Geometric Detectability Definition 6: The cost of the geometric detectability W detect is defined as : W detect : LV [, ] In order to satisfy property, W detect follows the next property : Property 3: W detect is continuous along a continuous For a sensor-landmark pair (s, l), W detect evolves how much l is easy to be recognized by the sensor. Let us compute W detect for the previous example. From a geometric point of view this value is with respect to the angle β = ( o v, o2 v ) made up by the origin of the sensor coordinate frame and each end point o v and o2 v of the visible segment part l v (see figure 3). The expression of this weight function is given by :, if β = W detect (l v ) = wdetect max, if θ = π w detect ], w max else. detect [, Formally W detect can be any function that follows the above conditions and satisfies property 3 related to these function weight. The figure 3 shows the variation of W detect with respect to the angle. More the angle increase more the landmark is dectable and more W detect is of great value. Finally W sl is computed by the following expression : W sl = W visi W detect (4) Up to now, we saw that, given a configuration of a sensor we are able to compute the weight of all the landmarks belonging to the sensor range (according to equation (4)). Even if this computation is sub-optimal because it takes into account only the sensor-landmark aspect, it gives a great information of the way the landmarks are distributed in the sensor range area. The extension of the computation criterion for a more optimal W sl function is future work. Further in this paper, we will see the results obtained with this first implementation of W sl. IV. SEQUENCING OF SENSOR-LANDMARK PAIRS As seen before, given a sensor configuration (or robot configuration) we get the associated sensor-landmark pairs constructed from the visible landmark and thus we are able to compute the corresponding weights. For a robot with n sensors, we have to define the continuous sequencement of the sensor-landmark pairs along a planned trajectory such that the robot s sensors switch smoothly through the landmarks of the environment. In each configuration of the planned trajectory, each sensor has to select the best p landmarks to use them in sensor-landmark pairs as primitives to correct the robot A first implementation of the planning algorithm to get a sensor-landmark based motion is given below : : {Initialization} 2: r Robot. 3: L list of the landmark of the model of the environment. 4: γ the geometric planned trajectory of the robot. 5: rangep aram the parameter length of γ. 6: p the number of maximum sensor-landmark pairs per sequence. 7: for Each sensor s of r do 8: Initialization 9: distp aram : repeat : q s SEN_CONF_AT_PARAM(s, γ, τ). 2: sl LIST_SL(s, q s, L). 3: W (sl) WEIGHT(sl). 4: sl SELECT_SL(sl, p). 5: distp aram MAX_DIST_MOVE(sl, γ, τ). 6: slp CONSTRUCT_SEQUENCE(sl, τ, distp aram). 7: ADD_SLP(s, slp). 8: τ τ + distp aram 9: distp aram 2: until τ > rangep aram 2: end for In the line 6 the number of maximum sensor-landmark pairs to take into account while executing sensor-landmark motion(notably localization and correction) is fixed. For each robot s sensor this algorithm compute a sensorlandmark motion. SEN_CONF_AT_PARAM in line returns the sensor configuration when r is on γ for τ. LIST_SL computes the sensor-landmark pairs (line 2) from all the visible landmarks of L for the current q s. WEIGHT assesses weights for the sl sensor-landmark list according to equation (4). SELECT_SL (line 4) allows to select the p pairs of sl that have the greatest weights or all sl pairs if the number of this pairs is less than p. In line 5, MAX_DIST_MOVE returns the maximum distance by which r can move along γ, where sl remain the pairs that have the greatest weights and where these weights keep the same monotony. After this step we obtain a sequence of sensor-landmark pairs along γ from τ to

τ + distp aram and then we add it to the sensor-landmark planned motion of the current sensor with the function ADD_SLP (line 7). These steps are repeated until the end of the planned geometric The same operation is done for all the sensors of the robot. Finally we obtain a sensor-landmark based motion that for each configuration of the robot on the planned trajectory gives the selected sequence of sensor-landmark pairs for each robot s sensors with their corresponding weights. V. SIMULATION RESULTS This first implementation of the sensor-landmark motion planner is integrated on our framework platform in a generic object oriented architecture. The simulations are applied to the hilare-trailer non-holonomic mobile robot of four degree of freedom. A laser sick sensor is mounted on the forward part of the robot. A second similar sensor is carried by the trailer platform. A library named segkit, developped in our laboratory, is used to simulate the laser sensors of the robot in the real world. The environment of simulation is beforehand constructed by the laser sensor from a real environment. The obtained map model is shown in figure. 4. In this model, a non-collision geometric trajectory is planned from a start to a goal configuration (see figure. 4). This is done by using Move3D which is a non-collision geometric trajectory planner developed in our laboratory [3]. By selecting for each sensor the three best landmarks along the trajectory, we plan the sensor-landmark motion. Now, we introduce some perturbations (in translation and in rotation) in the map model to obtain a simulated real environment different from the model environment. The execution of the planned trajectory in the real world is not correct (see figure. 5) since it is in collision while the robot crosses doors and it hasn t the desired shape in relation to the environment. The execution of the sensor-landmark motion (figure. 5) planned with the three best landmarks for each sensor allows to execute robust motion. During this motion execution the real sensors perceptions of the environment is simulated thanks to segkit. The comparison of the planned trajectory in 4 and the executed simulation of the sensor-landmark motion in the real environment in 4 shows that the robot suits the perturbations introduced in the real environment and the trajectory keep the same shape in each environment trajectory The figure 6 shows an enlarge view of the trajectory correction. In this configuration of the planned trajectory, the best three segments associated to the head laser are l 36, l 35, l 43. The back sensor is associated to the segments l 63, l 97, l 65. The corresponding weights for this configuration are presented in 7. The figure 8 shows the weights of the sensor-landmark pairs constructed of l 36, l 43, l 76, l 23 associated to the head laser for all the The continuous property is perfectly respected along the The case where The constructed model represents the Space City located in Toulouse-France. Fig. 4. 4The map model contains 234 segments. The non-collision geometric trajectory is shown with iron wire. 4 The executed sensorlandmark motion (with green color) suits the perturbations of the environment and keep the same shape of the planned trajectory as in 4. Fig. 5. The two environment are superposed. The simulated real environment is represented with light color. The execution of the planned trajectory doesn t respect the reference model constraints. The execution of the planned motion (with green color) is more robust. l m 76 l m 43 l r 76 l r 43 l m 36 l m 23 l m 63 l r 65 l r 23 l r 36 l r 35 l m 97 l r 35 Fig. 6. Enlarge view of a point of the The two environments are superposed. In this point the head laser is associated to the segments l 36, l 35, l 43. The back laser is associated to the segments l 63, l 97, l 65.

"head.dat" "back.dat" Wsl Wsl seg36 seg35 seg43 seg63 seg97 seg65 Fig. 7. 7 Shows the instantaneous weights of l 36, l 35, l 43 associated to the head laser for the configuration shown in 6. 7 Shows the instantaneous weights of l 63, l 97, l 65 associated to the back laser for the same configuration. seg36 seg23 seg76 seg43 Wsl (c) (d).5.5 2 2.5 3 parameter Fig. 8. The weights of the sensor-landmark pairs constructed of l 36, l 43, l 76, l 23 associated to the head laser. The case where the weights doesn t begin or end with like for three pair (head laser, segment36) is because the sensor-landmark motion planner skips this landmark and switch to another that has higher weight. the weights doesn t begin or end with like for the pair (head laser, segment36) is because the sensor-landmark motion planner skips this landmark and switch to another that has higher weight. For each sensor the weights are normalized to one. We plan sensor-landmark motion for various number of best selected landmarks. The algorithmic complexity depends on the number of sensors, the number of selected landmarks and the number of landmarks in the environment (in this model, we have 234 segments). The table I shows the time processing for the corresponded number of sensorlandmark pairs. p 2 3 4 5 time(sec) 7 29 36 42 TABLE I PLANNING DURATION WITH RESPECT TO p. In figure 9, we show that for p > 3 the corrected trajectory remains the same as for p = 3 (figure 5). However for p = 2 (figures 9(c) and 9(d)) the trajectory is well corrected only for small changes on the real environment. This happens because while the execution of the planned motion one or more landmarks that a sensor expected to view can be non-visible since the real environment is different Fig. 9. 9sensor-landmark planned motion with p = 4. 9sensorlandmark planned motion with p = 5. 9(c)sensor-landmark planned motion with p = 2. 9(d)sensor-landmark planned motion with p = 2 and smaller perturbations in the real environment. from the map model. This issue open the perspective to the possibility of instead of planning motion with a constant p along the trajectory it will be better to make it variable. This is future work. VI. CONCLUSION In this paper, we have presented a strategy to weigh sensor-landmark pairs in a given robot configuration. Along a geometric planned trajectory we proposed an algorithm to select best sensor-landmark pairs according to their weights and to switch smoothly across them. This enables to transform the geometric trajectory in a continuous sequence of sensor-landmark pairs. This selection approach is generic, indeed, any sensor on any robot and any landmark can be taken into account in our framework. The simulation results show the relevance of the sensorlandmark motion planner we have proposed. To confirm these results, we intend to apply this framework on a real robot, hilare-trailer non-holonomic mobile robot. Further, we project to define criteria that enables to compute the optimal number of sensor-landmark pairs,p, according to the robotic task. REFERENCES [] Kai O. Arras, Jan Persson, Nicola Tomatis, and Roland Siegwart. Real-time obstacle avoidance for polygonal robots with a reduced dyna;ic window. In International Conference on Robotics and Automation, pages 35 355, Washington D.C, May 22. IEEE. [2] T. Siméon B. Bouilly and R. Alami. A numerical technique for planning motion strategies of a mobile robot in presence of uncertainty. In International Conference on Robotics and Automation, pages 327 332, Nagoya, 995. IEEE.

[3] A. Lambert and N. Le Fort-Piat. Safe task planning integrating uncertainties and local maps federations. International Journal on Robotics Research, 9:597 6, 2. [4] A. Lambert and Th.Fraichard. Landmark-based safe path planning for car-like robots. In International Conference on Robotics and Automation, pages 246 25, San Francisco, CA, April 2. IEEE. [5] F. Lamiraux and D. Bonnafous. Reactive trajectory deformation for nonholonomic systems: Application to mobile robots. In International Conference on Robotics and Automation, pages 399 34, Washington D.C., May 22. IEEE. [6] J.C. Latombe. Robot motion planning. Kluwer Academic Publishers, 99. [7] J.-P. Laumond, editor. Robot Motion Planning and Control. Number ISBN 3-54-7629- in Lectures Notes in Control and Information Sciences 229. Springer, 998. [8] A. C. Malti, F. Lamiraux, and M. Taïx. Sensor landmark motion planning in mobile robots. In International Conference on Intelligent Robots and Systems, page to appear, Sendai, Japan, September 24. IEEE/RSJ. See also Report LAAS/CNRS N 3567, December 23. [9] Javier Minguez, Luis Montano, Thierry Siméon, and Rachid Alami. Global nearness diagram navigation (gnd). In International Conference on Robotics and Automation, pages 33 39, Seoul KR, May 2. IEEE. [] S. Quinlan and O. Khatib. Elastic bands: Connecting path planning and control. In International Conference on Robotics and Automation, Atlanta, GA, 993. IEEE. [] José Santos-Victor, Raquel Vassallo, and Hans Schneebeli. TopologicalMaps for Visual Navigation. In Int. Conf. on Vision System, pages 2 36, 999. [2] Erik Schulenburg, Thilo Weigel, and Alexander Kleiner. Selflocalization in dynamic environments based on laser and vision data. In International Conference on Intelligent Robots and Systems, pages 998 4, Las Vegas, Nevada, October 23. IEEE. [3] T. Siméon, JP. Laumond, C. Van Geem, and J. Cortés. Computer aided motion: Move3d within molog. In International Conference on Robotics and Automation, pages 494 499, Seoul KR, May 2. IEEE. [4] Alessandro C. Victorino, Patrick Rives, and Jean-Jacques Borrelly. A relative motion estimation by combining laser measurement and sensor based control. In International Conference on Robotics and Automation, pages 3924 3929, Washington D.C, May 22. IEEE.