Inertial Navigation for Wheeled Robots in Outdoor Terrain
|
|
- Sharon Neal
- 5 years ago
- Views:
Transcription
1 Inertial Navigation for Wheeled Robots in Outdoor Terrain Jan Koch, Carsten Hillenbrand and Karsten Berns Department of Computer Science, Robotics Group University of Kaiserslautern Gottlieb-Daimler-Straße, Kaiserslautern, Germany {koch, cahillen, Abstract The implementation of an inertial measurement system used within the behavior-based control of an autonomous outdoor vehicle is presented in this paper. Autonomous dead-reckoning navigation can be substantially improved by an inertial sensor system. The determined orientation information is also mandatory for higher level behavior-based control to allow path-finding adaptive to the environment. Index Terms mobile robot, inertial measurement system, navigation, behavior I. INTRODUCTION Autonomous navigation has been a major task since the beginning of robotics. Though already self-evident in all fictional applications of robots and strongly required for present projects, the actual abilities of mobile robots still lack local and global self-localization, path-finding and forward-thinking navigation. Ambitious enterprises like the Grand Challenge 24 1 revealed that even with the best technology available, it remains very hard to succeed in the mentioned goals. A behavior-based system constitutes a more reactive approach than conventional control systems. This paper presents the theory and implementation of an inertial sensor system and combining it with a behavior-based offroad navigation system. The data from this sensor system is merged with the information of the odometry in order to improve the disposable knowledge. Calculations, implementation details, and assembly of the inertial system will be illustrated in addition to the utilized behavior network and the data fusion of vehicle and inertial sensor information. At the end of this work the performance tests and results are shown. A. Scenario II. OFFROAD NAVIGATION The application scenario includes unstructured natural or wooded terrain where GPS-reception cannot be guaranteed. Especially in unstructured natural terrain the robot requires extensive information additional to navigational data. The measurement of slope, the detection of obstacles and consequently the avoidance of accidents have important influence on path-finding. For all these purposes a convenient assembly of sensor systems and navigation software has to be developed on a robust outdoor platform. The errors of 1 the sensors have to be taken into account and data fusion has to be implemented. B. Navigation Information Localization possibilities are based on two sources. Global data can be obtained by landmarks, map matching or GPS information. Dead-reckoning data is acquired by measuring and calculating the movement of the robot itself. For navigational efforts and behavior-based control one can identify five state vector variables and their according profits: acceleration: slip detection at translation velocity: navigation control angular velocity: slip detection in curves position: course correction attitude: safety All of them can be determined by dead-reckoning techniques. But as shown later, the longterm-stability of this state information cannot be guaranteed. Furthermore, dependent on the system implementation, some of these state vectors are necessary for others. III. INERTIAL MEASUREMENT SYSTEM One possibility for dead-reckoning navigation is based on inertial forces. In theory, acceleration forces and angular velocity information provide a perfect way to determine the trajectory of a system with a known initial state of motion. Due to our specific requirements, a lot of reasons made a self-development necessary, see Fig. 1. These are low power consumption, light-weight parts, small dimensions, accessibility to raw data, high data rates, adjustable parameters, low costs, and adequate precision. A. Recent Research Inertial navigation in the field of robotics is continuously investigated. Several groups work on approaches to make inertial measurement systems usable on mobile robots [1] [2]. In most cases only 2D scenarios have been considered. Promising results were achieved in this area, because as shown later, errors of the orientation do not have such a severe influence in 2D as in 3D. The calibration and statistical improvement and sensor data fusion have already been studied [3] [4]. Especially the limits of low-cost systems are evaluated, while statistical methods, first of all Kalman based procedures, could increase the effectiveness. Error estimation and evaluation of sensors, assembly and application of inertial systems are presented by [5] and [6].
2 Fig. 1. B. Calculation Methods Inertial measurement unit The given hardware provides discrete information of angular velocity and acceleration forces. The angular velocity information is mandatory to calculate the attitude of the robot or vehicle on which the inertial system is mounted. In addition the attitude is required to interpret the measured acceleration which has to be known for the global view in world coordinates. The following notations are used: ω: measured angular velocities σ: angles from integration of angular velocity φ: roll-angle around x-axis θ: pitch-angle around y-axis ψ: yaw-angle around z-axis a: accelerations g: gravity force v: velocities s: position u: rotation axis for attitude correction heuristic α: rotation angle for attitude correction heuristic t: time interval W : matrix with unit vectors of world frame B: matrix with unit vectors of body frame T : rotation tensor, translates from body to world frame ɛ: tolerance interval lower index b: vector/matrix referred to body frame lower index w: vector/matrix referred to world frame double lower index α, β: tensor rotates around axis α by angle β upper index k: vector/matrix/tensor in kth time step For the investigated application it is not necessary to consider earth rotation, earth shape, coriolis-forces and stellar influences. The world model shall be an unaccelerated plane surface only interfered by the gravity force. The calculation is performed on a standard PC, connected to the sensor system via a CAN-Bus. The components of software are visualized in Fig. 2. B w should always contain the requested attitude information of the system. At this part the common strap-down approach [7] is modified, where the attitude information is directly stored in Euler angles. Using the unit matrix of the body frame from world view has the advantage that the calculation of the Euler angles has only to be done, when needed [8]. The matrix however is updated in every calculation step. W w = 1 1 (1) 1 B w = b 11 b 12 b 13 b 21 b 22 b 23 (2) b 31 b 32 b 33 The angle change per time step is deduced by trapezoid integration of the current and last angular velocity measurement. Now this angle needs transforming to the world frame. σ k b = t k ( ω k b + ω k 1) /2 (3) σ k w = B k w σ k b (4) The attitude information is updated by means of multiplication with a rotation tensor. The rotation axis represented in this tensor has the the actual angular rate as components. The rotation angle is derived as the length of the actual angular velocity vector. In fact in every processing loop, only two of three normal vectors (rows in the orientation matrix) are rotated. The third one is always calculated as the cross product of the other ones and normed afterwards. On the one hand this saves calculating time and does on the other hand guarantee the rectangularity and normalization of the unit vectors in the transformation matrix B w. B k+1 w = T σ k, σ k B k w (5) The matrix which transforms from the body coordinate frame to the world frame is conveniently the attitude representation itself, as the following arguing shows. D shall be the unknown transformation matrix here. B w = DW w (6) W w = I (7) B w = D (8) For some applications the attitude needs to be represented in Euler angles. The combined Euler matrices represent three sequential rotations by the angles φ, θ and ψ around the world axis x, y, and z, or the body axis in opposite order (c: cos, s: sin): cψcθ sψcφ + cψsθsφ sψsφ + cψsθcφ D = sψcθ cψcφ + sψsθsφ cψsφ + sψsθcφ sθ cθsφ cθcφ (9) These angles can be determined by coefficient comparison from B w. ( ) b32 φ = arctan (1) b 33 θ = arcsin ( b 31 (11) ( ) b21 ψ = arctan (12) b 11
3 Naturally in dead reckoning systems the long term stability cannot be maintained. To get the best possible results within the capabilities of the sensors, the system has to be calibrated as correct as possible. The measured value x can be described as polynomial function of the true x and error coefficients e i. Fig. 2. Steps of the inertial calculations With knowledge of the orientation the measured acceleration forces can be handled. By multiplication with the matrix B w, they are translated to the world frame and can be used to continuously integrate the acceleration of the system to get velocity and position. The force of gravity has to be subtracted in every time step. Here we have the largest source of positioning errors. A wrong attitude would cause a wrong difference and an incorrect acceleration vector would be processed. C. Calibration g w = (13) 9, 8665 m/s 2 a w = B w a b g w (14) v w k = v w k 1 + t k ( a k w + a k 1 ) w /2 (15) s k w = s k 1 w + t k ( v k w + v k 1 w ) /2 (16) To calculate the position and attitude, the sensor data has to be integrated. Unfortunately all errors will be totalized due to integration. Assembly errors: sensor alignment on board rectangularity of board axis Electrical errors: voltage fluctuations system noise digital jitter Sensor errors: characteristic curve errors package alignment on chip temperature dependence cross axis influence acceleration influence (angular rate sensors) x = e n x n + e n 1 x n e 1 x + e n = e n x n (17) n= The coefficients of a suitable linear error model can be determined by doing representative measurements. The model consist of offsets, linear scale factors and a cross correlation matrix. These are 12 unknown variables for the set of angular velocity sensors and another 12 for the set of acceleration sensors if we move the scale factors to the diagonal elements of the correlation matrix. The unknown coefficients could be determined by taking a couple of linear independent measurements and solving an equation system but indeed a gradient descent procedure shows better results. The taken test measurements are compared to their goal results. An error sum represents a simple fitness function. The gradiend descent is done by starting with plausible initial values for the unknown variables and making random changes trying to minimize the differences between measured test vectors and their corresponding goals. If the error decreases in one loop, the achieved set of values becomes the basis for the next step. This strategy leads to an optimal solution as the model is linear. x 1 x 2 x 3 = k 11 k 12 k 13 k 21 k 22 k 23 k 31 k 32 k 33 a 1 x 1 a 2 + x 2 a 3 x 3 (18) The output of the angular velocity sensors depends on their temperature. Therefore at least the offset has to be gathered at different temperatures to make linear interpolation possible. D. Attitude Heuristic It will be shown in III-C that there are many problems with long term stability, called drift of position and attitude. In order to minimize this drift, one possibility is to adjust the attitude with respect to the gravity force. From this attitude, an expected gravity vector can be created. In each calculation step, the current attitude is determined from the totalized angular velocities. If the amount of the acceleration determined equals the standard gravity force, the first criteria is fulfilled. a w g ɛ (19) If it differs in direction from the actual acceleration vector, a correction of the attitude representation is done by
4 rotating it in order to make the two vectors point in the same direction. The rotation is done respectively to the axis built from the cross product of the two vectors while the rotation angle arises from their different directions. ( ) aw g α = arccos (2) a w g As a result of the lack of one dimension, only the roll and pitch values can be corrected by this method. u = a w g (21) B w = T u,α B w (22) This attitude correction is only heuristic and therefor is only applied if the deviation reaches a certain amount. Additionally the correction is suppressed while the acceleration data is fluctuating. But the acceleration vector might have the same length as the gravity vector in other cases than no motion. Nevertheless, in all experiments the correction helped greatly to improve the overall performance. To reduce the drift of the yaw rate, another source of information like a magnetic compass would be necessary. E. Sensor Data Fusion As a matter of fact, data provided by the inertial measurement system lacks longterm stability [7]. Therefore additional information can improve the knowledge of the system state. 1) Fusion Principle: RAVON is equipped with odometry sensors providing the wheel speed and absolute encoder values of the steering angles. Using the attitude information from the inertial measurement system, a velocity vector can be generated from these scalar odometry information. Melting this velocity vector with the one provided by the inertial measurement system improves the overall velocity information of the robot. The sensor data fusion is done by calculating the weighted mean value out of each available sensor value x i of one type of data x. x = i w ix i (23) i w i This gives a more direct control option than using a statistical method with fixed system model. The weights w i are adjustable for the particular circumstances, nevertheless, the use of Kalman or Particle-Filters will be part of future work. For example, the yaw angle is determined by the inertial measurement system and additionally by the steering angle. If the vehicle speed increases, the weight of the steering angle for yaw calculation is decreased, as slipping rises with increasing velocity. 2) Vehicle Description: The test platform used to examine the motion control is our wheel driven outdoor vehicle RAVON (Robust Autonomous Vehicle for Offroad Navigation, see Fig. 6 and Fig. 7). It has a size of about 1.4 m in width and 2.4 m in length and wheels of 73 cm in diameter. Each wheel has an individual D/C motor, the front and rear steering is independent. The robot is capable of driving up to 3 m/s and of climbing slopes with 4 % inclination. Fig. 3. F. Architecture Architecture of the inertial system 1) Sensors: For application on mobile robots, as said in III, the sensors have to be small, cheap and energy-saving. Despite the fact that optical fiber gyros are state of the art concerning precision, they cannot be used because of their weight and size. Though micromechanical sensors are less precise, they comply with all the other requirements. Some important characteristics of the chosen sensors are given. Angular rate sensor ADXRS15: Measurement area: ±15 /s or ±2.62 rad/s Non-linearity:.1% Acceleration influence:.23 /s/g Temperature influence: 15% or 21.7 /s (-4 C to +85 C),.17 /s/ C Acceleration sensor ADXL23: Measurement area: 1.7 g or m/s Non-linearity:.5% Temperature influence over full scale:.3% Cross axis influence: 2% Alignment error on chip: ±1. 2) System Design: The system consists of three sensor boards each equipped with high rate analog digital converters. The digitalized data is fetched by an Atmel CPLD triggering an interrupt in a DSP where basic filtering is done. Subsequently the position and orientation data is sent via the CAN-Bus to a PC, see Fig. 3. While sensor data is sent to the higher levels of application, control information can be given downwards. The DSP-program on the other hand adjusts the sampling rate by setting special registers of the CPLD. System state
5 angle [rad] position [m] Fig. 4. x y z roll pitch yaw Drift of position and attitude at zero movement updates from the sensor data fusion modules also have to be transferred from the PC to the DSP. While running the program, several data output modes can be selected for the DSP. The PC can request the bit, raw, filtered or fully processed data from the DSP. In this context, processed means fully calculated attitude and position information. The other modes are for debugging only. 3) Data Rates: The analog sensor data is low-pass filtered at about 1Hz by a simple capacitor. The CPLD reads the A/D converters with 6,4kHz. This oversampling is used in the DSP to do some better filtering. The algorithms mentioned in III-B run on the DSP with a loop time of about 2 ms. The final state information is sent via CAN to the PC where a sense and control loop works at about 1 ms. 4) Characteristics: The inertial measurement unit (see Fig. 1) is constructed as a aluminum cube with 4 cm edge length. Its mass is about 2 g and it consumes 2 W. IV. EXPERIMENTS Several tests have been performed to figure out the quality of the information provided by the inertial system and its capability to work together with the behavior-based control. A. Evaluating the Inertial Measurement System As expected, orientation and position accumulated from angular velocity and acceleration show immense drift even after thorough calibration (see Fig. 4). As a matter of fact, the drifts of attitude and position are significantly reduced (as shown in Fig. 5) by using the attitude correction heuristic, described in III-D. As mentioned before, a precise position integration does depend significantly on a proper attitude. With the given system the best possible location estimation has therefor an error of 1 m after one minute. Another influencing effect is the temperature dependence of the angular velocity sensors. Careful capturing of the offset in the whole expected temperature range is therefore necessary. angle [rad] position [m] x y z roll pitch yaw Fig. 5. Drift of position and attitude at zero movement with correction heuristic B. Outdoor Test of the Combined System The inertial measurement system has been attached centered to the top of the robot. At the beginning of each test, this is defined as the center of the navigation coordinate frame, which is only a displacement of angle and position in respect to the world frame. The motion control of RAVON is carried out using a behavior-based approach. Details about the architecture and about single behaviors can be found in [9]. For tests concerning the approach of goal positions the following behaviors were activated: Velocity behavior: Intended vehicle velocities generated by higher behaviors are transformed and supervised here. The output sets the vehicle velocity. Steering behavior: Front and rear steering are actuated by instances of this low-level behavior. It transforms intended translatory and rotatory movements into a steering angle. Translatory movement to a goal position behavior: The target of this behavior is to reach a given position by a translational movement. Depending on the current pose of the robot derived from odometry and inertial measurement system the translatory motion to the goal is generated. Additionally, depending on the distance to the target, the velocity is set. Orientate to a goal direction behavior: While the previous behavior handled the translatory movement to a target point this behavior has the goal to rotate the vehicle so that it points to the given position. The rotational movement and the velocity are set depending on the deviation between the vehicle orientation and the target direction. With these behaviors four destinations located in a square were to be approached in a testing scenario. After the test the vehicle was supposed to be located at the start position again.
6 roll φ [rad] pitch θ [rad] Fig. 6. RAVON in the testing area Fig. 8. isolines Roll and pitch angle in an experiment with vehicle following could be improved by the combination of inertial and odometry sensors. Especially the attitude determination is an indispensable feature not only for capturing the trajectory in a 3D-world. Also the detection of possible risks by the gradient of the environment can be avoided. Further work is done in our group in areas of sensor data fusion, GPS integration, obstacle detection and collision avoidance with stereo-vision and laser scanners. Fig. 7. Robosoft outdoor vehicle Without the assistance of the inertial measurement system and its orientation determination the path-finding is based only on the odometry. Because of slipping, the steering angle is quite imprecise. After 8 m a difference of 7 m was determined, mainly induced by slipping in curves on the snowy surface. With additional data from the angular rate sensors the obtained error was reduced to about 1 m. The odometry information was used to correct the velocity information of the system state. At a speed of 1 m/s this error was within the one from the static drift measurements. In further experiments behaviors for minimzing the pitch angle and for following isolines were activated. As a result the vehicle remained oriented along a slope and moved in the direction of minimal change in height. Fig. 8 shows the roll and pitch angle for such an experiment. The robot responded quickly to disturbances caused by the terrain in a way that it remained oriented along the isolines. The roll angle, which is not considered in the pose correction, changed distinctly, while the pitch angle remained at a safe absolute value. REFERENCES [1] A. Rudolph, A. Siegel, and J. Adamy, Ein integriertes navigationssystem für einen mobilen roboter, Thema Forschung, vol. 1, 22. [2] E. Nebot and H. Durrant-Whyte, Initial calibration and alignment of low cost inertial navigation units for land vehicle applications, Department of Mechanical an Mechatronic Engineering, University of Sidney, Tech. Rep., [3] H. Janocha and J. Fox, Eds., Statische Kalibrierung von Inertialsensoren mit Hilfe eines Industrieroboters, 24. [4] E. Kiriy and M. Buehler, Three-state extended kalman filter for mobile robot localization, McGill Center For Intelligent Machines, Montreal Canada, Tech. Rep., 22. [5] M. Hashimoto, H. Kawashima, T. Nakagami, and F. Oba, Sensor fault detection and identification in dead-reckoning system of mobile robot: Interacting multiple model approach, Department of Mechanical System Engineering, Hiroshima University, Tech. Rep., 21. [6] J. Borenstein and L. Feng, A method for measuring, comparing and correcting dead-reckoning errors in mobile robots, University of Michigan, Tech. Rep., [7] D. H. Titterton and J. L. Weston, Strapdown inertial navigation technology. Peter Peregrinus Ltd. London, [8] J. Koch, Lage- und positionsbestimmung mobiler roboter auf basis eines intertialen messsystems und ergänzender sensorik, Master s thesis, University of Kaiserslautern, unpublished, October 24. [9] M. Proetzsch, T. Luksch, and K. Berns, Fault-tolerant behavior-based motion control for offroad navigation, in 2th IEEE International Conference on Robotics and Automation (ICRA), Barcelona, Spain, April V. CONCLUSION AND OUTLOOK The inertial sensor system outstands the odometry in precision of rotation estimation because of wheel slipping in curves. As the experiments showed, navigational efforts
Fault-Tolerant Behavior-Based Motion Control for Offroad Navigation
Fault-Tolerant Behavior-Based Motion Control for Offroad Navigation Martin Proetzsch, Tobias Luksch and Karsten Berns AG Robotersysteme, Fachbereich Informatik University of Kaiserslautern Gottlieb-Daimler-Straße,
More informationTesting the Possibilities of Using IMUs with Different Types of Movements
137 Testing the Possibilities of Using IMUs with Different Types of Movements Kajánek, P. and Kopáčik A. Slovak University of Technology, Faculty of Civil Engineering, Radlinského 11, 81368 Bratislava,
More informationMobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS
Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2
More informationCalibration of Inertial Measurement Units Using Pendulum Motion
Technical Paper Int l J. of Aeronautical & Space Sci. 11(3), 234 239 (2010) DOI:10.5139/IJASS.2010.11.3.234 Calibration of Inertial Measurement Units Using Pendulum Motion Keeyoung Choi* and Se-ah Jang**
More informationUnit 2: Locomotion Kinematics of Wheeled Robots: Part 3
Unit 2: Locomotion Kinematics of Wheeled Robots: Part 3 Computer Science 4766/6778 Department of Computer Science Memorial University of Newfoundland January 28, 2014 COMP 4766/6778 (MUN) Kinematics of
More informationLocalization, Where am I?
5.1 Localization, Where am I?? position Position Update (Estimation?) Encoder Prediction of Position (e.g. odometry) YES matched observations Map data base predicted position Matching Odometry, Dead Reckoning
More informationChapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI
Chapter 4 Dynamics Part 2 4.3 Constrained Kinematics and Dynamics 1 Outline 4.3 Constrained Kinematics and Dynamics 4.3.1 Constraints of Disallowed Direction 4.3.2 Constraints of Rolling without Slipping
More informationImproving Vision-Based Distance Measurements using Reference Objects
Improving Vision-Based Distance Measurements using Reference Objects Matthias Jüngel, Heinrich Mellmann, and Michael Spranger Humboldt-Universität zu Berlin, Künstliche Intelligenz Unter den Linden 6,
More informationEvaluating the Performance of a Vehicle Pose Measurement System
Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of
More informationExam in DD2426 Robotics and Autonomous Systems
Exam in DD2426 Robotics and Autonomous Systems Lecturer: Patric Jensfelt KTH, March 16, 2010, 9-12 No aids are allowed on the exam, i.e. no notes, no books, no calculators, etc. You need a minimum of 20
More informationInertial Navigation Systems
Inertial Navigation Systems Kiril Alexiev University of Pavia March 2017 1 /89 Navigation Estimate the position and orientation. Inertial navigation one of possible instruments. Newton law is used: F =
More informationEE565:Mobile Robotics Lecture 3
EE565:Mobile Robotics Lecture 3 Welcome Dr. Ahmad Kamal Nasir Today s Objectives Motion Models Velocity based model (Dead-Reckoning) Odometry based model (Wheel Encoders) Sensor Models Beam model of range
More information(1) and s k ωk. p k vk q
Sensing and Perception: Localization and positioning Isaac Sog Project Assignment: GNSS aided INS In this project assignment you will wor with a type of navigation system referred to as a global navigation
More informationReal-time Visual Self-localisation in Dynamic Environments
Real-time Visual Self-localisation in Dynamic Environments A case study on the Off-road Platform RAVON H. Schäfer, P. Hahnfeld, K. Berns b schaef@informatik.uni-kl.de University of Kaiserslautern, Robotics
More informationVehicle Localization. Hannah Rae Kerner 21 April 2015
Vehicle Localization Hannah Rae Kerner 21 April 2015 Spotted in Mtn View: Google Car Why precision localization? in order for a robot to follow a road, it needs to know where the road is to stay in a particular
More informationE80. Experimental Engineering. Lecture 9 Inertial Measurement
Lecture 9 Inertial Measurement http://www.volker-doormann.org/physics.htm Feb. 19, 2013 Christopher M. Clark Where is the rocket? Outline Sensors People Accelerometers Gyroscopes Representations State
More informationPerformance Evaluation of INS Based MEMES Inertial Measurement Unit
Int'l Journal of Computing, Communications & Instrumentation Engg. (IJCCIE) Vol. 2, Issue 1 (215) ISSN 2349-1469 EISSN 2349-1477 Performance Evaluation of Based MEMES Inertial Measurement Unit Othman Maklouf
More informationAUTONOMOUS PLANETARY ROVER CONTROL USING INVERSE SIMULATION
AUTONOMOUS PLANETARY ROVER CONTROL USING INVERSE SIMULATION Kevin Worrall (1), Douglas Thomson (1), Euan McGookin (1), Thaleia Flessa (1) (1)University of Glasgow, Glasgow, G12 8QQ, UK, Email: kevin.worrall@glasgow.ac.uk
More informationCamera Drones Lecture 2 Control and Sensors
Camera Drones Lecture 2 Control and Sensors Ass.Prof. Friedrich Fraundorfer WS 2017 1 Outline Quadrotor control principles Sensors 2 Quadrotor control - Hovering Hovering means quadrotor needs to hold
More informationMULTI-SENSOR DATA FUSION FOR LAND VEHICLE ATTITUDE ESTIMATION USING A FUZZY EXPERT SYSTEM
Data Science Journal, Volume 4, 28 November 2005 127 MULTI-SENSOR DATA FUSION FOR LAND VEHICLE ATTITUDE ESTIMATION USING A FUZZY EXPERT SYSTEM Jau-Hsiung Wang* and Yang Gao Department of Geomatics Engineering,
More informationUAV Autonomous Navigation in a GPS-limited Urban Environment
UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight
More information3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit
3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 9 Toyomi Fujita and Yuya Kondo Tohoku Institute of Technology Japan 1. Introduction A 3D configuration and terrain sensing
More informationSafety for an Autonomous Bucket Excavator During Typical Landscaping Tasks
Safety for an Autonomous Bucket Excavator During Typical Landscaping Tasks Gregor Zolynski, Daniel Schmidt and Karsten Berns Abstract The Robotics Research Lab in Kaiserserlautern, Germany, pursues the
More informationNavigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM
Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement
More informationState Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements
State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements António Pedro Aguiar aguiar@ece.ucsb.edu João Pedro Hespanha hespanha@ece.ucsb.edu Dept.
More informationOutline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017
Electric Electronic Engineering Bogazici University December 13, 2017 Absolute position measurement Outline Motion Odometry Inertial systems Environmental Tactile Proximity Sensing Ground-Based RF Beacons
More informationEE565:Mobile Robotics Lecture 2
EE565:Mobile Robotics Lecture 2 Welcome Dr. Ing. Ahmad Kamal Nasir Organization Lab Course Lab grading policy (40%) Attendance = 10 % In-Lab tasks = 30 % Lab assignment + viva = 60 % Make a group Either
More informationNAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM
NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM Jorma Selkäinaho, Aarne Halme and Janne Paanajärvi Automation Technology Laboratory, Helsinki University of Technology, Espoo,
More informationDynamic Modelling and Simulation of a Vectored Thrust Quad-Rotor
Dynamic Modelling and Simulation of a Vectored Thrust Quad-Rotor Benjamin Faul School of Mechanical and Manufacturing Engineering University of New South Wales Sydney, NSW 252 Email: benjaminfaul@bigpond.com
More informationMotion estimation of unmanned marine vehicles Massimo Caccia
Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it
More informationCHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS
CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS ökçen Aslan 1,2, Afşar Saranlı 2 1 Defence Research and Development Institute (SAE), TÜBİTAK 2 Dept. of Electrical and Electronics Eng.,
More informationExterior Orientation Parameters
Exterior Orientation Parameters PERS 12/2001 pp 1321-1332 Karsten Jacobsen, Institute for Photogrammetry and GeoInformation, University of Hannover, Germany The georeference of any photogrammetric product
More informationBehavior-based Arm Control for an Autonomous Bucket Excavator
Behavior-based Arm Control for an Autonomous Bucket Excavator Sergey Pluzhnikov, Daniel Schmidt, Jochen Hirth, and Karsten Berns Robotics Research Lab, Dept. of Computer Science, University of Kaiserslautern,
More informationDYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda**
DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda** * Department of Mechanical Engineering Technical University of Catalonia (UPC)
More informationAutonomous Navigation for Flying Robots
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München Sensors IMUs (inertial measurement units) Accelerometers
More informationData Association for SLAM
CALIFORNIA INSTITUTE OF TECHNOLOGY ME/CS 132a, Winter 2011 Lab #2 Due: Mar 10th, 2011 Part I Data Association for SLAM 1 Introduction For this part, you will experiment with a simulation of an EKF SLAM
More informationEstimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter
Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Przemys law G asior, Stanis law Gardecki, Jaros law Gośliński and Wojciech Giernacki Poznan University of
More informationDealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech
Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The
More informationnavigation Isaac Skog
Foot-mounted zerovelocity aided inertial navigation Isaac Skog skog@kth.se Course Outline 1. Foot-mounted inertial navigation a. Basic idea b. Pros and cons 2. Inertial navigation a. The inertial sensors
More informationMajor project components: Sensors Robot hardware/software integration Kinematic model generation High-level control
Status update: Path planning/following for a snake Major project components: Sensors Robot hardware/software integration Kinematic model generation High-level control 2. Optical mouse Optical mouse technology
More information13. Learning Ballistic Movementsof a Robot Arm 212
13. Learning Ballistic Movementsof a Robot Arm 212 13. LEARNING BALLISTIC MOVEMENTS OF A ROBOT ARM 13.1 Problem and Model Approach After a sufficiently long training phase, the network described in the
More informationLecture 13 Visual Inertial Fusion
Lecture 13 Visual Inertial Fusion Davide Scaramuzza Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve
More informationRobots are built to accomplish complex and difficult tasks that require highly non-linear motions.
Path and Trajectory specification Robots are built to accomplish complex and difficult tasks that require highly non-linear motions. Specifying the desired motion to achieve a specified goal is often a
More informationJ. Roberts and P. Corke CRC for Mining technology and Equipment PO Box 883, Kenmore, Q
Experiments in Autonomous Underground Guidance S. Scheding, E. M. Nebot, M. Stevens and H. Durrant-Whyte Department of Mechanical Engineering The University of Sydney, NSW 6, Australia. e-mail: scheding/nebot/michael/hugh@tiny.me.su.oz.au
More informationMe 3-Axis Accelerometer and Gyro Sensor
Me 3-Axis Accelerometer and Gyro Sensor SKU: 11012 Weight: 20.00 Gram Description: Me 3-Axis Accelerometer and Gyro Sensor is a motion processing module. It can use to measure the angular rate and the
More informationROBOTICS AND AUTONOMOUS SYSTEMS
ROBOTICS AND AUTONOMOUS SYSTEMS Simon Parsons Department of Computer Science University of Liverpool LECTURE 6 PERCEPTION/ODOMETRY comp329-2013-parsons-lect06 2/43 Today We ll talk about perception and
More informationINTEGRATED TECH FOR INDUSTRIAL POSITIONING
INTEGRATED TECH FOR INDUSTRIAL POSITIONING Integrated Tech for Industrial Positioning aerospace.honeywell.com 1 Introduction We are the world leader in precision IMU technology and have built the majority
More informationRobotics and Autonomous Systems
Robotics and Autonomous Systems Lecture 6: Perception/Odometry Simon Parsons Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception
More informationThis was written by a designer of inertial guidance machines, & is correct. **********************************************************************
EXPLANATORY NOTES ON THE SIMPLE INERTIAL NAVIGATION MACHINE How does the missile know where it is at all times? It knows this because it knows where it isn't. By subtracting where it is from where it isn't
More informationRobotics and Autonomous Systems
Robotics and Autonomous Systems Lecture 6: Perception/Odometry Terry Payne Department of Computer Science University of Liverpool 1 / 47 Today We ll talk about perception and motor control. 2 / 47 Perception
More informationCalibration of a rotating multi-beam Lidar
The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Calibration of a rotating multi-beam Lidar Naveed Muhammad 1,2 and Simon Lacroix 1,2 Abstract
More informationCS283: Robotics Fall 2016: Sensors
CS283: Robotics Fall 2016: Sensors Sören Schwertfeger / 师泽仁 ShanghaiTech University Robotics ShanghaiTech University - SIST - 23.09.2016 2 REVIEW TRANSFORMS Robotics ShanghaiTech University - SIST - 23.09.2016
More informationEncoder applications. I Most common use case: Combination with motors
3.5 Rotation / Motion - Encoder applications 64-424 Intelligent Robotics Encoder applications I Most common use case: Combination with motors I Used to measure relative rotation angle, rotational direction
More informationAn Intro to Gyros. FTC Team #6832. Science and Engineering Magnet - Dallas ISD
An Intro to Gyros FTC Team #6832 Science and Engineering Magnet - Dallas ISD Gyro Types - Mechanical Hubble Gyro Unit Gyro Types - Sensors Low cost MEMS Gyros High End Gyros Ring laser, fiber optic, hemispherical
More informationDigital Compass Accuracy
Digital Compass Accuracy Lindsey Hines, University of St. Thomas Mentor: Dr. James Bellingham Summer 2007 Keywords: digital compass, Microstrain, tilt measurement error ABSTRACT The overall goal of the
More informationLocalization algorithm using a virtual label for a mobile robot in indoor and outdoor environments
Artif Life Robotics (2011) 16:361 365 ISAROB 2011 DOI 10.1007/s10015-011-0951-7 ORIGINAL ARTICLE Ki Ho Yu Min Cheol Lee Jung Hun Heo Youn Geun Moon Localization algorithm using a virtual label for a mobile
More informationCANAL FOLLOWING USING AR DRONE IN SIMULATION
CANAL FOLLOWING USING AR DRONE IN SIMULATION ENVIRONMENT Ali Ahmad, Ahmad Aneeque Khalid Department of Electrical Engineering SBA School of Science & Engineering, LUMS, Pakistan {14060006, 14060019}@lums.edu.pk
More informationSelection and Integration of Sensors Alex Spitzer 11/23/14
Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs
More informationLocalization and Map Building
Localization and Map Building Noise and aliasing; odometric position estimation To localize or not to localize Belief representation Map representation Probabilistic map-based localization Other examples
More informationExperimental Verification of Stability Region of Balancing a Single-wheel Robot: an Inverted Stick Model Approach
IECON-Yokohama November 9-, Experimental Verification of Stability Region of Balancing a Single-wheel Robot: an Inverted Stick Model Approach S. D. Lee Department of Mechatronics Engineering Chungnam National
More informationCecilia Laschi The BioRobotics Institute Scuola Superiore Sant Anna, Pisa
University of Pisa Master of Science in Computer Science Course of Robotics (ROB) A.Y. 2016/17 cecilia.laschi@santannapisa.it http://didawiki.cli.di.unipi.it/doku.php/magistraleinformatica/rob/start Robot
More informationFinal Exam Practice Fall Semester, 2012
COS 495 - Autonomous Robot Navigation Final Exam Practice Fall Semester, 2012 Duration: Total Marks: 70 Closed Book 2 hours Start Time: End Time: By signing this exam, I agree to the honor code Name: Signature:
More informationSimplified Orientation Determination in Ski Jumping using Inertial Sensor Data
Simplified Orientation Determination in Ski Jumping using Inertial Sensor Data B.H. Groh 1, N. Weeger 1, F. Warschun 2, B.M. Eskofier 1 1 Digital Sports Group, Pattern Recognition Lab University of Erlangen-Nürnberg
More informationCOMBINED BUNDLE BLOCK ADJUSTMENT VERSUS DIRECT SENSOR ORIENTATION ABSTRACT
COMBINED BUNDLE BLOCK ADJUSTMENT VERSUS DIRECT SENSOR ORIENTATION Karsten Jacobsen Institute for Photogrammetry and Engineering Surveys University of Hannover Nienburger Str.1 D-30167 Hannover, Germany
More informationMOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS. Josep Maria Font, Joaquim A. Batlle
MOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS Josep Maria Font, Joaquim A. Batlle Department of Mechanical Engineering Technical University of Catalonia (UC) Avda. Diagonal 647, 08028
More informationZürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary
Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza ETH Master Course: 151-0854-00L Autonomous Mobile Robots Summary 2 Lecture Overview Mobile Robot Control Scheme knowledge, data base mission
More informationRobotics (Kinematics) Winter 1393 Bonab University
Robotics () Winter 1393 Bonab University : most basic study of how mechanical systems behave Introduction Need to understand the mechanical behavior for: Design Control Both: Manipulators, Mobile Robots
More informationDYNAMIC POSITIONING CONFERENCE September 16-17, Sensors
DYNAMIC POSITIONING CONFERENCE September 16-17, 2003 Sensors An Integrated acoustic positioning and inertial navigation system Jan Erik Faugstadmo, Hans Petter Jacobsen Kongsberg Simrad, Norway Revisions
More informationWhere s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map
Where s the Boss? : Monte Carlo Localization for an Autonomous Ground Vehicle using an Aerial Lidar Map Sebastian Scherer, Young-Woo Seo, and Prasanna Velagapudi October 16, 2007 Robotics Institute Carnegie
More informationInertial Navigation Static Calibration
INTL JOURNAL OF ELECTRONICS AND TELECOMMUNICATIONS, 2018, VOL. 64, NO. 2, PP. 243 248 Manuscript received December 2, 2017; revised April, 2018. DOI: 10.24425/119518 Inertial Navigation Static Calibration
More informationBuilding Reliable 2D Maps from 3D Features
Building Reliable 2D Maps from 3D Features Dipl. Technoinform. Jens Wettach, Prof. Dr. rer. nat. Karsten Berns TU Kaiserslautern; Robotics Research Lab 1, Geb. 48; Gottlieb-Daimler- Str.1; 67663 Kaiserslautern;
More informationEE631 Cooperating Autonomous Mobile Robots
EE631 Cooperating Autonomous Mobile Robots Lecture: Multi-Robot Motion Planning Prof. Yi Guo ECE Department Plan Introduction Premises and Problem Statement A Multi-Robot Motion Planning Algorithm Implementation
More informationMETR 4202: Advanced Control & Robotics
Position & Orientation & State t home with Homogenous Transformations METR 4202: dvanced Control & Robotics Drs Surya Singh, Paul Pounds, and Hanna Kurniawati Lecture # 2 July 30, 2012 metr4202@itee.uq.edu.au
More informationIntroduction to Mobile Robots Kinematics 3: Kinematic Models of Sensors and Actuators 1. Kinematics 3: Kinematic Models of Sensors and Actuators
1 Kinematics 3: Kinematic Models of Sensors and Actuators 1.1Axis Conventions 2 1 Vehicle Attitude in Euler Angle Form 1.1 Axis Conventions In aerospace vehicles, z points downward. Good for airplanes
More informationIntroduction to Inertial Navigation (INS tutorial short)
Introduction to Inertial Navigation (INS tutorial short) Note 1: This is a short (20 pages) tutorial. An extended (57 pages) tutorial that also includes Kalman filtering is available at http://www.navlab.net/publications/introduction_to
More informationOptimization-Based Calibration of a Triaxial Accelerometer-Magnetometer
2005 American Control Conference June 8-10, 2005. Portland, OR, USA ThA07.1 Optimization-Based Calibration of a Triaxial Accelerometer-Magnetometer Erin L. Renk, Walter Collins, Matthew Rizzo, Fuju Lee,
More informationProject 1 : Dead Reckoning and Tracking
CS3630 Spring 2012 Project 1 : Dead Reckoning and Tracking Group : Wayward Sons Sameer Ansari, David Bernal, Tommy Kazenstein 2/8/2012 Wayward Sons CS3630 Spring 12 Project 1 Page 2 of 12 CS 3630 (Spring
More informationBearing only visual servo control of a non-holonomic mobile robot. Robert Mahony
Bearing only visual servo control of a non-holonomic mobile robot. Robert Mahony Department of Engineering, Australian National University, Australia. email: Robert.Mahony@anu.edu.au url: http://engnet.anu.edu.au/depeople/robert.mahony/
More informationImproving autonomous orchard vehicle trajectory tracking performance via slippage compensation
Improving autonomous orchard vehicle trajectory tracking performance via slippage compensation Dr. Gokhan BAYAR Mechanical Engineering Department of Bulent Ecevit University Zonguldak, Turkey This study
More informationDEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING
Inženýrská MECHANIKA, roč. 15, 2008, č. 5, s. 337 344 337 DEALING WITH SENSOR ERRORS IN SCAN MATCHING FOR SIMULTANEOUS LOCALIZATION AND MAPPING Jiří Krejsa, Stanislav Věchet* The paper presents Potential-Based
More informationUsing 3D Laser Range Data for SLAM in Outdoor Environments
Using 3D Laser Range Data for SLAM in Outdoor Environments Christian Brenneke, Oliver Wulf, Bernardo Wagner Institute for Systems Engineering, University of Hannover, Germany [brenneke, wulf, wagner]@rts.uni-hannover.de
More informationQuaternion Kalman Filter Design Based on MEMS Sensors
, pp.93-97 http://dx.doi.org/10.14257/astl.2014.76.20 Quaternion Kalman Filter Design Based on MEMS Sensors Su zhongbin,yanglei, Kong Qingming School of Electrical and Information. Northeast Agricultural
More informationInertial Measurement Units I!
! Inertial Measurement Units I! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 9! stanford.edu/class/ee267/!! Lecture Overview! coordinate systems (world, body/sensor, inertial,
More informationAnalysis of Euler Angles in a Simple Two-Axis Gimbals Set
Vol:5, No:9, 2 Analysis of Euler Angles in a Simple Two-Axis Gimbals Set Ma Myint Myint Aye International Science Index, Mechanical and Mechatronics Engineering Vol:5, No:9, 2 waset.org/publication/358
More informationMEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS
MEMS technology quality requirements as applied to multibeam echosounder Jerzy DEMKOWICZ, Krzysztof BIKONIS Gdansk University of Technology Gdansk, Narutowicza str. 11/12, Poland demjot@eti.pg.gda.pl Small,
More informationExperimental results of a Differential Optic-Flow System
Experimental results of a Differential Optic-Flow System Richard Kathage and Jonghyuk Kim Department of Engineering The Australian National University, Australia {richard.kathage, jonghyuk.kim}@anu.edu.au
More informationDevelopment of a Test Field for the Calibration and Evaluation of Kinematic Multi Sensor Systems
Development of a Test Field for the Calibration and Evaluation of Kinematic Multi Sensor Systems DGK-Doktorandenseminar Graz, Austria, 26 th April 2017 Erik Heinz Institute of Geodesy and Geoinformation
More informationCAMERA GIMBAL PERFORMANCE IMPROVEMENT WITH SPINNING-MASS MECHANICAL GYROSCOPES
8th International DAAAM Baltic Conference "INDUSTRIAL ENGINEERING 19-21 April 2012, Tallinn, Estonia CAMERA GIMBAL PERFORMANCE IMPROVEMENT WITH SPINNING-MASS MECHANICAL GYROSCOPES Tiimus, K. & Tamre, M.
More informationKeeping Track of Position and Orientation of Moving Indoor Systems by Correlation of Range-Finder Scans
Gerhard Weiß, Christopher Wetzler, Ewald von Puttkamer Paper-Ref-Nr. E / Keeping Track of Position and Orientation of Moving Indoor Systems by Correlation of Range-Finder Scans Gerhard Weiß, Christopher
More informationGPS + Inertial Sensor Fusion
GPS + Inertial Sensor Fusion Senior Project Proposal Aleksey Lykov, William Tarpley, Anton Volkov Advisors: Dr. In Soo Ahn, Dr. Yufeng Lu Date: November 26, 2013 Project Summary The objective of this project
More informationStrapdown inertial navigation technology
Strapdown inertial navigation technology D. H. Titterton and J. L. Weston Peter Peregrinus Ltd. on behalf of the Institution of Electrical Engineers Contents Preface Page xiii 1 Introduction 1 1.1 Navigation
More informationCOARSE LEVELING OF INS ATTITUDE UNDER DYNAMIC TRAJECTORY CONDITIONS. Paul G. Savage Strapdown Associates, Inc.
COARSE LEVELIG OF IS ATTITUDE UDER DYAMIC TRAJECTORY CODITIOS Paul G. Savage Strapdown Associates, Inc. SAI-W-147 www.strapdownassociates.com January 28, 215 ASTRACT Approximate attitude initialization
More information3D Audio Perception System for Humanoid Robots
3D Audio Perception System for Humanoid Robots Norbert Schmitz, Carsten Spranger, Karsten Berns University of Kaiserslautern Robotics Research Laboratory Kaiserslautern D-67655 {nschmitz,berns@informatik.uni-kl.de
More informationElectronics Design Contest 2016 Wearable Controller VLSI Category Participant guidance
Electronics Design Contest 2016 Wearable Controller VLSI Category Participant guidance June 27, 2016 Wearable Controller is a wearable device that can gather data from person that wears it. Those data
More informationGravity compensation in accelerometer measurements for robot navigation on inclined surfaces
Available online at www.sciencedirect.com Procedia Computer Science 6 (211) 413 418 Complex Adaptive Systems, Volume 1 Cihan H. Dagli, Editor in Chief Conference Organized by Missouri University of Science
More informationAutomatic Merging of Lidar Point-Clouds Using Data from Low-Cost GPS/IMU Systems
Utah State University DigitalCommons@USU Electrical and Computer Engineering Faculty Publications Electrical and Computer Engineering 6-1-2011 Automatic Merging of Lidar Point-Clouds Using Data from Low-Cost
More informationTracking a Mobile Robot Position Using Vision and Inertial Sensor
Tracking a Mobile Robot Position Using Vision and Inertial Sensor Francisco Coito, António Eleutério, Stanimir Valtchev, and Fernando Coito Faculdade de Ciências e Tecnologia Universidade Nova de Lisboa,
More informationDevelopment of a Ground Based Cooperating Spacecraft Testbed for Research and Education
DIPARTIMENTO DI INGEGNERIA INDUSTRIALE Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education Mattia Mazzucato, Sergio Tronco, Andrea Valmorbida, Fabio Scibona and Enrico
More informationACE Project Report. December 10, Reid Simmons, Sanjiv Singh Robotics Institute Carnegie Mellon University
ACE Project Report December 10, 2007 Reid Simmons, Sanjiv Singh Robotics Institute Carnegie Mellon University 1. Introduction This report covers the period from September 20, 2007 through December 10,
More informationSlope Traversal Experiments with Slip Compensation Control for Lunar/Planetary Exploration Rover
Slope Traversal Eperiments with Slip Compensation Control for Lunar/Planetary Eploration Rover Genya Ishigami, Keiji Nagatani, and Kazuya Yoshida Abstract This paper presents slope traversal eperiments
More information