Segway RMP Experiments at Georgia Tech

Similar documents
Vehicle Localization. Hannah Rae Kerner 21 April 2015

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

Robotics Project. Final Report. Computer Science University of Minnesota. December 17, 2007

Calibration of a rotating multi-beam Lidar

EE631 Cooperating Autonomous Mobile Robots

Build and Test Plan: IGV Team

Recycling Robotics Garbage collecting Robot Application - Project Work Definitions and Design

Exam in DD2426 Robotics and Autonomous Systems

CANAL FOLLOWING USING AR DRONE IN SIMULATION

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016

THE ML/MD SYSTEM .'*.,

EE565:Mobile Robotics Lecture 3

Real-time Cooperative Behavior for Tactical Mobile Robot Teams. April 2000 Ronald C. Arkin and Thomas R. Collins Georgia Tech

Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit

LAUROPE Six Legged Walking Robot for Planetary Exploration participating in the SpaceBot Cup

A Repository Of Sensor Data For Autonomous Driving Research

Lidar Sensors, Today & Tomorrow. Christian Sevcik RIEGL Laser Measurement Systems

CACIC Documentation. Release Gabriel F P Araujo

Terrain Roughness Identification for High-Speed UGVs

3D Simultaneous Localization and Mapping and Navigation Planning for Mobile Robots in Complex Environments

Sensor technology for mobile robots

ACE Project Report. December 10, Reid Simmons, Sanjiv Singh Robotics Institute Carnegie Mellon University

Unmanned Vehicle Technology Researches for Outdoor Environments. *Ju-Jang Lee 1)

CS283: Robotics Fall 2016: Sensors

3D Laser Range Finder Topological sensor gathering spatial data from the surrounding environment

Dynamic Sensor-based Path Planning and Hostile Target Detection with Mobile Ground Robots. Matt Epperson Dr. Timothy Chung

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Sphero Lightning Lab Cheat Sheet

GBT Commissioning Memo 11: Plate Scale and pointing effects of subreflector positioning at 2 GHz.

Motion Planning of a Robotic Arm on a Wheeled Vehicle on a Rugged Terrain * Abstract. 1 Introduction. Yong K. Hwangt

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

Terrain Traversability Analysis for off-road robots using Time-Of-Flight 3D Sensing

Final Report. Autonomous Robot: Chopper John Michael Mariano December 9, 2014

Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots

Sensor Modalities. Sensor modality: Different modalities:

3D LASER SCANNING SYSTEMS. RobotEye RE08 3D LIDAR 3D Laser Scanning System. Product Datasheet

Use the slope of a graph of the cart s acceleration versus sin to determine the value of g, the acceleration due to gravity.

Autonomous Programming FTC Challenge Workshops VCU School of Engineering September 24, 2016 Presented by: Team 8297 Geared UP!

Terrain Data Real-time Analysis Based on Point Cloud for Mars Rover

9th Intelligent Ground Vehicle Competition. Design Competition Written Report. Design Change Report AMIGO

RobotEye RE08 3D LIDAR 3D Laser Scanning System. Product Datasheet

Snow cover change detection with laser scanning range and brightness measurements

Data Association for SLAM

Project 1 : Dead Reckoning and Tracking

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Mobile Autonomous Robotic Sentry (MARS) with Facial Detection and Recognition

Probabilistic Robotics

Reactive and Hybrid Agents. Based on An Introduction to MultiAgent Systems and slides by Michael Wooldridge

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles

QUASI-3D SCANNING WITH LASERSCANNERS

Clearpath Communication Protocol. For use with the Clearpath Robotics research platforms

Announcements. Exam #2 next Thursday (March 13) Covers material from Feb. 11 through March 6

Introduction to Mobile Robotics

Final Project Report: Mobile Pick and Place

Studuino Block Programming Environment Guide

Mobile Robots Locomotion

Evaluating the Performance of a Vehicle Pose Measurement System

Operating Manual. AdirPro HV8RL Red Beam Horizontal/Vertical Laser level

Terrain Mapping and Control Optimization for a 6-Wheel Rover with Passive Suspension

Obstacle Avoidance Project: Final Report

SCORPIO 30 User s manual

Exploration of an Indoor-Environment by an Autonomous Mobile Robot

MTRX4700: Experimental Robotics

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

Design Report prepared by: Shawn Brovold Gregory Rupp Eric Li Nathan Carlson. Faculty Advisor: Max Donath

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots

Exploring Unknown Structured Environments

Calibration of a rotating multi-beam Lidar

Southern Illinois University Edwardsville

Active2012 HOSEI UNIVERSITY

Space Robotics. Ioannis Rekleitis

Final Exam : Introduction to Robotics. Last Updated: 9 May You will have 1 hour and 15 minutes to complete this exam

Localization and Mapping in Urban Environments Using Mobile Robots

SHARPKUNGFU TEAM DESCRIPTION 2006

Robot Localization based on Geo-referenced Images and G raphic Methods

CAUTION: NEVER LOOK DIRECTLY INTO THE LASER BEAM.

TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4

Final Exam Practice Fall Semester, 2012

MANUAL FOR: Universal stabilization system for video cameras weighing from 300 g to 4(5) kg

Experimental results of a Differential Optic-Flow System

Robotics. CSPP Artificial Intelligence March 10, 2004

Localization of Multiple Robots with Simple Sensors

Lab 12 - Interference-Diffraction of Light Waves

USING 3D DATA FOR MONTE CARLO LOCALIZATION IN COMPLEX INDOOR ENVIRONMENTS. Oliver Wulf, Bernardo Wagner

Autonomous Vehicle Navigation Using Stereoscopic Imaging

Ceilbot vision and mapping system

Safe Prediction-Based Local Path Planning using Obstacle Probability Sections

Optimized Design of 3D Laser Triangulation Systems

SAE Aerospace Control & Guidance Systems Committee #97 March 1-3, 2006 AFOSR, AFRL. Georgia Tech, MIT, UCLA, Virginia Tech

Dräger UCF 6000 Thermal Imaging Camera

NAVIGATION AND RETRO-TRAVERSE ON A REMOTELY OPERATED VEHICLE

3D Model-Based Dynamic Feature Planning for Laser Scanner-Based Navigation of Vehicles

Table of Contents. Introduction 1. Software installation 2. Remote control and video transmission 3. Navigation 4. FAQ 5.

DAMAGE INSPECTION AND EVALUATION IN THE WHOLE VIEW FIELD USING LASER

Dräger UCF 9000 NFPA Certified Thermal Imaging Camera

AUTONOMOUS CONTROL OF AN OMNI-DIRECTIONAL MOBILE ROBOT

CALCULUS II. Parametric Equations and Polar Coordinates. Paul Dawkins

PATENT LIABILITY ANALYSIS. Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers

Multiple camera, laser rangefinder, and encoder data fusion for navigation of a differentially steered 3-wheeled autonomous vehicle

Transcription:

Segway RMP Experiments at Georgia Tech DARPA MARS Segway Workshop September 23, 2003 Tom Collins

Major activities Interface to MissionLab Teleoperation experiments Use of laser range finder for terrain characterization

Physical modifications Basic operation requires mounting of only two components Ruggedized case containing laptop, power converters, 802.11, camera, and video encoder (on top plate, 4 bolts) Battery (on base, in custom bracket) SICK mount for laser experiments Optional protective kill switches (slight modification of UMass design)

Software and Interface Uses Kvaser LapCAN card for RMP interface RMP drivers run as part of HServer application HServer provides a uniform interface across various robot platforms and sensors RMP updates acquired at about 50 Hz MissionLab generates executable code for RMP like any other supported robot Interface to SICK and encoded video stream unchanged from our other platforms

Initial experiments Work stalled by bailment agreement Interface completed in lab during late July, with little actual robot usage Video shows first significant run teleoperation at Ft. Benning

Lessons learned Only one unexplained instance of dying (in lab, very early in testing) Virtually no tipping until we started trying Vulnerable to tipping during sudden acceleration in loose soil or gravel Hill climbing capabilities limited Battery power is impressive for vehicle size Speed, turning radius better than other outdoor robots

Ft. Benning runs- MOUT

Ft. Benning run -- Sewer

Ft. Benning runs Leader/Follower

Ft. Benning runs Stress testing

Terrain characterization Laser rangefinders have been used extensively on robots On or near the ground : Road following Footfall selection Vegetation characterization Localization and/or visualization RMP is at least as vulnerable to discontinuities as a legged robot RMP has a free tilt mechanism Seems worthwhile to revisit the terrain issue with the RMP in mind

Geometry of a SICK on the RMP First consider ONLY the reading taken directly ahead (azimuth = 0) Let (x 0, y 0, z 0 ) be the sensed point on the ground in egocentric coordinates Angles, distances be defined as in figure Then x y z 0 0 0 = 0 = l sin( ρ) d cos( ρ) + r = w + l cos( ρ) d sin( ρ) r 0 cos( α + ρ) 0 sin( α + ρ)

RMP pitch behavior Data taken directly from RMP pitch sensor Shows the tip needed to move across fairly level ground Average Tilt Initiates motion Stops

Closer look at pitch Gross control operates at ~0.15 Hz varying with payload, etc. Maintains speed? Fine control operates at about 1 Hz Hopefully, all of these effects can be made to disappear in range readings

Raw laser range readings Still considering only the single reading straight ahead (and tilted down) Data taken during same maneuver as previous pitch data During the active movement phase Note the appearance of same frequencies More apparent when scaled sin(pitch) is superimposed (inset)

Corrected laser range readings Apply the correction for z 0 Plotted along with raw range data for comparison Since this was fairly level ground, the plot should stay near zero It actually approximates?z relative to z at wheel Deviation from zero mostly due to minor slopes z = w + l cos( ρ ) d sin( ρ) r0 sin( α + 0 ρ )

Check for latency issues Pitch and range are acquired from different devices Timestamps are applied at the computer running hserver Latency characteristics of RMP are unknown So, some highfrequency errors may be due to picking the wrong pitch data Graph shows that the best choice of pitch data is the most recent at the time of laser scan completion

3D terrain Use data for all azimuths, not just straight ahead Calculate x and y for every point

Terrain and obstacles

More visualizations

How this data can be used Autonomous operation Reactive sensing of terrain considerations (perceptual schemas) Without even attempting to register data in a larger world map, it provides Local positive and negative obstacles smaller than wheel width Sideslope of path ahead (RMP is vulnerable to side tipping) Fore/aft slope All of these features can be expressed as simple avoidance vectors Teleoperation Visualization of terrain (can be displayed alongside visual image) Could be processed to produce simple operator cues (possibly generated by same perceptual schemas above) Low-light operation

Future work Full autonomous operation Multiagent experiments Heterogeneous with existing GT robots Homogeneous with MARS 2020 team Use SICK intensity information to display surface brightness (in IR) Estimation of surface material (sand, gravel, other difficult surfaces)