Segway RMP Experiments at Georgia Tech DARPA MARS Segway Workshop September 23, 2003 Tom Collins
Major activities Interface to MissionLab Teleoperation experiments Use of laser range finder for terrain characterization
Physical modifications Basic operation requires mounting of only two components Ruggedized case containing laptop, power converters, 802.11, camera, and video encoder (on top plate, 4 bolts) Battery (on base, in custom bracket) SICK mount for laser experiments Optional protective kill switches (slight modification of UMass design)
Software and Interface Uses Kvaser LapCAN card for RMP interface RMP drivers run as part of HServer application HServer provides a uniform interface across various robot platforms and sensors RMP updates acquired at about 50 Hz MissionLab generates executable code for RMP like any other supported robot Interface to SICK and encoded video stream unchanged from our other platforms
Initial experiments Work stalled by bailment agreement Interface completed in lab during late July, with little actual robot usage Video shows first significant run teleoperation at Ft. Benning
Lessons learned Only one unexplained instance of dying (in lab, very early in testing) Virtually no tipping until we started trying Vulnerable to tipping during sudden acceleration in loose soil or gravel Hill climbing capabilities limited Battery power is impressive for vehicle size Speed, turning radius better than other outdoor robots
Ft. Benning runs- MOUT
Ft. Benning run -- Sewer
Ft. Benning runs Leader/Follower
Ft. Benning runs Stress testing
Terrain characterization Laser rangefinders have been used extensively on robots On or near the ground : Road following Footfall selection Vegetation characterization Localization and/or visualization RMP is at least as vulnerable to discontinuities as a legged robot RMP has a free tilt mechanism Seems worthwhile to revisit the terrain issue with the RMP in mind
Geometry of a SICK on the RMP First consider ONLY the reading taken directly ahead (azimuth = 0) Let (x 0, y 0, z 0 ) be the sensed point on the ground in egocentric coordinates Angles, distances be defined as in figure Then x y z 0 0 0 = 0 = l sin( ρ) d cos( ρ) + r = w + l cos( ρ) d sin( ρ) r 0 cos( α + ρ) 0 sin( α + ρ)
RMP pitch behavior Data taken directly from RMP pitch sensor Shows the tip needed to move across fairly level ground Average Tilt Initiates motion Stops
Closer look at pitch Gross control operates at ~0.15 Hz varying with payload, etc. Maintains speed? Fine control operates at about 1 Hz Hopefully, all of these effects can be made to disappear in range readings
Raw laser range readings Still considering only the single reading straight ahead (and tilted down) Data taken during same maneuver as previous pitch data During the active movement phase Note the appearance of same frequencies More apparent when scaled sin(pitch) is superimposed (inset)
Corrected laser range readings Apply the correction for z 0 Plotted along with raw range data for comparison Since this was fairly level ground, the plot should stay near zero It actually approximates?z relative to z at wheel Deviation from zero mostly due to minor slopes z = w + l cos( ρ ) d sin( ρ) r0 sin( α + 0 ρ )
Check for latency issues Pitch and range are acquired from different devices Timestamps are applied at the computer running hserver Latency characteristics of RMP are unknown So, some highfrequency errors may be due to picking the wrong pitch data Graph shows that the best choice of pitch data is the most recent at the time of laser scan completion
3D terrain Use data for all azimuths, not just straight ahead Calculate x and y for every point
Terrain and obstacles
More visualizations
How this data can be used Autonomous operation Reactive sensing of terrain considerations (perceptual schemas) Without even attempting to register data in a larger world map, it provides Local positive and negative obstacles smaller than wheel width Sideslope of path ahead (RMP is vulnerable to side tipping) Fore/aft slope All of these features can be expressed as simple avoidance vectors Teleoperation Visualization of terrain (can be displayed alongside visual image) Could be processed to produce simple operator cues (possibly generated by same perceptual schemas above) Low-light operation
Future work Full autonomous operation Multiagent experiments Heterogeneous with existing GT robots Homogeneous with MARS 2020 team Use SICK intensity information to display surface brightness (in IR) Estimation of surface material (sand, gravel, other difficult surfaces)