Intern Presentation:

Similar documents
INF Introduction to Robot Operating System

Ceilbot vision and mapping system

EECS 4330/7330 Introduction to Mechatronics and Robotic Vision, Fall Lab 5. Controlling Puma Using Leap Motion Device

Quick Introduction to ROS

Preface...vii. Printed vs PDF Versions of the Book...ix. 1. Scope of this Volume Installing the ros-by-example Code...3

Humanoid Robotics. Inverse Kinematics and Whole-Body Motion Planning. Maren Bennewitz

Humanoid Robotics. Inverse Kinematics and Whole-Body Motion Planning. Maren Bennewitz

ROS-Industrial Basic Developer s Training Class. Southwest Research Institute

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation

Spring 2016 :: :: Robot Autonomy :: Team 7 Motion Planning for Autonomous All-Terrain Vehicle

LUMS Mine Detector Project

Hand. Desk 4. Panda research 5. Franka Control Interface (FCI) Robot Model Library. ROS support. 1 technical data is subject to change

Industrial Calibration. Chris Lewis Southwest Research Institute

Generating and Learning from 3D Models of Objects through Interactions. by Kiana Alcala, Kathryn Baldauf, Aylish Wrench

3D Simulation in ROS. Mihai Emanuel Dolha Intelligent Autonomous Systems Technische Universität München. November 4, 2010

Small Object Manipulation in 3D Perception Robotic Systems Using Visual Servoing

Final Project Report: Mobile Pick and Place

Lighting- and Occlusion-robust View-based Teaching/Playback for Model-free Robot Programming

Mobile Manipulation for the KUKA youbot Platform. A Major Qualifying Project Report. submitted to the Faculty. of the WORCESTER POLYTECHNIC INSTITUTE

Miniature faking. In close-up photo, the depth of field is limited.

Set up and Foundation of the Husky

BIN PICKING APPLICATIONS AND TECHNOLOGIES

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

3D Collision Avoidance for Navigation in Unstructured Environments

DARPA Investments in GEO Robotics

Inverse Kinematics. Given a desired position (p) & orientation (R) of the end-effector

begins halting unexpectedly, doing one or more of the following may improve performance;

Exam in DD2426 Robotics and Autonomous Systems

Robot mechanics and kinematics

EE-565-Lab2. Dr. Ahmad Kamal Nasir

ROS-Industrial Basic Developer s Training Class

Cognitive Robotics

autorob.github.io Inverse Kinematics UM EECS 398/598 - autorob.github.io

Stereo SLAM. Davide Migliore, PhD Department of Electronics and Information, Politecnico di Milano, Italy

Motion Planning of a Robotic Arm on a Wheeled Vehicle on a Rugged Terrain * Abstract. 1 Introduction. Yong K. Hwangt

Geometry of Multiple views

16-662: Robot Autonomy Project Bi-Manual Manipulation Final Report

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

S7316: Real-Time Robotics Control and Simulation for Deformable Terrain Applications Using the GPU

Super Assembling Arms

POME A mobile camera system for accurate indoor pose

MoveIt. Release Indigo. MoveIt Motion Planning Framework

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Robot mechanics and kinematics

ISE 422/ME 478/ISE 522 Robotic Systems

DEVELOPMENT OF TELE-ROBOTIC INTERFACE SYSTEM FOR THE HOT-LINE MAINTENANCE. Chang-Hyun Kim, Min-Soeng Kim, Ju-Jang Lee,1

A Modular Software Framework for Eye-Hand Coordination in Humanoid Robots

This overview summarizes topics described in detail later in this chapter.

Team Description Paper Team AutonOHM

Autonomous Mobility and Manipulation of a 9-DoF WMRA

Field-of-view dependent registration of point clouds and incremental segmentation of table-tops using time-offlight

ACE Project Report. December 10, Reid Simmons, Sanjiv Singh Robotics Institute Carnegie Mellon University

LAUROPE Six Legged Walking Robot for Planetary Exploration participating in the SpaceBot Cup

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)

Robotic Grasping in Cluttered Scenes

Humanoid Robots Exercise Sheet 9 - Inverse reachability maps (IRM) and statistical testing

Prof. Fanny Ficuciello Robotics for Bioengineering Visual Servoing

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Equipment use workshop How to use the sweet-pepper harvester

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots

10/11/07 1. Motion Control (wheeled robots) Representing Robot Position ( ) ( ) [ ] T

Object Classification in Domestic Environments

Robot Art Competition Using HERB: Final Report

Robo-Librarian: Sliding-Grasping Task

The Kinematic Chain or Tree can be represented by a graph of links connected though joints between each Link and other links.

Master s Thesis: Real-Time Object Shape Perception via Force/Torque Sensor

Contract number : Project acronym : SRS-EEU Project title : Multi-Role Shadow Robotic System for Independent Living

On Satellite Vision-aided Robotics Experiment

3D Modeling of Objects Using Laser Scanning

AIT Inline Computational Imaging: Geometric calibration and image rectification

CARE-O-BOT-RESEARCH: PROVIDING ROBUST ROBOTICS HARDWARE TO AN OPEN SOURCE COMMUNITY

A Robust Two Feature Points Based Depth Estimation Method 1)

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

ADVANCED MOTION-FORCE CONTROLLER FOR SPACE ARMS: EXPERIMENTAL RESULTS WITH THE GROUND REFERENCE MODEL OF EUROPA MISSION

Object recognition and grasping using bimanual robot. Report

Open World Assistive Grasping Using Laser Selection

Optimization Motion Planning with TrajOpt and Tesseract for Industrial Applications

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Semi-Dense Direct SLAM

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

Hybrid Rendering for Interactive Virtual Scenes

Visual Hulls from Single Uncalibrated Snapshots Using Two Planar Mirrors

Development of dual arm mobile manipulation systems for small part assembly tasks

Course 23: Multiple-View Geometry For Image-Based Modeling

Stereo: Disparity and Matching

MANIPULATOR AUTONOMY FOR EOD ROBOTS

Spring 2010: Lecture 9. Ashutosh Saxena. Ashutosh Saxena

Stereo. Outline. Multiple views 3/29/2017. Thurs Mar 30 Kristen Grauman UT Austin. Multi-view geometry, matching, invariant features, stereo vision

International Journal of Advance Engineering and Research Development

Construction of SCARA robot simulation platform based on ROS

Hand-Eye Calibration from Image Derivatives

Making Machines See. Roberto Cipolla Department of Engineering. Research team

Stereo Vision A simple system. Dr. Gerhard Roth Winter 2012

Project 4 Results. Representation. Data. Learning. Zachary, Hung-I, Paul, Emanuel. SIFT and HoG are popular and successful.

MCE/EEC 647/747: Robot Dynamics and Control. Lecture 1: Introduction

What is a Manipulator? 2007 RoboJackets TE Sessions 10/16/2007. Keys to Understanding Manipulators TE Sessions Manipulators 10/16/07

AMR 2011/2012: Final Projects

AUTONOMOUS DRONE NAVIGATION WITH DEEP LEARNING

Task-Oriented Planning for Manipulating Articulated Mechanisms under Model Uncertainty. Venkatraman Narayanan and Maxim Likhachev

Dense 3D Reconstruction. Christiano Gava

Transcription:

: Gripper Stereo and Assisted Teleoperation Stanford University December 13, 2010

Outline 1. Introduction 2. Hardware 3. Research 4. Packages 5. Conclusion

Introduction Hardware Research Packages Conclusion Motivation (1/2) Why use a gripper stereo sensor? It is easy to obtain good viewpoints. Object modeling Exploration I Data from multiple I See in and around perspectives. occluding objects. I Potentially less noise from I Low cost to exploration; distance. base does not move.

Motivation (2/2) Why use a gripper stereo sensor? There is no kinematic chain between sensor and end-effector. Uncalibrated URDF Head stereo and gripper stereo differ by >3cm Grasping based on gripper stereo works fine

Outline 1. Introduction 2. Hardware 3. Research 4. Packages 5. Conclusion

Small-Baseline Stereo (1/3) The off-the-shelf approach Custom mounting of Logitech C905 webcams. Rolling shutter, not synchronized Baseline = 25mm FOV 65 Minimum distance 8cm 640x480 resoution

Small-Baseline Stereo (2/3) The custom camera approach Custom stereo camera made from WGE100 cameras - functionally identical to PR2 head stereo. Synchronized, global (instantaneous) shutter. Baseline = 30mm Wide-angle, FOV 90 Minimum distance 10cm 640x480 resoution

Small-Baseline Stereo (3/3) How do they compare? Trade off between FOV and depth noise. Rolling shutter is fine for static scenes only. Narrower angle for autonomy, wider angle for teleop.

Introduction Hardware Research Packages Conclusion Hardware Calibration I Stereo camera calibration is natively supported in ROS. I pr2_calibration was modified to calibrate the gripper stereo frame location on the PR2 model.

Outline 1. Introduction 2. Hardware 3. Research 4. Packages 5. Conclusion

Grasp Adjustment - Overview Developed novel grasp adjustment for cluttered environments. Versatile, feature-based local point cloud grasp planner. Ideal for co-oriented sensor and gripper. Published in ISER 2010.

Grasp Adjustment - Evaluation Grasp quality is computed from features of local points. Distance to Centroid - Pulls gripper into object. Points in Collision - Penalizes collisions! Symmetry - Even distribution of points in vicinity of gripper. Normals - Point normals aligned with gripper frame axes.

Grasp Adjustment - Search Seed poses chosen from a grid centered at the input pose. Brief gradient search corrects unlucky seed poses. All poses are ranked according to cost. The best poses are selected for full optimization.

Introduction Hardware Research Packages Conclusion Assisted Teleoperation "So... am I helping the robot, or is it helping me? Developed joystick teleop of PR2 arms with grasp assistance. I Multiple control perspectives aid maneuvering. I Preset shortcuts for gripper orientations aid positioning. I Grasp adjustment is used to suggest grasp poses. I Rviz is used for viewing cameras and the virtual scene.

Outline 1. Introduction 2. Hardware 3. Research 4. Packages 5. Conclusion

pr2_gripper_stereo Package for managing and using gripper cameras. Launch files for calibration, stereo processing. URDF and meshes for modified hardware. Proof-of-concept nodes for point-cloud concatenation.

pr2_gripper_grasp_adjust Package for running grasp adjust server. Service uses GripperGraspAdjust service message. Could be expanded to general grasp planner. NOT dependent on gripper stereo.

pr2_remote_teleop Package enables full teleoperation of PR2. Head and base control is similar to pr2_teleop package. Grippers controlled using Cartesian J-inverse controller. Spacenav and ps3 controller supported by default. Config file supports button remapping on any joystick. Useful as local pr2_teleop replacement, or for remote operation.

What s Next? Object modeling using gripper stereo. Multiple cloud registration, surface smoothing. Integration with octomap framework. Human-in-the-loop robotics. Find strategies for human-robot multi-tasking. Robot: Grasp assist, collision avoidance, object tracking. Human: Object identification, edge-case recovery. > Teleoperation would greatly benefit from RVE / Rviz2.

Conclusion Gripper stereo provides valuable new viewpoint. Grasp adjustment works well on cluttered scenes. Remote operation is ready for added levels of autonomy.

The Final Slide Acknowledgments: Kaijen Hsiao, WG mentor Blaise Gassend, much camera help Kenneth Salisbury, PhD advisor Questions?