Final Grant Report Grant/Contract Number: NNJ04HI11A
|
|
- Julius Williams
- 5 years ago
- Views:
Transcription
1 Final Grant Report Grant/Contract Number: NNJ04HI11A Augmented Reality, Path Planning, and Physically-based Modeling for Remote Robotic Operations Abhilash Pandya (PI), Keshav Chintamani, R. Darin Ellis, Chin-An Tan, Alex Cao Wayne State University Kenneth Baker/ ER4, Chiun-Hong Chien / ER4, James C. Maida/ SF3, Barbara Woolford (COTR) NASA Johnson Space Center apandya@ece.eng.wayne.edu June/2008
2 Intro This is an interactive presentation with movies linked from an external website. When you see this symbol, click on the movie link next to it. This presentation was given to the Robotics and Human Factors community at Johnson Space Center, Astronaut Nancy Currie and to the head of ISS robotics- Dr. Shakeel Razvi. 2
3 Related Robotics Projects This clip shows our lab s related robotics projects in: Medical Robotics Military Robotics Space Robotics 3 Video: Related Robotics Projects
4 Previous NASA grant work in AR for Medical Application Video Shows AR models overlaid on a phantom skull Trajectory information overlaid Danger zones blinking Coordinate axis displayed Video: Link to movie 4
5 Background & Significance The astronaut controls the robot by manipulating the end-effecter with joysticks Visual feedback is provided by a camera mounted on the end-effecter and other peripheral cameras The astronaut depends on the video views to manipulate the robot This can be error-prone due to oblique views Difficult to simulate space dynamics on earth 5
6 Main Research Questions Can Augmented Reality (AR) cues improve user performance? Can optimal path planning help? Can we train with virtual AR robots /environments? Can dynamic environments be used for training? Are there better ways to control /select camera views? 6
7 Outline Camera control issues (pan/tilt/zoom) AR navigation cues Path planning Dynamic simulation for training Conclusions Future work 7
8 Preview The Video Shows: Video: A virtual robot registered to an actual robot. The virtual robot can be detached from the actual robot and manipulated Once the final position is reached, the actual robot can be triggered to follow the actual robots movements. The research which follows detailed aspects of this work. Link to Movie 8
9 Tele-robotic Test Bed (Hardware) This is the test bed setup at Wayne State University. Top, remote user console with joysticks and camera views. Bottom, robotic test bed with cameras 9
10 Tele-robotic Test Bed (Software) User Input 10
11 Where We Left Off Last Our software for axis augmentation has been implemented on the Dexterous Manipulator Trainer at NASA/JSC The joysticks were color-coded AR coordinate frames are displayed at the end-effector in multiple camera views 11
12 Camera control issues (pan/tilt/zoom) AR navigation cues Path planning Dynamic simulation for training Conclusions Future work CAMERA CONTROL ISSUES (PAN/TILT/ZOOM) 12
13 Camera Model Extrinsic Camera Parameters: Change with pan/tilt (6 Parameters) Translation component Rotation component Intrinsic Camera Parameters: Change with zoom (9 Parameters) Focal length Principal point Distortion 13 Camera Control Issues
14 Changing Camera Extrinsic: Pan/Tilt Allow AR with pan / tilt of camera Allow camera to track end-effecter or other objects Optimize camera views 14 Camera Control Issues
15 Camera Movement with AR Video clip shows The ability to pan and tilt the camera and maintain the AR scene. The ability to auto-track the end effector of the robot. Video: Link to Movie 15 Camera Control Issues
16 Zoom: Changing the intrinsic Camera Parameters Zoom-level vs. Parameter 18-zoom levels Fits for 9 intrinsic parameters 16 Camera Control Issues
17 Updating Zoom Parameters Video Shows The ability to zoom a camera within the AR scene. The effects of intrinsic parameter updates. Video: Link to Movie 17 Camera Control Issues
18 Camera control issues (pan/tilt/zoom) Augmented reality navigation cues Path planning Dynamic simulation for training Conclusions Future work AUGMENTED REALITY NAVIGATION CUES 18
19 Robot Navigation using Augmented Coordinates EE-Hand controller misalignments Hinder performance Errors in direction inputs Proposed Method Simple hand controller to EE axes mapping using AR Cues for ORU/receptacle alignment 19 Navigation Cues
20 Methods Each HC axes: equipped with color labels Indicate direction (+/-) Translation Rotation End-effector axes: Coordinate system with axes in RGB Additional orientation cues 3D cones for direction and depth Worksite axes Stationary coordinates in RGB Variation in graphics to prevent confusion 20 Navigation Cues
21 Navigation using Augmented Coordinates Video Shows The use of augmented coordinate system for navigation The user has color coded joysticks corresponding to the AR cue. She manipulates the end effector and aligns the coordinate frames for insertion. Video: Link to Movie 21 Navigation Cues
22 Results Path Deviation (p<0.007) 22 Navigation Cues
23 Reversal Errors Path Distance (cm) Results 150 NO AR Experiment 1 Within Subject Experienced users (12) Conditions NO AR AR Experiment 2 Between groups Experienced users (6 per group) Conditions NO AR AR Within Subject Path Distance (p<0.005) AR NO AR AR 0 Reversal Errors (p<0.003) 23 Navigation Cues
24 Extensions to Augmented Waypoints Waypoints can be used to break the task into subtasks Waypoint (WP) Location along a collision free path Orientation/position directed towards goal Going from WP i to WP i+1 1 rotation 1 translation 24 Navigation Cues
25 Augmented Waypoints Video Shows A system that places a waypoint coordinate system for the user Once the user achieves the necessary tolerance, the next waypoint is displayed. Until the payload is inserted. Video: Link to Movie 25 Navigation Cues
26 Operator Trajectories AR without Waypoints Trial 1 Trial 2 Trial 3 AR with Waypoints 26 Navigation Cues
27 Limitations Waypoints were pre-defined Placing waypoints manually is a cumbersome process Tolerances led to deviations with waypoint progression Displays get cluttered Visual collision checks only Real-time path generation is required 27 Navigation Cues
28 Camera control issues (pan/tilt/zoom) Augmented reality navigation cues Path planning Dynamic simulation for training Conclusions Future work PATH PLANNING 28
29 Augmented Collision-free GPS Paths Navigation support in complex cluttered environments Operator can preview Collision free paths Robot configurations required to reach goal Assist mission planning with subtasks Uncluttered AR cues to help user guide the robot along the path 29 Path Planning
30 Collision-free Paths from Probabilistic Searches Video shows Probabilistic Roadmap (PRM) to find Paths Collision free Kinematically comfortable Assumptions Environment geometry is known Immobile obstacles Can be extended to moving obstacles Video: Link to Movie 30 Path Planning
31 Path Planning Video Video Shows Path generated in multiple views Virtual robot arm being moved along planned path Video: Link to Movie 31 Path Planning
32 Path Planning/AR :Applications Training Navigating in cluttered environments Preview of optimal robot motions Minimum distance, angles, energy Smart guidance cues for EE navigation On-orbit Intelligently help operator guide robot through tasks & subtasks Optimal payload manipulation 32 Path Planning
33 Modes of Robot Control Manual Provide on-demand cues for navigation Supervisory Allow robot to proceed along planned trajectory in user-controlled increments Preview robot movements with virtual robot Autonomous Provide visual monitoring cues for safe operations 33 Path Planning
34 Manual Navigation Camera 1 Camera 2 34 Path Planning
35 Supervisory Navigation Camera 1 Camera 2 35 Path Planning
36 Autonomous Provide information on robots current and future objectives Display proximity to obstacles Display error modes and robot constraints 36 Path Planning
37 Path Planning Plans Human Robot Interaction Evaluation Performance Situational awareness Usability testing Training Study impact on training curves Proficiency Advanced Development Intelligent path guidance Real-time path generation Dynamic environments 37 Path Planning
38 Camera control issues (pan/tilt/zoom) Augmented reality navigation cues Path planning Dynamic simulation for training Conclusions Future work DYNAMIC SIMULATION FOR TRAINING 38
39 Rationale for Dynamic AR Simulation Tele-operated arms: Very effective in space (SSRMS, SPDM) Pre-mission training: Dexterous Manipulator Trainer (DMT) Operational environment: Microgravity Training environment: Earth Methods of training: Orbital Replacement Units (ORU) (simulated) Space shuttle mock up environment Helium simulates weightlessness 39 Dynamic Simulation
40 Terrestrial Test Beds? On-orbit Problem/Solution Test bed currently are limited E.g. DMT cannot interact with larger modules Expensive equipment (robots, ORUs, mockups) Cumbersome setup procedures Safety measures AR: solution: Physically-based virtual environment embedded in live video stream Provide detailed virtual equipment Provide dynamic simulation of onorbit No safety concerns over virtual robot movement Dynamic simulation in AR 40 Dynamic Simulation
41 Simulation: Satellite Payload Rendezvous/Collision (g 0.001m/s 2 ) 41 Dynamic Simulation
42 Virtual Robot Interacts with Virtual Object Video Show the interaction of a virtual satellite with a virtual robot. Inertial properties are modeled. Link to Movie Link to Movie 42 Dynamic Simulation
43 Actual Robot Interacts with Virtual Satellite Video Shows An actual robot interacts with the virtual satellite The inertial properties of the satellite are modeled. Video: Link to Movie 43 Dynamic Simulation
44 Dynamic Simulation: Issues Physical AR simulation maybe an alternative to current training methods. End-user training proficiency evaluation needed Object occlusion First order simulations inaccuracies Fidelity of robot/environment models Haptic capability: Will it help? 44 Dynamic Simulation
45 Summary Original Grant Aims Hybrid AR system prototype development HF analysis for the DMT Continuous zoom / pan-tilt camera calibration Achievements Developed fullyfunctioning testbed with software compatibility with the DMT HF testing on WSU testbed (due to budget cuts) Demonstration of continuous zoom/pan-tilt camera calibration 45
46 Beyond Grant Aims Developed path planning techniques for operator guidance Dynamic simulations in AR Demonstrated alternative advanced methods for camera control 46
47 Future Work Implement advanced concepts on the DMT (NASA test bed) Models of the DMT Kinematics of DMT Perform HF evaluations on the DMT Access to expert users Access to DMT Study training effects of physically-based AR Extend concepts to Unmanned Ground Vehicles. Study the occlusion problem For additional information and related references, contact 47
48 Acknowledgements WSU Dr. Greg Auner Dr. Cheng-Zhong Xu Aditya Nawab Dr. Robert Erlandson David Sant Students Thomas Overgaard Anuj Awasthi Vukasin Denic Kelly Foster Robert Xu NASA/JSC Kristian Mueller Andrew Cheang John Pace Charles Bowen Nancy Currie Shakeel Razvi Alberto Magh
Space Robotics. Lecture #23 November 15, 2016 Robotic systems Docking and berthing interfaces Attachment mechanisms MARYLAND U N I V E R S I T Y O F
Lecture #23 November 15, 2016 Robotic systems Docking and berthing interfaces Attachment mechanisms 1 2016 David L. Akin - All rights reserved http://spacecraft.ssl.umd.edu Shuttle Remote Manipulator System
More informationPose Estimation for Insertion of Orbital Replacement Units Against Cluttered Background Using a Non-calibrated Camera
MVA2002 IAPR Workshop on Machine Vision Applications, Dec. 11-13, 2002, Nara ken New Public Hall, Nara, Japan -2 Pose Estimation for Insertion of Orbital Replacement Units Against Cluttered Background
More informationTurning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018
Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Asaf Moses Systematics Ltd., Technical Product Manager aviasafm@systematics.co.il 1 Autonomous
More informationEvaluating the Performance of a Vehicle Pose Measurement System
Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of
More informationPlanning in Mobile Robotics
Planning in Mobile Robotics Part I. Miroslav Kulich Intelligent and Mobile Robotics Group Gerstner Laboratory for Intelligent Decision Making and Control Czech Technical University in Prague Tuesday 26/07/2011
More informationCamera Positioning System
Camera Positioning System Team 79 Members: Junjiao Tian, Jialu Li, and Weicheng Jiang Spring 2017 TA: John Capozzo Table of Contents 1 Introduction 1.1 Objective 1.2 Background 1.3 High-level requirement
More informationMIDN Dakota Wenberg, USN
DEMONSTRATING AUTONOMOUS SPACECRAFT ASSEMBLY WITH THE INTELLIGENT SPACE ASSEMBLY ROBOT MIDN Dakota Wenberg, USN United States Naval Academy, m196804@usna.edu MIDN Christopher Wellins, USN; MIDN Alex Hardy,
More informationPublished Technical Disclosure. Camera-Mirror System on a Remotely Operated Vehicle or Machine Authors: Harald Staab, Carlos Martinez and Biao Zhang
Background: Camera-Mirror System on a Remotely Operated Vehicle or Machine Authors: Harald Staab, Carlos Martinez and Biao Zhang This innovation relates to situational awareness for a remote operator of
More informationProject Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)
Address: Hong Kong Polytechnic University, Phase 8, Hung Hom, Kowloon, Hong Kong. Telephone: (852) 3400 8441 Email: cnerc.steel@polyu.edu.hk Website: https://www.polyu.edu.hk/cnerc-steel/ Project Title:
More informationDevelopment and Human Factors Analysis of Neuronavigation vs. Augmented Reality
Development and Human Factors Analysis of Neuronavigation vs. Augmented Reality Abhilash Pandya 1, 3, Mohammad-Reza Siadat 2, Greg Auner 1, 3, Mohammad Kalash 1, R. Darin Ellis 3, 4 1 Electrical and Computer
More informationProceedings of the 2013 SpaceVision Conference November 7-10 th, Tempe, AZ, USA ABSTRACT
Proceedings of the 2013 SpaceVision Conference November 7-10 th, Tempe, AZ, USA Development of arm controller for robotic satellite servicing demonstrations Kristina Monakhova University at Buffalo, the
More informationLUMS Mine Detector Project
LUMS Mine Detector Project Using visual information to control a robot (Hutchinson et al. 1996). Vision may or may not be used in the feedback loop. Visual (image based) features such as points, lines
More informationSmartphone Video Guidance Sensor for Small Satellites
SSC13-I-7 Smartphone Video Guidance Sensor for Small Satellites Christopher Becker, Richard Howard, John Rakoczy NASA Marshall Space Flight Center Mail Stop EV42, Huntsville, AL 35812; 256-544-0114 christophermbecker@nasagov
More informationCONTENT ENGINEERING & VISION LABORATORY. Régis Vinciguerra
CONTENT ENGINEERING & VISION LABORATORY Régis Vinciguerra regis.vinciguerra@cea.fr ALTERNATIVE ENERGIES AND ATOMIC ENERGY COMMISSION Military Applications Division (DAM) Nuclear Energy Division (DEN) Technological
More informationAutonomous Vehicle Navigation Using Stereoscopic Imaging
Autonomous Vehicle Navigation Using Stereoscopic Imaging Project Proposal By: Beach Wlaznik Advisors: Dr. Huggins Dr. Stewart December 7, 2006 I. Introduction The objective of the Autonomous Vehicle Navigation
More informationApplication of planar air-bearing microgravity simulator for experiments related to ADR missions
Application of planar air-bearing microgravity simulator for experiments related to ADR missions Tomasz Rybus, Karol Seweryn, Jakub Oleś, Piotr Osica, Katarzyna Ososińska Space Research Centre of the Polish
More informationMembers. Team Members. Advisor. Mentor. Tim Sonnen Joe Carter Marshall Townsend Brian Gift Nathan Park Kierra Ryan Qinlin Xu. Dr.
Discover Bot 1 Members Team Members Advisor Tim Sonnen Joe Carter Marshall Townsend Brian Gift Nathan Park Kierra Ryan Qinlin Xu Dr. Joel Perry Mentor Sarah Willis 2 Project Goal To develop a single-user
More informationState Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements
State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements António Pedro Aguiar aguiar@ece.ucsb.edu João Pedro Hespanha hespanha@ece.ucsb.edu Dept.
More informationAll human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them.
All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them. - Aristotle University of Texas at Arlington Introduction
More informationImage Transformations & Camera Calibration. Mašinska vizija, 2018.
Image Transformations & Camera Calibration Mašinska vizija, 2018. Image transformations What ve we learnt so far? Example 1 resize and rotate Open warp_affine_template.cpp Perform simple resize
More informationReality Modeling Drone Capture Guide
Reality Modeling Drone Capture Guide Discover the best practices for photo acquisition-leveraging drones to create 3D reality models with ContextCapture, Bentley s reality modeling software. Learn the
More informationH2020 Space Robotic SRC- OG4
H2020 Space Robotic SRC- OG4 2 nd PERASPERA workshop Presentation by Sabrina Andiappane Thales Alenia Space France This project has received funding from the European Union s Horizon 2020 research and
More informationCooperative Targeting: Detection and Tracking of Small Objects with a Dual Camera System
Cooperative Targeting: Detection and Tracking of Small Objects with a Dual Camera System Moein Shakeri and Hong Zhang Abstract Surveillance of a scene with computer vision faces the challenge of meeting
More informationROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots
ROBOT TEAMS CH 12 Experiments with Cooperative Aerial-Ground Robots Gaurav S. Sukhatme, James F. Montgomery, and Richard T. Vaughan Speaker: Jeff Barnett Paper Focus Heterogeneous Teams for Surveillance
More informationIndustrial Calibration. Chris Lewis Southwest Research Institute
Industrial Calibration Chris Lewis Southwest Research Institute clewis@swri.org Motivation Push button intrinsic and extrinsic calibration with predictable accuracy. Unified Framework for a wide variety
More informationImage processing techniques for driver assistance. Razvan Itu June 2014, Technical University Cluj-Napoca
Image processing techniques for driver assistance Razvan Itu June 2014, Technical University Cluj-Napoca Introduction Computer vision & image processing from wiki: any form of signal processing for which
More informationProbabilistic Methods for Kinodynamic Path Planning
16.412/6.834J Cognitive Robotics February 7 th, 2005 Probabilistic Methods for Kinodynamic Path Planning Based on Past Student Lectures by: Paul Elliott, Aisha Walcott, Nathan Ickes and Stanislav Funiak
More informationAssembly and Cutaway Illustration & Interaction
Assembly and Cutaway Illustration & Interaction Erik Johansson, Cut & Fold http://erikjohanssonphoto.com/work/cut-fold/ http://gfycat.com/scrawnythoughtfulfairyfly http://i.imgur.com/szqz7r7.gif http://entertainment.howstuffworks.com/muppet4.htm
More informationMOTION PLANNING FOR THE MOBILE SERVICING SYSTEM FOR A REPAIR TASK IN THE INTERNATIONAL SPACE STATION. Yunxiao Gao B.S. Qingdao University, 1994
MOTION PLANNING FOR THE MOBILE SERVICING SYSTEM FOR A REPAIR TASK IN THE INTERNATIONAL SPACE STATION Yunxiao Gao B.S. Qingdao University, 1994 THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS
More informationAPN-065: Determining Rotations for Inertial Explorer and SPAN
APN-065 Rev A APN-065: Determining Rotations for Inertial Explorer and SPAN Page 1 May 5, 2014 Both Inertial Explorer (IE) and SPAN use intrinsic -order Euler angles to define the rotation between the
More informationProf. Fanny Ficuciello Robotics for Bioengineering Visual Servoing
Visual servoing vision allows a robotic system to obtain geometrical and qualitative information on the surrounding environment high level control motion planning (look-and-move visual grasping) low level
More informationOmni Stereo Vision of Cooperative Mobile Robots
Omni Stereo Vision of Cooperative Mobile Robots Zhigang Zhu*, Jizhong Xiao** *Department of Computer Science **Department of Electrical Engineering The City College of the City University of New York (CUNY)
More informationCSCI 5980: Assignment #3 Homography
Submission Assignment due: Feb 23 Individual assignment. Write-up submission format: a single PDF up to 3 pages (more than 3 page assignment will be automatically returned.). Code and data. Submission
More informationGeometric Calibration in Active Thermography Applications
19 th World Conference on Non-Destructive Testing 2016 Geometric Calibration in Active Thermography Applications Thomas SCHMIDT 1, Christoph FROMMEL 2 1, 2 Deutsches Zentrum für Luft- und Raumfahrt (DLR)
More informationHuman Augmentation in Teleoperation of Arm Manipulators in an Environment with Obstacles
Human Augmentation in Teleoperation of Arm Manipulators in an Environment with Obstacles I. Ivanisevic and V. Lumelsky Robotics Lab, University of Wisconsin-Madison Madison, Wisconsin 53706, USA iigor@cs.wisc.edu
More informationColorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.
Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Introduction to 2 What is? A process that produces from images of the external world a description
More informationConstruction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots
Construction and Calibration of a Low-Cost 3D Laser Scanner with 360º Field of View for Mobile Robots Jorge L. Martínez, Jesús Morales, Antonio, J. Reina, Anthony Mandow, Alejandro Pequeño-Boter*, and
More informationz θ 1 l 2 θ 2 l 1 l 3 θ 3 θ 4 θ 5 θ 6
Human Augmentation in Teleoperation of Arm Manipulators in an Environment with Obstacles Λ I. Ivanisevic and V. Lumelsky Robotics Lab, University of Wisconsin-Madison Madison, Wisconsin 53706, USA iigor@cs.wisc.edu
More informationTotal Connect 2.0 Online Help
Security Events Users Locations Video Hints for use. After logging in hit F11 to toggle full screen mode. Hover over icons for tool tip help. Upon the initial login, control panel data should be imported
More informationSatellite Attitude Determination
Satellite Attitude Determination AERO4701 Space Engineering 3 Week 5 Last Week Looked at GPS signals and pseudorange error terms Looked at GPS positioning from pseudorange data Looked at GPS error sources,
More informationUAV Autonomous Navigation in a GPS-limited Urban Environment
UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight
More informationEE565:Mobile Robotics Lecture 2
EE565:Mobile Robotics Lecture 2 Welcome Dr. Ing. Ahmad Kamal Nasir Organization Lab Course Lab grading policy (40%) Attendance = 10 % In-Lab tasks = 30 % Lab assignment + viva = 60 % Make a group Either
More informationHomework 2 Questions? Animation, Motion Capture, & Inverse Kinematics. Velocity Interpolation. Handing Free Surface with MAC
Homework 2 Questions? Animation, Motion Capture, & Inverse Kinematics Velocity Interpolation Original image from Foster & Metaxas, 1996 In 2D: For each axis, find the 4 closest face velocity samples: Self-intersecting
More informationAnother Look at Camera Control
Another Look at Camera Control Karan Singh, Cindy Grimm, Nisha Sudarsanan Media and Machines Lab Department of Computer Science and Engineering Washington University in St. Louis University of Toronto
More informationSingle View Geometry. Camera model & Orientation + Position estimation. What am I?
Single View Geometry Camera model & Orientation + Position estimation What am I? Vanishing points & line http://www.wetcanvas.com/ http://pennpaint.blogspot.com/ http://www.joshuanava.biz/perspective/in-other-words-the-observer-simply-points-in-thesame-direction-as-the-lines-in-order-to-find-their-vanishing-point.html
More informationHomework #2 Posted: February 8 Due: February 15
CS26N Motion Planning for Robots, Digital Actors and Other Moving Objects (Winter 2012) Homework #2 Posted: February 8 Due: February 15 How to complete this HW: First copy this file; then type your answers
More informationPROPOSAL. Automated 3D Model Building Using a Robot and an RGB- Depth Camera
PROPOSAL Automated 3D Model Building Using a Robot and an RGB- Depth Camera Michigan State University ECE 480 - Senior Design Team 2 20 February 2015 Sponsor: Dr. Daniel Morris Facilitator: Dr. Hayder
More informationSatellite Tracking System using Amateur Telescope and Star Camera for Portable Optical Ground Station
Satellite Tracking System using Amateur Telescope and Star Camera for Portable Optical Ground Station Hyosang Yoon, Kathleen Riesing, and Kerri Cahoy MIT STAR Lab 30 th SmallSat Conference 8/10/2016 Hyosang
More informationImproved Navigated Spine Surgery Utilizing Augmented Reality Visualization
Improved Navigated Spine Surgery Utilizing Augmented Reality Visualization Zein Salah 1,2, Bernhard Preim 1, Erck Elolf 3, Jörg Franke 4, Georg Rose 2 1Department of Simulation and Graphics, University
More informationACE Project Report. December 10, Reid Simmons, Sanjiv Singh Robotics Institute Carnegie Mellon University
ACE Project Report December 10, 2007 Reid Simmons, Sanjiv Singh Robotics Institute Carnegie Mellon University 1. Introduction This report covers the period from September 20, 2007 through December 10,
More informationPresentation Outline
Autonomous Vehicle Navigation Using Stereoscopic Imaging Senior Capstone Project Progress Report Department of Electrical and Computer Engineering Bradley University Presented by: Beach, Wlaznik Advisors:
More informationChapter 13. Vision Based Guidance. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,
Chapter 3 Vision Based Guidance Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide Architecture w/ Camera targets to track/avoid vision-based guidance waypoints status
More informationIndustrial Robots : Manipulators, Kinematics, Dynamics
Industrial Robots : Manipulators, Kinematics, Dynamics z z y x z y x z y y x x In Industrial terms Robot Manipulators The study of robot manipulators involves dealing with the positions and orientations
More informationA Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles
Proceedings of the International Conference of Control, Dynamic Systems, and Robotics Ottawa, Ontario, Canada, May 15-16 2014 Paper No. 54 A Reactive Bearing Angle Only Obstacle Avoidance Technique for
More informationDealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech
Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The
More informationPlanning 3D Task Demonstrations of a Teleoperated Space Robot Arm
Proceedings of the Eighteenth International Conference on Automated Planning and Scheduling (ICAPS 2008) Planning 3D Task Demonstrations of a Teleoperated Space Robot Arm Froduald Kabanza and Khaled Belghith
More informationCS 563 Advanced Topics in Computer Graphics Stereoscopy. by Sam Song
CS 563 Advanced Topics in Computer Graphics Stereoscopy by Sam Song Stereoscopy Introduction Parallax Camera Displaying and Viewing Results Stereoscopy What is it? seeing in three dimensions creates the
More informationSensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016
Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused
More information3D Scene Reconstruction with a Mobile Camera
3D Scene Reconstruction with a Mobile Camera 1 Introduction Robert Carrera and Rohan Khanna Stanford University: CS 231A Autonomous supernumerary arms, or "third arms", while still unconventional, hold
More informationImage Based Rendering. D.A. Forsyth, with slides from John Hart
Image Based Rendering D.A. Forsyth, with slides from John Hart Topics Mosaics translating cameras reveal extra information, break occlusion Optical flow for very small movements of the camera Explicit
More informationDevelopment of a Ground Based Cooperating Spacecraft Testbed for Research and Education
DIPARTIMENTO DI INGEGNERIA INDUSTRIALE Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education Mattia Mazzucato, Sergio Tronco, Andrea Valmorbida, Fabio Scibona and Enrico
More informationMiniature faking. In close-up photo, the depth of field is limited.
Miniature faking In close-up photo, the depth of field is limited. http://en.wikipedia.org/wiki/file:jodhpur_tilt_shift.jpg Miniature faking Miniature faking http://en.wikipedia.org/wiki/file:oregon_state_beavers_tilt-shift_miniature_greg_keene.jpg
More informationAutonomous Navigation for Flying Robots
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München Sensors IMUs (inertial measurement units) Accelerometers
More informationSelf-Correcting Projectile Launcher. Josh Schuster Yena Park Diana Mirabello Ryan Kindle
Self-Correcting Projectile Launcher Josh Schuster Yena Park Diana Mirabello Ryan Kindle Motivation & Applications Successfully reject disturbances without use of complex sensors Demonstrate viability of
More informationRobotics kinematics and Dynamics
Robotics kinematics and Dynamics C. Sivakumar Assistant Professor Department of Mechanical Engineering BSA Crescent Institute of Science and Technology 1 Robot kinematics KINEMATICS the analytical study
More informationBIM Navigation with Hand-Based Gesture Control on Sites. Chao-Chung Yang 1 and Shih-Chung Kang 2
785 BIM Navigation with Hand-Based Gesture Control on Sites Chao-Chung Yang 1 and Shih-Chung Kang 2 1 Department of Civil Engineering, National Taiwan University, Rm. 611, No.188, Sec. 3, Xinhai Rd., Da
More informationSingle View Geometry. Camera model & Orientation + Position estimation. What am I?
Single View Geometry Camera model & Orientation + Position estimation What am I? Vanishing point Mapping from 3D to 2D Point & Line Goal: Point Homogeneous coordinates represent coordinates in 2 dimensions
More informationPreface...vii. Printed vs PDF Versions of the Book...ix. 1. Scope of this Volume Installing the ros-by-example Code...3
Contents Preface...vii Printed vs PDF Versions of the Book...ix 1. Scope of this Volume...1 2. Installing the ros-by-example Code...3 3. Task Execution using ROS...7 3.1 A Fake Battery Simulator...8 3.2
More informationMCE/EEC 647/747: Robot Dynamics and Control. Lecture 1: Introduction
MCE/EEC 647/747: Robot Dynamics and Control Lecture 1: Introduction Reading: SHV Chapter 1 Robotics and Automation Handbook, Chapter 1 Assigned readings from several articles. Cleveland State University
More informationCeilbot vision and mapping system
Ceilbot vision and mapping system Provide depth and camera data from the robot's environment Keep a map of the environment based on the received data Keep track of the robot's location on the map Recognize
More informationDEVELOPMENT OF TELE-ROBOTIC INTERFACE SYSTEM FOR THE HOT-LINE MAINTENANCE. Chang-Hyun Kim, Min-Soeng Kim, Ju-Jang Lee,1
DEVELOPMENT OF TELE-ROBOTIC INTERFACE SYSTEM FOR THE HOT-LINE MAINTENANCE Chang-Hyun Kim, Min-Soeng Kim, Ju-Jang Lee,1 Dept. of Electrical Engineering and Computer Science Korea Advanced Institute of Science
More informationPhysical Privacy Interfaces for Remote Presence Systems
Physical Privacy Interfaces for Remote Presence Systems Student Suzanne Dazo Berea College Berea, KY, USA suzanne_dazo14@berea.edu Authors Mentor Dr. Cindy Grimm Oregon State University Corvallis, OR,
More informationRobotics (Kinematics) Winter 1393 Bonab University
Robotics () Winter 1393 Bonab University : most basic study of how mechanical systems behave Introduction Need to understand the mechanical behavior for: Design Control Both: Manipulators, Mobile Robots
More informationDevelopment of Formation Flight and Docking Algorithms Using the SPHERES Testbed
Development of Formation Flight and Docking Algorithms Using the Testbed Prof. David W. Miller MIT Allen Chen, Alvar Saenz-Otero, Mark Hilstad, David W. Miller Introduction : Synchronized Position Hold
More informationAnimation, Motion Capture, & Inverse Kinematics. Announcements: Quiz
Animation, Motion Capture, & Inverse Kinematics Announcements: Quiz On Tuesday (3/10), in class One 8.5x11 sheet of notes allowed Sample quiz (from a previous year) on website Focus on reading comprehension
More informationAnnouncements: Quiz. Animation, Motion Capture, & Inverse Kinematics. Last Time? Today: How do we Animate? Keyframing. Procedural Animation
Announcements: Quiz Animation, Motion Capture, & Inverse Kinematics On Friday (3/1), in class One 8.5x11 sheet of notes allowed Sample quiz (from a previous year) on website Focus on reading comprehension
More informationIntern Presentation:
: Gripper Stereo and Assisted Teleoperation Stanford University December 13, 2010 Outline 1. Introduction 2. Hardware 3. Research 4. Packages 5. Conclusion Introduction Hardware Research Packages Conclusion
More informationNavigation Templates for PSA
Navigation Templates for PSA Li Tan & David P. Miller School of Aerospace and Mechanical Engineering University of Oklahoma Norman, OK 73019 USA litan@ou.edu & dpmiller@ou.edu Abstract Navigation Templates
More informationSingle View Geometry. Camera model & Orientation + Position estimation. Jianbo Shi. What am I? University of Pennsylvania GRASP
Single View Geometry Camera model & Orientation + Position estimation Jianbo Shi What am I? 1 Camera projection model The overall goal is to compute 3D geometry of the scene from just 2D images. We will
More informationAided-inertial for GPS-denied Navigation and Mapping
Aided-inertial for GPS-denied Navigation and Mapping Erik Lithopoulos Applanix Corporation 85 Leek Crescent, Richmond Ontario, Canada L4B 3B3 elithopoulos@applanix.com ABSTRACT This paper describes the
More informationAutonomous Vehicle Navigation Using Stereoscopic Imaging
Autonomous Vehicle Navigation Using Stereoscopic Imaging Functional Description and Complete System Block Diagram By: Adam Beach Nick Wlaznik Advisors: Dr. Huggins Dr. Stewart December 14, 2006 I. Introduction
More informationWAVE2: Warriors Autonomous Vehicle IGVC Entry. Wayne State University. Advising Faculty: Dr. R. Darin Ellis. Team Members: Justin Ar-Rasheed
WAVE2: Warriors Autonomous Vehicle 2010 IGVC Entry Wayne State University Advising Faculty: Dr. R. Darin Ellis Team Members: Justin Ar-Rasheed Shawn Hunt Sam Li Vishal Lowalekar Prem Sivakumar Ali Syed
More informationBuild and Test Plan: IGV Team
Build and Test Plan: IGV Team 2/6/2008 William Burke Donaldson Diego Gonzales David Mustain Ray Laser Range Finder Week 3 Jan 29 The laser range finder will be set-up in the lab and connected to the computer
More informationModeling the Virtual World
Modeling the Virtual World Joaquim Madeira November, 2013 RVA - 2013/2014 1 A VR system architecture Modeling the Virtual World Geometry Physics Haptics VR Toolkits RVA - 2013/2014 2 VR object modeling
More informationDirect Plane Tracking in Stereo Images for Mobile Navigation
Direct Plane Tracking in Stereo Images for Mobile Navigation Jason Corso, Darius Burschka,Greg Hager Computational Interaction and Robotics Lab 1 Input: The Problem Stream of rectified stereo images, known
More informationDJI GS PRO. User Manual V
DJI GS PRO User Manual V1.4 2017.03 Video Tutorials Virtual Fence Mission 3D Map Area Mission Waypoint Flight Mission 2 2017 DJI All Rights Reserved. Contents Video Tutorials 2 Disclaimer 4 Warning 4 Introduction
More informationAdvanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module
Advanced Driver Assistance Systems: A Cost-Effective Implementation of the Forward Collision Warning Module www.lnttechservices.com Table of Contents Abstract 03 Introduction 03 Solution Overview 03 Output
More informationMAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER
MAPPING ALGORITHM FOR AUTONOMOUS NAVIGATION OF LAWN MOWER USING SICK LASER A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Engineering By SHASHIDHAR
More informationInternational Journal of Advance Engineering and Research Development
Scientific Journal of Impact Factor (SJIF): 4.14 International Journal of Advance Engineering and Research Development Volume 3, Issue 3, March -2016 e-issn (O): 2348-4470 p-issn (P): 2348-6406 Research
More informationBehavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism
Behavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism Sho ji Suzuki, Tatsunori Kato, Minoru Asada, and Koh Hosoda Dept. of Adaptive Machine Systems, Graduate
More informationset for a fixed view. Install the PTZ camera and the stationary camera in close proximity of each other
CHAPTER 3 3.1 Object Tracking and Zooming Object Tracking provides you the real-time tracking and automatic magnification of a single moving object by the combination of one PTZ camera and one stationary
More informationPPGEE Robot Dynamics I
PPGEE Electrical Engineering Graduate Program UFMG April 2014 1 Introduction to Robotics 2 3 4 5 What is a Robot? According to RIA Robot Institute of America A Robot is a reprogrammable multifunctional
More informationThis week. CENG 732 Computer Animation. Warping an Object. Warping an Object. 2D Grid Deformation. Warping an Object.
CENG 732 Computer Animation Spring 2006-2007 Week 4 Shape Deformation Animating Articulated Structures: Forward Kinematics/Inverse Kinematics This week Shape Deformation FFD: Free Form Deformation Hierarchical
More informationGeolocation with FW 6.4x & Video Security Client Geolocation with FW 6.4x & Video Security Client 2.1 Technical Note
Geolocation with FW 6.4x & Video Security Client 2.1 1 10 Geolocation with FW 6.4x & Video Security Client 2.1 Technical Note Geolocation with FW 6.4x & Video Security Client 2.1 2 10 Table of contents
More informationCSE 4392/5369. Dr. Gian Luca Mariottini, Ph.D.
University of Texas at Arlington CSE 4392/5369 Introduction to Vision Sensing Dr. Gian Luca Mariottini, Ph.D. Department of Computer Science and Engineering University of Texas at Arlington WEB : http://ranger.uta.edu/~gianluca
More informationAutonomous navigation in industrial cluttered environments using embedded stereo-vision
Autonomous navigation in industrial cluttered environments using embedded stereo-vision Julien Marzat ONERA Palaiseau Aerial Robotics workshop, Paris, 8-9 March 2017 1 Copernic Lab (ONERA Palaiseau) Research
More informationSpace Robot Path Planning for Collision Avoidance
Space Robot Path Planning for ollision voidance Yuya Yanoshita and Shinichi Tsuda bstract This paper deals with a path planning of space robot which includes a collision avoidance algorithm. For the future
More informationipcam-wo Wireless Outdoor
POWER NETWORK Total Connect Online Help Guide for: ip Cameras ipcam-wi Wireless Indoor ipcam-pt Pan and Tilt ipcam-wo Wireless Outdoor 800-08456 3/11 Rev. A TRADEMARKS Honeywell is a registered trademark
More informationMobile Procedure Viewer Short Duration Mission. mobipv-sdm /14/NL/AT. Final Presentation
Mobile Procedure Viewer Short Duration Mission mobipv-sdm 4000110425/14/NL/AT Final Presentation Space Applications Services, Belgium Keshav Chintamani Project Manager Boris Van Lierde Project Engineer
More informationOutline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017
Electric Electronic Engineering Bogazici University December 13, 2017 Absolute position measurement Outline Motion Odometry Inertial systems Environmental Tactile Proximity Sensing Ground-Based RF Beacons
More information