ArchGenTool: A System-Independent Collaborative Tool for Robotic Architecture Design

Similar documents
LAUROPE Six Legged Walking Robot for Planetary Exploration participating in the SpaceBot Cup

Robotic Perception and Action - Project. Augmented Reality for the Man in the Loop. M. De Cecco - Robotic Perception and Action

Hand. Desk 4. Panda research 5. Franka Control Interface (FCI) Robot Model Library. ROS support. 1 technical data is subject to change

Final Project Report: Mobile Pick and Place

A Modular Software Framework for Eye-Hand Coordination in Humanoid Robots

Gesture Recognition Technique:A Review

Visual Perception for Robots

Properties of Hyper-Redundant Manipulators

Gesture Recognition: Hand Pose Estimation. Adrian Spurr Ubiquitous Computing Seminar FS

High-Fidelity Augmented Reality Interactions Hrvoje Benko Researcher, MSR Redmond

Development of dual arm mobile manipulation systems for small part assembly tasks

The Kinect Sensor. Luís Carriço FCUL 2014/15

Cecilia Laschi The BioRobotics Institute Scuola Superiore Sant Anna, Pisa

AMR 2011/2012: Final Projects

Rich Augmented Reality Interactions (without goggles, gloves or 3D trackers)

INF Introduction to Robot Operating System

Echtzeit und Nicht-Echtzeit Integration von MATLAB und Simulink in das neue Robotik Software Framework ardx des DLR Berthold Bäuml

Anibal Ollero Professor and head of GRVC University of Seville (Spain)

Functional Architectures for Cooperative Multiarm Systems

3D Collision Avoidance for Navigation in Unstructured Environments

Robot For Assistance. Master Project ME-GY 996. Presented By: Karim Chamaa. Presented To: Dr. Vikram Kapila

Margarita Grinvald. Gesture recognition for Smartphones/Wearables

Designing a Pick and Place Robotics Application Using MATLAB and Simulink

Mobile Manipulation for the KUKA youbot Platform. A Major Qualifying Project Report. submitted to the Faculty. of the WORCESTER POLYTECHNIC INSTITUTE

Team Description Paper Team AutonOHM

ROS-Industrial Basic Developer s Training Class. Southwest Research Institute

An Interactive Technique for Robot Control by Using Image Processing Method

Developing Algorithms for Robotics and Autonomous Systems

Constraint-Based Task Programming with CAD Semantics: From Intuitive Specification to Real-Time Control

Spring 2016 :: :: Robot Autonomy :: Team 7 Motion Planning for Autonomous All-Terrain Vehicle

Inverse Kinematics. Given a desired position (p) & orientation (R) of the end-effector

Object recognition and grasping using bimanual robot. Report

Towards an Improved Understanding of Human In-hand Manipulation Capabilities

Mobile Manipulator Design

3D Simultaneous Localization and Mapping and Navigation Planning for Mobile Robots in Complex Environments

Deliverable D3.1 Dissemination Level (PU) RAMCIP

Real Sense- Use Case Scenarios

CARE-O-BOT-RESEARCH: PROVIDING ROBUST ROBOTICS HARDWARE TO AN OPEN SOURCE COMMUNITY

Intern Presentation:

ROBOT SENSORS. 1. Proprioceptors

TRS: An Open-source Recipe for Teaching/Learning Robotics with a Simulator.

Generating and Learning from 3D Models of Objects through Interactions. by Kiana Alcala, Kathryn Baldauf, Aylish Wrench

CMU Amazon Picking Challenge

INF Introduction to Robot Operating System

Development of 3D Image Manipulation Software Utilizing the Microsoft Kinect

ICT & Computing Progress Grid

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

International Journal of Advanced Research in Computer Science and Software Engineering

Humanoid whole-body motion control with multiple contacts. Francesco Nori Robotics, Brain and Cognitive Sciences

Robotics. Chapter 25. Chapter 25 1

Dynamic Time Warping for Binocular Hand Tracking and Reconstruction

Semantic Grasping: Planning Robotic Grasps Functionally Suitable for An Object Manipulation Task

16-662: Robot Autonomy Project Bi-Manual Manipulation Final Report

Variability Modeling and Resolution in Component-based Robotics Systems

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Image Processing Methods for Interactive Robot Control

Motion capture: An evaluation of Kinect V2 body tracking for upper limb motion analysis

Active Recognition and Manipulation of Simple Parts Exploiting 3D Information

HAREMS: Hierarchical Architecture for Robotics Experiments with Multiple Sensors

Assignment 3. Position of the center +/- 0.1 inches Orientation +/- 1 degree. Decal, marker Stereo, matching algorithms Pose estimation

A Robust Gesture Recognition Using Depth Data

Integrated Grasp and Motion Planning using Independent Contact Regions 1

CMU Facilities. Motion Capture Lab. Panoptic Studio

Spring 2010: Lecture 9. Ashutosh Saxena. Ashutosh Saxena

John Hsu Nate Koenig ROSCon 2012

EECS 4330/7330 Introduction to Mechatronics and Robotic Vision, Fall Lab 5. Controlling Puma Using Leap Motion Device

AADL Graphical Editor Design

Manipulating a Large Variety of Objects and Tool Use in Domestic Service, Industrial Automation, Search and Rescue, and Space Exploration

AASS, Örebro University

Motion Planning Strategy For a 6-DOFs Robotic Arm In a Controlled Environment

Humanoid Manipulation

Quick Introduction to ROS

Human Arm Simulation Using Kinect

Learning Semantic Environment Perception for Cognitive Robots

The Internet of Things and Factory of Future

CONTENT ENGINEERING & VISION LABORATORY. Régis Vinciguerra

Team Description Paper

MCE/EEC 647/747: Robot Dynamics and Control. Lecture 1: Introduction

Semantic Mapping and Reasoning Approach for Mobile Robotics

Kaijen Hsiao. Part A: Topics of Fascination

The Fawkes Robot Software Framework

Task-based Robotic Grasp Planning

Written exams of Robotics 2

Design and building motion capture system using transducer Microsoft kinect to control robot humanoid

Preface...vii. Printed vs PDF Versions of the Book...ix. 1. Scope of this Volume Installing the ros-by-example Code...3

CS 775: Advanced Computer Graphics. Lecture 17 : Motion Capture

Whole-Arm Dexterous Manipulation In The Plane

An Animation System for Imitation of Object Grasping in Virtual Reality

Construction Progress Management and Interior Work Analysis Using Kinect 3D Image Sensors

User Experience Design

INTELLIGENT AUTONOMOUS SYSTEMS LAB

Operation of machine vision system

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation

Overview. Animation is a big topic We will concentrate on character animation as is used in many games today. humans, animals, monsters, robots, etc.

Hand Gesture Recognition Based On The Parallel Edge Finger Feature And Angular Projection

Human Motion Detection in Manufacturing Process

ROS-Industrial Basic Developer s Training Class

Planning, Execution and Learning Application: Examples of Planning for Mobile Manipulation and Articulated Robots

User Experience Design

Robotized Assembly of a Wire Harness in Car Production Line

Transcription:

ArchGenTool: A System-Independent Collaborative Tool for Robotic Architecture Design Emanuele Ruffaldi (SSSA) I. Kostavelis, D. Giakoumis, D. Tzovaras (CERTH)

Overview Problem Statement Existing Solutions ArchGen Principles ArchGen Implementation Example Conclusions

Context Service Robotics aims at supporting the activity and life of people in home environments The H2020 RAMCIP project (2015-2017) targets Mild- Cognitive Impaired people by means of a new Service robot with both manipulative and cognitive capabilities Based on ROS the Software Architecture of RAMCIP aims at Understanding the Environment Understanding User State and Action Support the User Activity Controlling Arm with Dexterous Hand Controlling Platform Motion Controlling the Communication Module How we can develop and manage such complex system? 2016 Scuola Superiore Sant Anna

Key Concept Robotic Architectures are defined along three elements 1. Software components libraries providing functionalities 2. Software architecture that describes the instantiation of the components and their interconnection 3. Functional Requirements

State of the Art ROBOTML based on a DSL for robot ontology and integrated with SysML BRIDE (BRICS project) Allows to design models of components (ROS/Orocos) Originally purely on Eclipse moving also to textual representation Auto-code generation HyperFlex Eclipse based, symbolic representation of architecture points and variants Generation of output to ROS / Orocos Robotics Run-time Adaptation Toolchain Run-time Adaptation over HyperFlex Architecture Modeling and Analysis Language (AMAL) Mostly based on Eclipse, with big focus on the specific ROS/Orocos detailed architecture. Wanted a more general tool for refining the architecture

Diagnostics Logical Analysis ArchGen Principles Functional Components ROS Design Time Architecture*Deployment* The*Robot*Vision* High%Level* Conceptual*Analysis* Low%Level* Implementa9on*Details* ArchGen comes as a standalone tool for creating and manipulating Robotic Architecture in a collaborative way ArchGen deals with both high-level functional architecture and the low-level implementation (ROS-based) Collabora9on* Modularity Documentation Reusability Expandability

ArchGen Features Component-based Model YAML based representation Collaboration via Git (github) Multiple back-ends Visualization Report Analysis

ArchGen Workflow

High-Level The High-Level functional architecture is the abstract specification of the robot components Component Instances Connected by Data Interfaces Additional Properties Performance criteria Operative Requirements Grasp_Planner Imaging Robot Arm Workspace Identification Workspace environment (Occupancy grid) Hand_over_Controller Grasp_Controller Lower_body_pHRI_planner

Tool Output The High-level execution step produces YAML files for further automatic processing Diagrams for parts Word document Web pages Report of Architecture analysis The generated YAML files represent the aggregated Group/All of contributions The Word document is used for EU Project reporting

ROS Mapping The implementation phase declares how to realize the functional components, each into one or more ROS components Which existing ROS components are used Their interconnection The mapping of High-level types to ROS types Implementation can be put as an attribute of a Functional Component or instead placed in a Separate YAML document Name is the ROS package Execution Method

Dealing with Time Timing is an important aspect of a robotic architecture, in particular for the allocation of hardware resources and the understanding of criticalities ArchGen tackles this by specific attributes Mode of timing: Periodic, Event-driven, On-demand (Service) Frequency (Hz) Timing of unspecified components is inferred ArchGen propagates the timings of signals along the Graph and produces a colored Graph of components similar to the one used in Simulink Message timing Component timing

Collaboration The collaboration is based on the well known code management tools: git/github Each team member writes in common YAML or per-member/partner YAML Architectural changes can be discussed using Issue Management Some properties of the model can be specified in external files (mapping, timings) so that changes can be easily tracked using branches The tool validates the architecture using continuous integration tools (e.g. CircleCI/Shippable) Report and diagrams can be generated online (e.g. Shippable)

RAMCIP Case Study http://www.ramcip-project.eu/ High-level Component Groups: 10 Robot State, Environment State, Environment Model, Human State, Human Model, Cognitive Reasoning, Robot Task Scheduler, Robot Action Planner and Communication Planner Functional Components: 61

RAMCIP High-Level Groups Except Human Model and Communication Current Human State Current Robot State Current Environment State Human Tracking Vital signs Monitoring Robot Arm Workspace Identification Robot Localization Robot Hardware Imaging Robot Sensors Current Environment Model Robot Task Scheduler Human posture recognition Global Mapping Library of capabilities Environment Large Object Tracking Object Detection / Identification Fine grained body motion analysis Human action recognition Human Identification Hierarchical spaces / objects modelling Task Planner Cognitive Reasoning Robot Action Planner Human activity recognition High-level assistance decision maker Task Integrator Lower-body phri planner Robot Navigation Manipulation Planner and Control Grasp Planner Hand-over Controller Grasp Controller

ramcip_ar_projector_ar_projector Current_Robot_State Current_Human_State Current_Environment_State Current_Environment_Model Robot Task Scheduler Library of capabilities ramcip_library_of_capabilities/library_of_capabilities Sequence of tasks to be undertaken by the robot Success / failure List of capabilities Task Planner ramcip_task_planner/task_planner Ordered list of itemic actions Contingency plan Task Integrator ramcip_task_integrator/task_integrator Sends one action at a time Manipulation Planner and Control ramcip_manipulation_planner_and_control/manipulation_planner_and_control Desired motor torques Task initialized signal Abort signal when task cannot be fulfilled Task finished signal Get feasible position volume. Response Arm s end position/orientation Externally applied wrench at the wrist Grasp Planner ramcip_grasp_planner/grasp_planner Grasp strategy and its parameters. Goal Get Object. Request Get feasible position volume. Request Force_Torque_Sensors Robot_Joints Robot_Motors Robot Action Planner Lower-body phri planner ramcip_lower_body_phri_planner/lower_body_phri_planner Desired motor torques Desired position and compliance Desired velocity and compliance Abort signal when task cannot be fulfilled Task finished signal Grasp Controller ramcip_grasp_controller/grasp_controller Hand joint position / torque targets Hand joint trajectory target Desired arm s end position/orientation Desired virtual external wrench on the wrist Compliance parameters of arm Grasp strategy and its parameters. Feedback Grasp strategy and its parameters. Result Current_Human_Model Robot_Model Robot Navigation ramcip_nav_plugin/human_detection_plugin Human detection layer plugin output move_base/move_base_node Velocity of base platform A sequence of 2D points that the robot has to follow in order to reach its target location Hand-over Controller ramcip_hand_over_controller/hand_over_controller Task initialized signal Abort signal when task cannot be fulfilled Task finished signal Joint position / velocity / torque targets Fingertips positions targets Wrist s trajectory target Hokuyo Robot Hardware ramcip_robot_hardware/robot_state_aggregator??? Environment Perception Contacts force Contacts position and reliability of data Hand Joint positions/velocities/torques Arm s end position/orientation Fingertip position (hand finger kinematics) Externally applied wrench at the wrist Tactile sensor information (contact identification and imminent slipping information) Fingertip exerted force Hand finger inertias Current Robot State Robot Arm Workspace Identification ramcip_robot_workspace/arm_workspace_node Workspace environment (Occupancy grid) Kinect2 Robot Localization ramcip_localization/initial_pose Initial robot localization pose amcl/amcl_node Robot localization pose Kinect1 F200 Sensors openni_launch/openni Undistorted, Rectified RGB image (640X480) pixels Undistorted, Rectified Depth image (640X480) pixels Colour Camera Calibration data (R, T) Depth Camera Calibration data (R, T) Colour to Depth Camera Calibration data (R, T) Registered point cloud, list of points (xyz,rgb) Time stamped synchronized RGB and Depth images Environment Large Object Tracking ramcip_large_object_tracking/large_object_tracking Get large object. Response Robot localization pose Imaging realsense_camera/realsense_camera Undistorted, Rectified RGB image (640X480) pixels Undistorted, Rectified Depth image (640X480) pixels Colour Camera Calibration data (R, T) Depth Camera Calibration data (R, T) Colour to Depth Camera Calibration data (R, T) Registered point cloud, list of points (xyz,rgb) Time stamped synchronized RGB and Depth images ramcip_camera_helpers/camera_server Get data from camera. Response Object Detection / Identification ramcip_object_detection/object_detection Get Object. Response Current Environment State kinect2_bridge/kinect2_bridge Undistorted, Rectified RGB image (1920x1080) pixels Undistorted, Rectified Depth image (512x424) pixels Colour Camera Calibration data (R, T) Depth Camera Calibration data (R, T) Colour to Depth Camera Calibration data (R, T) Registered point cloud, list of points (xyz,rgb) Time stamped synchronized RGB and Depth images Robot Sensors ramcip_robot_sensors/robot_sensors_aggregator??? Processed measurements from the environment User Communication Preferences ramcip_communication_preferences/communication_preferences The values of the user s communication preferences parameters Baseline_Model Population_Oriented_ Motor Skills Model Current Human Model ramcip_motor_skills_model/motor_skills_model Current measurements of the user s motor skills organized by skill Current Environment Model Global Mapping ramcip_global_mapping/global_mapping Pose graph (xyz and rpy) Aggregated Point clouds expressed as a 3D map Global obstacles map Hierarchical spaces / objects modelling ramcip_hierarchical_spaces/hierarchical_spaces List of objects Pose, label list of points for each object Current robot position and orientation State of the objects (articulated or not-articulated) Behavioural Model ramcip_behavioural_model/behavioural_model Current values of the user s behavioural parameters Virtual User Models - VUM management engine ramcip_vum/vum_management_engine VUM of user Environment_CAD_Model Cognitive Skills Model ramcip_cognitive_skills_model/cognitive_skills_model Current values of the user s cognitive skills parameters Human activity recognition Cognitive Reasoning High-level assistance decision maker Fine grained body motion analysis Current Human State ramcip_body_motion_analysis/body_motion_analysis Human Skeleton with joint values (e.g. URDF based) Human Skeleton with the necessary joint values to recognize the supported gestures as well as a pointcloud segment that contains the palm of the hand for further finger based gestures recognition ramcip_human_activity_recognition/human_activity_recognition List of human labels List of activities for each human Time stamped outputs ramcip_assistance_decision_maker/assistance_decision_maker Signal to start a communication plan in order for the RAMCIP platform to engage in a communication activity Signal to start a plan in order for the RAMCIP platform to engage in a robotic task Human Tracking ramcip_human_tracking/skeleton_tracker_node Skeleton tracking Human action recognition ramcip_human_action_recognition/human_action_recognition List of human labels List of actions for each human Time stamped outputs Human posture recognition Vital signs Monitoring ramcip_vital_signs_monitoring/vital_signs_monitoring Heart Rate (HR) Galvanic Skin Response (GSR) ramcip_human_posture_recognition/human_posture_recognition Human posture Human face orientation Label Time stamped outputs Device_placed_on_the_user _hand Human Identification ramcip_human_identification/human_identification Face recognition The identity of the user Affect Recognizer ramcip_affect_recognizer/affect_recognizer Estimated emotional state of user Communication Planner Communication decision maker Augmented reality projector ramcip_ar_projector/ar_projector_decision_maker ARHead Projected Content Pointed Action Goal Gesture Recognition Engine ramcip_gesture_recognition_engine/gesture_recognition_engine Recognized gesture ramcip_communication_decision_maker/communication_decision_maker The content to project Response to the ADM containing the outcome of the communication plan ramcip_ar_projector/ar_projector_head_controller Head commands ramcip_ar_projector/ar_projector_head Head joints Head frames ramcip_ar_projector/ar_projector_projector_control Overall Diagram Kinect2 Kinect1 F200 Se Imaging Current Environment State openni_launch/openni Undistorted, Rectified RGB image (640X480) pixels Undistorted, Rectified Depth image (640X480) pixels Colour Camera Calibration data (R, T) Depth Camera Calibration data (R, T) Colour to Depth Camera Calibration data (R, T) Registered point cloud, list of points (xyz,rgb) Time stamped synchronized RGB and Depth images realsense_camera/realsense_camera Undistorted, Rectified RGB image (640X480) pixels Undistorted, Rectified Depth image (640X480) pixels Colour Camera Calibration data (R, T) Depth Camera Calibration data (R, T) Colour to Depth Camera Calibration data (R, T) Registered point cloud, list of points (xyz,rgb) Time stamped synchronized RGB and Depth images kinect2_bridge/kinect2_bridge Undistorted, Rectified RGB image (1920x1080) pixels Undistorted, Rectified Depth image (512x424) pixels Colour Camera Calibration data (R, T) Depth Camera Calibration data (R, T) Colour to Depth Camera Calibration data (R, T) Registered point cloud, list of points (xyz,rgb) Time stamped synchronized RGB and Depth images Robot ramcip_robot_sensors/ro Processed measureme Robot Localization cip_localization/initial_pose itial robot localization pose ramcip_camera_helpers/camera_server Get data from camera. Response amcl/amcl_node Robot localization pose Environment Large Object Tracking ramcip_large_object_tracking/large_object_tracking Get large object. Response Robot localization pose Object Detection / Identification ramcip_object_detection/object_detection Get Object. Response

RAMCIP ROS Mapping Example Mapping of High-Level to low level types

Conclusion ArchGen tries to address the problem of architecture specification in a large robotic problem while supporting implementation Semantics is Component Based with High and Low level Testing with an running project

Future Work Making tool Open Source Improve Quality of Graphs More features Semantic Analysis Verification Tools Integration with ROS RAMCIP specific: continuing the project development and integration Other projects: REMEDI FP7 is a good candidate

Thanks! Questions?