AASS, Örebro University

Size: px
Start display at page:

Download "AASS, Örebro University"

Transcription

1 Henrik Andreasson, Abdelbaki Bouguerra, Marcello Mobile Cirillo, Robotics Dimitar Dimitrov, and Olfaction Dimiter Driankov, Lab, Martin Magnusson, Federico Pecora, and Achim J. Lilienthal AASS, Örebro University 1

2 Content 1. Transport 2

3 Content 1. Transport 2. The SAUNA Project o Safe Autonomous Navigation 3

4 Content 1. Transport 2. The SAUNA Project 3. Sensors 4

5 Content 1. Transport 2. The SAUNA Project 3. Sensors 4. Rich 3D Perception o what and why? 5

6 Content 1. Transport 2. The SAUNA Project 3. Sensors 4. Rich 3D Perception 5. Rich 3D in SAUNA 6

7 1 Professional Service Robots for Transport Applications at AASS 10

8 3. 1. Major Projects at the AASS MR&O Lab ( Application Area Mobile Robot Olfaction Robot Vision Safe Navigation & Planning Research Area Rich 3D Perception Other Industrial Vehicles SAVIE SaNa SAUNA ALL-4-eHAM ELWAYS Autonomous Unloading of Containers Grasping and Manipulation RobCab RobLog HANDLE Robosanis Artificial Olfaction Systems Gasbot DIADEM 11

9 3. 1. Major Projects at the AASS MR&O Lab ( Application Area Mobile Robot Olfaction Robot Vision Safe Navigation & Planning Research Area Rich 3D Perception Other Industrial Vehicles SAVIE SaNa SAUNA ALL-4-eHAM Autonomous Unloading of Containers Grasping and Manipulation RobCab RobLog Artificial Olfaction Systems 12

10 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) speed x 2 13

11 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) o environment with a dynamic "background" 14

12 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) o environment with a dynamic "background" o requires 3D sensing 1 meter drop to the railway tracks 15

13 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) Wheel Loaders (VolvoCE, VolvoTech, NCC) 17

14 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) Wheel Loaders (VolvoCE, VolvoTech, NCC) Mining Vehicles (Atlas Copco, Fotonic) picture from the actual mine! 18

15 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) Wheel Loaders (VolvoCE, VolvoTech, NCC) Mining Vehicles (Atlas Copco, Fotonic) Hospital Transport Vehicles (RobCab) 20

16 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) Wheel Loaders (VolvoCE, VolvoTech, NCC) Mining Vehicles (Atlas Copco, Fotonic) Hospital Transport Vehicles (RobCab) Garbage Bin Collection and Cleaning (RoboTech) 24

17 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) Wheel Loaders (VolvoCE, VolvoTech, NCC) Mining Vehicles (Atlas Copco, Fotonic) Hospital Transport Vehicles (RobCab) Garbage Bin Collection and Cleaning (RoboTech) Requirements o perception o manipulation (for picking up and unloading material / goods) o planning and execution o safe navigation 27

18 1. Professional Service Robots for Transport Applications Forklift Trucks (Danaher Motion, Linde MH, Stora Enso) Wheel Loaders (VolvoCE, VolvoTech, NCC) Mining Vehicles (Atlas Copco, Fotonic) Hospital Transport Vehicles (RobCab) Garbage Bin Collection and Cleaning (RoboTech) Requirements o perception o manipulation (for picking up and unloading material / goods) o planning and execution o safe navigation SAUNA! 28

19 2 SAUNA Leading Edge Research in Safe Autonomous Navigation 30

20 2. SAUNA Mobile Service Vehicles for Autonomous Transportation o transportation of goods by one or more vehicles o time and resource constraints o non-holonomic vehicles 31

21 2. SAUNA Mobile Service Vehicles for Autonomous Transportation o transportation of goods by one or more vehicles o time and resource constraints o non-holonomic vehicles Challenges o fleets of mixed autonomous and human-operated vehicles o high speeds (up to km/h) o rich 3D perception for enhanced safety and performance o automated mission planning capabilities at several levels of abstraction o flexible operation, accommodate run-time changes o collision avoidance throughout mission planning, trajectory computation and execution 32

22 2. SAUNA Scientific Objectives o Rich 3D Perception (depth + additional information)» mapping, localization» object identification and localization» real-time performance» long-term operation 33

23 2. SAUNA Scientific Objectives o Rich 3D Perception (depth + additional modalities)» mapping, localization» object identification and localization» real-time performance» long-term operation o Safe Motion» obstacle detection 34

24 2. SAUNA Scientific Objectives o Rich 3D Perception (depth + additional modalities)» mapping, localization» object identification and localization» real-time performance» long-term operation o Safe Motion ( very reliable collision avoidance)» tracking of vehicles and humans predicted collisions our vehicle a v other vehicle 35

25 2. SAUNA Scientific Objectives o Rich 3D Perception (depth + additional modalities)» mapping, localization» object identification and localization» real-time performance» long-term operation o Safe Motion ( very reliable collision avoidance)» tracking of vehicles and humans» collision avoidance (trajectory modification) our vehicle» real-time response at medium and high speeds (up to 30km/h) a v other vehicle 36

26 2. SAUNA Scientific Objectives o Rich 3D Perception (depth + additional modalities)» mapping, localization» object identification and localization» real-time performance» long-term operation o Safe Motion ( very reliable collision avoidance)» tracking of vehicles and humans» collision avoidance (trajectory modification)» real-time response at medium and high speeds (up to 30km/h) o Flexible Operation (no fixed action plans)» automated mission planning process (both mission and motion planning)» multiple types of requirements/constraints» incomplete prior knowledge 37

27 2. SAUNA Realization o in simulation o in real-world prototype robots 40

28 2. SAUNA Assumptions on the Environment o minimal infrastructure 42

29 2. SAUNA Assumptions on the Environment o minimal infrastructure High-Speed Navigation without Additional Infrastructure (MALTA) speed x 1 43

30 2. SAUNA Assumptions on the Environment o minimal infrastructure o environment is 3D, but navigation is in 2D» 3D workspace map 44

31 2. SAUNA Assumptions on the Environment o minimal infrastructure o environment is 3D, but navigation is in 2D» 3D workspace map» planning in 2D locally flat environment 45

32 2. SAUNA Assumptions on the Environment o minimal infrastructure o environment is 3D, but navigation is in 2D o fully dynamic objects» other vehicles» humans» navigation areas can be blocked 46

33 2. SAUNA Assumptions on the Environment o minimal infrastructure o environment is 3D, but navigation is in 2D o fully dynamic objects o slow dynamic changes» e.g., piles of objects/gravel 47

34 2. SAUNA Assumptions on the Environment o minimal infrastructure o environment is 3D, but navigation is in 2D o fully dynamic objects o slow dynamic changes o incomplete prior knowledge about the whereabouts of objects to transport 48

35 3 Sensors 56

36 3. Sensors for Autonomous Vehicles Sensors for Autonomous Vehicles o laser scanner (2D LRF) 57

37 3. Sensors for Autonomous Vehicles Sensors for Autonomous Vehicles o laser scanner (2D LRF) o laser scanner (3D LRF) 58

38 3. Sensors for Autonomous Vehicles Sensors for Autonomous Vehicles o laser scanner (2D LRF) o laser scanner (3D LRF) o laser scanner (3D a-lrf) 59

39 3. Sensors for Autonomous Vehicles Sensors for Autonomous Vehicles o laser scanner (2D LRF) o laser scanner (3D LRF) o laser scanner (3D a-lrf) o TOF camera Fotonic B70 Data Fotonic B70 ( 5k ) 60

40 3. Sensors for Autonomous Vehicles Sensors for Autonomous Vehicles o laser scanner (2D LRF) o laser scanner (3D LRF) o laser scanner (3D a-lrf) o TOF camera o range sensor + perspective camera ( Rich 3D) courtesy B. Huhle 61

41 3. Sensors for Autonomous Vehicles Sensors for Autonomous Vehicles o laser scanner (2D LRF) o laser scanner (3D LRF) o laser scanner (3D a-lrf) o TOF camera o range sensor + perspective camera ( Rich 3D)» Kinect sensor ( Rich 3D) Kinect Data Kinect ( 0.2k$) 63

42 3. Sensors for Autonomous Vehicles Sensors for Autonomous Vehicles o laser scanner (2D LRF) o laser scanner (3D LRF) o laser scanner (3D a-lrf) o TOF camera o range sensor + perspective camera ( Rich 3D)» "industrial Kinect" 64

43 4 Rich 3D Perception 66

44 4. What is Rich 3D? Rich 3D Perception o rich 3D = spatial data augmented with additional information» additional dimensions: e.g. colour coloured point cloud (example of rich 3D data) [Andreasson, 2005] 67

45 4. What is Rich 3D? Rich 3D Perception o rich 3D = spatial data augmented with additional information» additional dimensions: e.g. colour coloured point cloud (example of rich 3D data) [Andreasson, 2005] 68

46 4. What is Rich 3D? Rich 3D Perception o rich 3D = spatial data augmented with additional information» additional dimensions: e.g. colour, temperature "thermal" point cloud (example of rich 3D data) [Andreasson, 2006] 70

47 4. What is Rich 3D? Rich 3D Perception o rich 3D = spatial data augmented with additional information» additional dimensions: e.g. colour, temperature, remission, semantic label, confidence estimate, x i = (x i, y i, z i, r (1) i,..., r (n) i) 71

48 4. Why Rich 3D Perception? Why do we need "rich" 3D? 73

49 4. Why Rich 3D Perception? Why do we need "rich" 3D? o helps to establish the identity of obstacles» contains more information» can be important to maintain people tracks» can help to identify drivable areas 74

50 4. Why Rich 3D Perception? Why do we need "rich" 3D? o helps to establish the identity of obstacles o improved localization in areas without geometrical features 75

51 4. Why Rich 3D Perception? Why do we need "rich" 3D? o helps to establish the identity of obstacles o improved localization in areas without geometrical features o object separation in cluttered scenes may work with 3D perception... 76

52 4. Why Rich 3D Perception? Why do we need "rich" 3D? o helps to establish the identity of obstacles o improved localization in areas without geometrical features o object separation in cluttered scenes... here rich 3D is necessary 77

53 4. Why Rich 3D Perception? Why do we need "rich" 3D? o helps to establish the identity of obstacles o improved localization in areas without geometrical features o object separation in cluttered scenes o additional dimensions can carry the meaning of a map» classification labels, uncertainty 78

54 4. Why Rich 3D Perception? Why do we need "rich" 3D? o helps to establish the identity of obstacles o improved localization in areas without geometrical features o object separation in cluttered scenes o additional dimensions can carry the meaning of a map o may even allow for more compact representations (because it provides ways to instantaneously identify the most important information in the "rich 3D space") 79

55 5 Rich 3D Perception in SAUNA 81

56 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) develop algorithms to create and maintain a single compact and truthful rich 3D workspace model 89

57 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) create/maintain compact and truthful rich 3D map o (2) localize within rich 3D workspace model 90

58 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) create/maintain compact and truthful rich 3D map o (2) localize within rich 3D workspace model o (3) detect obstacles wrt rich 3D workspace model 91

59 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) create/maintain compact and truthful rich 3D map o (2) localize within rich 3D workspace model o (3) detect obstacles wrt rich 3D workspace model 92

60 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) create/maintain compact and truthful rich 3D map o (2) localize within rich 3D workspace model o (3) detect obstacles wrt rich 3D workspace model 93

61 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) create/maintain compact and truthful rich 3D map o (2) localize within rich 3D workspace model o (3) detect obstacles wrt rich 3D workspace model 94

62 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) create/maintain compact and truthful rich 3D map o (2) localize within rich 3D workspace model o (3) detect obstacles wrt rich 3D workspace model o (4) identify drivable areas in rich 3D workspace model 95

63 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o (1) create/maintain compact and truthful rich 3D map o (2) localize within rich 3D workspace model o (3) detect obstacles wrt rich 3D workspace model o (4) identify drivable areas in rich 3D workspace model 96

64 5. Rich 3D Perception => Produce Really Useful Maps Rich 3D Perception in SAUNA Expected Outcome o Create and maintain a compact and useful rich 3D workspace model and use it! 100

65 Henrik Andreasson, Abdelbaki Bouguerra, Marcello Mobile Cirillo, Robotics Dimitar Dimitrov, and Olfaction Dimiter Driankov, Lab, Martin Magnusson, Federico Pecora, and Achim J. Lilienthal AASS, Örebro University 171

Mobile Robotics contact: and Olfaction Lab,

Mobile Robotics contact: and Olfaction Lab, Prof. Achim J. Lilienthal Mobile Robotics contact: and Olfaction Lab, www.aass.oru.se/~lilien AASS, Örebro achim.lilienthal@oru.se University 1 Prof. Achim J. Lilienthal Mobile Robotics contact: and Olfaction

More information

Current Research at AASS Learning Systems Lab

Current Research at AASS Learning Systems Lab Current Research at AASS Learning Systems Lab Achim Lilienthal, Tom Duckett, Henrik Andreasson, Grzegorz Cielniak, Li Jun, Martin Magnusson, Martin Persson, Alexander Skoglund, Christoffer Wahlgren Örebro

More information

Autonomous Transport Vehicles

Autonomous Transport Vehicles Autonomous Transport Vehicles Where We Are and What Is Missing By Henrik Andreasson, Abdelbaki Bouguerra, Marcello Cirillo, Dimitar Nikolaev Dimitrov, Dimiter Driankov, Lars Karlsson, Achim J. Lilienthal,

More information

ToF Camera for high resolution 3D images with affordable pricing

ToF Camera for high resolution 3D images with affordable pricing ToF Camera for high resolution 3D images with affordable pricing Basler AG Jana Bartels, Product Manager 3D Agenda Coming next I. Basler AG II. 3D Purpose and Time-of-Flight - Working Principle III. Advantages

More information

A Modular Software Framework for Eye-Hand Coordination in Humanoid Robots

A Modular Software Framework for Eye-Hand Coordination in Humanoid Robots A Modular Software Framework for Eye-Hand Coordination in Humanoid Robots Jurgen Leitner, Simon Harding, Alexander Forster and Peter Corke Presentation: Hana Fusman Introduction/ Overview The goal of their

More information

W4. Perception & Situation Awareness & Decision making

W4. Perception & Situation Awareness & Decision making W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic

More information

Visual Perception for Robots

Visual Perception for Robots Visual Perception for Robots Sven Behnke Computer Science Institute VI Autonomous Intelligent Systems Our Cognitive Robots Complete systems for example scenarios Equipped with rich sensors Flying robot

More information

New Sony DepthSense TM ToF Technology

New Sony DepthSense TM ToF Technology ADVANCED MATERIAL HANDLING WITH New Sony DepthSense TM ToF Technology Jenson Chang Product Marketing November 7, 2018 1 3D SENSING APPLICATIONS Pick and Place Drones Collision Detection People Counting

More information

3D Simultaneous Localization and Mapping and Navigation Planning for Mobile Robots in Complex Environments

3D Simultaneous Localization and Mapping and Navigation Planning for Mobile Robots in Complex Environments 3D Simultaneous Localization and Mapping and Navigation Planning for Mobile Robots in Complex Environments Sven Behnke University of Bonn, Germany Computer Science Institute VI Autonomous Intelligent Systems

More information

Robust Autonomous Systems. Technical University of Denmark DTU-Building 326 DK-2800 Kgs. Lyngby Denmark aut.elektro.dtu.

Robust Autonomous Systems. Technical University of Denmark DTU-Building 326 DK-2800 Kgs. Lyngby Denmark aut.elektro.dtu. Robust Autonomous Systems Technical University of Denmark DTU-Building 326 DK-2800 Kgs. Lyngby Denmark aut.elektro.dtu.dk Ole Ravn Ole Ravn Civilingeniør, Ph.D. fra DTU Faggruppeleder, Automation and Control,

More information

Comparative Evaluation of Range Sensor Accuracy in Indoor Environments

Comparative Evaluation of Range Sensor Accuracy in Indoor Environments Comparative Evaluation of Range Sensor Accuracy in Indoor Environments Todor Stoyanov, Athanasia Louloudi, Henrik Andreasson and Achim J. Lilienthal Center of Applied Autonomous Sensor Systems (AASS),

More information

Spring Localization II. Roland Siegwart, Margarita Chli, Martin Rufli. ASL Autonomous Systems Lab. Autonomous Mobile Robots

Spring Localization II. Roland Siegwart, Margarita Chli, Martin Rufli. ASL Autonomous Systems Lab. Autonomous Mobile Robots Spring 2016 Localization II Localization I 25.04.2016 1 knowledge, data base mission commands Localization Map Building environment model local map position global map Cognition Path Planning path Perception

More information

New Sony DepthSense TM ToF Technology

New Sony DepthSense TM ToF Technology ADVANCED MATERIAL HANDLING WITH New Sony DepthSense TM ToF Technology Jenson Chang Product Marketing November 7, 2018 1 3D SENSING APPLICATIONS Pick and Place Drones Collision Detection People Counting

More information

Introduction to Autonomous Mobile Robots

Introduction to Autonomous Mobile Robots Introduction to Autonomous Mobile Robots second edition Roland Siegwart, Illah R. Nourbakhsh, and Davide Scaramuzza The MIT Press Cambridge, Massachusetts London, England Contents Acknowledgments xiii

More information

Final Exam Practice Fall Semester, 2012

Final Exam Practice Fall Semester, 2012 COS 495 - Autonomous Robot Navigation Final Exam Practice Fall Semester, 2012 Duration: Total Marks: 70 Closed Book 2 hours Start Time: End Time: By signing this exam, I agree to the honor code Name: Signature:

More information

Semantic Mapping and Reasoning Approach for Mobile Robotics

Semantic Mapping and Reasoning Approach for Mobile Robotics Semantic Mapping and Reasoning Approach for Mobile Robotics Caner GUNEY, Serdar Bora SAYIN, Murat KENDİR, Turkey Key words: Semantic mapping, 3D mapping, probabilistic, robotic surveying, mine surveying

More information

Big Picture: Overview of Issues and Challenges in Autonomous Robotics + Impact on Practical Implementation Details

Big Picture: Overview of Issues and Challenges in Autonomous Robotics + Impact on Practical Implementation Details Big Picture: Overview of Issues and Challenges in Autonomous Robotics + Impact on Practical Implementation Details August 27, 2002 Class Meeting 2 Announcements/Questions Course mailing list set up: cs594-sir@cs.utk.edu

More information

Active Recognition and Manipulation of Simple Parts Exploiting 3D Information

Active Recognition and Manipulation of Simple Parts Exploiting 3D Information experiment ActReMa Active Recognition and Manipulation of Simple Parts Exploiting 3D Information Experiment Partners: Rheinische Friedrich-Wilhelms-Universität Bonn Metronom Automation GmbH Experiment

More information

Are you looking for ultrafast and extremely precise stereovision technology for industrial applications? Learn about

Are you looking for ultrafast and extremely precise stereovision technology for industrial applications? Learn about Edition November 2017 Image sensors and vision systems, Smart Industries, imec.engineering Are you looking for ultrafast and extremely precise stereovision technology for industrial applications? Learn

More information

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm) Chapter 8.2 Jo-Car2 Autonomous Mode Path Planning (Cost Matrix Algorithm) Introduction: In order to achieve its mission and reach the GPS goal safely; without crashing into obstacles or leaving the lane,

More information

Depth Sensors Kinect V2 A. Fornaser

Depth Sensors Kinect V2 A. Fornaser Depth Sensors Kinect V2 A. Fornaser alberto.fornaser@unitn.it Vision Depth data It is not a 3D data, It is a map of distances Not a 3D, not a 2D it is a 2.5D or Perspective 3D Complete 3D - Tomography

More information

Developing Algorithms for Robotics and Autonomous Systems

Developing Algorithms for Robotics and Autonomous Systems Developing Algorithms for Robotics and Autonomous Systems Jorik Caljouw 2015 The MathWorks, Inc. 1 Key Takeaway of this Talk Success in developing an autonomous robotics system requires: 1. Multi-domain

More information

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2

More information

EE631 Cooperating Autonomous Mobile Robots

EE631 Cooperating Autonomous Mobile Robots EE631 Cooperating Autonomous Mobile Robots Lecture: Multi-Robot Motion Planning Prof. Yi Guo ECE Department Plan Introduction Premises and Problem Statement A Multi-Robot Motion Planning Algorithm Implementation

More information

Autonomous navigation in industrial cluttered environments using embedded stereo-vision

Autonomous navigation in industrial cluttered environments using embedded stereo-vision Autonomous navigation in industrial cluttered environments using embedded stereo-vision Julien Marzat ONERA Palaiseau Aerial Robotics workshop, Paris, 8-9 March 2017 1 Copernic Lab (ONERA Palaiseau) Research

More information

CS 4758: Automated Semantic Mapping of Environment

CS 4758: Automated Semantic Mapping of Environment CS 4758: Automated Semantic Mapping of Environment Dongsu Lee, ECE, M.Eng., dl624@cornell.edu Aperahama Parangi, CS, 2013, alp75@cornell.edu Abstract The purpose of this project is to program an Erratic

More information

CONTENT ENGINEERING & VISION LABORATORY. Régis Vinciguerra

CONTENT ENGINEERING & VISION LABORATORY. Régis Vinciguerra CONTENT ENGINEERING & VISION LABORATORY Régis Vinciguerra regis.vinciguerra@cea.fr ALTERNATIVE ENERGIES AND ATOMIC ENERGY COMMISSION Military Applications Division (DAM) Nuclear Energy Division (DEN) Technological

More information

3D object recognition used by team robotto

3D object recognition used by team robotto 3D object recognition used by team robotto Workshop Juliane Hoebel February 1, 2016 Faculty of Computer Science, Otto-von-Guericke University Magdeburg Content 1. Introduction 2. Depth sensor 3. 3D object

More information

Space Robotics. Ioannis Rekleitis

Space Robotics. Ioannis Rekleitis Space Robotics Ioannis Rekleitis On-Orbit Servicing of Satellites Work done at the Canadian Space Agency Guy Rouleau, Ioannis Rekleitis, Régent L'Archevêque, Eric Martin, Kourosh Parsa, and Erick Dupuis

More information

Robotics: Science and Systems II

Robotics: Science and Systems II Robotics: Science and Systems II 6.189/2.994/16.401 September 7th, 2005 Last Semester Motor Control Visual Servoing Range Processing Planning Manipulation Lab Progression 1. Schematics: Layout and Components

More information

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science. Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Introduction to 2 What is? A process that produces from images of the external world a description

More information

Introduction to Information Science and Technology (IST) Part IV: Intelligent Machines and Robotics Planning

Introduction to Information Science and Technology (IST) Part IV: Intelligent Machines and Robotics Planning Introduction to Information Science and Technology (IST) Part IV: Intelligent Machines and Robotics Planning Sören Schwertfeger / 师泽仁 ShanghaiTech University ShanghaiTech University - SIST - 10.05.2017

More information

What is computer vision?

What is computer vision? What is computer vision? Computer vision (image understanding) is a discipline that studies how to reconstruct, interpret and understand a 3D scene from its 2D images in terms of the properties of the

More information

Integrating multiple representations of spatial knowledge for mapping, navigation, and communication

Integrating multiple representations of spatial knowledge for mapping, navigation, and communication Integrating multiple representations of spatial knowledge for mapping, navigation, and communication Patrick Beeson Matt MacMahon Joseph Modayil Aniket Murarka Benjamin Kuipers Department of Computer Sciences

More information

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Asaf Moses Systematics Ltd., Technical Product Manager aviasafm@systematics.co.il 1 Autonomous

More information

AMR 2011/2012: Final Projects

AMR 2011/2012: Final Projects AMR 2011/2012: Final Projects 0. General Information A final project includes: studying some literature (typically, 1-2 papers) on a specific subject performing some simulations or numerical tests on an

More information

Prof. Fanny Ficuciello Robotics for Bioengineering Visual Servoing

Prof. Fanny Ficuciello Robotics for Bioengineering Visual Servoing Visual servoing vision allows a robotic system to obtain geometrical and qualitative information on the surrounding environment high level control motion planning (look-and-move visual grasping) low level

More information

An Interactive Technique for Robot Control by Using Image Processing Method

An Interactive Technique for Robot Control by Using Image Processing Method An Interactive Technique for Robot Control by Using Image Processing Method Mr. Raskar D. S 1., Prof. Mrs. Belagali P. P 2 1, E&TC Dept. Dr. JJMCOE., Jaysingpur. Maharashtra., India. 2 Associate Prof.

More information

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016 Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused

More information

Digital Images. Kyungim Baek. Department of Information and Computer Sciences. ICS 101 (November 1, 2016) Digital Images 1

Digital Images. Kyungim Baek. Department of Information and Computer Sciences. ICS 101 (November 1, 2016) Digital Images 1 Digital Images Kyungim Baek Department of Information and Computer Sciences ICS 101 (November 1, 2016) Digital Images 1 iclicker Question I know a lot about how digital images are represented, stored,

More information

Ceilbot vision and mapping system

Ceilbot vision and mapping system Ceilbot vision and mapping system Provide depth and camera data from the robot's environment Keep a map of the environment based on the received data Keep track of the robot's location on the map Recognize

More information

Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving. Frank Schuster, Dr. Martin Haueis

Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving. Frank Schuster, Dr. Martin Haueis Measuring the World: Designing Robust Vehicle Localization for Autonomous Driving Frank Schuster, Dr. Martin Haueis Agenda Motivation: Why measure the world for autonomous driving? Map Content: What do

More information

Spring Localization II. Roland Siegwart, Margarita Chli, Juan Nieto, Nick Lawrance. ASL Autonomous Systems Lab. Autonomous Mobile Robots

Spring Localization II. Roland Siegwart, Margarita Chli, Juan Nieto, Nick Lawrance. ASL Autonomous Systems Lab. Autonomous Mobile Robots Spring 2018 Localization II Localization I 16.04.2018 1 knowledge, data base mission commands Localization Map Building environment model local map position global map Cognition Path Planning path Perception

More information

Autonomous Navigation of Nao using Kinect CS365 : Project Report

Autonomous Navigation of Nao using Kinect CS365 : Project Report Autonomous Navigation of Nao using Kinect CS365 : Project Report Samyak Daga Harshad Sawhney 11633 11297 samyakd@iitk.ac.in harshads@iitk.ac.in Dept. of CSE Dept. of CSE Indian Institute of Technology,

More information

BIN PICKING APPLICATIONS AND TECHNOLOGIES

BIN PICKING APPLICATIONS AND TECHNOLOGIES BIN PICKING APPLICATIONS AND TECHNOLOGIES TABLE OF CONTENTS INTRODUCTION... 3 TYPES OF MATERIAL HANDLING... 3 WHOLE BIN PICKING PROCESS... 4 VISION SYSTEM: HARDWARE... 4 VISION SYSTEM: SOFTWARE... 5 END

More information

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza ETH Master Course: 151-0854-00L Autonomous Mobile Robots Summary 2 Lecture Overview Mobile Robot Control Scheme knowledge, data base mission

More information

All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them.

All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them. All human beings desire to know. [...] sight, more than any other senses, gives us knowledge of things and clarifies many differences among them. - Aristotle University of Texas at Arlington Introduction

More information

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science. Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Bundle Adjustment 2 Example Application A vehicle needs to map its environment that it is moving

More information

Learning Semantic Environment Perception for Cognitive Robots

Learning Semantic Environment Perception for Cognitive Robots Learning Semantic Environment Perception for Cognitive Robots Sven Behnke University of Bonn, Germany Computer Science Institute VI Autonomous Intelligent Systems Some of Our Cognitive Robots Equipped

More information

Outline of today s lecture. Proseminar Roboter und Aktivmedien Mobile Service Robot TASER. Outline of today s lecture.

Outline of today s lecture. Proseminar Roboter und Aktivmedien Mobile Service Robot TASER. Outline of today s lecture. Proseminar Roboter und Aktivmedien Mobile Service Robot TASER Lecturer Lecturer Houxiang Houxiang Zhang Zhang TAMS, TAMS, Department Department of of Informatics Informatics University University of of

More information

ROBOT NAVIGATION IN VERY COMPLEX, DENSE, AND CLUTTERED INDOOR/OUTDOOR ENVIRONMENTS

ROBOT NAVIGATION IN VERY COMPLEX, DENSE, AND CLUTTERED INDOOR/OUTDOOR ENVIRONMENTS ROBOT NAVIGATION IN VERY COMPLEX, DENSE, AND CLUTTERED INDOOR/OUTDOOR ENVIRONMENTS Javier Minguez Luis Montano Computer Science and Systems Engineering, CPS, University of Zaragoza, Spain Abstract: This

More information

Final Project Report: Mobile Pick and Place

Final Project Report: Mobile Pick and Place Final Project Report: Mobile Pick and Place Xiaoyang Liu (xiaoyan1) Juncheng Zhang (junchen1) Karthik Ramachandran (kramacha) Sumit Saxena (sumits1) Yihao Qian (yihaoq) Adviser: Dr Matthew Travers Carnegie

More information

arxiv: v1 [cs.ro] 2 Sep 2017

arxiv: v1 [cs.ro] 2 Sep 2017 arxiv:1709.00525v1 [cs.ro] 2 Sep 2017 Sensor Network Based Collision-Free Navigation and Map Building for Mobile Robots Hang Li Abstract Safe robot navigation is a fundamental research field for autonomous

More information

Semantic RGB-D Perception for Cognitive Robots

Semantic RGB-D Perception for Cognitive Robots Semantic RGB-D Perception for Cognitive Robots Sven Behnke Computer Science Institute VI Autonomous Intelligent Systems Our Domestic Service Robots Dynamaid Cosero Size: 100-180 cm, weight: 30-35 kg 36

More information

Preface...vii. Printed vs PDF Versions of the Book...ix. 1. Scope of this Volume Installing the ros-by-example Code...3

Preface...vii. Printed vs PDF Versions of the Book...ix. 1. Scope of this Volume Installing the ros-by-example Code...3 Contents Preface...vii Printed vs PDF Versions of the Book...ix 1. Scope of this Volume...1 2. Installing the ros-by-example Code...3 3. Task Execution using ROS...7 3.1 A Fake Battery Simulator...8 3.2

More information

ON THE DUALITY OF ROBOT AND SENSOR PATH PLANNING. Ashleigh Swingler and Silvia Ferrari Mechanical Engineering and Materials Science Duke University

ON THE DUALITY OF ROBOT AND SENSOR PATH PLANNING. Ashleigh Swingler and Silvia Ferrari Mechanical Engineering and Materials Science Duke University ON THE DUALITY OF ROBOT AND SENSOR PATH PLANNING Ashleigh Swingler and Silvia Ferrari Mechanical Engineering and Materials Science Duke University Conference on Decision and Control - December 10, 2013

More information

ArchGenTool: A System-Independent Collaborative Tool for Robotic Architecture Design

ArchGenTool: A System-Independent Collaborative Tool for Robotic Architecture Design ArchGenTool: A System-Independent Collaborative Tool for Robotic Architecture Design Emanuele Ruffaldi (SSSA) I. Kostavelis, D. Giakoumis, D. Tzovaras (CERTH) Overview Problem Statement Existing Solutions

More information

3D Collision Avoidance for Navigation in Unstructured Environments

3D Collision Avoidance for Navigation in Unstructured Environments 3D Collision Avoidance for Navigation in Unstructured Environments Armin Hornung Humanoid Robots Lab, University of Freiburg May 5, 2011 Motivation Personal robots are to operate in arbitrary complex environments:

More information

Applying AI to Mapping

Applying AI to Mapping Applying AI to Mapping Dr. Ryan Wolcott Manager, Simultaneous Localization and Mapping (SLAM) 1 TRI: Who We Are Our mission is to improve the quality of human life through advances in artificial intelligence,

More information

Creating Affordable and Reliable Autonomous Vehicle Systems

Creating Affordable and Reliable Autonomous Vehicle Systems Creating Affordable and Reliable Autonomous Vehicle Systems Shaoshan Liu shaoshan.liu@perceptin.io Autonomous Driving Localization Most crucial task of autonomous driving Solutions: GNSS but withvariations,

More information

Operation of machine vision system

Operation of machine vision system ROBOT VISION Introduction The process of extracting, characterizing and interpreting information from images. Potential application in many industrial operation. Selection from a bin or conveyer, parts

More information

Safe Prediction-Based Local Path Planning using Obstacle Probability Sections

Safe Prediction-Based Local Path Planning using Obstacle Probability Sections Slide 1 Safe Prediction-Based Local Path Planning using Obstacle Probability Sections Tanja Hebecker and Frank Ortmeier Chair of Software Engineering, Otto-von-Guericke University of Magdeburg, Germany

More information

3D Vision System. Vision for Your Automation

3D Vision System. Vision for Your Automation 3D Vision System Vision for Your Automation Experience and Innovation 2003 2006 2009 2012 2014 2015 2016 3D robot guidance 3D robot guidance Complete range of software for 3D 3D robot guidance Launching

More information

Design of a Home Multi- Robot System for the Elderly and Disabled PAT RICK BENAVID EZ

Design of a Home Multi- Robot System for the Elderly and Disabled PAT RICK BENAVID EZ Design of a Home Multi- Robot System for the Elderly and Disabled PAT RICK BENAVID EZ Overview Introduction Proposed Home Robotic System Software and Simulation Environment Hardware Summary of Contributions

More information

UAV Autonomous Navigation in a GPS-limited Urban Environment

UAV Autonomous Navigation in a GPS-limited Urban Environment UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight

More information

Simultaneous Localization and Mapping (SLAM)

Simultaneous Localization and Mapping (SLAM) Simultaneous Localization and Mapping (SLAM) RSS Lecture 16 April 8, 2013 Prof. Teller Text: Siegwart and Nourbakhsh S. 5.8 SLAM Problem Statement Inputs: No external coordinate reference Time series of

More information

Robotics. Haslum COMP3620/6320

Robotics. Haslum COMP3620/6320 Robotics P@trik Haslum COMP3620/6320 Introduction Robotics Industrial Automation * Repetitive manipulation tasks (assembly, etc). * Well-known, controlled environment. * High-power, high-precision, very

More information

Field-of-view dependent registration of point clouds and incremental segmentation of table-tops using time-offlight

Field-of-view dependent registration of point clouds and incremental segmentation of table-tops using time-offlight Field-of-view dependent registration of point clouds and incremental segmentation of table-tops using time-offlight cameras Dipl.-Ing. Georg Arbeiter Fraunhofer Institute for Manufacturing Engineering

More information

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang MULTI-MODAL MAPPING Robotics Day, 31 Mar 2017 Frank Mascarich, Shehryar Khattak, Tung Dang Application-Specific Sensors Cameras TOF Cameras PERCEPTION LiDAR IMU Localization Mapping Autonomy Robotic Perception

More information

Vision Aided 3D Laser Scanner Based Registration

Vision Aided 3D Laser Scanner Based Registration 1 Vision Aided 3D Laser Scanner Based Registration Henrik Andreasson Achim Lilienthal Centre of Applied Autonomous Sensor Systems, Dept. of Technology, Örebro University, Sweden Abstract This paper describes

More information

Actuated Sensor Networks: a 40-minute rant on convergence of vision, robotics, and sensor networks

Actuated Sensor Networks: a 40-minute rant on convergence of vision, robotics, and sensor networks 1 Actuated Sensor Networks: a 40-minute rant on convergence of vision, robotics, and sensor networks Chad Jenkins Assistant Professor Computer Science Department Brown University Women in Computer Science

More information

Machine Vision based Data Acquisition, Processing & Automation for Subsea Inspection & Detection

Machine Vision based Data Acquisition, Processing & Automation for Subsea Inspection & Detection Machine Vision based Data Acquisition, Processing & Automation for Subsea Inspection & Detection 1 st Nov 2017 SUT Presentation: The Leading Edge of Value Based Subsea Inspection Adrian Boyle CEO Introduction

More information

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner AARMS Vol. 15, No. 1 (2016) 51 59. Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner Peter KUCSERA 1 Thanks to the developing sensor technology in mobile robot navigation

More information

Overview of Course. Practical Robot Implementation Details. August 28, 2008

Overview of Course. Practical Robot Implementation Details. August 28, 2008 Overview of Course + Practical Robot Implementation Details August 28, 2008 Announcements/Questions Course mailing list set up: cs594mr-students@eecs.utk.edu If you haven t received a welcome message from

More information

IEEE Control Systems Society, SCV November 18, 2015

IEEE Control Systems Society, SCV November 18, 2015 Multi-Robot Adaptive Navigation IEEE Control Systems Society, SCV November 18, 2015 Dr. Christopher Kitts Director, Robotic Systems Laboratory Associate Dean of Research & Faculty Development School of

More information

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit

3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 3D Terrain Sensing System using Laser Range Finder with Arm-Type Movable Unit 9 Toyomi Fujita and Yuya Kondo Tohoku Institute of Technology Japan 1. Introduction A 3D configuration and terrain sensing

More information

5G promotes the intelligence connected vehicles. Dr. Menghua Tao Senior Solution Manager China Unicom

5G promotes the intelligence connected vehicles. Dr. Menghua Tao Senior Solution Manager China Unicom 5G promotes the intelligence connected vehicles Dr. Menghua Tao Senior Solution Manager China Unicom ICT enabled automated driving One of the important features of a smart car is automated driving. As

More information

A System for Real-time Detection and Tracking of Vehicles from a Single Car-mounted Camera

A System for Real-time Detection and Tracking of Vehicles from a Single Car-mounted Camera A System for Real-time Detection and Tracking of Vehicles from a Single Car-mounted Camera Claudio Caraffi, Tomas Vojir, Jiri Trefny, Jan Sochman, Jiri Matas Toyota Motor Europe Center for Machine Perception,

More information

Active Light Time-of-Flight Imaging

Active Light Time-of-Flight Imaging Active Light Time-of-Flight Imaging Time-of-Flight Pulse-based time-of-flight scanners [Gvili03] NOT triangulation based short infrared laser pulse is sent from camera reflection is recorded in a very

More information

OPEN SIMULATION INTERFACE. INTRODUCTION AND OVERVIEW.

OPEN SIMULATION INTERFACE. INTRODUCTION AND OVERVIEW. OPEN SIMULATION INTERFACE. INTRODUCTION AND OVERVIEW. DIFFERENTIATION OF SIMULATION DATA INTERFACES. Odometry / Dynamics Interface Open Interface (OSI) Map Interface Dynamics Model Map Model Vehicle Dynamics

More information

Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform. Project Proposal. Cognitive Robotics, Spring 2005

Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform. Project Proposal. Cognitive Robotics, Spring 2005 Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform Project Proposal Cognitive Robotics, Spring 2005 Kaijen Hsiao Henry de Plinval Jason Miller Introduction In the context

More information

CMU Amazon Picking Challenge

CMU Amazon Picking Challenge CMU Amazon Picking Challenge Team Harp (Human Assistive Robot Picker) Graduate Developers Abhishek Bhatia Alex Brinkman Lekha Walajapet Feroza Naina Rick Shanor Search Based Planning Lab Maxim Likhachev

More information

CS283: Robotics Fall 2016: Software

CS283: Robotics Fall 2016: Software CS283: Robotics Fall 2016: Software Sören Schwertfeger / 师泽仁 ShanghaiTech University Mobile Robotics ShanghaiTech University - SIST - 18.09.2016 2 Review Definition Robot: A machine capable of performing

More information

Postprint.

Postprint. http://www.diva-portal.org Postprint This is the accepted version of a paper presented at 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, September 28-October

More information

Intra-Logistics with Integrated Automatic Deployment: Safe and Scalable Fleets in Shared Spaces

Intra-Logistics with Integrated Automatic Deployment: Safe and Scalable Fleets in Shared Spaces Intra-Logistics with Integrated Automatic Deployment: Safe and Scalable Fleets in Shared Spaces H2020-ICT-2016-2017 Grant agreement no: 732737 DELIVERABLE 1.2 Precise localisation and mapping in dynamic

More information

Electronic Travel Aids for Blind Guidance an Industry Landscape Study

Electronic Travel Aids for Blind Guidance an Industry Landscape Study Electronic Travel Aids for Blind Guidance an Industry Landscape Study Kun (Linda) Li EECS, UC Berkeley Dec 4 th, 2015 IND ENG 290 Final Presentation Visual Impairment Traditional travel aids are limited,

More information

How Microcontrollers help GPUs in Autonomous Drive

How Microcontrollers help GPUs in Autonomous Drive How Microcontrollers help GPUs in Autonomous Drive GTC 2017 Munich, 2017-10-12 Hans Adlkofer, VP Automotive System department Outline 1 Main Safety concepts 2 Sensor Fusion architecture and functionalities

More information

Flexible Visual Inspection. IAS-13 Industrial Forum Horizon 2020 Dr. Eng. Stefano Tonello - CEO

Flexible Visual Inspection. IAS-13 Industrial Forum Horizon 2020 Dr. Eng. Stefano Tonello - CEO Flexible Visual Inspection IAS-13 Industrial Forum Horizon 2020 Dr. Eng. Stefano Tonello - CEO IT+Robotics Spin-off of University of Padua founded in 2005 Strong relationship with IAS-LAB (Intelligent

More information

Optimally Placing Multiple Kinect Sensors for Workplace Monitoring

Optimally Placing Multiple Kinect Sensors for Workplace Monitoring Nima Rafibakhsht Industrial Engineering Southern Illinois University Edwardsville, IL 62026 nima1367@gmail.com H. Felix Lee Industrial Engineering Southern Illinois University Edwardsville, IL 62026 hflee@siue.edu

More information

Deep Incremental Scene Understanding. Federico Tombari & Christian Rupprecht Technical University of Munich, Germany

Deep Incremental Scene Understanding. Federico Tombari & Christian Rupprecht Technical University of Munich, Germany Deep Incremental Scene Understanding Federico Tombari & Christian Rupprecht Technical University of Munich, Germany C. Couprie et al. "Toward Real-time Indoor Semantic Segmentation Using Depth Information"

More information

TURN AROUND BEHAVIOR GENERATION AND EXECUTION FOR UNMANNED GROUND VEHICLES OPERATING IN ROUGH TERRAIN

TURN AROUND BEHAVIOR GENERATION AND EXECUTION FOR UNMANNED GROUND VEHICLES OPERATING IN ROUGH TERRAIN 1 TURN AROUND BEHAVIOR GENERATION AND EXECUTION FOR UNMANNED GROUND VEHICLES OPERATING IN ROUGH TERRAIN M. M. DABBEERU AND P. SVEC Department of Mechanical Engineering, University of Maryland, College

More information

Photoneo's brand new PhoXi 3D Camera is the highest resolution and highest accuracy area based 3D

Photoneo's brand new PhoXi 3D Camera is the highest resolution and highest accuracy area based 3D Company: Photoneo s.r.o. Germany Contact: Veronika Pulisova E-mail: pulisova@photoneo.com PhoXi 3D Camera Author: Tomas Kovacovsky & Jan Zizka Description of the innovation: General description Photoneo's

More information

Has Something Changed Here? Autonomous Difference Detection for Security Patrol Robots

Has Something Changed Here? Autonomous Difference Detection for Security Patrol Robots Has Something Changed Here? Autonomous Difference Detection for Security Patrol Robots Henrik Andreasson, Martin Magnusson and Achim Lilienthal Centre of Applied Autonomous Sensor Systems, Dept. of Technology,

More information

3D-2D Laser Range Finder calibration using a conic based geometry shape

3D-2D Laser Range Finder calibration using a conic based geometry shape 3D-2D Laser Range Finder calibration using a conic based geometry shape Miguel Almeida 1, Paulo Dias 1, Miguel Oliveira 2, Vítor Santos 2 1 Dept. of Electronics, Telecom. and Informatics, IEETA, University

More information

The Kinect Sensor. Luís Carriço FCUL 2014/15

The Kinect Sensor. Luís Carriço FCUL 2014/15 Advanced Interaction Techniques The Kinect Sensor Luís Carriço FCUL 2014/15 Sources: MS Kinect for Xbox 360 John C. Tang. Using Kinect to explore NUI, Ms Research, From Stanford CS247 Shotton et al. Real-Time

More information

H2020 Space Robotic SRC- OG4

H2020 Space Robotic SRC- OG4 H2020 Space Robotic SRC- OG4 2 nd PERASPERA workshop Presentation by Sabrina Andiappane Thales Alenia Space France This project has received funding from the European Union s Horizon 2020 research and

More information

COS Lecture 13 Autonomous Robot Navigation

COS Lecture 13 Autonomous Robot Navigation COS 495 - Lecture 13 Autonomous Robot Navigation Instructor: Chris Clark Semester: Fall 2011 1 Figures courtesy of Siegwart & Nourbakhsh Control Structure Prior Knowledge Operator Commands Localization

More information

AUTOMATED GENERATION OF VIRTUAL DRIVING SCENARIOS FROM TEST DRIVE DATA

AUTOMATED GENERATION OF VIRTUAL DRIVING SCENARIOS FROM TEST DRIVE DATA F2014-ACD-014 AUTOMATED GENERATION OF VIRTUAL DRIVING SCENARIOS FROM TEST DRIVE DATA 1 Roy Bours (*), 1 Martijn Tideman, 2 Ulrich Lages, 2 Roman Katz, 2 Martin Spencer 1 TASS International, Rijswijk, The

More information

Announcements. Exam #2 next Thursday (March 13) Covers material from Feb. 11 through March 6

Announcements. Exam #2 next Thursday (March 13) Covers material from Feb. 11 through March 6 Multi-Robot Path Planning and Multi-Robot Traffic Management March 6, 2003 Class Meeting 16 Announcements Exam #2 next Thursday (March 13) Covers material from Feb. 11 through March 6 Up to Now Swarm-Type

More information

Terrain Data Real-time Analysis Based on Point Cloud for Mars Rover

Terrain Data Real-time Analysis Based on Point Cloud for Mars Rover Terrain Data Real-time Analysis Based on Point Cloud for Mars Rover Haoruo ZHANG 1, Yuanjie TAN, Qixin CAO. Abstract. With the development of space exploration, more and more aerospace researchers pay

More information