HANSE - A Low-Cost Autonomous Underwater Vehicle

Size: px
Start display at page:

Download "HANSE - A Low-Cost Autonomous Underwater Vehicle"

Transcription

1 HANSE - A Low-Cost Autonomous Underwater Vehicle Dariush Forouher, Jan Hartmann, Jan Helge Klüssendorff, Erik Maehle, Benjamin Meyer, Christoph Osterloh, and Thomas Tosik Institute of Computer Engineering, University of Lübeck, Ratzeburger Allee 160, Lübeck, Germany Abstract. HANSE is a low-cost Autonomous Underwater Vehicle (AUV) capable of solving many common underwater challenges. In this paper we will present HANSE s modular and expandable hardware and software design, the underwater simulator MARS, as well as robust and efficient sonar-based localization and vision-based object detection algorithms, with which we have successfully participated in the Student Autonomous Underwater Vehicle Challenge in Europe (SAUC-E) Introduction Autonomous underwater vehicles (AUV) have become increasingly popular, for example in the fields of environmental monitoring or inspection of underwater structures. Underwater environment is a challenge to autonomous robots, as it limits both sensing and actuation. Changes in the environment, e.g. water currents, direct sunlight, or floating obstacles, may affect the robot significantly. HANSE is a low cost AUV with an innovative commodity hardware-based, modular, and expandable design. It is able to solve localization using sonar sensors and an attitude and heading reference system (AHRS) only and provides robust vision-based underwater object detection and tracking. We will present the HANSE AUV, its design, and its algorithms in this paper. Robotic competitions, e.g. the RoboCup, have shown to improve research and promote international collaborations. The Student Autonomous Underwater Vehicle Challenge in Europe (SAUC-E) 1 provides a possibility to demonstrate solutions to a variety of problems of underwater robotics, including localization and navigation, visual and acoustic object detection and tracking, or manipulation. We have participated in SAUC- E in the last three years with HANSE. Our experiences at the competition will be used to evaluate our design choices and algorithms. 1.1 Related Work Ten teams from nine different European universities have competed in SAUC-E AUVs are typically cylindrically shaped, one major difference to HANSE. Most AUVs have scanning sonars and AHRS for localization, and multiple cameras for object detection. Teams with large budgets, for example Hariot Watt University with Nessie 1

2 Fig. 1. The autonomous underwater robot HANSE. IV [MCJ + 09], winner of SAUC-E 2009, and the University of Girona with Sparus, winner of SAUC-E 2010 [APL + 11], can afford additional and higher quality sensors, e.g. Doppler Velocity Logs (DVL) for odometry and additional sonar sensors. Especially the inclusion of a DVL may be an advantage. Ribas et al. describe an approach to Simultaneous Localization and Mapping (SLAM) in [RRTN07] using a scanning sonar. They extract line features from sonar images using the Hough transform and use them as landmarks for localization. Distortions due to movement of the vehicle are corrected by estimating the movement of the robot with the help of a DVL and correcting the sonar images accordingly, resulting in an accurate and robust localization. A comparison of different camera-based object tracking algorithms in underwater environments has been discussed in [SD06]. A feature-based approach for visual SLAM has been shown in [ACL + 10]. For HANSE, we use a much simpler color-based object detection algorithm, which nevertheless proved to be accurate and robust. Generally, there are two main differences between HANSE and other similarly successful AUVs at the SAUC-E competition. Firstly, our AUV HANSE was built on a comparatively low budget, limiting the choices in both mechanical design and sensors. Secondly, HANSE was built and maintained mainly by undergraduate students, which again significantly affected the hardware and software design. This paper will therefore focus on these two aspects. 2 Design Our design decisions are based on the SAUC-E mission rules on the one hand and budget limits and student requirements on the other. The SAUC-E mission rules state the general tasks or missions the AUV must perform as well as competition rules. Missions include, for example, following a yellow pipeline on the seabed, following a harbour wall, or tracking a surface vehicle. We generally see localization and object detection as the main challenges at SAUC-E. Localization is needed to navigate between missions and provide feedback on success or failure of a mission. Object detection is needed to identify the mission target. The main focus should be on robustness, as several runs have to be successfully performed during different times of day and potentially different weather conditions. Due to the limited budget, we have built HANSE with part costs of less than 14,000 Euro. We therefore could only include the most necessary sensors, e.g. a scanning sonar for localization and multiple cameras for object detection.

3 Fig. 2. Electronical components of HANSE The student participation affects the development process and the choice of algorithms. The amount of work that can be put into the development of the AUV through practical courses and theses is limited, the AUV must therefore be iteratively designed over several years, requiring a modular and expandable design. We further heavily rely on our underwater simulator MARS, which enables students to develop and thoroughly test algorithms without the need of the actual robot. 2.1 Hardware The biggest challenge in this competition, by far, is the construction of the AUV. HANSE is by its nature an experimental prototype, requiring high maintainance and multiple members of the team to use. Acknowledging that, we kept HANSE s design simple, so that failures and breakdowns can be repaired easily, often within minutes. All parts are easily accessible and easy to replace. The base frame of our AUVs is shaped like a sledge. All sensors and thrusters are attached to the frame using pipe clamps, allowing for easy reconfiguration. The base frame of the AUV is made out of 50 mm Polypropylene (PP) tubes. We have chosen this material because of its light weight and the possibility of welding single parts together easily. To increase the solidity of the frame, it was strengthened by glass fiber sheathing. Drilled holes allow the flooding of the frame. It therefore has near neutral buoyancy. A commercial waterproof case, designed for photo equipment, is fixed on the base frame and serves as the main pressure hull of the robot. With inside measurements of 30x22.5x13.2 cm, the case contains the laptop, batteries, and auxilary electronics. Similar designs have also been proposed for hobby AUVs (see e.g. [Boh97]). Thrusters, sonar devices, pressure sensor and cameras are placed outside the main case and connected to the case by waterproof connectors (BULGIN Buccaneer). To ensure stable horizontal orientation of the robot, additional lead weights are attached to the frame. Fig. 2 gives an overview of the major electronic components of HANSE. The central processing unit of HANSE is a 12.1 laptop. All peripheral hardware is connected by USB. Motor controller and pressure sensor are accessed via an I2C bus. An Atmegabased microcontroller board (Chip45 Crumb168) acts as a bridge between the laptop and the I2C bus.

4 Fig. 3. Snapshot of the simulation environment MARS. 2.2 Software We chose to develop a custom software framework for HANSE, the main reason being that no cross platform framework was available and we did not want to commit to an operating system at that time. The development of a custom, light-weight framework further eases the introduction to new students joining the team and increases the degree of control over the system. The framework consists of a slim core, which includes the raw graphical user interface and helper classes, i.e. logging, serial port interface and a configuration storage. It is further divided into modules, each module executed in its own thread. Modelled after the data flow of the robot, modules are arranged in a directed acyclic graph. Modules can be roughly separated into three levels: drivers, behaviours, and tasks. Driver modules provide an abstracted access to low-level hardware, e.g. thrusters and sensors. High-level algorithm modules process the sensor data and provide information for the behaviour modules. A task module encapsulates different behaviours of a SAUC- E mission. Both behaviours and task are implemented using state machines. 2.3 Simulation AUVs are operating in a risky and ever-changing environment. Due to the complexity of such a system as an AUV extensive testing is necessary to minimize these risks. One solution to this problem is the use of simulations. There are several underwater simulators, e.g Neptune [RBRC04], MarineSim [SWL + 10], or SubSim [BB06]. But these are either not available to us (Neptune), provide only rudimentary underwater simulation (MarineSim and Subsim), or are closed source and not extendable (Subsim). We have therefore developed our own simulator: the Marine Robotics Simulator (MARS, see Fig. 3). MARS is a hardware-in-the-loop (HIL) simulation environment for multiple AUVs written mainly in Java. It uses the Bullet Physics Library 2 as a physics engine and 2 Bullet Physics Library,

5 Fig. 4. Illustration of the sonar image filter chain. Left: A 360 scan as provided by the scanning sonar (in polar coordinates). Center: Sonar image after applying the gradient filter. Right: extracted wall features after applying non-maximum suppression and heuristics. jmonkeyengine3 3 for 3d-rendering. MARS runs independently of the HANSE framework and offers a straight-forward TCP connection. To ensure a natural behaviour of the AUV, physics, world model and sensor/actuator models have to be as realistic as possible. For the physics part static buoyancy, drag, waves and currents has been taken into account. The volume and center of buoyancy of the AUV are calculated through ray collisions depending on the current water height. Drag is simulated by parallel projection and pixel counting. The world model consists of a height map-based terrain, in which other objects like buoyages or pipelines can be placed. To perceive and interact with the world several sensors and actuators were implemented. The sonar is simulated through raycasting, where several rays are colliding with the world. The length and the angle of impact are taken into account, resulting in a more realistic sonar image. The sonar noise is based on experimental data. The simulated cameras allow more natural-looking underwater environments through the use of various filters, e.g. underwater-fog and depth-of-field. The thruster model is based on experimental data taken from the Seabotix thrusters. To all sensors and actuators noise or an own noise function can be added. With the use of MARS, we were able to perform thorough dry-dock testing of all behaviours and higher level code in HANSE. Complete runs with several tasks could be performed and several team members could work in parallel and evaluate our algorithms, resulting in a faster development cycle. 3 Localization In underwater scenarios, localization is a challenging problem [RRTN07]. A scanning sonar provides information about the environment and an AHRS provides information about the robot s orientation. Without the use of a DVL for odometry one challenge lies in the slow update rate of the scanning sonar, which takes at least 8 sec. for a 360 turn. Another challenge is the robust extraction of features from the noisy sonar images. Our localization algorithm is based on a landmark-based particle filter framework as described in [FHLM11]. We have further improved feature extraction using a multiscale gradient filter to enhance the regions of the sonar image characteristic to walls, 3 jmonkeyengine 3,

6 which are used as landmarks. 1D Haar wavelet responses at different scales are multiplied for each beam pixel to form the beam gradient G as ( x G(x) = B(i) k K i=x k x+k i=x+1 B(i) ), (1) where x is the distance from the robot, K is the set of all scales to be evaluated, and B(i) is the echo intensity at distance i. The Haar wavelet responses can be efficiently calculated using integral images. The filtered sonar image is then analyzed to find wall locations. First, a non-maximum suppression is performed to identify potential walls. For each beam, all local maxima that have a gradient intensity of less than one tenth of the maximum gradient intensity in the last sonar image are discarded. Further heuristics concerning the neighbors of wall features are applied, based on assumptions on continuity of walls and minimum lengths of wall segments. The result of the filter chain is illustrated in Fig Object Detection The challenge for the computer vision algorithm lies in the robustness to changing outer conditions, e.g. different lighting or particles in the water, as well as the bad image quality of the low cost webcams that were used. A visual object detection is thus performed based on color-based segmentation rather than edge or corner-based feature extraction techniques. Each image is segmented in the appropriate color channel (e.g. red for the pipeline) using Otsu s algorithm for automatic threshold selection [Ots79]. This method provides an efficient object detection, being fairly robust to image noise and lighting. The presence of an object is decided based on the number of pixels belonging to the object class and the mean position of these pixels. The object orientation θ can further be found using centralized image moments [BSA91] of the segmented image. An example of the object detection algorithm can be found in Fig. 5. The extracted information can now be used by simple homing and following behaviours to perform a mission, e.g. follow the yellow pipeline. Fig. 5. Object detection using segmentation. Left: RGB image. Center: segmentation channel (red). Right: Otsu thresholding. The blue line indicates the position and direction of a yellow pipeline.

7 y in m x in m Fig. 6. Robot path estimated by the localization algorithm at the SAUC-E competition. The start point is at the red circle, the end point is at the red square. Walls are depicted in thick black. Different phases of the run are shown as described in the text. 5 Evaluation An evaluation of the presented hardware and software design and algorithm performance at SAUC-E is difficult due to the lack of ground truth data. The performance of the presented localization and object detection algorithms may be inferred from data recorded at the competition. Fig. 6, for example, shows the estimated robot path for a qualifying run at the SAUC-E The AUV started at the side of the test site and moved to the pipeline in the center of the test site (dotted black line). The target point is accurately reached, as the pipeline has been successfully found. The curved pipeline has been followed back and forth (solid red line). The path in this section closely resembles the actual pipeline, indication of both accurate localization and object detection. The AUV was then moved to the lower right to start the wall following mission (dotted black line). The wall was now being followed at a distance of 5 m till the end of the run (solid blue line), which matches the distance from the wall estimated by the localization algorithm. Generally, the successful completion of several missions was performed repeatedly in qualifying, semi-final, and final runs, an indication for the robustness of our algorithms. Pipeline following and wall following were successfully completed in all runs, which proofs the robust approach to underwater object detection and the stable localization, which was used to navigate from the start to the pipeline as well as from the pipeline to the wall to be followed. We have further been the only team to successfully find and cut loose the red buoy. 6 Conclusion In this paper, we have presented our AUV HANSE. A number of design decisions make HANSE special. With a limited budget and the focus on student participation in the development of both hard- and software, simplicity and robustness have been key features of our work. The modular hardware design based on commodity parts was the basis of

8 an iterative improvement over the last three years. In terms of software, we developed a slim, custom software framework, accurate localization, and camera-based object detection. A special focus was laid on repeatability in case of changes in the environment, resulting in a particle filter-based localization algorithm with sophisticated feature extraction and an automatic color segmentation algorithm for object detection. With HANSE, we have participated at the Student Autonomous Underwater Vehicle Challenge in Europe (SAUC-E), winning the innovation prize in our first year of participation at Gosport, England in At the SAUC-E 2011 in La Spezia, Italy, we were able to achieve a great consistency throughout the entire competition, thus winning the first prize. In the future, as missions and therefore our algorithms are becoming more complex, the next step will be a migration to the Robot Operating System (ROS) framework, which will enable us to more easily incorporate advanced state-of-the-art algorithms. The expandable design will enable us to equip HANSE with new sensors, e.g. an acoustic modem, as needed to successfully compete in the years to come. References [ACL + 10] J. Aulinas, M. Carreras, X. Llado, J. Salvi, R. Garcia, R. Prados, and Y. Petillot. Feature extraction for underwater visual SLAM. In IEEE OCEANS, [APL + 11] J. Aulinas, Y. Petillot, X. Llado, J. Salvi, and R. Garcia. Vision-based underwater SLAM for the SPARUS AUV. In International Conference on Computer and IT Applications in the Maritime Industries (COMPIT), pages , [BB06] A. Boeing and T. Bräunl. SubSim: An autonomous underwater vehicle simulation package. In International Symposium on Autonomous Minirobots for Research and Edutainment (AMiRE), pages Springer Berlin Heidelberg, [Boh97] Harry Bohm. Build Your Own Underwater Robot and Other Wet Projects. Westcoast Words, January [BSA91] S. O. Belkasim, M. Shridhar, and M. Ahmadi. Pattern recognition with moment invariants: A comparative study and new results. Pattern Recognition, 24(12): , [FHLM11] D. Forouher, J. Hartmann, M. Litza, and E. Maehle. Sonar-based FastSLAM in an underwater environment using walls as features. In International Conference on Advanced Robotics (ICAR), [MCJ + 09] F. Maurelli, J. Cartwright, N. Johnson, G. Bossant, P.-L. Garmier, P. Regis, J. Sawas, and Y. Petillot. Nessie IV autonomous underwater vehicle. In Unmanned Underwater Vehicles Showcase (UUVS), [Ots79] N. Otsu. A threshold selection method from gray-level histograms. IEEE Transactions on Systems, Man and Cybernetics, 9(1):62 66, [RBRC04] P. Ridao, E. Batlle, D. Ribas, and M. Carreras. NEPTUNE: A HIL simulator for multiple UUVs. In IEEE OCEANS, pages , [RRTN07] D. Ribas, P. Ridao, J.D. Tardos, and J. Neira. Underwater SLAM in a marina environment. In IEEE International Conference on Intelligent Robots and Systems (IROS), pages , [SD06] J. Sattar and G. Dudek. On the performance of color tracking algorithms for underwater robots under varying lighting and visibility. In IEEE International Conference on Robotics and Automation (ICRA), [SWL + 10] P.G.C.N. Senarathne, W. S. Wijesoma, Kwang Wee Lee, B. Kalyan, M.D.P Moratuwage, N.M. Patrikalakis, and F.S. Hover. MarineSIM : Robot simulation for marine environments. In IEEE OCEANS, 2010.

SAUC-E The Hanse Team

SAUC-E The Hanse Team SAUC-E 2011 - The Hanse Team Jan Hartmann, Dariush Forouher, Marek Litza, Helge Klu ssendorff, Benjamin Meyer, Tjorven Mintzlaff, Patrik Stahl, Christian Strakerjahn, Thomas Tosik, Patrick Zenker and Erik

More information

Sonar-based FastSLAM in an Underwater Environment Using Walls as Features

Sonar-based FastSLAM in an Underwater Environment Using Walls as Features Sonar-based FastSLAM in an Underwater Environment Using Walls as Features Dariush Forouher, Jan Hartmann, Marek Litza, Erik Maehle Abstract A lot of research has been done in the area of Simultaneous Localization

More information

SAUC-E The HANSE Team

SAUC-E The HANSE Team SAUC-E 2012 - The HANSE Team Jan Hartmann, Thomas Tosik, Jannis Harder, Jan Fechner, Hans-Joachim Reddecker, Andrea Kampsen, Patrick Zenker, Sven Friedrichs, Cedric Isokeit, Patrik Stahl, Peter Hegen and

More information

Feature extraction for underwater visual SLAM

Feature extraction for underwater visual SLAM Feature extraction for underwater visual SLAM Josep Aulinas, Marc Carreras, Xavier Llado Joaquim Salvi, Rafael Garcia and Ricard Prados Computer Vision and Robotics Group Institute of Informatics and Applications

More information

Nessie IV Autonomous Underwater Vehicle

Nessie IV Autonomous Underwater Vehicle 1 Nessie IV Autonomous Underwater Vehicle Guillaume Bossant, Joel Cartwright, Pierre-Louis Garmier, Nick Johnson, Francesco Maurelli, Pierre Regis, Jamil Sawas, and Yvan Petillot Abstract Nessie IV is

More information

A GLOBAL LOCALIZATION SYSTEM FOR STRUCTURED ENVIRONMENTS USING AN IMAGING SONAR. Guillermo García de Marina, David Ribas, Pere Ridao

A GLOBAL LOCALIZATION SYSTEM FOR STRUCTURED ENVIRONMENTS USING AN IMAGING SONAR. Guillermo García de Marina, David Ribas, Pere Ridao A GLOBAL LOCALIZAION SYSEM FOR SRUCURED ENVIRONMENS USING AN IMAGING SONAR Guillermo García de Marina, David Ribas, Pere Ridao Dept. Arquitectura i ecnologia de Computadors Universitat de Girona Girona,

More information

Vision-Based Underwater SLAM for the SPARUS AUV

Vision-Based Underwater SLAM for the SPARUS AUV Vision-Based Underwater SLAM for the SPARUS AUV Josep Aulinas, Institute of Informatics and Applications, Girona/Spain, jaulinas@eia.udg.edu Yvan R. Petillot, Oceans Systems Laboratory, Edinburgh/UK, Y.R.Petillot@hw.ac.uk

More information

ICTINEUAUv Wins the First SAUC-E Competition

ICTINEUAUv Wins the First SAUC-E Competition 2007 IEEE International Conference on Robotics and Automation Roma, Italy, 10-14 April 2007 ICTINEUAUv Wins the First SAUC-E Competition D. Ribas, N. Palomeras, P. Ridao, M. Carreras and E. Hernandez.

More information

Vision-based Localization of an Underwater Robot in a Structured Environment

Vision-based Localization of an Underwater Robot in a Structured Environment Vision-based Localization of an Underwater Robot in a Structured Environment M. Carreras, P. Ridao, R. Garcia and T. Nicosevici Institute of Informatics and Applications University of Girona Campus Montilivi,

More information

A METHOD FOR EXTRACTING LINES AND THEIR UNCERTAINTY FROM ACOUSTIC UNDERWATER IMAGES FOR SLAM. David Ribas Pere Ridao José Neira Juan Domingo Tardós

A METHOD FOR EXTRACTING LINES AND THEIR UNCERTAINTY FROM ACOUSTIC UNDERWATER IMAGES FOR SLAM. David Ribas Pere Ridao José Neira Juan Domingo Tardós 6th IFAC Symposium on Intelligent Autonomous Vehicles IAV 2007, September 3-5 2007 Toulouse, France (To appear) A METHOD FOR EXTRACTING LINES AND THEIR UNCERTAINTY FROM ACOUSTIC UNDERWATER IMAGES FOR SLAM

More information

Underwater Scan Matching using a Mechanical Scanned Imaging Sonar

Underwater Scan Matching using a Mechanical Scanned Imaging Sonar Underwater Scan Matching using a Mechanical Scanned Imaging Sonar Antoni Burguera Yolanda González, Gabriel Oliver Dept. Matemàtiques i Informàtica. Universitat de les Illes Balears. Ctra. Valldemossa

More information

LAIR. UNDERWATER ROBOTICS Field Explorations in Marine Biology, Oceanography, and Archeology

LAIR. UNDERWATER ROBOTICS Field Explorations in Marine Biology, Oceanography, and Archeology UNDERWATER ROBOTICS Field Explorations in Marine Biology, Oceanography, and Archeology COS 402: Artificial Intelligence - Sept. 2011 Christopher M. Clark Outline! Past Projects! Maltese Cistern Mapping!

More information

Robust and Accurate Detection of Object Orientation and ID without Color Segmentation

Robust and Accurate Detection of Object Orientation and ID without Color Segmentation 0 Robust and Accurate Detection of Object Orientation and ID without Color Segmentation Hironobu Fujiyoshi, Tomoyuki Nagahashi and Shoichi Shimizu Chubu University Japan Open Access Database www.i-techonline.com

More information

AUTOMATIC RECTIFICATION OF SIDE-SCAN SONAR IMAGES

AUTOMATIC RECTIFICATION OF SIDE-SCAN SONAR IMAGES Proceedings of the International Conference Underwater Acoustic Measurements: Technologies &Results Heraklion, Crete, Greece, 28 th June 1 st July 2005 AUTOMATIC RECTIFICATION OF SIDE-SCAN SONAR IMAGES

More information

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras UAV Position and Attitude Sensoring in Indoor Environment Using Cameras 1 Peng Xu Abstract There are great advantages of indoor experiment for UAVs. Test flights of UAV in laboratory is more convenient,

More information

A Trajectory Based Framework to Perform Underwater SLAM using Imaging Sonar Scans

A Trajectory Based Framework to Perform Underwater SLAM using Imaging Sonar Scans A Trajectory Based Framewor to Perform Underwater SLAM using Imaging Sonar Scans Antoni Burguera, Gabriel Oliver and Yolanda González Dept. Matemàtiques i Informàtica - Universitat de les Illes Balears

More information

2002 Intelligent Ground Vehicle Competition Design Report. Grizzly Oakland University

2002 Intelligent Ground Vehicle Competition Design Report. Grizzly Oakland University 2002 Intelligent Ground Vehicle Competition Design Report Grizzly Oakland University June 21, 2002 Submitted By: Matt Rizzo Brian Clark Brian Yurconis Jelena Nikolic I. ABSTRACT Grizzly is the product

More information

APPLICATIONS AND CHALLENGES FOR UNDERWATER SWIMMING MANIPULATORS

APPLICATIONS AND CHALLENGES FOR UNDERWATER SWIMMING MANIPULATORS APPLICATIONS AND CHALLENGES FOR UNDERWATER SWIMMING MANIPULATORS Jørgen Sverdrup-Thygeson AMOS Days October 2017 Introduction NEEDS FOR SUBSEA INSPECTION, MAINTENANCE AND REPAIR The number of new subsea

More information

Path Planning and Decision-making Control for AUV with Complex Environment

Path Planning and Decision-making Control for AUV with Complex Environment 2010 3rd International Conference on Computer and Electrical Engineering (ICCEE 2010) IPCSIT vol. 53 (2012) (2012) IACSIT Press, Singapore DOI: 10.7763/IPCSIT.2012.V53.No.2.05 Path Planning and Decision-making

More information

Filtering and mapping systems for underwater 3D imaging sonar

Filtering and mapping systems for underwater 3D imaging sonar Filtering and mapping systems for underwater 3D imaging sonar Tomohiro Koshikawa 1, a, Shin Kato 1,b, and Hitoshi Arisumi 1,c 1 Field Robotics Research Group, National Institute of Advanced Industrial

More information

UNMANNED UNDERWATER VEHICLE SIMULATOR ENABLING THE SIMULATION OF MULTI- ROBOT UNDERWATER MISSIONS WITH GAZEBO

UNMANNED UNDERWATER VEHICLE SIMULATOR ENABLING THE SIMULATION OF MULTI- ROBOT UNDERWATER MISSIONS WITH GAZEBO UNMANNED UNDERWATER VEHICLE SIMULATOR ENABLING THE SIMULATION OF MULTI- ROBOT UNDERWATER MISSIONS WITH GAZEBO MUSA MORENA MARCUSSO MANHÃES CORPORATE SECTOR RESEARCH AND ADVANCE ENGINEERING (CR) Robert

More information

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than

(a) (b) (c) Fig. 1. Omnidirectional camera: (a) principle; (b) physical construction; (c) captured. of a local vision system is more challenging than An Omnidirectional Vision System that finds and tracks color edges and blobs Felix v. Hundelshausen, Sven Behnke, and Raul Rojas Freie Universität Berlin, Institut für Informatik Takustr. 9, 14195 Berlin,

More information

W4. Perception & Situation Awareness & Decision making

W4. Perception & Situation Awareness & Decision making W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic

More information

A single ping showing threshold and principal return. ReturnStrength Range(m) Sea King Feature Extraction. Y (m) X (m)

A single ping showing threshold and principal return. ReturnStrength Range(m) Sea King Feature Extraction. Y (m) X (m) Autonomous Underwater Simultaneous Localisation and Map Building Stefan B. Williams, Paul Newman, Gamini Dissanayake, Hugh Durrant-Whyte Australian Centre for Field Robotics Department of Mechanical and

More information

Nessie III Autonomous Underwater Vehicle for SAUC-E 2008

Nessie III Autonomous Underwater Vehicle for SAUC-E 2008 Nessie III Autonomous Underwater Vehicle for SAUC-E 2008 J. Cartwright, N. Johnson, B. Davis, Z. Qiang, T. L. Bravo, A. Enoch, G. Lemaitre, H. Roth, Y. Petillot Ocean Systems Laboratory, Heriot-Watt University,

More information

A modular approach to system integration in underwater robotics

A modular approach to system integration in underwater robotics A modular approach to system integration in underwater robotics Tomislav Lugarić, Đula Nađ and Zoran Vukić University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb, Croatia Abstract

More information

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm) Chapter 8.2 Jo-Car2 Autonomous Mode Path Planning (Cost Matrix Algorithm) Introduction: In order to achieve its mission and reach the GPS goal safely; without crashing into obstacles or leaving the lane,

More information

Nao Devils Dortmund. Team Description Paper for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann

Nao Devils Dortmund. Team Description Paper for RoboCup Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann Nao Devils Dortmund Team Description Paper for RoboCup 2017 Matthias Hofmann, Ingmar Schwarz, and Oliver Urbann Robotics Research Institute Section Information Technology TU Dortmund University 44221 Dortmund,

More information

Underwater Robots with Sonar and Smart Tether for Underground Cistern Mapping and Exploration

Underwater Robots with Sonar and Smart Tether for Underground Cistern Mapping and Exploration The 10th International Symposium on Virtual Reality, Archaeology and Cultural Heritage VAST (2009) K. Debattista, C. Perlingieri, D. Pitzalis, and S. Spina (Editors) Short and Project Papers Underwater

More information

IRIS SEGMENTATION OF NON-IDEAL IMAGES

IRIS SEGMENTATION OF NON-IDEAL IMAGES IRIS SEGMENTATION OF NON-IDEAL IMAGES William S. Weld St. Lawrence University Computer Science Department Canton, NY 13617 Xiaojun Qi, Ph.D Utah State University Computer Science Department Logan, UT 84322

More information

Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments

Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments Real-time Obstacle Avoidance and Mapping for AUVs Operating in Complex Environments Jacques C. Leedekerken, John J. Leonard, Michael C. Bosse, and Arjuna Balasuriya Massachusetts Institute of Technology

More information

Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera

Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera Three-Dimensional Measurement of Objects in Liquid with an Unknown Refractive Index Using Fisheye Stereo Camera Kazuki Sakamoto, Alessandro Moro, Hiromitsu Fujii, Atsushi Yamashita, and Hajime Asama Abstract

More information

University of Jordan Faculty of Engineering and Technology Mechatronics Engineering Department

University of Jordan Faculty of Engineering and Technology Mechatronics Engineering Department University of Jordan Faculty of Engineering and Technology Mechatronics Engineering Department 2016 Control and Measurement Laboratory Robotino Robot (Mobile Robot System) Robotino Robot Objectives: The

More information

Robust Controller Design for an Autonomous Underwater Vehicle

Robust Controller Design for an Autonomous Underwater Vehicle DRC04 Robust Controller Design for an Autonomous Underwater Vehicle Pakpong Jantapremjit 1, * 1 Department of Mechanical Engineering, Faculty of Engineering, Burapha University, Chonburi, 20131 * E-mail:

More information

Motion estimation of unmanned marine vehicles Massimo Caccia

Motion estimation of unmanned marine vehicles Massimo Caccia Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it

More information

A Symmetry Operator and Its Application to the RoboCup

A Symmetry Operator and Its Application to the RoboCup A Symmetry Operator and Its Application to the RoboCup Kai Huebner Bremen Institute of Safe Systems, TZI, FB3 Universität Bremen, Postfach 330440, 28334 Bremen, Germany khuebner@tzi.de Abstract. At present,

More information

Human Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg

Human Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg Human Detection A state-of-the-art survey Mohammad Dorgham University of Hamburg Presentation outline Motivation Applications Overview of approaches (categorized) Approaches details References Motivation

More information

Computer and Machine Vision

Computer and Machine Vision Computer and Machine Vision Lecture Week 10 Part-2 Skeletal Models and Face Detection March 21, 2014 Sam Siewert Outline of Week 10 Lab #4 Overview Lab #5 and #6 Extended Lab Overview SIFT and SURF High

More information

MEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS

MEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS MEMS technology quality requirements as applied to multibeam echosounder Jerzy DEMKOWICZ, Krzysztof BIKONIS Gdansk University of Technology Gdansk, Narutowicza str. 11/12, Poland demjot@eti.pg.gda.pl Small,

More information

Monocular SLAM for a Small-Size Humanoid Robot

Monocular SLAM for a Small-Size Humanoid Robot Tamkang Journal of Science and Engineering, Vol. 14, No. 2, pp. 123 129 (2011) 123 Monocular SLAM for a Small-Size Humanoid Robot Yin-Tien Wang*, Duen-Yan Hung and Sheng-Hsien Cheng Department of Mechanical

More information

Using Layered Color Precision for a Self-Calibrating Vision System

Using Layered Color Precision for a Self-Calibrating Vision System ROBOCUP2004 SYMPOSIUM, Instituto Superior Técnico, Lisboa, Portugal, July 4-5, 2004. Using Layered Color Precision for a Self-Calibrating Vision System Matthias Jüngel Institut für Informatik, LFG Künstliche

More information

DEVELOPMENT OF POSITION MEASUREMENT SYSTEM FOR CONSTRUCTION PILE USING LASER RANGE FINDER

DEVELOPMENT OF POSITION MEASUREMENT SYSTEM FOR CONSTRUCTION PILE USING LASER RANGE FINDER S17- DEVELOPMENT OF POSITION MEASUREMENT SYSTEM FOR CONSTRUCTION PILE USING LASER RANGE FINDER Fumihiro Inoue 1 *, Takeshi Sasaki, Xiangqi Huang 3, and Hideki Hashimoto 4 1 Technica Research Institute,

More information

Building Reliable 2D Maps from 3D Features

Building Reliable 2D Maps from 3D Features Building Reliable 2D Maps from 3D Features Dipl. Technoinform. Jens Wettach, Prof. Dr. rer. nat. Karsten Berns TU Kaiserslautern; Robotics Research Lab 1, Geb. 48; Gottlieb-Daimler- Str.1; 67663 Kaiserslautern;

More information

A Simple Automated Void Defect Detection for Poor Contrast X-ray Images of BGA

A Simple Automated Void Defect Detection for Poor Contrast X-ray Images of BGA Proceedings of the 3rd International Conference on Industrial Application Engineering 2015 A Simple Automated Void Defect Detection for Poor Contrast X-ray Images of BGA Somchai Nuanprasert a,*, Sueki

More information

CS 223B Computer Vision Problem Set 3

CS 223B Computer Vision Problem Set 3 CS 223B Computer Vision Problem Set 3 Due: Feb. 22 nd, 2011 1 Probabilistic Recursion for Tracking In this problem you will derive a method for tracking a point of interest through a sequence of images.

More information

Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot

Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot 21st International Congress on Modelling and Simulation, Gold Coast, Australia, 29 Nov to 4 Dec 2015 www.mssanz.org.au/modsim2015 Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot

More information

A Compact Control Language for AUV Acoustic Communication Roger P. Stokey Lee E. Freitag Matthew D. Grund

A Compact Control Language for AUV Acoustic Communication Roger P. Stokey Lee E. Freitag Matthew D. Grund A Compact Control Language for AUV Acoustic Communication Roger P. Stokey Lee E. Freitag Matthew D. Grund Woods Hole Oceanographic Institution Woods Hole, MA 054 ccl@whoi.edu Abstract - Acoustic communication

More information

CS4758: Rovio Augmented Vision Mapping Project

CS4758: Rovio Augmented Vision Mapping Project CS4758: Rovio Augmented Vision Mapping Project Sam Fladung, James Mwaura Abstract The goal of this project is to use the Rovio to create a 2D map of its environment using a camera and a fixed laser pointer

More information

International Journal of Advance Engineering and Research Development

International Journal of Advance Engineering and Research Development Scientific Journal of Impact Factor (SJIF): 4.14 International Journal of Advance Engineering and Research Development Volume 3, Issue 3, March -2016 e-issn (O): 2348-4470 p-issn (P): 2348-6406 Research

More information

Introduction to Autonomous Mobile Robots

Introduction to Autonomous Mobile Robots Introduction to Autonomous Mobile Robots second edition Roland Siegwart, Illah R. Nourbakhsh, and Davide Scaramuzza The MIT Press Cambridge, Massachusetts London, England Contents Acknowledgments xiii

More information

Team Description Paper Team AutonOHM

Team Description Paper Team AutonOHM Team Description Paper Team AutonOHM Jon Martin, Daniel Ammon, Helmut Engelhardt, Tobias Fink, Tobias Scholz, and Marco Masannek University of Applied Science Nueremberg Georg-Simon-Ohm, Kesslerplatz 12,

More information

Machine Learning on Physical Robots

Machine Learning on Physical Robots Machine Learning on Physical Robots Alfred P. Sloan Research Fellow Department or Computer Sciences The University of Texas at Austin Research Question To what degree can autonomous intelligent agents

More information

Research and application of volleyball target tracking algorithm based on surf corner detection

Research and application of volleyball target tracking algorithm based on surf corner detection Acta Technica 62 No. 3A/217, 187 196 c 217 Institute of Thermomechanics CAS, v.v.i. Research and application of volleyball target tracking algorithm based on surf corner detection Guowei Yuan 1 Abstract.

More information

Active2012 HOSEI UNIVERSITY

Active2012 HOSEI UNIVERSITY Active2012 HOSEI UNIVERSITY Faculty of Science and Engineering, Hosei University 3-7-2 Kajinocho Koganei, Tokyo 194-8584, Japan E-mail; ikko@hosei.ac.jp Faculty Advisor Statement I hereby certify that

More information

A Robot Recognizing Everyday Objects

A Robot Recognizing Everyday Objects A Robot Recognizing Everyday Objects -- Towards Robot as Autonomous Knowledge Media -- Hideaki Takeda Atsushi Ueno Motoki Saji, Tsuyoshi Nakano Kei Miyamato The National Institute of Informatics Nara Institute

More information

ICTINEU AUV Takes the Challenge

ICTINEU AUV Takes the Challenge ICTINEU AUV Takes the Challenge D. Ribas, N. Palomeras., X. Ribas, G. García de Marina, E. Hernàndez, F.Chung, N. Hurtós, J. Massich, A. Almohaya and J. Vila. Institute of Informatics and Applications.

More information

An Angle Estimation to Landmarks for Autonomous Satellite Navigation

An Angle Estimation to Landmarks for Autonomous Satellite Navigation 5th International Conference on Environment, Materials, Chemistry and Power Electronics (EMCPE 2016) An Angle Estimation to Landmarks for Autonomous Satellite Navigation Qing XUE a, Hongwen YANG, Jian

More information

Equipment Site Description

Equipment Site Description Equipment Site Description The 150 g-ton geotechnical centrifuge NEES facility is located in the basement of the Jonsson Engineering Center (JEC) of the RPI campus in Troy, NY. This building also contains

More information

Robotics Programming Laboratory

Robotics Programming Laboratory Chair of Software Engineering Robotics Programming Laboratory Bertrand Meyer Jiwon Shin Lecture 8: Robot Perception Perception http://pascallin.ecs.soton.ac.uk/challenges/voc/databases.html#caltech car

More information

Using Side Scan Sonar to Relative Navigation

Using Side Scan Sonar to Relative Navigation Using Side Scan Sonar to Relative Navigation Miguel Pinto, Bruno Ferreira, Aníbal Matos, Nuno Cruz FEUP-DEEC, Rua Dr. Roberto Frias, 4200-465 Porto, Portugal ee04134@fe.up.pt, ee04018@fe.up.pt, anibal@fe.up.pt,

More information

Recognition of a Predefined Landmark Using Optical Flow Sensor/Camera

Recognition of a Predefined Landmark Using Optical Flow Sensor/Camera Recognition of a Predefined Landmark Using Optical Flow Sensor/Camera Galiev Ilfat, Alina Garaeva, Nikita Aslanyan The Department of Computer Science & Automation, TU Ilmenau 98693 Ilmenau ilfat.galiev@tu-ilmenau.de;

More information

Table of Contents. Introduction 1. Software installation 2. Remote control and video transmission 3. Navigation 4. FAQ 5.

Table of Contents. Introduction 1. Software installation 2. Remote control and video transmission 3. Navigation 4. FAQ 5. Table of Contents Introduction 1. Software installation 2. Remote control and video transmission 3. Navigation 4. FAQ 5. Maintenance 1.1 1.2 1.3 1.4 1.5 1.6 2 Introduction Introduction Introduction The

More information

DYNAMIC POSITIONING CONFERENCE September 16-17, Sensors

DYNAMIC POSITIONING CONFERENCE September 16-17, Sensors DYNAMIC POSITIONING CONFERENCE September 16-17, 2003 Sensors An Integrated acoustic positioning and inertial navigation system Jan Erik Faugstadmo, Hans Petter Jacobsen Kongsberg Simrad, Norway Revisions

More information

Canny Edge Based Self-localization of a RoboCup Middle-sized League Robot

Canny Edge Based Self-localization of a RoboCup Middle-sized League Robot Canny Edge Based Self-localization of a RoboCup Middle-sized League Robot Yoichi Nakaguro Sirindhorn International Institute of Technology, Thammasat University P.O. Box 22, Thammasat-Rangsit Post Office,

More information

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots

3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots 3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots Yuncong Chen 1 and Will Warren 2 1 Department of Computer Science and Engineering,

More information

Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform. Project Proposal. Cognitive Robotics, Spring 2005

Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform. Project Proposal. Cognitive Robotics, Spring 2005 Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform Project Proposal Cognitive Robotics, Spring 2005 Kaijen Hsiao Henry de Plinval Jason Miller Introduction In the context

More information

Localization of Multiple Robots with Simple Sensors

Localization of Multiple Robots with Simple Sensors Proceedings of the IEEE International Conference on Mechatronics & Automation Niagara Falls, Canada July 2005 Localization of Multiple Robots with Simple Sensors Mike Peasgood and Christopher Clark Lab

More information

using an omnidirectional camera, sufficient information for controlled play can be collected. Another example for the use of omnidirectional cameras i

using an omnidirectional camera, sufficient information for controlled play can be collected. Another example for the use of omnidirectional cameras i An Omnidirectional Vision System that finds and tracks color edges and blobs Felix v. Hundelshausen, Sven Behnke, and Raul Rojas Freie Universität Berlin, Institut für Informatik Takustr. 9, 14195 Berlin,

More information

Evaluating optical flow vectors through collision points of object trajectories in varying computergenerated snow intensities for autonomous vehicles

Evaluating optical flow vectors through collision points of object trajectories in varying computergenerated snow intensities for autonomous vehicles Eingebettete Systeme Evaluating optical flow vectors through collision points of object trajectories in varying computergenerated snow intensities for autonomous vehicles 25/6/2018, Vikas Agrawal, Marcel

More information

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016

Pedestrian Detection Using Correlated Lidar and Image Data EECS442 Final Project Fall 2016 edestrian Detection Using Correlated Lidar and Image Data EECS442 Final roject Fall 2016 Samuel Rohrer University of Michigan rohrer@umich.edu Ian Lin University of Michigan tiannis@umich.edu Abstract

More information

Introduction. Chapter Overview

Introduction. Chapter Overview Chapter 1 Introduction The Hough Transform is an algorithm presented by Paul Hough in 1962 for the detection of features of a particular shape like lines or circles in digitalized images. In its classical

More information

Learning Semantic Environment Perception for Cognitive Robots

Learning Semantic Environment Perception for Cognitive Robots Learning Semantic Environment Perception for Cognitive Robots Sven Behnke University of Bonn, Germany Computer Science Institute VI Autonomous Intelligent Systems Some of Our Cognitive Robots Equipped

More information

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles Proceedings of the International Conference of Control, Dynamic Systems, and Robotics Ottawa, Ontario, Canada, May 15-16 2014 Paper No. 54 A Reactive Bearing Angle Only Obstacle Avoidance Technique for

More information

CS4758: Moving Person Avoider

CS4758: Moving Person Avoider CS4758: Moving Person Avoider Yi Heng Lee, Sze Kiat Sim Abstract We attempt to have a quadrotor autonomously avoid people while moving through an indoor environment. Our algorithm for detecting people

More information

Estimation of Planar Surfaces in Noisy Range Images for the RoboCup Rescue Competition

Estimation of Planar Surfaces in Noisy Range Images for the RoboCup Rescue Competition Estimation of Planar Surfaces in Noisy Range Images for the RoboCup Rescue Competition Johannes Pellenz pellenz@uni-koblenz.de with Sarah Steinmetz and Dietrich Paulus Working Group Active Vision University

More information

Next Generation Bluefin-9:

Next Generation Bluefin-9: Next Generation Bluefin-9: A COTS AUV Enabling On-going and Advanced Platform Research Cheryl Mierzwa, Systems Engineer Mikell Taylor, Systems Engineer 31 July 2013 553 South Street Quincy, Massachusetts

More information

Image Thickness Correction for Navigation with 3D Intra-cardiac Ultrasound Catheter

Image Thickness Correction for Navigation with 3D Intra-cardiac Ultrasound Catheter Image Thickness Correction for Navigation with 3D Intra-cardiac Ultrasound Catheter Hua Zhong 1, Takeo Kanade 1,andDavidSchwartzman 2 1 Computer Science Department, Carnegie Mellon University, USA 2 University

More information

Calibration of a rotating multi-beam Lidar

Calibration of a rotating multi-beam Lidar The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Calibration of a rotating multi-beam Lidar Naveed Muhammad 1,2 and Simon Lacroix 1,2 Abstract

More information

CS 231A Computer Vision (Fall 2012) Problem Set 3

CS 231A Computer Vision (Fall 2012) Problem Set 3 CS 231A Computer Vision (Fall 2012) Problem Set 3 Due: Nov. 13 th, 2012 (2:15pm) 1 Probabilistic Recursion for Tracking (20 points) In this problem you will derive a method for tracking a point of interest

More information

Design and Execution of Model Experiments to Validate Numerical Modelling of 2D Ship Operations in Pack Ice

Design and Execution of Model Experiments to Validate Numerical Modelling of 2D Ship Operations in Pack Ice Design and Execution of Model Experiments to Validate Numerical Modelling of 2D Ship Operations in Pack Ice Roelof C. Dragt Offshore Engineering Faculty of Mechanical, Maritime and Material Engineering

More information

Autonomous Underwater Vehicles for the 2018 RoboSub Competition

Autonomous Underwater Vehicles for the 2018 RoboSub Competition HEU-AUV Harbin Engineering University 1 of 5 Autonomous Underwater Vehicles for the 2018 RoboSub Competition Liu Wenzhi, Li Haibo,Ye,Xiufen, Zhou Hanwen, Liu Hong, Sun Xiangren,Houjun, Abstract The underwater

More information

1 Mission Level Design. of Autonomous Underwater Vehicles

1 Mission Level Design. of Autonomous Underwater Vehicles Mission Level Design of Autonomous Underwater Vehicles Thomas Liebezeit, Volker Zerbe Department of Automatic Control and System Engineering, TU Ilmenau, Germany e-mail: thomas.liebezeit@tu-ilmenau.de

More information

Summary of Computing Team s Activities Fall 2007 Siddharth Gauba, Toni Ivanov, Edwin Lai, Gary Soedarsono, Tanya Gupta

Summary of Computing Team s Activities Fall 2007 Siddharth Gauba, Toni Ivanov, Edwin Lai, Gary Soedarsono, Tanya Gupta Summary of Computing Team s Activities Fall 2007 Siddharth Gauba, Toni Ivanov, Edwin Lai, Gary Soedarsono, Tanya Gupta 1 OVERVIEW Input Image Channel Separation Inverse Perspective Mapping The computing

More information

Feature based SLAM using Side-scan salient objects

Feature based SLAM using Side-scan salient objects Feature based SLAM using Side-scan salient objects Josep Aulinas, Xavier Lladó and Joaquim Salvi Computer Vision and Robotics Group Institute of Informatics and Applications University of Girona 17071

More information

Complex Sensors: Cameras, Visual Sensing. The Robotics Primer (Ch. 9) ECE 497: Introduction to Mobile Robotics -Visual Sensors

Complex Sensors: Cameras, Visual Sensing. The Robotics Primer (Ch. 9) ECE 497: Introduction to Mobile Robotics -Visual Sensors Complex Sensors: Cameras, Visual Sensing The Robotics Primer (Ch. 9) Bring your laptop and robot everyday DO NOT unplug the network cables from the desktop computers or the walls Tuesday s Quiz is on Visual

More information

The NAO Robot, a case of study Robotics Franchi Alessio Mauro

The NAO Robot, a case of study Robotics Franchi Alessio Mauro The NAO Robot, a case of study Robotics 2013-2014 Franchi Alessio Mauro alessiomauro.franchi@polimi.it Who am I? Franchi Alessio Mauro Master Degree in Computer Science Engineer at Politecnico of Milan

More information

Collaborative Multi-Vehicle Localization and Mapping in Marine Environments

Collaborative Multi-Vehicle Localization and Mapping in Marine Environments Collaborative Multi-Vehicle Localization and Mapping in Marine Environments Moratuwage M.D.P., Wijesoma W.S., Kalyan B., Dong J.F., Namal Senarathne P.G.C., Franz S. Hover, Nicholas M. Patrikalakis School

More information

Autonomous Vehicle Navigation Using Stereoscopic Imaging

Autonomous Vehicle Navigation Using Stereoscopic Imaging Autonomous Vehicle Navigation Using Stereoscopic Imaging Project Proposal By: Beach Wlaznik Advisors: Dr. Huggins Dr. Stewart December 7, 2006 I. Introduction The objective of the Autonomous Vehicle Navigation

More information

Object Conveyance Algorithm for Multiple Mobile Robots based on Object Shape and Size

Object Conveyance Algorithm for Multiple Mobile Robots based on Object Shape and Size Object Conveyance Algorithm for Multiple Mobile Robots based on Object Shape and Size Purnomo Sejati, Hiroshi Suzuki, Takahiro Kitajima, Akinobu Kuwahara and Takashi Yasuno Graduate School of Tokushima

More information

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

3D Time-of-Flight Image Sensor Solutions for Mobile Devices 3D Time-of-Flight Image Sensor Solutions for Mobile Devices SEMICON Europa 2015 Imaging Conference Bernd Buxbaum 2015 pmdtechnologies gmbh c o n f i d e n t i a l Content Introduction Motivation for 3D

More information

Towards Autonomous Robotic Valve Turning

Towards Autonomous Robotic Valve Turning BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 12, No 3 Sofia 2012 Print ISSN: 1311-9702; Online ISSN: 1314-4081 DOI: 10.2478/cait-2012-0018 Towards Autonomous Robotic Valve

More information

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion 007 IEEE International Conference on Robotics and Automation Roma, Italy, 0-4 April 007 FrE5. Accurate Motion Estimation and High-Precision D Reconstruction by Sensor Fusion Yunsu Bok, Youngbae Hwang,

More information

Hough Transform Run Length Encoding for Real-Time Image Processing

Hough Transform Run Length Encoding for Real-Time Image Processing Hough Transform Run Length Encoding for Real-Time Image Processing C. H. Messom 1, G. Sen Gupta 2,3, S. Demidenko 4 1 IIMS, Massey University, Albany, New Zealand 2 IIS&T, Massey University, Palmerston

More information

Snow cover change detection with laser scanning range and brightness measurements

Snow cover change detection with laser scanning range and brightness measurements Snow cover change detection with laser scanning range and brightness measurements Sanna Kaasalainen, Harri Kaartinen, Antero Kukko, Henri Niittymäki Department of Remote Sensing and Photogrammetry 5th

More information

UMASIS, AN ANALYSIS AND VISUALIZATION TOOL FOR DEVELOPING AND OPTIMIZING ULTRASONIC INSPECTION TECHNIQUES

UMASIS, AN ANALYSIS AND VISUALIZATION TOOL FOR DEVELOPING AND OPTIMIZING ULTRASONIC INSPECTION TECHNIQUES UMASIS, AN ANALYSIS AND VISUALIZATION TOOL FOR DEVELOPING AND OPTIMIZING ULTRASONIC INSPECTION TECHNIQUES A.W.F. Volker, J. G.P. Bloom TNO Science & Industry, Stieltjesweg 1, 2628CK Delft, The Netherlands

More information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information

Efficient SLAM Scheme Based ICP Matching Algorithm Using Image and Laser Scan Information Proceedings of the World Congress on Electrical Engineering and Computer Systems and Science (EECSS 2015) Barcelona, Spain July 13-14, 2015 Paper No. 335 Efficient SLAM Scheme Based ICP Matching Algorithm

More information

Hand Gesture Recognition Based On The Parallel Edge Finger Feature And Angular Projection

Hand Gesture Recognition Based On The Parallel Edge Finger Feature And Angular Projection Hand Gesture Recognition Based On The Parallel Edge Finger Feature And Angular Projection Zhou Yimin 1,2, Jiang Guolai 1, Xu Guoqing 1 and Lin Yaorong 3 1 Shenzhen Institutes of Advanced Technology, Chinese

More information

Robosub of the Palouse. University of Idaho Division Design Review

Robosub of the Palouse. University of Idaho Division Design Review Robosub of the Palouse University of Idaho Division Design Review Project Background 2015 Robosub Competition History: 18th annual Type: Autonomous Location: San Diego, CA Participants: International When:

More information

Ball tracking with velocity based on Monte-Carlo localization

Ball tracking with velocity based on Monte-Carlo localization Book Title Book Editors IOS Press, 23 1 Ball tracking with velocity based on Monte-Carlo localization Jun Inoue a,1, Akira Ishino b and Ayumi Shinohara c a Department of Informatics, Kyushu University

More information

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS Cognitive Robotics Original: David G. Lowe, 004 Summary: Coen van Leeuwen, s1460919 Abstract: This article presents a method to extract

More information