Major project components: Sensors Robot hardware/software integration Kinematic model generation High-level control

Similar documents
Localization, Where am I?

CS283: Robotics Fall 2016: Sensors

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Modern Robotics Inc. Sensor Documentation

Exam in DD2426 Robotics and Autonomous Systems

Sensor Modalities. Sensor modality: Different modalities:

Project 1 : Dead Reckoning and Tracking

Localization and Map Building

EE565:Mobile Robotics Lecture 2

Camera Drones Lecture 2 Control and Sensors

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Autonomous Navigation for Flying Robots

THE UNIVERSITY OF AUCKLAND

E80. Experimental Engineering. Lecture 9 Inertial Measurement

Sensor technology for mobile robots

DEAD RECKONING FOR MOBILE ROBOTS USING TWO OPTICAL MICE

Marker Based Localization of a Quadrotor. Akshat Agarwal & Siddharth Tanwar

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

GPS + Inertial Sensor Fusion

Tracking a Mobile Robot Position Using Vision and Inertial Sensor

navigation Isaac Skog

SLAM with SIFT (aka Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks ) Se, Lowe, and Little

Testing the Possibilities of Using IMUs with Different Types of Movements

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI

Data Association for SLAM

Humanoid Robotics. Monte Carlo Localization. Maren Bennewitz

Lecture 13 Visual Inertial Fusion

Robotics and Autonomous Systems

Robotics and Autonomous Systems

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

501, , 1052, , 1602, 1604 EXCEL EXCEL 1602UC EXCEL 1052UC EXCEL 501HC. Micro-Vu Corporation. Precision Measurement Systems

Encoder applications. I Most common use case: Combination with motors

Scout. Quick Start Guide. WiFi Mobile Robot Development Platform with Multi-DOF Gripping Arms

DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception

Modular robotics and locomotion Juan Gonzalez Gomez

ROBOTICS AND AUTONOMOUS SYSTEMS

Wireless Networked Autonomous Mobile Robot with HAWK Animated Head System. Sputnik 3. Quick Start Guide

Stable Vision-Aided Navigation for Large-Area Augmented Reality

7.58 lbs (3440 g, including two batteries, without gimbal and camera)

Exterior Orientation Parameters

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

Collaboration is encouraged among small groups (e.g., 2-3 students).

COS Lecture 13 Autonomous Robot Navigation

Chapter 36 Diffraction

Practical Robotics (PRAC)

ROBOT SENSORS. 1. Proprioceptors

Sense Autonomous 2_11. All rights reserved.

Motion estimation of unmanned marine vehicles Massimo Caccia

Gravity compensation in accelerometer measurements for robot navigation on inclined surfaces

Evaluating the Performance of a Vehicle Pose Measurement System

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

Autonomous Navigation for Flying Robots

Experimental results of a Differential Optic-Flow System

Wireless (WiFi g), Dual 5DOF Arm and 1DOF Grippers, Internet Remote Monitoring Robot. Scout 2

Robotic Perception and Action: Vehicle SLAM Assignment

ME 3200 Mechatronics Laboratory FALL 2002 Lab Exercise 7: Ultrasonic Sensors

Introduction to Mobile Robotics Probabilistic Motion Models

7 3-Sep Localization and Navigation (GPS, INS, & SLAM) 8 10-Sep State-space modelling & Controller Design 9 17-Sep Vision-based control

Summary of Computing Team s Activities Fall 2007 Siddharth Gauba, Toni Ivanov, Edwin Lai, Gary Soedarsono, Tanya Gupta

Final Exam Practice Fall Semester, 2012

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner

SR4000 Data Sheet.

Modular robotics and locomotion Juan Gonzalez Gomez

How do you roll? Fig. 1 - Capstone screen showing graph areas and menus

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Ceilbot vision and mapping system

Interaction with the Physical World

Clearpath Communication Protocol. For use with the Clearpath Robotics research platforms

Application Note. Fiber Alignment Using The HXP50 Hexapod PROBLEM BACKGROUND

Application Note. Fiber Alignment Using the HXP50 Hexapod PROBLEM BACKGROUND

Unit 2: Locomotion Kinematics of Wheeled Robots: Part 3

UAV Autonomous Navigation in a GPS-limited Urban Environment

Linescan System Design for Robust Web Inspection

MOBILE ROBOT LOCALIZATION. REVISITING THE TRIANGULATION METHODS. Josep Maria Font, Joaquim A. Batlle

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

Specifications. Inspire 2. Aircraft (Model: T650) 7.25 lbs (3290 g, including two batteries, without gimbal and camera)

ECGR4161/5196 Lecture 6 June 9, 2011

Mobile Robotics. Mathematics, Models, and Methods

DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda**

Diffraction and Interference of Plane Light Waves

J. Roberts and P. Corke CRC for Mining technology and Equipment PO Box 883, Kenmore, Q

Perception: Sensors. Autonomous Mobile Robots. Davide Scaramuzza Margarita Chli, Paul Furgale, Marco Hutter, Roland Siegwart

Wall-Follower. Xiaodong Fang. EEL5666 Intelligent Machines Design Laboratory University of Florida School of Electrical and Computer Engineering

New Technologies for UAV/UGV

Experiment 9. Law of reflection and refraction of light

LOAM: LiDAR Odometry and Mapping in Real Time

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

CAMERA GIMBAL PERFORMANCE IMPROVEMENT WITH SPINNING-MASS MECHANICAL GYROSCOPES

PHYSICS 1040L LAB LAB 7: DIFFRACTION & INTERFERENCE

(1) and s k ωk. p k vk q

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK

COMBINED BUNDLE BLOCK ADJUSTMENT VERSUS DIRECT SENSOR ORIENTATION ABSTRACT

Applications of Piezo Actuators for Space Instrument Optical Alignment

NEW. Very high-speed profile measurement for industrial in line applications. 2D/3D profile sensor SENSORS & SYSTEMS

Technical Data Sheet Opto Interrupter

Transcription:

Status update: Path planning/following for a snake Major project components: Sensors Robot hardware/software integration Kinematic model generation High-level control

2. Optical mouse Optical mouse technology CMOS camera is 16 16 to 24 24 in resolution Center 5 5 region from new frame is compared to entirety of old frame Processing is optimized to detect translation The optical mouse is based on a main optical sensor with four main parts (Fig. 1): illumination device, illumination lens, camera and camera lens. Illumination is an important problem for a vision system. A light emitting diode (LED) with wavelength peak from 639 to 875 nm in combination with a plastic lens and mirrors is used to illuminate the surface under the sensor. The chip contains a CMOS camera as the image acquisition system and a proprietary DSP for image processing and external communications. The camera is very close to the floor surface, with 7.45 mm being the recommended height. The mouse structure protects Fig. 1. Optical mouse parts. 3. Optical mouse calibration The following results were obtained using a mouse with a declared resolution of 0.0635 mm pulses per millimeter or 400 cpi). The optica mouse is the HDNS-2000 type [19]. The mous were measured with a precision of ±0.05 mm an speed was limited to 1.5 mm/s to minimize the of the displacements. The pulses generated by th the displacement were obtained using a MATLA with all software tools and utilities that modify disabled. Finally, the x-axis refers to transversal ment and y-axis to longitudinal. Image from The optical mouse for indoor mobile robot odometry measurement, by J. Palacin, I Valgañon and R. Pernia. Sensors and Actuators A, 2006.

Mouse hardware

Arc verses Linear motion s and Actuators A 126 (2006) 141 147 nt. ith rror ion ith ion nds sult gian Fig. 8. Mouse pulses measured when the sensor completes an arc of 45 at different radii (r): expected result according to initial linear calibration (triangles) and real measurements (circles). Table 1 Typical European indoor floor surfaces tested Arc motion severely underestimated Constant multiple between arc and linear motion pulse count Mitigation plan: If my tests duplicate this result, then will use a second filter on the mouse inputs, based on the estimated arc and linear motion components Tile type Color Fig. 9 Code Parameters (mm) Image from The optical mouse for indoor mobile robot odometry measurement, by J. Palacin, I a b d h Valgañon and R. Pernia. Sensors and Actuators A, 2006. Stoneware S1 Grey 3 0.0 0 0.5 Stoneware S2 Grey 3 0.0 50 0.5 Stoneware S3 Dark 3 0.0 0 0.5 Ceramic C1 White 2 0.1 1.0 1.6 Ceramic C2 Red 2 0.1 0.7 1.0 Parquet P Wood 3 0.0 0.0 0.1

olution is 0.0630 mm (measuring 15.87 pulses per millimeter) over the x-axis and 0.0624mm (16.02 pulses per millimeter) over the y-axis, very close to the declared resolution with an average error of 1.27%. Sensor height 3.2. Height dependence The optical sensor can be placed for heights ranging from 7.15 to 7.75mm with 7.45mm as a recommended manufacturer Resolution goes down as sensor height goes up Mitigation plan: Use original mouse base to set sensor height, possibly with a suspension to keep it level. Fig. 3. Evolution of the standard deviation depending on the displacement. sensor. 3.3. Angular diagonal displacement In this experiment, the mouse was moved 30.0 mm diagonally at an angle varying from 0 (x-axis) to 90 (y-axis) to test the behavior of the optical sensor in angular displacements. Fig. 6 shows the diagonal pulses measured depending on the displace- Fig. 5. Influence of the height offset in the measured mouse pulses. Offset 0.0 corresponds to the original sensor height (7.35 mm). Image from The optical mouse for indoor mobile robot odometry measurement, by J. Palacin, I Valgañon and R. Pernia. Sensors and Actuators A, 2006.

Next Steps Add mice to snake Calibrate mice and compare to paper results Add center wheel-set to snake Get control software for snake motors

State Estimation & Path Planning for Scarab Rover Robot Motion Planning Fall 2007 Michael Dille Umashankar Nagarajan

Overview Primary goal: state estimation Fuse IMU data with downward pointing optic flow sensor Exercise in Kalman/Particle Filtering Path planning component A* and/or D* Lite in real or simulated obstacleladen environment

State Estimation Data used here is from the Zoe robot Application of Linear Kalman Filter to fuse IMU and GPS data to estimate the state Validate the state estimates with the existing {GPS, INS, Odometry} Kalman Filter The research goal is to do state estimation without GPS IMU and Odometry

Linear Kalman Filter Time Update Equations Measurement Update Equations

Sensors and Data available GPS (x,y,z) Position IMU Angular velocities along roll, pitch and yaw Linear Accelerations of x, y and z Odometry Distance Travelled

Discrete LKF implementation State is defined as (15x1 vector): [x,y,z,θ roll,θ pitch,θ yaw,dx,dy,dz,dθ roll,dθ pitch,dθ yaw,d 2 x,d 2 y,d 2 z] T Measurement vector is (9x1 vector): [x,y,z,dθ roll,dθ pitch,dθ yaw,d 2 x,d 2 y,d 2 z] T The matrix A used here represents the kinematic equations assuming that the motion on one dimension is independent of the other It is assumed that there is no control here i.e. U=0 The sensor model H maps the state variables to the measurement variables

Discrete LKF implementation There is an existing {GPS,INS,Odometry} Kalman Filter with which Discrete LKF implementation is validated The process error covariance matrix Q is: Q = E(v k,v kt ) v k = The measurement error covariance matrix R is derived using the confidence measures available from GPS data

Discrete LKF implementation Results

Discrete LKF implementation Results

Error Propagation What to do when no GPS or compass? E.g., on moon! Must use just odometry & IMU Have just [s forward dθ roll,dθ pitch,dθ yaw ] No absolute position/orientation fixes Error will grow without bound But can still track confidence! Just run Kalman prediction step & covariance update

Error Propagation With Orientation

Error Propagation No Orientation

Path Planning + Ultrasonic Mapping Sean Hyde

Path Planning + Ultrasonic Mapping Project Description: Construct and deploy a robot that will find path from start to goal using potential planning. The potential function for obstacles will include a probabilistic component based on how many of the 5 sensors can see the object and how close it is. Path Planning: Path planning will be done using a potential function with a probabilistic component. The robot will log its position by dead reckoning based on wheel encoders. Filtering: Filtering will be based on the work by Elfes and Morevac Demo scenario: Give the robot a deltax, deltay goal from its start position and let it loose!

Physical Construction 5 Ultrasonic Range sensors 2 Devantech SRF 04 3 Parallax Ping))) sensors 2 Wheel Differential Servo Drive Quadrature Optical encoders 44 spokes 176 transitions/revolution 0.045 linear resolution 7.2V 850mah LiPoly Battery 18F4525 PIC processor

Using Computational Design tools for path planning Siddharth Sanan

Platform for Implementation Scout Robot in the REL lab Sensors: Ultrasonic range sensors Wheel Encoders Sonar range sensor output

Data from ultrasonic range sensors Static Environment Dynamic environement

Experiments on position data from robot How do we know its improving the state estimate? - Make the robot run in a loop Sensor data : [X, Y, θ] T Integrated encoder values State estimated : [X, Y, θ] T using linear kalman filter

Topology Optimization for Path generation Initial exercise in generating paths in free space 200 22 23 24 Goal 25 180 160 140 120 100 80 60 40 20 0 19 20 21 15 16 17 18 12 13 14 8 9 10 11 5 6 7 Start 1 2 3 4 0 50 100 150 200 250 Path Found Objective function: 1 min SE = U T KU x 2 (x is a vector containing the thickness of the elements) Applying a force in the direction from the goal to the start we start the optimization Starting from the goal node a connection is made to the next node through the element with the greatest thickness

Even in free space there seem to be some issues with the current objective function We do get paths but not necessarily of optimal length, particularly at nodes which are not connected along a single line 200 180 160 140 120 100 80 60 40 20 0 22 23 24 25 19 20 21 Goal 15 16 17 18 12 13 14 8 9 10 11 5 6 7 Start 1 2 3 4 0 50 100 150 200 250 Path Found

Future Directions Implement other objective functions Include obstacles Implement the paths on the scout robot for an apriori known map Thank you

Physical Construction Nearly complete Missing wheel encoder mounts Need separate left/right Microprocessor/Circuit complete except for sensor connections

Software Design

Software Design Varied Duty Cycle PWM for Servo Control 4bit Parallel Bus for Character LCD Distance Calculation Based on Speed of Sound Encoder Update

Probability Component? Difference between exponential/linear/quadratic dropoff and probabilistic effect? Difference between additive potential vectors and combined view?

Other Issues Dead Reckoning Drift Local Minima