MTRX4700: Experimental Robotics

Similar documents
EE565:Mobile Robotics Lecture 3

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

Localization, Where am I?

MTRX 4700 Experimental Robotics

CSE 490R P1 - Localization using Particle Filters Due date: Sun, Jan 28-11:59 PM

Basics of Localization, Mapping and SLAM. Jari Saarinen Aalto University Department of Automation and systems Technology

AUTONOMOUS SYSTEMS. LOCALIZATION, MAPPING & SIMULTANEOUS LOCALIZATION AND MAPPING Part V Mapping & Occupancy Grid Mapping

ME132 February 3, 2011

Today MAPS AND MAPPING. Features. process of creating maps. More likely features are things that can be extracted from images:

Practical Robotics (PRAC)

This chapter explains two techniques which are frequently used throughout

Simulation of a mobile robot with a LRF in a 2D environment and map building

Build and Test Plan: IGV Team

DYNAMIC POSITIONING OF A MOBILE ROBOT USING A LASER-BASED GONIOMETER. Joaquim A. Batlle*, Josep Maria Font*, Josep Escoda**

High Accuracy Navigation Using Laser Range Sensors in Outdoor Applications

Final Exam Practice Fall Semester, 2012

A Simple Introduction to Omni Roller Robots (3rd April 2015)

Advanced Vision Practical

To graph the point (r, θ), simply go out r units along the initial ray, then rotate through the angle θ. The point (1, 5π 6

MTRX4700 Experimental Robotics

Localization and Map Building

Matching Evaluation of 2D Laser Scan Points using Observed Probability in Unstable Measurement Environment

Localization algorithm using a virtual label for a mobile robot in indoor and outdoor environments

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Practical Course WS12/13 Introduction to Monte Carlo Localization

Error Simulation and Multi-Sensor Data Fusion

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM

Data Association for SLAM

To graph the point (r, θ), simply go out r units along the initial ray, then rotate through the angle θ. The point (1, 5π 6. ) is graphed below:

Analysis, optimization, and design of a SLAM solution for an implementation on reconfigurable hardware (FPGA) using CńaSH

Localization and Map Building

Statistical Techniques in Robotics (16-831, F10) Lecture#06(Thursday September 11) Occupancy Maps

Probabilistic Robotics

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks

Exam in DD2426 Robotics and Autonomous Systems

Unit 2: Locomotion Kinematics of Wheeled Robots: Part 3

Mobile Robots: An Introduction.

Centre for Autonomous Systems

Graduate Topics in Biophysical Chemistry CH Assignment 0 (Programming Assignment) Due Monday, March 19

ECE 497 Introduction to Mobile Robotics Spring 09-10

Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle

SLAM: Robotic Simultaneous Location and Mapping

PATENT LIABILITY ANALYSIS. Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers

Rigid Body Transformations

Improving autonomous orchard vehicle trajectory tracking performance via slippage compensation

Robot Localization based on Geo-referenced Images and G raphic Methods

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements

Statistical Techniques in Robotics (STR, S15) Lecture#05 (Monday, January 26) Lecturer: Byron Boots

Robotics (Kinematics) Winter 1393 Bonab University

Pearson BTEC Level 5 Higher National Diploma in Engineering (Electrical and Electronic Engineering)

Robot Navigation Worksheet 1: Obstacle Navigation

Introduction to Mobile Robotics Probabilistic Motion Models

LAIR. UNDERWATER ROBOTICS Field Explorations in Marine Biology, Oceanography, and Archeology

Page 1 of 7 E7 Spring 2009 Midterm I SID: UNIVERSITY OF CALIFORNIA, BERKELEY Department of Civil and Environmental Engineering. Practice Midterm 01

Robot Mapping. A Short Introduction to the Bayes Filter and Related Models. Gian Diego Tipaldi, Wolfram Burgard

DE2.3 Electronics 2. Lab Experiment 3: IMU and OLED Display

IROBOT RIG LABORATORY USER GUIDE VERSION 2.1

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

Introduction to Mobile Robotics

UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE

A General Framework for Mobile Robot Pose Tracking and Multi Sensors Self-Calibration

Graphical Analysis of Kinematics

Objectives. Part 1: forward kinematics. Physical Dimension

Section 10.1 Polar Coordinates

Monte Carlo Localization

DEPARTMENT OF ACADEMIC UPGRADING

Online Algorithm Comparison points

Interference and Diffraction of Light

Summary of Computing Team s Activities Fall 2007 Siddharth Gauba, Toni Ivanov, Edwin Lai, Gary Soedarsono, Tanya Gupta

Problem Set 3 CMSC 426 Due: Thursday, April 3

HOUGH TRANSFORM CS 6350 C V

Important Project Dates

Introduction to Intelligent System ( , Fall 2017) Instruction for Assignment 2 for Term Project. Rapidly-exploring Random Tree and Path Planning

EE-565-Lab2. Dr. Ahmad Kamal Nasir

Simultaneous Localization

08 An Introduction to Dense Continuous Robotic Mapping

Math 7 Elementary Linear Algebra PLOTS and ROTATIONS

Introduction. Chapter Overview

Kinematics, Kinematics Chains CS 685

Unified Engineering Fall 2004

Statistical Techniques in Robotics (16-831, F12) Lecture#05 (Wednesday, September 12) Mapping

Mini Survey Paper (Robotic Mapping) Ryan Hamor CPRE 583 September 2011

Dynamic Sensor-based Path Planning and Hostile Target Detection with Mobile Ground Robots. Matt Epperson Dr. Timothy Chung

Cover Page. Abstract ID Paper Title. Automated extraction of linear features from vehicle-borne laser data

New York University Computer Science Department Courant Institute of Mathematical Sciences

Attack Resilient State Estimation for Vehicular Systems

W4. Perception & Situation Awareness & Decision making

The Internet of Things: Roadmap to a Connected World. IoT and Localization

Optimization of the Simultaneous Localization and Map-Building Algorithm for Real-Time Implementation

Lecture 8: Forward and Inverse Kinematics & Servo Motors. E5 Lab is in Hicks 213, not in Trotter

NMT EE 589 & UNM ME 482/582 ROBOT ENGINEERING. Dr. Stephen Bruder NMT EE 589 & UNM ME 482/582

Introduction to Programming System Design CSCI 455x (4 Units)

Simultaneous Localization and Mapping (SLAM)

Final Exam : Introduction to Robotics. Last Updated: 9 May You will have 1 hour and 15 minutes to complete this exam

Calibration of Inertial Measurement Units Using Pendulum Motion

2002 Intelligent Ground Vehicle Competition Design Report. Grizzly Oakland University

Individual Lab Report #6. Pratik Chatrath. Team A / Team Avengers. Teammates: Tushar Agrawal, Sean Bryan. January 28, 2016

Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Transcription:

Stefan B. Williams April, 2013 MTR4700: Experimental Robotics Assignment 3 Note: This assignment contributes 10% towards your final mark. This assignment is due on Friday, May 10 th during Week 9 before the tutorial (9 am) or via email prior to that time. The demonstrations will be conducted during the supplementary lab session in Week 9. Submit your report to the assignment box on the 3 rd floor outside of the drawing office in the Mechanical Engineering Building or give it to your tutor/lectuerer during the tutorial. Late assignments will not be marked unless accompanied by a valid special consideration form. Plagiarism will be dealt with in accordance with the University of Sydney plagiarism policy. ou must complete and submit the compliance statement available online. The objective of this assignment is to explore data fusion and mapping techniques suitable for use with mobile robotic systems. The choice of tools will depend on your preference and level of proficiency in Matlab, Python and/or C/C++. This assignment should take an average student 15-20π hours to complete to a passing grade. Total Marks: 60 The front page of your report should include: our name and SID The names of your group members (you are strongly encouraged to work in small groups of 2 or 3) 1 Data Fusion One important aspect of robotic navigation is the ability to fuse multiple sources of data. In the case of a mobile robot, we might have a number of sensors telling us our current position. Each of these sensors is subject to noise and errors of various kinds. Some sources of position data, such as triangulated range and bearing observations to known targets, are often quite noisy and subject to short term errors. Other position information, such as dead reckoning, is subject to an accumulation of errors that result from inaccuracies in our model and noise on the control lines. The fusion of these two sources of data can, however, give us much better results since one is good over the long term while the other is fairly reliable for predicting our position over a short distance. We have given you a number of data files collected during deployment of a simulated robotic vehicle. Write a robot position estimator that will fuse the data received from the vehicle to generate a reliable estimate of the robot s position. The information you are interested in is the velocity and turn rate information derived from the vehicle s wheel encoders and observations from a compass, global positioning system and to the laser strips in the environment. In this case the predictions will act as a low pass filter for the noisy pose estimates. There are typically three stages involved in an estimator. The first stage is the prediction stage in which we use the control data to predict the position of the vehicle given a vehicle model. When an observation such as a range and bearing to a beacon is made the estimate of the vehicle position can be updated.

1.1 Prediction Stage In the prediction stage we will use a simple kinematic based vehicle model. ehicle control signals consisting of velocity and steer angle can be retrieved from the robot. The vehicle model and update equations are included here for reference purposes. y Ψ x (, ) Figure 1 ehicle model ˆ ( 1) ˆ ( 1 1) ( )cos( ˆ k k k k + Δt k Ψ ( k 1 k 1)) ˆ ( 1) ˆ ( 1 1) ( )sin( ˆ k k = k k + Δt k Ψ ( k 1 k 1)) ˆ ( 1) ˆ Ψ k k Ψ ( k 1 k 1) + ΔtΨ ( k) where (k) is the measured vehicle velocity and Ψ ( k) is the turnrate at time k. The hat denotes the fact that this is our estimate of the value while the (k k-1) indicates that this is the estimate of the state at time k given the information available at time k-1. 1.2 The Observation Stage The observation stage consists of a series of observations that arrive from other sensors. In this case, the vehicle is equipped with a laser range finder and a noisy GPS style system. Start by using the noisy position and compass observations we have provided. Observations of retro-reflective beacons in the environment can also be used to provide a noisy position estimate if the position of the beacons is known. This signal is likely to be quite accurate over the long term (i.e. at low frequency) assuming that you can associate the observations to the correct beacon but will be subject to high frequency noise induced by errors in the sensor readings. The update stage will be used to filter out this noise. ( 1 ) (T 2, T 2 ) y R 2 θ 2 x (, ) Ψ R 1 θ 1 (T 1, T 1 ) 1 T 2 T 1 1 R2 sinθ2 R1 sinθ 1 Ψ = tan tan T T R cosθ R cosθ 2 1 2 2 1 1 = T R cos + Ψ 1 1 1 = T R sin + Ψ 1 1 1 Figure 2 Computing the position of the vehicle P in a global reference frame using Range and Bearing information from the known targetst 1 and T 2.

1.3 The Update Stage The update stage is responsible for fusing the predictions and the observations to provide a more accurate estimate of the true position of the vehicle. The update equations for a simple low pass filter are ˆ ( ) ˆ ( 1) ˆ k k k k Obs ( k) = ( 1 α p) + α p ˆ ( ) ˆ ( 1) ˆ k k k k Obs ( k) ( α ) ˆ α [ ] ˆ ( k k) 1 ( k k 1) Ψ = Ψ Ψ + Ψ ΨObs ( k) ( 2 ) ( 3 ) where the constants α i are weighting factors that represent our confidence in the observed data. If we trust our predictions more than our observations α would be large. We d like you to change the value of α to observe what happens to the estimate. Use the values α = {1.0, 0.9, 0.5, 0.1, 0.0}. Describe what happens to the estimated path and the obstacle map in each case. The relationship between the prediction, observations and update is shown in Figure 3. Dead Reckoning True Path Triangulation Signal Triangulation path Dead Reckoning path Update stage Fused path estimate Triangulation observation Dead Reckoning prediction Dead Reckoning prediction at Observation Fused estimate True Path Estimate Observation Update Stage Observation Initial Estimate Figure 3 The Prediction-Observation-Update stages of the estimator. Notice that the estimate always lies between the Prediction and the Observation.

2 Occupancy Grid Mapping The occupancy grid is a fairly simple representation of an obstacle map. It consists of dividing an area into equal squares and checking for the existence of obstacles in each grid location. When an obstacle is detected in a particular location in space, the count in that grid space is incremented. When the count exceeds a certain threshold (set to ignore spurious data readings) the grid square is considered to be occupied by an obstacle (see Figure 4). Figure 4 Occupancy grid after 10 scans Our robot is equipped with a laser range scanner that gives us range and bearing information about obstacles in the environment. The robot is moving around an area with the laser positioned on the centre of the robot. Whenever a laser reading arrives, the data should be analysed to check for obstacles and the occupancy grid should be updated using the current position estimate. What happens to the quality of the map when you use different filter values to generate different path estimates? Note that when there is no obstacle at a particular bearing, the laser returns its maximum range of 8.0m. This should not be considered a valid return and should not be used to update obstacle positions.

i y R j (T, T ) (, ) θ Ψ x T = + R cos Ψ + T = + Rsin Ψ + (a) (b) Figure 5 (a) Computing the position of a target T in a global reference frame using Range and Bearing information from the moving reference frame P on a robot (b) There is also a direct relationship between the position in world coordinates and the location of the grid cell. This relationship is a function of the size of the occupancy grid, its resolution and the relative position of the origin within the map. our second task in this assignment will be to generate an occupancy grid map of the environment in which the robot is operating. We will use the position estimates of the vehicle pose you generated earlier as your reference. ou should create an array of integers that is large enough to accommodate the environment in which the robot is working and that has enough resolution to discriminate targets. ou should provide a mechanism for changing the resolution of the grid and experiment with different map resolutions. Consider how the storage requirement changes in terms of the size and resolution of the map. Each time a laser reading is received, it should be translated from world coordinates (i.e. (T x, T y ) in Figure 5) to grid coordinates (i, j) which will act as an index into your grid. The value of that grid cell can then be incremented. Once you have finished creating the map, select an appropriate threshold and transform the array into a binary array of ones and zeros representing occupied and unoccupied space. Find some means of displaying this data in a meaningful format (i.e. Matlab, Excel or as ASCII text). Alternatively, you may wish to investigate methods for creating bitmaps from an arbitrary array from within your program or, if you have experience in this area, you may wish to create a graphical user interface (in Matlab, C++ or Java perhaps) to display the data. This could be done in real-time as the data is being received from the robot or offline once the occupancy grid is complete. 3 Bonus! Real Data We will provide you with a second data set collected using one of our irobot Create robots, our acoustic localisation system, and a Hokuyo laser rangefinder. The format of the data will match the synthetic data already provided. our task is to rerun the path estimation and occupancy grid generation components of the assignment using this real data set. The acoustic positioning estimates are noisy. There are estimates that are wildy incorrect. How can we detect these outliers? What effect do they have on the quality of the path estimate and map? How can we eliminate them? Show how the quality of your path estimate changes after mitigating the effect of outliers.

4 Report & Marking ou are to submit a brief report detailing the work you have undertaken as part of this lab. We are particularly interested in seeing a discussion of the principles covered by this lab. our report should discuss your implementation of the data fusion and occupancy grid methods discussed above. We expect that you will understand and be able to explain the fundamental principles behind estimation and be able to relate the results you obtain to those principles. Plots of the Prediction-Observation- Update cycle similar to the one shown in this handout but using real data will help you to demonstrate your understanding of how the filter is working. We d also like some discussion relating to the software design decisions you made in the implementation of the filter and the occupancy grid. A demonstration of your working system will be expected during the supplementary lab session in Week 9. ou and your group members will be asked to answer questions about your demonstration. ou must be present at the demonstration to receive demonstration marks and are expected to be able to answer questions relevant to all aspects of the system you and your team members are demonstrating. The grading will be based 50% on the demonstration of the system operation, including a discussion of the system design, and 50% on the report. The grading will fall roughly into the following divisions: Pass: implement occupancy grip using dead reckoned position estimates. Some elementary form of plotting of the grid. Credit: same as above plus a more elaborate display of the output. Also independent results of the position data fusion process should be shown Distinction: use of the fused position information to improve the resulting map. Answering of the questions posed throughout the lab handout High Distinction: a useful front end added to the program to allow the user to see the results being generated on the fly while the program is connected via Player to the robot and/or simulator. Bonus: Complete Section 3 (Real Data) of the assignment. Partial marks will be awarded for path estimation and mapping without considering outliers in the acoustic data. Full marks will be awarded for improving your results by considering these outliers and answering the related questions posed. Marks will be assigned according to the following breakdown: 1. Demonstration 30 2. Description of the problem 5 3. Data Fusion Results 10 4. Mapping Results 10 5. Presentation 5 6. TOTAL 60 7. BONUS 10