Build and Test Plan: IGV Team

Similar documents
Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Homework 11: Reliability and Safety Analysis

Odyssey. Build and Test Plan

Mobile Autonomous Robotic Sentry (MARS) with Facial Detection and Recognition

MTRX4700: Experimental Robotics

Fire Bird V Insect - Nex Robotics

Everything You Always Wanted To Know About Programming Behaviors But Were Afraid To Ask

Pick and Place Robot Simulation

Autonomous People Mover Phase II - Sensors P MSD 1 - FINAL DESIGN REVIEW

Presentation Outline

Active2012 HOSEI UNIVERSITY

Southern Illinois University Edwardsville

SPARTAN ROBOTICS FRC 971

NOMAD 4000 Marking System

Sensor technology for mobile robots

3D Laser Range Finder Topological sensor gathering spatial data from the surrounding environment

Handout. and. brief description. Marine Gravity Meter KSS 32- M

ADR. - Configuration and Functionality USER MANUAL

Faculty Advisor Statement

Combining a Modified Vector Field Histogram Algorithm and Realtime Image Processing for Unknown Environment Navigation

2002 Intelligent Ground Vehicle Competition Design Report. Grizzly Oakland University

ROBOTIC SURVEILLANCE

Camera Drones Lecture 2 Control and Sensors

Advance Robotics with Embedded System Design (ARESD)

ASV 2008 Son of a Boatname. Group 1 Michael Podel Gor Beglaryan Kiran Bernard Christina Sylvia

9th Intelligent Ground Vehicle Competition. Design Competition Written Report. Design Change Report AMIGO

Autonomous Navigation for Flying Robots

MSR Team SAVI. Satellite Active Vibration Inverter

AX1500. Dual Channel Digital Motor Controller. Quick Start Manual. v1.9b, June 1, 2007

And. Modal Analysis. Using. VIC-3D-HS, High Speed 3D Digital Image Correlation System. Indian Institute of Technology New Delhi

EE565:Mobile Robotics Lecture 2

Technical Specification for Educational Robots

Festo LOGISTICS COMPETITION 2011

ECE1778: Creative Applications for Mobile Devices. Mover-bot. Android-based Mobile Robotics Development Platform

EE565:Mobile Robotics Lecture 3

D115 The Fast Optimal Servo Amplifier For Brush, Brushless, Voice Coil Servo Motors

Homework 13: User Manual

Design Report prepared by: Shawn Brovold Gregory Rupp Eric Li Nathan Carlson. Faculty Advisor: Max Donath

This presentation is NOT designed to replace the Hofmann installation instruction manual. This presentation is only a reference to assist in the

Project 1 : Dead Reckoning and Tracking

Segway RMP Experiments at Georgia Tech

Homework 11: Reliability and Safety Analysis Due: Friday, April14, at NOON

Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform. Project Proposal. Cognitive Robotics, Spring 2005

Arduino Smart Robot Car Kit User Guide

Instruction book IQAN-LST. Publ no HY /UK Edition 0301

Mobile Robots: An Introduction.

Matthew Morris Lalit Tanwar

Instruction book IQAN-LSL. Publ no HY /UK Edition 0301

Terrain Roughness Identification for High-Speed UGVs

RS-FS-N01 Wind speed transmitter user's Guide (485type)

COR Series Router IBR900/IBR950 AND IBR1100/IBR1150

CS283: Robotics Fall 2016: Software

Introduction to Welding Automation

STEPPER MOTOR DRIVES SOME FACTORS THAT WILL HELP DETERMINE PROPER SELECTION

Table of Contents. Introduction 1. Software installation 2. Remote control and video transmission 3. Navigation 4. FAQ 5.

ROBOT SENSORS. 1. Proprioceptors

APPLICATION NOTE

Studuino Block Programming Environment Guide

IEEE SoutheastCon Hardware Challenge

MANUAL FOR RECTILINEAR AND TORSIONAL POSITION CONTROL SYSTEM Prof. R.A. de Callafon, Dept. of MAE, UCSD, version

This is an inspection failure, not meeting the requirement of >10k Ohm between either PD battery post and chassis.

WAVE2: Warriors Autonomous Vehicle IGVC Entry. Wayne State University. Advising Faculty: Dr. R. Darin Ellis. Team Members: Justin Ar-Rasheed

2004 IGVC LTU Zenna Design Report. Project Zenna IGVC entry from Lawrence Technological University

Evaluating the Performance of a Vehicle Pose Measurement System

RS-FX-N01 wind transmitter user's Guide (485type)

Sonar range sensor WiFi n

Robotics: Science and Systems II

TB6600 Stepper Motor Driver

MONITOR AND CONTROL OF AN EXCAVATOR ROBOT

Program your face off

AMRobot. Educational mini-robot platform

Range Sensors (time of flight) (1)

4 Leg Air +Hydraulic Leveling System

BFF Driver Test App Quick Start Guide v1.2

Introducing Robotics Vision System to a Manufacturing Robotics Course

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

GROUP 23 Military Surveillance Robotic Vehicle. Ryan Hromada - EE John Baumgartner - EE Austin King - CpE Kevin Plaza - CpE

Vehicle Localization. Hannah Rae Kerner 21 April 2015

AMR 2011/2012: Final Projects

APPLICATION NOTE /20/02 Getting started using IPM240-5E with a brushless motor

2014 IGVC Design Report

Team: XeroDual. EEL 4924 Electrical Engineering Design. Final Report 3 August Project Ehrgeiz. Team Name: XeroDual

Lawrence Technological University. Think-Tank. IGVC 2005 Autonomous Vehicle

MINDPX. User Guide. Autopilot System V1.2

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

ASSOCIATED ELECTRONICS RESEARCH FOUNDATION C-53 PHASE-II, NOIDA

Georgia Institute of Technology RoboJackets 2007 IGVC Design Report

Read Me. Table of Contents

RDB3K. An Unique Omni-directional 4-Wheeled Platform Governed by ARTMAP Neural Networks. Volunteer Team Members:

DVD/CD RECEIVER SERVICE MANUAL

Advanced Features. High Performance Stepper Drive Description. Self Test and Auto Setup

Part A: Monitoring the Rotational Sensors of the Motor

Experiment 4.A. Speed and Position Control. ECEN 2270 Electronics Design Laboratory 1

ELEVENTH ANNUAL INTELLIGENT GROUND VEHICLE COMPETITION. Design Report

09/05/2014. Engaging electronics for the new D&T curriculum. Geoff Hampson Managing Director of Kitronik. Presentation overview

Stereo Vision Inside Tire

DX4085 Z-Meter Tester

Indoor Mobile Robot Navigation and Obstacle Avoidance Using a 3D Camera and Laser Scanner

OLD DOMINION UNIVERSITY AUTONOMOUS SURFACE VEHICLE TEAM: BIG BLUE

Mirror positioning on your fingertip. Embedded controller means tiny size plus fast, easy integration. Low power for hand-held systems

Transcription:

Build and Test Plan: IGV Team 2/6/2008 William Burke Donaldson Diego Gonzales David Mustain Ray

Laser Range Finder Week 3 Jan 29 The laser range finder will be set-up in the lab and connected to the computer via the USB interface. The information acquired from the laser range finder will be tested for accuracy. This will involve setting up pre-defined object a certain distance from the laser range finder as a reference and comparing the data obtained from it to the reference distances. These objects will include traffic barrels, cones and barricades to get a feel for what the robot will actually encounter on the obstacle course. LRF to XY - Week 3 Jan 29 Laser range finder data needs to be converted from polar coordinates to xy coordinates for integration with the rest of the obstacle detection system. Matlab Compiler - Week 3 - Jan 29 Have the matlab compiler functioning. We should be able to run a simple matlab program using the framework. Motor Controller - Week 3 Jan 29 The motor controller operates via a RS232 serial interface. The goal during this phase is to have smooth communication between the computer and the motor controllers using basic functions such as connect, disconnect, setspeed, setdirection, getspeed, gettemp (for transistors), stoprobot, getbattery (battery voltage), and reset. These functions will be used extensively in phase 2 for following a path. This will be tested on the robot from last year in a controlled environment. A basic program will be made using every function and logging speed, temperature, and battery voltage to access at the end of the run. Update Website - Week 3 Jan 29 The website should be updated from the previous year to reflect the new members and the new design of the robot. This is important for fund raising as well. The update will be complete by this point. Camera - Week 4 - Feb 5 This is the basic interfacing of the camera to the computer. We should be able to change resolution, exposure, etc. and aquire images. This is placed in week 4 because we will be waiting for the firewire adapter card. Camera Matrix - Week 4 - Feb 5 Once the laser pointer assembly from Mr. James is finished, debugging of the projection matrix function in matlab can be completed. This will be used to create a projection

matrix for any position that the camera is in. It may be possible to create ideal projection matrices based on the derived camera matrix and rotation and position data once the first projection matrix has been calculated. However, it will probably be more accurate to continue using the empirical method. Emergency Distance Sensors Week 4 - Feb 5 The Emergency Distance Sensors will be mounted around the perimeter of the robot and will be constantly comparing the distance to a certain threshold distance. The Emergency Distance Sensors will be connected to a microcontroller which will be programmed to monitor each sensor. If a sensor s distance information reaches a programmed threshold in the microcontroller a switch will be trigger to shut the motors off. The testing will begin once the Emergency Distance Sensors are attached to a microcontroller and the microcontroller is programmed to monitor and trigger a switch at a certain threshold. The threshold will then be adjusted to a certain distance. GPS - Week 4 - Feb 5 The GPS also operates via a RS232 serial interface. It uses the NMEA 0183 protocol to transmit data. The GPS will be interfaced to our Linux machine using either modified old code or the daemon called gpsd. Both will be tested to see which works better and the most reliable method will be used. Testing will go as follows: the GPS will be moved to an area where it can receive a signal (outside), and the position and heading will be queried at close to the maximum refresh rate while the GPS is mobile. The method that produces the most consistent and error-free data will be chosen. Wireless E-Stop - Week 4 - Feb 5 A circuit will be constructed by the team to receive a signal via a RF module that communicates via RS-232. This circuit will be wired to the emergency stop input on the motor controller so that the robot will come to a complete stop whenever the signal is received. The team will also make a transmitter device with a button on it that will activate this emergency stop. This device will have a transmitter RF module and a switch and will be battery powered for portability. It will also be wired up to the motor controllers and tested along with the motor control code. It will be important for the safety of the testers, so it will be tested first. Extract Ground Plane - Week 5 - Feb 12 Once the projection matrix calculation is working, it will be possible to map the laser range finder data into the image. The next issue will be segmentation of the ground plane. At this point, we will be able to take laser range finder data and run the image through a function that will generate a new image containing just the ground plane.

Hardware Abstraction Layer - Week 5 - Feb 12 This will be the concierge of hardware data, a simplified and standardized C++ class that will include all functions that will be needed to communicate with any low level hardware. Testing will included calling each function and comparing its output with known output to see if they are the same. Position, Velocity, and Heading - Week 5 - Feb 12 The hardware abstraction layer will be used to determine the current speed via GPS and Encoders. The position will be determined through the GPS and the GPS s compass will be used to determine heading. These values will be checked via a handheld GPS unit. This will either be tested on the old robot or the new robot based on the status of the chassis. Chassis - Week 6 - Feb 19 The chassis will be constructed out of 6063 aluminum. The design, which is currently in Solidworks, will be exported into a metal cut sheet. The ME students will practice TIG welding on small lengths, until they feel that they are proficient enough to conquer the actual chassis construction. The chassis will be tacked together for fitment then fully TIG ed at every seam. This should provide for an extremely strong, lightweight chassis. Match LRF to Obstacles - Week 6 - Feb 19 This milestone involves the segmentation of obstacles. Once the obstacles are distinct areas, they can be matched to the laser range finder data in preparation for the obstacle characterization. Parameterize Lines - Week 6 - Feb 19 Building on the previous week's ground plane extraction, the lines on the ground should be detected and parameterized at this point. A-Star - Week 7 - Feb 26 This phase will consist of developing the A-Star algorithm which will be used for following the map that is in place. A weighted grid will be taken in, and the lowest weight path will come from it. Optimizations will be made for smoother turns. This will be tested using multiple grids and confirming the results by hand. Assembly - Week 7 - Feb 26 The team will be assembling the robot at this time. This includes mounting all of the components to the chassis and implementing any extraneous systems such as the access drawers and humidity reduction heat vent. This stage also includes neatly wiring the

robot and ensuring full functionality of all previously working systems. The robot will be made such that the sensors can be removed easily and the camera mount can also be removed for possible shipping of the robot. Parameterize Obstacles - Week 7 - Feb 26 Once the laser range finder data has been matched to color regions in the image, we can estimate the depth and area of the obstacles using our assumptions. This function should be operational at this time. Motor Controller Tuning - Week 8 Mar 4 The feedback constants (kp, ki, kd) will be adjusted to get the robot to follow the path smoothly. Turns should be smooth, and acceleration should not be jerky. Potholes - Week 8 - Mar 4 Potholes will be a difficult task in themselves. At this point, we should be able to detect them in the ground plane. Overlay Multiple Maps - Week 9 - Mar 11 As the vehicle moves through the course, it will be necessary to combine the information from maps generated at various positions and angles. At this point, the algorithm for doing so will be finished, and the code will be written and tested. Path Following - Week 9 - Feb 11 The robot must be able to follow a path and use sensor data to calculate the error in this path while correcting it. In order to do this, a PID motion control system will be implemented at this stage. A test path will be made with GPS coordinates and the robot must be able to follow the path as close as possible. Spring and Tires Tuning - Week 9 - Mar 11 In order to accurately predict and control the vibrations of the vehicle, the characteristics of the tires and casters must be determined. In particular, the spring rate and possibly the damping coefficient can be estimated experimentally. The wheels will first be mounted to the 2007 vehicle frame or some other rigid testing rig which will allow for vertical movement as the tires are loaded. Next, incremental weights will be added and the deflection will be measured. This experiment will be repeated under multiple tire air pressures. In a similar fashion, the caster spring rate will be estimated for multiple spring designs. This information can be used to properly model the movement of the vehicle. We will also attempt to estimate the damping coefficients associated with each air pressure and spring by measuring the response of each to a given set of initial conditions. Plots of the displacement response will be acquired from the accelerometer plot.

Terrain - Week 9 - Mar 11 The vision system should be able to ignore terrain features such as ramps and sand pits. This should be operational at this point. Final Map - Week 10 - Mar 18 This represents the final functioning stage of the obstacle detection system. All core functions should be written and the robot should be able to execute the algorithm as designed. The result will be a map of the area visible to the sensors. Emergency Stop Handling Week 11 Mar 25 The emergency distance sensors will need to be incorporated into the motor control and locomotion functionality. At this point, the two systems will be interacting reliably. An object coming within range of the distance sensors should stop the robot or initiate avoidance algorithms. Nav Operation - Week 11 Mar 25 This step processes the GPS coordinates and determines the shortest path between them. This will be done via brute force. This path will determine a general direction and the robot will use this direction to create a weighted grid with the obstacles found by the sensors in the path. Following and Planning Code - Week 12 - Apr 1 At this point, the timing for the transition and interaction between following and planning modes should be finalized and functional. Autonomous Operation - Week 14 Apr 15 All At this point, the autonomous operation should be finished. Final Debug - Week 15 - Apr 22 All The final debug stage is a final run through of all the possible situations that we will encounter in the competition. We will start with the most basic problems without any obstacles to verify simple operation. We will add obstacles in degrees, verifying performance with each new configuration. Finally, we will conduct torture tests that will pit the robot against the most difficult situations conceivable. Any bugs found will be fixed as they arise.