Wall-Follower. Xiaodong Fang. EEL5666 Intelligent Machines Design Laboratory University of Florida School of Electrical and Computer Engineering

Similar documents
Final Report. EEL 5666 Intelligent Machines Design Laboratory

Final Report. Autonomous Robot: Chopper John Michael Mariano December 9, 2014

D-Rex. Final Report. An Ho. April 21, 14

Mobile Autonomous Robotic Sentry (MARS) with Facial Detection and Recognition

Wallmaster Final Report

Scott Kanowitz, A. Antonio Arroyo. Machine Intelligence Laboratory Department of Electrical and Computer Engineering

EEL 5666C FALL Robot Name: DogBot. Author: Valerie Serluco. Date: December 08, Instructor(s): Dr. Arroyo. Dr. Schwartz. TA(s): Andrew Gray

Advance Robotics with Embedded System Design (ARESD)

Direction Control of Robotic Fish Using Infrared Sensor Modules and IPMC Activation Schemes with a dspic30f4013 DSC

1.0. Presents. techathon 3.0

Rover 5. Explorer kit

Fire Bird V Insect - Nex Robotics

Autonomous Bottle Opener Robot

Robotics Training Module ABLab Solutions

Intelligent Machines Design Laboratory EEL 5666C

Master Chief A. Final Report Justin Dickinson

University of Moratuwa

GUIDE TO SP STARTER SHIELD (V3.0)

Embedded Systems and Kinetic Art. CS5968: Erik Brunvand School of Computing. FA3800: Paul Stout Department of Art and Art History.

Logistics. Embedded Systems. Kinetic Art. This Class. Embedded Systems and Kinetic Art. Kinetic Art

AC : INFRARED COMMUNICATIONS FOR CONTROLLING A ROBOT

Control System Consideration of IR Sensors based Tricycle Drive Wheeled Mobile Robot

INDUSTRIAL TRAINING:6 MONTHS PROGRAM TEVATRON TECHNOLOGIES PVT LTD

Index. Jeff Cicolani 2018 J. Cicolani, Beginning Robotics with Raspberry Pi and Arduino,

THE COMPLETE ALL IN ONE ROBOT 360 NANO BOT

IMDL Final Report. EEL5666: Intelligent Machine Design Laboratory. Aaron Fisher. Robot: Lazy Maid

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

SystemVision Case Study: Robotic System Design. Dan Block Senior Project Oregon State University

Unlocking the Potential of Your Microcontroller

Doc: page 1 of 8

Experiment 4.A. Speed and Position Control. ECEN 2270 Electronics Design Laboratory 1

Arduino Prof. Dr. Magdy M. Abdelhameed

AlphaBot2 robot building kit for Arduino

Arduino Internals. Dale Wheat. Apress

CS , Software Defined PCBs

Specialty Plants. User Manual. Quanser Qbot

Technical Specification for Educational Robots

Robotics Adventure Book Scouter manual STEM 1

Academic Year Annexure I. 1. Project Title: Color sensor based multiple line follower robot with obstacle detection

IEEE SoutheastCon Hardware Challenge

Arduino Smart Robot Car Kit User Guide

Experimental Procedure

SATCOM-On-The-Move User Manual

KELVIN: Kart for Long distance Vision Navigation 2001 Florida Conference on Recent Advances in Robotics May 10-11, 2001, Florida A&M University

Intelligent Machine Design Lab

IMDL Fall 2010 Final Report

International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.3, May Bashir Ahmad

keyestudio Keyestudio MEGA 2560 R3 Board

AlphaBot2 robot building kit for Raspberry Pi 3 Model B

ELEX ばはは 0: Digital System Design. Self-Driving Car

DIY Line Tracking Smart Car with AT89C2051

EV3 Programming Workshop for FLL Coaches

Thursday, September 15, electronic components

Own Your Technology Pvt Ltd. Own Your Technology Presents Workshop on MSP430

Arduino Programming and Interfacing

Microcontroller Basics

Autonomous, Surveillance Fire Extinguisher Robotic Vehicle with Obstacle Detection and Bypass using Arduino Microcontroller

Modern Robotics Inc. Sensor Documentation

ARDUINO UNO R3 BASED 20A ROBOT CONTROL BOARD [RKI-1580] Page 1

BirdBuggy Autonomous Mobile Parrot Perch

INSPIRE 1 Quick Start Guide V1.0

EEL 5666C - Intelligent Machine Design Lab Final report

TEVATRON TECHNOLOGIES PVT. LTD Embedded! Robotics! IoT! VLSI Design! Projects! Technical Consultancy! Education! STEM! Software!

Rear Distance Detection with Ultrasonic Sensors Project Report

Sonar range sensor WiFi n

Sense Autonomous 2_11. All rights reserved.

Embedded Surveillance System using Multiple Ultrasonic Sensors

Goal: We want to build an autonomous vehicle (robot)

ARDUINO LEONARDO ETH Code: A000022

CprE 288 Introduction to Embedded Systems (Project and Platform Overview)

Pick and Place ABB Working with a Liner Follower Robot

IME-100 ECE. Lab 4. Electrical and Computer Engineering Department Kettering University. G. Tewolde, IME100-ECE,

Scout. Quick Start Guide. WiFi Mobile Robot Development Platform with Multi-DOF Gripping Arms

Microcontrollers for Ham Radio

Welcome to Apollo. For more information, please visit the website and select Apollo. Default Code

ECE 477 Design Review Team 8 Spring Mike Cianciarulo, Josh Wildey, Robert Toepfer, Trent Nelson

The Constant Gardener

Section 3 Board Experiments

Model: K0073. Smart Robot Car Kit Quick Start Guide

1. Introduction Packing list Parts Introduction Uno R3 Board for Arduino Specifications... 6

RKP08 Component List and Instructions

Modeling, Design, and Control of Robotic Mechanisms (MathWorks/Kyungnam Univ.) -1-

IME-100 ECE. Lab 5. Electrical and Computer Engineering Department Kettering University

SMART DUSTBIN ABSTRACT

RoboStamp Basic Software (A tutorial by Technotutorz)

Pridgen Vermeer Robotics Xmega128 Manual

ARDUINO MEGA INTRODUCTION

Yushun He Written Report Project: DCR EEL5666 FALL 2011 Intelligent Machines Design Laboratory Dr. Arroyo and Dr. Schwartz

INTERFACING HARDWARE WITH MICROCONTROLLER

DRK6000/8000 System Specification. DRK6000 Wireless Mobile System DRK8000 Wireless Mobile Animated Head System

Homework 5: Circuit Design and Theory of Operation Due: Friday, February 24, at NOON

Intelligent Guide Robot (I-GUIDE)

Serial.begin ( ); Serial.println( ); analogread ( ); map ( );

Various power connectors. 3.3V regulator. 64K Flash (Internal) 2K EEPROM (Internal) 4K SRAM (Internal) JA Mem Adr/ Data. Doc: page 1 of 9

Please review this guide fully before use. For any questions not answered in this guide, please contact WARNING Battery warning

WiFi Mobile Robot Development Platform with Video/Audio Capability and Animated Head System X80-H

Prototyping & Engineering Electronics Kits Basic Kit Guide

Arduino Smart Bluetooth Robot Car Kit User Guide

SECOND EDITION. Arduino Cookbook. Michael Margolis O'REILLY- Tokyo. Farnham Koln Sebastopol. Cambridge. Beijing

Section 4 - Automation Assembly

Transcription:

Wall-Follower Xiaodong Fang EEL5666 Intelligent Machines Design Laboratory University of Florida School of Electrical and Computer Engineering TAs: Tim Martin Josh Weaver Instructors: Dr. A. Antonio Arroyo Dr. Eric M. Schwartz

Table of Contents Abstract... 3 Introduction... 3 Integrated System... 3 Control Board... 5 Mobile Platform... 6 Actuation... 6 Sensors... 6 Software design... 9 Experimental Results... 11 Future improvement... 11 Conclusion... 11 Acknowledgement... 11 References... 11

Abstract Wall follower is an automatic wall following and color detecting robot. It is a small size (8 inch * 8inch * 6inch), highly integrated intelligent system based on an Atmel ATXmega A1 micro controller. It utilizes ultra-sonic and infrared range finders together with two motors on the rear of the aluminum platform to implement obstacle avoidance as well as wall following. A CMU camera is used to provide on-board color detection. Keywords: intelligent machine, robot, color detection, wall following, CMU camera Introduction Automatic navigation of robots is always regarded as a critical part of machine intelligence. While outdoor navigation can be achieved using GPS technology, satellite-based indoor navigation is limited due to poor signal reception and high requirement on accuracy. Therefore many indoor navigation methods are developed and used instead. These methods include computer vision technology, sensors of different perspective of view, etc. The Wall-Follower is an indoor navigation robot which will automatically approach and follow a wall. It will also detect a certain color while it is following a wall. This feature can be used by other advanced robots in a lot of indoor and outdoor applications such as parallel parking, charge position detecting by a vacuum robot, etc. Followed by this brief introduction to Wall-Follower, the overall integrated system will be shown. Then the author will give separate introductions to the control board, the mobile platform, the actuation part and sensors. After the introduction to software, experimental results, future improvements and conclusions are given in the end. Integrated System The integrated system of Wall-Follower is made up of five major components: sensors, an aluminum mobile platform, actuation system including two motors to drive and steer the car, two battery packs, and the most important, an Epiphany DIY control board. All other components are mounted on the aluminum platform. Sensors detect the outside world and give the information to the micro controller on the control board. While running the pre-stored program, the micro controller will according to data read from sensors give orders to motors regarding how to make the car move. The finished robot is shown in figure 1, and a component outline can be found in figure 2.

Figure 1, the finished robot Figure 2, component outline

Control Board The Epiphany DIY (figure 3) is a highly integrated, fully featured, open source and open hardware development board for the ATXmega A1 or A1U series.[1][2][3] Besides all the features ATXmega A1 has, the board itself has built-in motor and servo controllers, 3.3V and 5V regulators, USB serial communication port, 5V-3.3V ADC range shifter, etc. This board is highly recommended to be used for IMDL students because of its abundance of counters, serial communication ports, ADC s and other peripherals. Figure 3, Epiphany DIY board Figure 4, aluminum four wheel drive arduino platform

Mobile Platform The Wall-Follower is based on an aluminum four wheel drive arduino platform (figure 4). Two motors and two battery packs are mounted inside the body. The control board and all the sensors will be mounted on the top of the car-body. While mounting the board and sensors, plastic pillars and aluminum sheets are used. Actuation Two 6V geared motors are used in Wall-Follower, while the other two free spinning wheels are mounted on empty motor cases. Thus forwarding, reversing and turning can be performed by controlling the speed of two motors separately. However, when testing the robot even with lowest workable PWM duty cycle, the motors still turned out to be too fast for the application. The solution is to power on the motor, in software, for part of the time in a macroscopic cycle, in addition to the PWM signal sent to the motor. This mechanism can be explained in figure 5. Figure 5, software level calibration of motor speed. PWM signal is sent to the motors during part of the time in the macroscopic cycle. The motors are actually repeating start and stop in each macroscopic cycle. This will cause the robot to move slower. Sensors There are three types of sensors on Wall-Follower: ultra sonic range finders, infrared range finders and a CMU camera. The two IR range finders in the front of the robot are used for obstacle avoidance. The two ultra-sonic range finders on the side are used to detect a wall and tell the micro controller the robot s relative position to the wall. The CMU camera is used to track a predetermined color type while the robot is following a wall. It tells the micro controller the position of detected colored object in the camera s vision so that the microcontroller can react to the position.

Ping))) ultra-sonic range finder Figure6, PING))) ultra-sonic range finder This wide range sensor can detect the distance between 2cm to 3m with narrow acceptance angle. It works with 5V supply voltage and standard TTL logic. The only signal pin is used for both input and output communication. While triggered by a typical 5us positive pulse, it will return a pulse width signal which is proportional to the distance detected. Its clock diagram is shown in figure 7[4]. The ATXmega micro controller can capture the pulse width by the internal event system without using any interrupt function to control the counter [3]. A serial resistor is connected between the sonar signal pin and the on-board input/output pin. The resistor together with a built-in clamp diode will protect the on-board input transistor from breaking down due to 5V received signal from the sonar. Figure7, clock diagram of PING))) ultra sonic range finder The two sonars are mounted on the same side of the robot as shown in figure 1. If the front

sonar detects a longer distance from the wall, the robot will turn left so that it is closer to the wall than the rear sonar. Then the robot will go straight to approach the wall. After it approached to a certain distance from the wall, the Wall-Follower will waggle to follow the wall while detecting the color via the CMU camera. Sharp GP2D120XJ00F IR Range Finder This range finder detects from 3cm up to 30cm. It needs a 5V supply, but gives analog output inversely proportional to the distance detected ranging from 0 to 3.1V [5]. If the left IR detects an obstacle, the robot will turn right, and if the right IR detects an obstacle, the robot will turn left. If both of them detect obstacles, the robot will halt. Figure 8, Sharp GP2D120XJ00F IR Range Finder CMU camera V4 (special sensor) Figure 9, CMU camera version 4 from Sparkfun.com The CMU camera is a fully programmable embedded computer vision sensor. It provides on-board color tracking with a 160*120 resolution at the highest speed of 30 fps. 5V supply

voltage is needed and it works perfectly with 3.3V communication. Other features include micro SD card access and automatic servo control. The way to use this CMU camera is not complicated. Particular commands in viewable ASCII code need o be sent to the camera via USART port, and after a certain latency, it will return an ASCII viewable message indicating whether the command is acknowledged and if yes, together with possible return values. Two examples are given below: [6] Figure 10, communication examples between micro controller and CMU camera The message after the colon in gray font is the command, and those in black font are responses from the camera, among which ACK means the command is acknowledged. The first command asks the camera to track color pixels with red channel value between 0 and 100, green channel value between 0 and 100, and blue channel value between 0 and 100, where the maximum color values are all 255. Returned message starting with T, indicating this is a certain type of data. Among all the numbers, much attention is paid on the first two, which are the mean values of X and Y coordinates of tracked pixels. This can be seen as the center of captured color object. The second command asks the camera to return histogram of the image it sees in 2^3 bins. And the returned data after H is the histogram values divided into 8 bins. When the X coordinate is on the left half of the camera vision, the robot will reverse to track the object, and if the X coordinate if on the right half of the camera vision, the robot will move forward to track the object. Software design The software is developed in Atmel studio. All codes are written in C, and some of the functions are from Out of The Box Robotics [7]. The software flow chart is shown in figure 11 below.

START Initializing Blinking the LED Read sonar and IR values Turn left/ right, or halt Obstacle? Yes No Following the wall? No Yes Get color mean coordinate Approach and follow the wall Track the color Figure 11, software flow chart of Wall-Follower

Experimental Results Wall-Follower achieved its target on demo day. When approaching the wall, it is working fine. After it gets closer to a wall and starts to follow it, due to the slow reaction of the CMU camera to the command, the speed of the robot is a little bit slow. However, it works pretty good to detect the color and then follow it. Future improvement Future improvement can be made on faster detection of the color object, and automatic color threshold adjustment. Many advance applications can be achieved based on the features of Wall-Follower, including parallel parking, automatic indoor navigation and so on. Conclusion A comprehensive introduction to the hardware and software design of a car robot, the Wall - Follower is shown in this report. The robot works pretty well and shows a great potential for future uses. Acknowledgement The author wants to give special acknowledgement to the TA s Tim Martin and Josh Weaver. This robot cannot be completed without the generous help and encouragement from them. References [1] Out Of The Box robotics, http://ootbrobotics.pixelgeko.com/store/products/epiphany-diy/ [2] 8-bit AVR XMEGA A1 Microcontroller Preliminary, http://www.atmel.com/images/doc8067.pdf [3] 8-bit AVR XMEGA A Microcontroller Manual, http://www.atmel.com/images/doc8077.pdf [4] Data sheet of PING))) Ultra Sonic Distance Sensor (#28015), http://www.parallax.com/dl/docs/prod/acc/28015-ping-v1.3.pdf [5] Data sheet of Sharp GP2D120XJ00F IR Range Finder, https://www.sparkfun.com/datasheets/sensors/infrared/gp2d120xj00f_ss.pdf [6] CMUcam 4 Command List v1.02, http://cmucam.org/projects/cmucam4/documents [7] http://ootbrobotics.pixelgeko.com/