9th Intelligent Ground Vehicle Competition. Design Competition Written Report. Design Change Report AMIGO

Similar documents
8th Intelligent Ground Vehicle Competition

Systems and Control Engineering Department Faculty of Engineering, Hosei University Kajinocho Koganei, Tokyo , Japan ;

Active2012 HOSEI UNIVERSITY

12th Intelligent Ground Vehicle Competition

2002 Intelligent Ground Vehicle Competition Design Report. Grizzly Oakland University

Faculty Advisor Statement


Build and Test Plan: IGV Team

INSPIRE 1 Quick Start Guide V1.0

Development of Wall Mobile Robot for Household Use

Autonomous Vehicle Navigation Using Stereoscopic Imaging

Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System

HOSEI UNIVERSITY. Orange2014. Design Report

POINT CLOUD ANALYSIS FOR ROAD PAVEMENTS IN BAD CONDITIONS INTRODUCTION

WAVE2: Warriors Autonomous Vehicle IGVC Entry. Wayne State University. Advising Faculty: Dr. R. Darin Ellis. Team Members: Justin Ar-Rasheed

This simple one is based on looking at various sized right angled triangles with angles 37 (36á9 ), 53 (53á1 ) and 90.

VISUAL NAVIGATION SYSTEM ON WINDSHIELD HEAD-UP DISPLAY

PCMM System Specifications Leica Absolute Tracker and Leica T-Products

PCMM System Specifications Leica Absolute Tracker and Leica T-Products

Aligning Parallelism of Rolls Has Never Been Faster. L-732 Precision Dual-Scan Roll Alignment System

TomTom Innovation. Hans Aerts VP Software Development Business Unit Automotive November 2015

Who is riding the Bus?

DEVELOPMENT OF REAL TIME 3-D MEASUREMENT SYSTEM USING INTENSITY RATIO METHOD

501, , 1052, , 1602, 1604 EXCEL EXCEL 1602UC EXCEL 1052UC EXCEL 501HC. Micro-Vu Corporation. Precision Measurement Systems

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

Contents. - i - Ver.:2

Calibration of Recorded Digital Counts to Radiance

Team Description Paper Team AutonOHM

INSTALLATION MANUAL. ST-BTWD650IR2812 B or W Weatherproof Day/Night Infrared Color Camera

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Iris Innovations. IM-PTZ-16 Micro-PTZ Camera. Iris Innovations

A METHOD OF MAP MATCHING FOR PERSONAL POSITIONING SYSTEMS

HUVITZ SLIT LAMP SERIES & IMAGING SYSTEM

Cohda Wireless White Paper DSRC Field Trials

Behavior Learning for a Mobile Robot with Omnidirectional Vision Enhanced by an Active Zoom Mechanism

THE COLORED COMPARISON OF THE WALL SCULPTURE WITH 3D LASER SCANNER AND ORTHOPHOTO

Amplifying Human Performance Through Advanced Technology

Dept. of Adaptive Machine Systems, Graduate School of Engineering Osaka University, Suita, Osaka , Japan

ROBOTIC SURVEILLANCE

Wall-Follower. Xiaodong Fang. EEL5666 Intelligent Machines Design Laboratory University of Florida School of Electrical and Computer Engineering

CONTENTS PRODUCT FEATURES... EG-2 SAFETY PRECAUTIONS... EG-2 PARTS DESCRIPTION... EG-3 INSTALLATION AND ADJUSTMENT... EG-4 SPECIFICATIONS...

On-road obstacle detection system for driver assistance

Precision Roadway Feature Mapping Jay A. Farrell, University of California-Riverside James A. Arnold, Department of Transportation

New! 3D Smart Sensor your assistant on mobile machines. ifm.com/us/mobile

SERVICE MANUAL 1/3 SONY DSP COLOR CCD CAMERA OVER 650TVL SERIES

DISTANCE MEASUREMENT USING STEREO VISION

Ch 22 Inspection Technologies

ESTGIS 2012 Priit Leomar 1

DSRC Field Trials Whitepaper

INSTRUCTION MANUAL. * Design and Specifications are subject to change without notice. ver. 1.0 PRINTED IN KOREA

Gesture Recognition Using 3D MEMS Accelerometer

VISUAL NAVIGATION SYSTEM ON WINDSHIELD HEAD-UP DISPLAY

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles

428 XL SEISMIC ACQUISITION SYSTEM. Sound science. The power to do more.

Portable Ground Control Station

> Acoustical feedback in the form of a beep with increasing urgency with decreasing distance to an obstacle

Introduction. User Manual - SpotterRF M600. Introduction M600 Required Components Physical Setup Direct Wire Connection.

Extend Your Security into Multi-dimension

Festo LOGISTICS COMPETITION 2011

TR Digital Software-Driven Traffic Radar Unit. ANPR Automatic Number Plate Recognition. Mobile, Fixed, and Portable. Bi-direction radar

// ipq-view Digital web viewing for your quality assurance. Be inspired. Move forward.

The MCS shade evaluation procedure. Part 2 How to record shade onto the chart

UAV Hyperspectral system for remote sensing application

VG-D70. Vertical / Battery Grip for the Nikon D70

Why the Self-Driving Revolution Hinges on one Enabling Technology: LiDAR

Development of Vision System on Humanoid Robot HRP-2

Image processing techniques for driver assistance. Razvan Itu June 2014, Technical University Cluj-Napoca

Final Report. EEL 5666 Intelligent Machines Design Laboratory

Supplier Business Opportunities on ADAS and Autonomous Driving Technologies

DREAM 2 WIRED RTU SYSTEM GUIDE

Project: UAV Payload Retract Mechanism. Company Background. Introduction

Outdoor Scene Reconstruction from Multiple Image Sequences Captured by a Hand-held Video Camera

Flat-Field Mega-Pixel Lens Series

Leica Systems Overview

DETECTION OF STREET-PARKING VEHICLES USING LINE SCAN CAMERA. Kiyotaka HIRAHARA, Mari MATSUDA, Shunsuke KAMIJO Katsushi IKEUCHI

GEOLINER 680 3D technology wheel aligner for car HOFMANN

Cordless screwdriver technology. Cordless screwdriver technology. TorqBee - series. Overview. Overview. TorqBee Light. TorqBee

Preceding vehicle detection and distance estimation. lane change, warning system.

Intelligent Transportation Systems Using Short Range Wireless Technologies

Homework 11: Reliability and Safety Analysis

This is an inspection failure, not meeting the requirement of >10k Ohm between either PD battery post and chassis.

Optiv Classic 321 GL tp Technical Data

Segway RMP Experiments at Georgia Tech

Reprise of Locker Access System

Control System Consideration of IR Sensors based Tricycle Drive Wheeled Mobile Robot

Complete Barrel Measuring and Inspection System. PS Series. User s manual

Iwane Mobile Mapping System

Watchmaker precision for robotic placement of automobile body parts

ITS (Intelligent Transportation Systems) Solutions

DETECTION OF 3D POINTS ON MOVING OBJECTS FROM POINT CLOUD DATA FOR 3D MODELING OF OUTDOOR ENVIRONMENTS

VBOX Video HD2 RLVBVDHD2. Features

Laser Sensor for Obstacle Detection of AGV

Proposal for Novel Bus Location System. Using Wireless Sensor Network

INSTALLATION AND OPERATING INSTRUCTIONS DSST SYSTEM

Weighing and measuring precise. innovative. medical. Hospital Range Approved Class III. Price List 2009

3-D Urban Environment Model Reconstruction from Aerial Data High-definition Video and Airborne Laser Data

HV-F22CL Color Camera. Specifications

RIEGL SYSTEM CONFIGURATION 3D TERRESTRIAL SCANNER LMS-Z620

Detecting abnormality in vehicle immediately and providing the information surely in vehicle. Control vehicle remotely in real time by operating the v

Odyssey. Build and Test Plan

Transcription:

9th Intelligent Ground Vehicle Competition Design Competition Written Report Design Change Report AMIGO AMIGO means the friends who will join to the IGV Competition. Watanabe Laboratory Team System Control Engineering Department Hosei University 3-7-2 Kajinocho Koganei Tokyo 184-8584, Japan e-mail : bob@k.hosei.ac.jp Fax 81-42-387-6123 June 4th, 2001

1. Introduction The autonomous vehicle nicknamed AMIGO (friends) was designed by a team of an undergraduate and four graduate students from HOSEI University. Professor Kajiro Watanabe and Assistant Professor Kazuyuki Kobayashi are the advisers. The team was organized in the early of April 2001. At fall of the 1999, four graduate students tentatively set out to design a new vehicle, later to be nicknamed AMIGO. The team members have innovated on the vehicle AMIGO of 2000 version. Figure 1 shows the team organizational chart. All team members in this chart are cross-listed in the team roster shown in Table 1. The vehicle AMIGO is a four wheeled driven autonomous vehicle powered entirely by a 24 volt DC power source. "Vision of dragonfly" and "robustness" are the key words that express the design concept of the vehicle AMIGO. An Omni-directional camera, a one-directional camera and laser radar are employed in the new model. New algorithm to make full use of the above sensing devices were designed. By the word of robustness, we mean that the vehicle autonomously runs on the very tough curving road with variety of obstacles toward to the goal in the very severe out door environment. Team Leader Kenji Amada Software design Autonomous Challenge Kei Odagiri Hardware design Electrical design Kei Odagiri Design Competition Presentation Ken Ishikawa Navigation Challenge Masayoshi Ito Body design Kenji Amada Written report Kenji Amada Follow the Leader Shinya Ogawa Figure 1 The team organizational chart 2

Table 1 Team roster Function Name Major Academic Level Team Leader Kenji Amada System engineering Control Graduate Technical Leader Kei Odagiri System engineering Control Graduate Masayoshi Ito System engineering Control Graduate Shinya Ogawa System engineering Control Graduate Ken Ishikawa System engineering Control Undergraduate 2. Design process The vehicle AMIGO was designed by the following processes; (1) Mechanical Design (a) Base vehicle (b) Arrangement of the components Changed! (c) Body Design Changed! (2) Electric and Sensing Hardware Design. (a) Vehicle Design Power supply system Controller hardware (b) Sensing Design Vision sensing Changed! Obstacle sensing Changed! (3) Software design. (a) Autonomous Challenge Lane detection algorithm New! Obstacle detection Autonomous driving control (b) Navigation Challenge New! Navigation Challenge Vehicle control algorithm (c) Follow the Leader Challenge Leader vehicle detection Changed! 3

(4) Other Design Follower vehicle control New! (a) Safety Mechanical breaking system E-stop 4

3. Mechanical Design 3.1 Base Vehicle The vehicle was developed on the basis of the electric wheelchair (Co. Ltd. SUZUKI) for foot handicap persons. The body size of the vehicle is 90cm long and 180cm tall, and 80kg weight. The base vehicle has two DC motors to drive and a stepping motor to steer. For the convenience by the air plane transportation, our selection of base vehicle is limited. It must be compact and thus we selected the electric wheel chair as the base. Maximum speed is 6.0 [km/h]. The vehicle can climb 50 [mm] ramp. 3.2 Arrangement of the Components Changed! The vehicle AMIGO has the upper stage and lower row. A personal computer for control and an E-Stops element were set on the upper stage. The lower row contains a power supply circuit, a DC-AC converter and will be used to set a payload. The 2-D color CCD cameras are used for image processing. One 2-D color CCD camera was set to the highest position of the vehicle within the regulation limit to acquire the most environmental information as possible. The other camera was set to the low front position of the vehicle to recognize the near environment. The laser radar for the obstacle detection and avoidance and for the follow the leader was set at the tip of the vehicle. 3.3 Body design Changed! The body covers the whole components except the personal computer and a computer roof was prepared to protect it form rain fall and to shade it from sunshine. The roof is removable. The cover is compactly made in lighter weight materials. We installed some small air blowers to prevent exothermic reaction of PC. 5

4. Electric and Sensing Hardware design 4.1 Electrical Design (1) Power Supply System The two 12V35AH batteries were mounted on the vehicle. The system produces DC12V and DC24V, and AC100V voltages. It provides DC12V voltage to the CCD camera and RS422-RS232C converter and DC24V voltage to the laser radar, and AC100V to the personal computer. (2) Controller hardware The vehicle can be operated either manually or autonomously. In manual mode, the speed of the vehicle and the angle of the steering are given by the joystick. In autonomous mode, the personal computer calculates the speed and the angle based on vision information acquired by the color CCD camera and the laser radar. When the E-Stop button is pushed or radio signal of E-Stop is transmitted to the E-stop circuit, the vehicle stops. (3) Battery life The 12 hours are the time necessary to charge the batteries form zero to full. The vehicle can run about 5 hours after full charging. 4.2 Sensing designs (1) Vision sensing Changed! The dragonfly vision system shown in Figure 2 was developed. Through the competition on the 1999, it was recognized that the main causes of the errors in detecting the lane of the course were due to the limitation of camera characteristics. The dynamic range of the camera is so narrow that it can not capture the image in the shadow area and sun shining area in one flame of image. Further, the width of the field of view was not sufficient to let the vehicle run reliably. Thus, since 2000 the Omni-directional optical method was employed to solve the 6

problems that we experienced in 1999. The omni-directional optical system is given by combining a color CCD camera with conventional lens and the hemisphere miller used for front miller for big track or bus. The miller was set such that the miller s center faces to the ground direction and the camera looks at the center of the miller from the lower side. This simple arrangement leads to the Omni-directional optical system from which the image from 360Degree is acquired. A color CCD camera with the automatic focus and automatic iris functions and with optical filter was employed. The vision system covers the field of 3m radius form the camera position. The image acquired by the camera is transmitted to the personal computer via the image capture board. This year, we employed one more CCD camera and set at the tip to look at near area. The Omni-directional camera installed high position covers wide area and the front camera look at near front area. (2) Obstacle sensing Changed! The laser radar with the scan angle of 180and with the angular resolution is 0.5is used to detect the obstacle. The longest measurement distance is 50m, the range resolution is 5cm, and the response time is 40ms. Figure 3 shows the laser radar. Figure 3 Laser radar 7

5. Software Design The computer software was programmed by using the application software installed in MATLAB on Microsoft Windows 9x.. The use of MATLAB is advantageous. I.e., the complicated programming is simplified, if we use the many attaching. And the software development time can be considerably shortened by the property of the interpreter language of MATLAB. The most effective function for our purpose can be found in the GUI (Graphical User s Interface) programming. For some special part, of programming MATLAB Executable functions by using C or C++ were developed. The each software was developed for each challenge. 5.1 Autonomous Challenge (1) Lane Detection New! The vision system that can robustly detect the lane of the course for various environmental change was developed. The lane detection method was designed to fully use of the advantage of the Omnidirectional camera. The captured image is that overlooked from top position of the vehicle, and it is intuitively understandable for a human. Thus the algorithm development becomes easy. However the captured image is strongly distorted. As shown in Figure 4-(a), the image is original with the strong distortion. Figure 4-(b) shown the reconstruct by the assumption that the ground is flat and horizontal. The reconstructed image is used to detect the lane. The Active Contour Model (ACM) Method to detect the lanes was employed. The method is known as the robust method to detect contour of target. The image processing of how to detect the lane by ACM is shown in Figure 4-(b) and Figure 4- (c). By the Additional camera at the tip, the errors of lane detection by the omni-directional camera is compensated. 8

(a) Original image (b) Reconstructed image (c) Extraction Figure 4 (2) Obstacle Detection (a) The method using the laser radar The data of the obstacle is acquired from the laser radar. The obstacle position is detected by the laser initially given by polar co-ordinate is converted into the x-y co-ordinate. The converted position data is mapped on the image. The route where the vehicle will pass, is calculated by the information of the obstacle size and how the obstacle continues. After passing the obstacle, if the vehicle recovers the steering angle soon, it may collide to the obstacle. To avoid such the condition, we keep the obstacle data in the memory to notice that the obstacle is still there. Figure 5 shows the debugging screen. Figure 5 Debugging screen (b) Obstacle detection by using color information In the competition, the variety of obstacles and thus variety of colors and shapes are selected. It is very difficult to correspond to the variety. The information of gray level of the image is not enough to 9

discriminate the variety and thus we employ the color information. The obstacles are mainly detected by the laser radar but those which is on the ground are frequently missed by the radar. The image obstacle avoidance strategy is employed to cover such the low-lying obstacle. First, the color information of the obstacle is beforehand preserved as knowledge using k-mean method. The colors of the obstacles are detected and compared with those preserved as the knowledge. (3) Autonomous Driving Control The use of omni-directional camera and the lane detection signal processing provide the information where the vehicle goes. Under the information above, the pre-view control (optimal control) can be employed. The pre-view control strategy was installed in the vehicle. In the event when the vehicle approaches to the lane and going to cross it, the front camera catch the situation and the control signal to avoid the crossing is generated. 10

5.2 Navigation Challenge New! (1) Navigation algorithm By using the GPS latitude and longitude information, the vehicle is navigated. However, GPS data may include an error and cannot guide, correctly to the destination. First, to obtain the necessary information, following procedures are carried out. (1) Transformation of global co-ordinate to local co-ordinate. (2) Calibration of the initial position. Figure 6 Second, to avoid the effect by the errors of GPS, the GPS data were statistically analyzed. The navigation algorithm, once the correct local co-ordinate is given, can be easily developed. (2) Vehicle control algorithm The destination position is calculated on the local co-ordinate. From the GPS information, the vehicle position is given. The GPS data are filtered to obtain more accurate self position. Once the starting self position is estimated, the starting angle of the vehicle toward the destination is initially adjusted and begins to run. During running, the steering is controlled based on the Gyro information. 11

5.3 Follow the Leader (1) Leader vehicle detection Changed! The follow the leader vehicle mounts the image sensor as well as laser radar. A new algorithm based on these two sensors is presented to follow the leader. The image sensor is special. It is omnidirectional camera The omni-directional camera captures by the image of 360 images vehicle can be expressed on the polar-co-ordinate on which the data obtained the laser radar are also expressed. The image and range information are naturally faced on the same co-ordinate. The conventional follow-the-leader vehicle was mounted only the laser radar. The new scheme with laser radar and omni-directional camera presents Figure 7 more robust sensing function than the conventional one. The image on the coordinate is used to catch the leader vehicle. The use of two different information yield the robustness of the following. (2) Follower vehicle control New! The follow-the-leader vehicle requires not only latitude control but also longitude control. The longitude control is carried out to keep the distance between two vehicles constant. As the control strategy, the sliding mode control which is one of the robust controls is used. The latitude control requires the special algorithm to prevent the driving on the short cut pass of the leader driving loci. A new algorithm to trace the leader vehicle driving loci is presented. 12

6. Other design Issue 6.1 Safety (1) Mechanical braking system The original base vehicle (electric-powered four-wheeled wheelchair) itself has a mechanical braking system for safety purposed. When both the speed control and steering control signals are 0, the vehicle is mechanically braked. Furthermore, if the all of the batteries of the vehicle are off or gone, again the vehicle is mechanically braked and locked. And, the sudden handling is limited in the high-speed driving in order to prevent the roll. (2) E-Stop We designed two different types of emergency stop mechanisms. The first one is a contact switch at the back of the pillar where the operator can easily touch. The second one is stopping via wireless transmission. The stop signal is transmitted by the transceiver. The reason why we use the transceiver is for transmitting various signals except for the stop signal. It has the wide communication range of 2km, if it is a wide site. This function is used only for debugging and must not be used in the competition. Figure 7 shows the picture of the E-Stop. Figure 8 E-Stop 6.2 Innovations The renewed items of the change vehicles are as follows; (1) Autonomous Challenge 13

Development the dragonfly vision system is the first innovation. Two cameras with different functions, i.e., one looks around 360view and the other looks at local focus area just like as the dragonfly vision were installed. (2) Navigation Challenge This challenge is the first trial and every this is innovative. Discrete GPS measurement method was employed. And the along the interval of the vehicle runs only based on the gyro information. (3) Follow the Leader The Omni-directional camera to detect the leader vehicle is the first innovation. New follower vehicle control algorithms i.e., the path tracking algorithm in latitude control and sliding mode control in longitude control, are presented. There are second innovations in the Follow the Leader Challenge. 14

6.3 Cost The costs of developing the AMIGO vehicle are summarized in Table 2. The most expensive item was the laser radar. The second-most expensive item was the base vehicle (electric four-wheeled chair). The third-most expensive item was the PC. Table 2 Cost Cost and time of vehicle s design Item Cost Remarks Electric Powered Wheel chair (SUZUKI $5,500 Base Vehicle Co.Ltd.) Personal Laptop Computer x 2 (Panasonic and Fujitsu) $3,800 Pentium III 800MHz Pentium MMX 150MHz CCD camera (SONY) x 2 $400 Transceiver $200 For wireless E-Stop Laser Radar (SICK) $9,000 Electronics Parts $400 Electric parts Mechanical Parts $250 Frame Steel Body Cover $100 Totals $19,650 15