Autonomous Landing of an Unmanned Aerial Vehicle

Similar documents
An Experimental Study of the Autonomous Helicopter Landing Problem

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

Design and Development of Unmanned Tilt T-Tri Rotor Aerial Vehicle

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

UNIVERSAL CONTROL METHODOLOGY DESIGN AND IMPLEMENTATION FOR UNMANNED VEHICLES. 8 th April 2010 Phang Swee King

Visual Servoing for Tracking Features in Urban Areas Using an Autonomous Helicopter

Unmanned Aerial Vehicles

Low Cost solution for Pose Estimation of Quadrotor

IMPROVING QUADROTOR 3-AXES STABILIZATION RESULTS USING EMPIRICAL RESULTS AND SYSTEM IDENTIFICATION

QBALL-X4 QUICK START GUIDE

Designing Simple Indoor Navigation System for UAVs

(1) and s k ωk. p k vk q

CS4758: Moving Person Avoider

Autonomous Indoor Hovering with a Quadrotor

A Control System for an Unmanned Micro Aerial Vehicle

Evaluating the Performance of a Vehicle Pose Measurement System

GPS denied Navigation Solutions

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Terminal Phase Vision-Based Target Recognition and 3D Pose Estimation for a Tail-Sitter, Vertical Takeoff and Landing Unmanned Air Vehicle

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

Cooperative Relative Localization for Moving UAVs with Single Link Range Measurements

Construction, Modeling and Automatic Control of a UAV Helicopter

CS 4758 Robot Navigation Through Exit Sign Detection

Performance Evaluation of Vision-Based Navigation and Landing on a Rotorcraft Unmanned Aerial Vehicle

Camera Drones Lecture 2 Control and Sensors

Marker Based Localization of a Quadrotor. Akshat Agarwal & Siddharth Tanwar

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

LibrePilot GCS Tutorial

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Indoor UAV Positioning Using Stereo Vision Sensor

Mapping Contoured Terrain Using SLAM with a Radio- Controlled Helicopter Platform. Project Proposal. Cognitive Robotics, Spring 2005

Intelligent Outdoor Navigation of a Mobile Robot Platform Using a Low Cost High Precision RTK-GPS and Obstacle Avoidance System

UAV Autonomous Navigation in a GPS-limited Urban Environment

DYNAMIC POSITIONING CONFERENCE September 16-17, Sensors

VISION-BASED UAV FLIGHT CONTROL AND OBSTACLE AVOIDANCE. Zhihai He, Ram Venkataraman Iyer, and Phillip R. Chandler

INSPIRE 1 Quick Start Guide V1.0

Epic Made Easy The Redesigned 350 QX3

Real-Time Embedded Control System for VTOL Aircrafts: Application to stabilize a quad-rotor helicopter

Electronic Letters on Science & Engineering 11(2) (2015) Available online at

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

INSPIRE 1 Release Notes

REAL FLIGHT DEMONSTRATION OF PITCH AND ROLL CONTROL FOR UAV CANYON FLIGHTS

Quadrotor Autonomous Approaching and Landing on a Vessel Deck

Inertial Systems. Ekinox Series TACTICAL GRADE MEMS. Motion Sensing & Navigation IMU AHRS MRU INS VG

This document describes the user interface, navigation and the common resources that are available in SvalSim

INSPIRE 1 Release Notes

Motion estimation of unmanned marine vehicles Massimo Caccia

Use of Geo-referenced Images with Unmanned Aerial Systems

Modeling, Parameter Estimation, and Navigation of Indoor Quadrotor Robots

Learning Helicopter Model Through Examples

Autonomous Navigation for Flying Robots

Disturbance Rejection in Multi-Rotor Unmanned Aerial Vehicles Using a Novel Rotor Geometry

Dynamical Modeling and Controlof Quadrotor

GPS-Aided Inertial Navigation Systems (INS) for Remote Sensing

ALGORITHMS FOR DETECTING DISORDERS OF THE BLDC MOTOR WITH DIRECT CONTROL

Outline Plan UAV with stabilized camera

AUTONOMOUS IMAGE EXTRACTION AND SEGMENTATION OF IMAGE USING UAV S

Neural Networks Terrain Classification using Inertial Measurement Unit for an Autonomous Vehicle

EE5110/6110 Special Topics in Automation and Control. Autonomous Systems: Unmanned Aerial Vehicles. UAV Platform Design

A Novel Marker Based Tracking Method for Position and Attitude Control of MAVs

Non-symmetric membership function for Fuzzy-based visual servoing onboard a UAV

Robust Controller Design for an Autonomous Underwater Vehicle

Quadrotor Control Using Dual Camera Visual Feedback

Simplified Orientation Determination in Ski Jumping using Inertial Sensor Data

Improving Vision-Based Distance Measurements using Reference Objects

Using constraint propagation for cooperative UAV localization from vision and ranging

Landing a UAV on a Runway Using Image Registration

Autonomous Navigation for Flying Robots

CHAPTER 2 SENSOR DATA SIMULATION: A KINEMATIC APPROACH

This was written by a designer of inertial guidance machines, & is correct. **********************************************************************

CAMERA GIMBAL PERFORMANCE IMPROVEMENT WITH SPINNING-MASS MECHANICAL GYROSCOPES

An Intro to Gyros. FTC Team #6832. Science and Engineering Magnet - Dallas ISD

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements

navigation Isaac Skog

Motion Planning for an Autonomous Helicopter in a GPS-denied Environment

Quick Start Guide.

A Tale of Two Helicopters

Dynamic Modelling for MEMS-IMU/Magnetometer Integrated Attitude and Heading Reference System

Relating Local Vision Measurements to Global Navigation Satellite Systems Using Waypoint Based Maps

BRx6 Sensor Calibration Using SurvCE/SurvPC

FlightGear application for flight simulation of a mini-uav

VISION-ONLY AIRCRAFT FLIGHT CONTROL

Experimental results of a Differential Optic-Flow System

9 Degrees of Freedom Inertial Measurement Unit with AHRS [RKI-1430]

EE631 Cooperating Autonomous Mobile Robots

Thesis Project. UAV: Feature Based Navigation. Thesis Report. By: Wang Yui Kong ( )

Chapter 13. Vision Based Guidance. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,

Test Report iµvru. (excerpt) Commercial-in-Confidence. imar Navigation GmbH Im Reihersbruch 3 D St. Ingbert Germany.

POME A mobile camera system for accurate indoor pose

Identification of a UAV and Design of a Hardware-in-the-Loop System for Nonlinear Control Purposes

Graph-based SLAM (Simultaneous Localization And Mapping) for Bridge Inspection Using UAV (Unmanned Aerial Vehicle)

Quadrotor Flight Control Software Development

Tracking and Control of a Small Unmanned Aerial Vehicle using a Ground-Based 3D Laser Scanner

SAE Aerospace Control & Guidance Systems Committee #97 March 1-3, 2006 AFOSR, AFRL. Georgia Tech, MIT, UCLA, Virginia Tech

Autonomous Navigation System of Crawler-Type Robot Tractor

State Space System Modeling of a Quad Copter UAV

Survey on Computer Vision for UAVs: Current Developments and Trends

Soft Computing Control System of an unmanned airship

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

AUTONOMOUS PLANETARY ROVER CONTROL USING INVERSE SIMULATION

Transcription:

Autonomous Landing of an Unmanned Aerial Vehicle Joel Hermansson, Andreas Gising Cybaero AB SE-581 12 Linköping, Sweden Email: {joel.hermansson, andreas.gising}@cybaero.se Martin Skoglund and Thomas B. Schön Division of Automatic Control Linköping University SE-581 83 Linköping, Sweden Email: {ms, schon}@isy.liu.se Abstract This paper is concerned with the problem of autonomously landing an unmanned aerial vehicle (UAV) on a stationary platform. Our solution consists of two parts, a sensor fusion framework producing estimates of the UAV state and a control system that computes appropriate actuator commands. There are three sensors used, a camera, a GPS and a compass. Besides the description of the solution, we also present experimental results illustrating the results obtained in using our system to autonomously land an UAV. I. INTRODUCTION This is an industry application paper, where we provide a solution to the problem of autonomously landing an unmanned helicopter, hereafter also referred to as an unmanned aerial vehicle (UAV), using measurements from a GPS, a compass and a camera. In Figure 1 we show a successful landing during winter conditions. We will in this paper briefly describe the problem and our solution, which we divide into two parts, the sensor fusion framework and the control system. The sensor fusion framework produce estimates of the UAV state and the control system that computes appropriate actuator commands. These systems are very briefly introduced in Section II and Section IV, respectively. Furthermore, we will also provide results from an experimental evaluation of our solution. The work presented in this paper constitutes an important part of CybAero s patented system for landing helicopters on moving vehicles. This system is called MALLS, Mobile Automatic Launch and Landing Station, and holds subsystems for close navigation, precision landing, post landing helicopter fixation, wave compensation and take-off support. In a recent overview of aerial robotics [1], the problem of autonomously landing a UAV is pinpointed as an active research area, where more work is needed. Despite this, the problem is by no means new, several interesting approaches have been presented, see e.g., [2] [6]. In [2] the authors makes use of a GPS and a vision system to autonomously land a UAV on a pattern similar to ours. They report 14 test flights and the average position accuracy is 4 cm. Compared to this, our fifteen landings during the last test flight resulted in a mean position accuracy of 34 cm. Furthermore, the vision system in [2] only provides estimates of the horizontal position and the heading angle, whereas our sensor fusion framework also provide estimates of the altitude and the roll and pitch angles. Fig. 1. A snapshot of a successful outdoor autonomous precision landing. II. SENSORS AND SENSOR FUSION In the sensor fusion framework we use three sensors; a camera, a GPS and a compass. The GPS provides measurements of the position and the velocity and the compass measures the heading. The camera is used by the vision system, which provides measurements of the position and orientation (pose) of the UAV relative to the platform. A time varying bias in the s implies that the vision system is required in order to land with high precision. The measurements from the sensors introduced above are combined into a resulting state estimate using an Extended Kalman Filter (EKF), see for instance [7]. The output from the EKF is an estimate of the state X, which is given by X = [ Ẍ Ÿ Z Ẋ Ẏ Ż X Y Z...... θ ϕ γ θ ϕ γ PX P Y P Z P θ ] T (1) where X, Y and Z are the position of the helicopter in the ground coordinate system, θ, ϕ and γ are Euler angles describing the orientation of the helicopter. Futhermore, P X, P Y and P Z are the position of the platform in the ground coordinate system and P θ describes the orientation of the landing platform, which is assumed to lie flat on the ground.

A. GPS The GPS provides measurements of the position and the velocity of the helicopter by measuring the time it takes for the radio signal to travel from the satellites to the receiver. There is an unknown time varying bias present in these measurements. The bias is for example caused by inaccurate estimation of satellite positions, the signals from the satellites bouncing on objects in the environment before reaching the receiver and time measurement inaccuracies. The bias in the s can be seen in Figure 2 which shows measurements from a GPS lying still on the ground. Latitude Position [m].8.6.4.2.2.4.6 satellite change.8.6.4.2.2.4.6.8 Longitude position [m] Fig. 2. Measurements from the GPS lying still on the ground for three minutes. The bias in the measurements is obvious as the actual position is constant. The arrow shows a sudden jump in the measurements caused by a change in the satellites included in the GPS position estimation. B. Vision system The vision system uses camera images together with a predefined pattern of rectangles on the platform, see Figure 3, to estimate the helicopter pose relative to the platform. The estimation is based on finding points in the camera image which corresponds to known positions on the platform. In this case, the known points are the corners of the white rectangles. Algorithm 1 Track pattern calculate the expected corners and centers of the blobs given an approximate position of the camera and the landing platform calculate a threshold separating the dark background from the white rectangles for each blob b do get the expected middle m of b for each expected corner c of b do set v to the normalized vector from m to c set the new corner c new = m while moved do set moved = false set v 45 to v rotated 45 degrees CW set v 45 to v rotated 45 degrees CCW if image intensity at c new + v threshold then set c new = c new + v else if image intensity at c new + v 45 threshold then set c new = c new + v 45 else if image intensity at c new + v 45 threshold then set c new = c new + v 45 end if end while the new position of c is c new end for end for III. CONTROL SYSTEM Figure 4 gives an overview of the control system. The Helicommand is a pilot support system that uses accelerometers and gyrosscopes to stabilize the helicopter. The high level controller uses the low level controller to actuate its decisions. Fig. 3. A photograph of the landing platform. In order to get the vision system to run in real time a fast algorithm for finding the corners of the pattern rectangles is required. Algorithm 1 uses the knowledge of an approximate position of the landing platform and the camera in order to speed up the image processing. Given this knowledge, only a small part of the image needs to be searched in order to find the corners. Using Algorithm 1, the vision system can easily run at the camera frame rate, which is 3 Hz. Fig. 4. An overview of the controller hierarchy. The low level controller provides the essential hovering capability and controls the helicopter to a certain reference position. The high level controller makes larger decisions such as which position to fly to, when to land and so on. The low level controller consists of anti-windup PIDcontrollers for roll, elevate and collective pitch and a customized P-controller for the heading (yaw). The initial parameters were obtained from simulations using a model identified from flight test data. The parameters were then further tuned

during test flights. The high level controller consists of two modes, referred to as assisted mode and landing mode, respectively. In assisted mode, the ground station operator is able to command the helicopter to certain positions. In landing mode, the UAV finds the landing platform given an approximate GPS position and uses the vision system to land autonomously with high precision. The landing strategy employed by the high level controller is provided in Algorithm 2. The goal is to always keep the helicopter above the platform in the horizontal position while decreasing the altitude stepwise. The initial altitude is 5 m above the platform and this altitude is decreased when certain conditions apply according to Algorithm 2. One problem during landing is to actually get the helicopter down to the ground. When the helicopter is less than one meter from the ground the dynamics changes. The helicopter becomes more sensitive in its horizontal position and less lift is required to keep the helicopter flying. This is called the ground effect and is caused by disturbance of the airflow around the helicopter when it comes close to the ground. A solution to this problem is to get the helicopter down to the ground quickly. Therefore, the last step in the landing phase is to set the reference in the altitude to 3. m below the platform. This takes the helicopter down fast to avoid drifting away in the horizontal position, but still not too fast in order to get a smooth landing. Algorithm 2 Landing algorithm set the reference height above the platform, rh = 5.m while controlling do set the horizontal reference position to the estimated position of the platform set d to the distance between the helicopter and the platform set s to the helicopter speed set a to the difference in the actual height above the platform and the reference height above the platform if d <.7m AND s <.2 AND a <.2 AND rh > 2.m then rh = rh - 1. else if d <.5m AND s <.15 AND a <.2 AND rh > 1.m then rh = rh.5m else if d <.2m AND s <.1 AND a <.2 AND rh >.m then rh = 3.m end if end while In Figure 5 data from a real test flight is shown. The data is demonstrating how the helicopter finds the platform before landing. During the first 1 seconds the helicopter was hovering above its starting position (, ). After that the high level controller was changed from assisted to landing mode. The high level controller immediately changes the reference position to an approximate platform position that has been given before the test started. We can see that the helicopter starts moving towards this position by viewing the s. When the helicopter is close to the platform, the vision system detects the platform and starts producing measurements. Using these measurements the EKF estimates a new platform position which again causes a sudden change in the reference position. Instead of modeling the GPS bias in the EKF the landing platforms position is changed. Therefore, the platform and hence the reference position keeps changing after it first has been seen. Position [m] 1 8 6 4 2 Reference position Camera measurements 2 5 5 1 15 2 25 3 Time [s] (a) Longitude position. Position [m] 2 2 4 6 8 1 Reference position Camera measurements 12 5 5 1 15 2 25 3 Time [s] (b) Latitude position. Fig. 5. Figure (a)-(b) show GPS and vision system position estimates form real test flight data. In order to compare the GPS and the camera measurements, the camera measurements have been transformed into the ground coordinate system. The reference position, which is the desired position of the helicopter, is also given. From the start the reference position is the origin of the ground coordinate system. After 1 seconds the helicopter is commanded to the platform and then the reference position is the same as the platform position estimate. IV. HARDWARE ARCHITECTURE The system mainly consists of a helicopter and a ground control station. The ground control station receives data from the helicopter through a wireless network in order to continuously provide the operator with information about the helicopter state. The operator can also send commands over the network to the helicopter. The helicopter used is an Align TREX 6 model helicopter, see Figure 6. It weights 5 kg with all the equipment and has an electric engine. Fig. 6. The helicopter and its equipment. The helicopter can be operated in two modes; autonomous mode and manual mode. In manual mode the pilot has command of the control signals and in autonomous mode the computer controls the helicopter. This is done by a switch, see

Figure 7, which is controlled by the pilot. This means that the pilot can take control of the helicopter at any time. In both manual and autonomous mode the helicopter is controlled through a pilot support system called Helicommand, see Figure 7. The Helicommand uses accelerometers and gyroscopes to stabilize the helicopter. As already described in Section II, three different sensors were used; a GPS, a compass and a camera which provide the control computer with measurements. The measurements are used to compute an estimate of the state of the system in order to make control decisions. A test of the landing algorithm starts by providing the high level controller with an approximate position of the landing platform. After that, the pilot takes the helicopter to a certain altitude and then switches to autonomous mode. When the controllers have settled, the user of the ground station changes the high level controller from assisted to land mode. This causes the high level controller to initiate the landing sequence according to Algorithm 2. B. Results A perfect landing has the center of the helicopter positioned in the middle of the landing platform. During development and tuning of the landing algorithm many landings have been performed. During the last tests the distance between the center of the helicopter and the middle of the platform was measured. Fifteen landings were made in these tests and the results are shown in Figure 8. As the figure shows, fourteen of these landings are within.5 m from the center of the platform and most of them are a lot better. The average Euclidean distance from the landing target was 34 cm. Fig. 9 shows pictures from one of the landing sequences. The UAV enters the landing mode roughly five meters above the landing platform and then slowly descends until it has landed. 1 8 6 4 2 [cm] 2 4 Fig. 7. An overview of the system. The communication direction is marked with arrows. V. FLIGHT TESTS Real flight tests has been made in order to develop and validate the controllers and the landing strategies. Bellow, the experimental setup and the results of these tests are described. A. Experimental Setup During all flight tests a backup pilot, that can control the helicopter using an RC-controller, is available. The pilot can switch the helicopter between manual and autonomous mode. In manual mode the pilot controlls the helicopter and in autonomous mode the control computer has command over the helicopter. In this work no algorithm for starting the helicopter has been developed and therefore this is carried out by the pilot. The pilot has also been supporting the development by taking command of the helicopter when something goes wrong. 6 8 1 1 8 6 4 2 2 4 6 8 1 [cm] Fig. 8. The results from the last landing tests. The cross shows a perfect landing and the dots is where the helicopter acctually landed. The outer circle is 1. m and the inner circle.5 m from a perfect landing. VI. CONCLUSION AND FUTURE WORK Now that we can land on a stationary platform, the obvious next step is to move on to perform autonomous landing on moving platforms as well. The first step will be to land on a moving truck. The final challenge is to land on a ship, which moves in six dimensions due to the wave motions. REFERENCES [1] E. Feron and E. N. Johnson, Ch 44. Aerial Robotics, in Springer Handbook of Robotics. Berlin, Germany: Springer, 28, pp. 19 129, Ed. Siciliano, B. and Khatib, O.

[2] S. Saripalli, J. F. Montgomery, and G. S. Sukhatme, Visually guided landing of an unmanned aerial vehicle, IEEE Transactions on Robotics and Automation, vol. 19, no. 3, pp. 371 38, Jun. 23. [3] S. Saripalli and G. Sukhatme, Landing a helicopter on a moving target, in IEEE International Conference on Robotics and Automation (ICRA), Roma, Italy, Apr. 27, pp. 23 235. [4] C. Theodore, D. Rowley, D. Hubbard, A. Ansar, Matthies, S. Goldberg, and M. Whalley, Flight trials of a rotorcraft unmanned aerial vehicle landing autonomously at unprepared sites, in Proceedings of the American Helicopter Society 62nd Annual Forum, Phoenix, AZ, USA, May 26. [5] P. Corke, An inertial and visual sensing system for a small autonomous helicopter, Journal of Robotic Systems, vol. 21, no. 2, pp. 43 51, 24. [6] F. Kendoul, Z. Yu, and K. Nonami, Guidance and nonlinear control system for autonomous flight of minirotorcraft unmanned aerial vehicles, Journal of Field Robotics, 21, in press. [7] T. Kailath, A. H. Sayed, and B. Hassibi, Linear Estimation. Upper Saddle River, New Jersey: Prentice Hall, 2. Fig. 9. Sequence illustrating an autonomous helicopter landing. In the top photograph, the UAV has just arrived, using the assisted mode, at a GPS position roughly 5 m above the landing platform. Here, the landing mode is activated and the UAV lands autonomously, relying on the camera and the vision system for measurements.