Marker Based Localization of a Quadrotor. Akshat Agarwal & Siddharth Tanwar

Similar documents
Camera Drones Lecture 2 Control and Sensors

Low Cost solution for Pose Estimation of Quadrotor

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Lecture: Autonomous micro aerial vehicles

GPS denied Navigation Solutions

Sphero Lightning Lab Cheat Sheet

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Unmanned Aerial Vehicles

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

Computationally Efficient Visual-inertial Sensor Fusion for GPS-denied Navigation on a Small Quadrotor

Lecture 13 Visual Inertial Fusion

Collaboration is encouraged among small groups (e.g., 2-3 students).

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Analysis of Extended Movement Models for Autonomous Quadrocopters

INTEGRATED TECH FOR INDUSTRIAL POSITIONING

UNIVERSAL CONTROL METHODOLOGY DESIGN AND IMPLEMENTATION FOR UNMANNED VEHICLES. 8 th April 2010 Phang Swee King

A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models

navigation Isaac Skog

Autonomous Landing of an Unmanned Aerial Vehicle

State Space System Modeling of a Quad Copter UAV

9 Degrees of Freedom Inertial Measurement Unit with AHRS [RKI-1430]

Ceilbot vision and mapping system

Visual SLAM for small Unmanned Aerial Vehicles

CAMERA GIMBAL PERFORMANCE IMPROVEMENT WITH SPINNING-MASS MECHANICAL GYROSCOPES

CS283: Robotics Fall 2016: Sensors

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

Autonomous Indoor Hovering with a Quadrotor

Selection and Integration of Sensors Alex Spitzer 11/23/14

Stable Vision-Aided Navigation for Large-Area Augmented Reality

E80. Experimental Engineering. Lecture 9 Inertial Measurement

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

Major project components: Sensors Robot hardware/software integration Kinematic model generation High-level control

Autonomous Flight in Unstructured Environment using Monocular Image

Exterior Orientation Parameters

Perspective Sensing for Inertial Stabilization

Motion estimation of unmanned marine vehicles Massimo Caccia

Visual Pose Estimation System for Autonomous Rendezvous of Spacecraft

Unscented Kalman Filter for Vision Based Target Localisation with a Quadrotor

Visual-Inertial Localization and Mapping for Robot Navigation

Pose Estimation and Control of Micro-Air Vehicles

AE483: Lab #1 Sensor Data Collection and Analysis

QBALL-X4 QUICK START GUIDE

Movit System G1 WIRELESS MOTION DEVICE SYSTEM

Simplified Orientation Determination in Ski Jumping using Inertial Sensor Data

Elective in Robotics. Quadrotor Modeling (Marilena Vendittelli)

Robot Localization based on Geo-referenced Images and G raphic Methods

EE631 Cooperating Autonomous Mobile Robots

Dynamical Modeling and Controlof Quadrotor

Visual Odometry. Features, Tracking, Essential Matrix, and RANSAC. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Robust marker-tracking system for vision-based autonomous landing of VTOL UAVs

Planar-Based Visual Inertial Navigation

Progress Review 12 Project Pegasus. Tushar Agrawal. Team A Avengers Ultron Teammates: Pratik Chatrath, Sean Bryan

Control of a quadrotor manipulating a beam (2 projects available)

Advanced Imaging Applications on Smart-phones Convergence of General-purpose computing, Graphics acceleration, and Sensors

Specifications. Inspire 2. Aircraft (Model: T650) 7.25 lbs (3290 g, including two batteries, without gimbal and camera)

PHANTOM 4 PRO SPECS AIRCRAFT VISION SYSTEM. HomeProductsPhantom 4 ProSpecs. Weight (Battery & Propellers Included) 1388 g

P/N: FLIR Aerial Commercial Building Inspector Kit (30 Hz) Other output formats Social media

Implementation of UAV Localization Methods for a Mobile Post-Earthquake Monitoring System

UAV Autonomous Navigation in a GPS-limited Urban Environment

IMPROVING QUADROTOR 3-AXES STABILIZATION RESULTS USING EMPIRICAL RESULTS AND SYSTEM IDENTIFICATION

Camera and Inertial Sensor Fusion

MEAM 620: HW 1. Sachin Chitta Assigned: January 10, 2007 Due: January 22, January 10, 2007

ROBOTICS AND AUTONOMOUS SYSTEMS

Inertial Systems. Ekinox Series TACTICAL GRADE MEMS. Motion Sensing & Navigation IMU AHRS MRU INS VG

AR Drone Setup with ROS and Sensor Data Fusion using AR Drone s Accelerometer and Gyroscope

Robotics and Autonomous Systems

Robotics and Autonomous Systems

ViconMAVLink: A SOFTWARE TOOL FOR INDOOR POSITIONING USING A MOTION CAPTURE SYSTEM

Visual Pose Estimation and Identification for Satellite Rendezvous Operations

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

Vol agile avec des micro-robots volants contrôlés par vision

Quadrotor Autonomous Approaching and Landing on a Vessel Deck

Evaluating the Performance of a Vehicle Pose Measurement System

Improved line tracker using IMU and Vision for visual servoing

7.58 lbs (3440 g, including two batteries, without gimbal and camera)

Sensor technology for mobile robots

Real-Time Embedded Control System for VTOL Aircrafts: Application to stabilize a quad-rotor helicopter

ALGORITHMS FOR DETECTING DISORDERS OF THE BLDC MOTOR WITH DIRECT CONTROL

Overview. EECS 124, UC Berkeley, Spring 2008 Lecture 23: Localization and Mapping. Statistical Models

IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 2, NO. 1, JANUARY Robust and Accurate Monocular Visual Navigation Combining IMU for a Quadrotor

Exam in DD2426 Robotics and Autonomous Systems

CSE 527: Introduction to Computer Vision

Inspire 2 Release Notes

Target Shape Identification for Nanosatellites using Monocular Point Cloud Techniques

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

DYNAMIC POSITIONING CONFERENCE September 16-17, Sensors

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Dynamic Modelling for MEMS-IMU/Magnetometer Integrated Attitude and Heading Reference System

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry

Time-to-Contact from Image Intensity

A Control System for an Unmanned Micro Aerial Vehicle

Design and Development of Unmanned Tilt T-Tri Rotor Aerial Vehicle

DriftLess Technology to improve inertial sensors

Introduction to Inertial Navigation (INS tutorial short)

Robot Mapping. SLAM Front-Ends. Cyrill Stachniss. Partial image courtesy: Edwin Olson 1

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Anibal Ollero Professor and head of GRVC University of Seville (Spain)

Transcription:

Marker Based Localization of a Quadrotor Akshat Agarwal & Siddharth Tanwar

Objective

Introduction Objective: To implement a high level control pipeline on a quadrotor which could autonomously take-off, hover over a marker and land on it with high precision. Quadrotors have Vertical TakeOff and Landing (VTOL) ability Limited flight time because of battery technology In any autonomous deployment, quads must be able to find a suitable landing pad and land on it In any long term deployment, quads need ability to land on a charging platform and dock with it, autonomously - Highly precise! Paper Followed: Yang, Shuo, et al. "Precise quadrotor autonomous landing with SRUKF vision perception." 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015.

What makes landing difficult? Ground effect: When a quad flies close to the ground, the air being pushed downward by rotors has no place to go, build up in air pressure. Gives non-linear lifting forces, making landing much more unstable. Solution: Use landing platform above the ground Mechanical docking of chargers needs extremely high precision. Mitigating solution: Use guiding mechanical structures, like a cone Target: Perform landing with error < 5cm

Hardware Setup

Nayan Quadrotor A high performance quadrotor by Aarav Unmanned Systems (AUS) Uses an Odroid-XU4 on-board computer running Lubuntu 14.04 (ARM Octa-core, 2GB RAM) Twin ARM Cortex M4 Processor with a RTOS (Real-time OS) for the flight controller (HLP + LLP) Image Source: http://www.aus.co.in/

Sensor Setup on Nayan IMU and Gyro Monocular Camera PX4Flow Camera Monocular Camera Image Source: http://www.aus.co.in/

Monocular Camera Matrix Vision Bluefox USB 2.0 MLC (high quality gray scale CMOS) camera Resolution : 752 x 480 Max. frame rate [Hz] : 90 Adjustable exposure and gain for adapting to low lighting conditions, but performs much better in well lit environments The package - Bluefox Driver on ROS Image Source: http://www.aus.co.in/

PX4Flow (Optical Flow sensor) Optical flow is the pattern of apparent motion of objects, surfaces and edges in a scene caused by the relative motion between an observer and the scene Optical Flow processing (gives x-,y- velocities) @ 400 Hz Installed facing downwards Supposed to work in both indoor and outdoor low-lighting conditions A supplementary sonar sensor gives distance to ground The package - PX4Flow Driver on ROS Image Source: http://www.aus.co.in/

IMU and Gyroscope Provides linear acceleration (-8G to +8G) and angular rates (max 2000 deg/sec) to flight controller Linear acc using accelerometers, and changes in pitch/roll using gyroscopes An absolute reference frame (towards North) Installed on the flight controller board - Body frame Used to estimate orientation of the quad Image Source: https://pixhawk.org/

Architecture Flight Controller IMU & Gyro USB PX4Flow Optical Flow Image Source : Yang, Shuo, et al. "Precise quadrotor autonomous landing with SRUKF vision perception." 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015

Libraries

ROS - Robot Operating System A flexible framework for writing robot software and managing inter-process communication ROS offers a huge community, all sensors have robust ROS packages Used ROS-Indigo on Nayan to integrate all sensors and observe from ground station

ArUco Markers and Library Provides a library to generate markers that are easily detectable via camera Comes with an image processing pipeline as well to obtain the pose of the marker in Camera Frame Integrates with ROS through the aruco_ros package

Explored Frameworks

Marker Detection The marker detection and identification steps are: Image segmentation Contour extraction and filtering Marker Code extraction Marker identification and error correction Obtain the corner points of detected marker Since real size of marker is known, correspondence easily established PnP problem - solved to get R,t! Image Source : Yang, Shuo, et al. "Precise quadrotor autonomous landing with SRUKF vision perception." 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015

Rotation Compensation IMU rotation better than the rotation from PnP solver (ArUco), especially in pitch and roll components Decomposes rotation into two components: R = R tilt R torsion, where R torsion only involves rotation around yaw axis, while R tilt contains the rotation on pitch and roll axes. The torsion component of PnP solver rotation is multiplied with tilt component of IMU rotation, to get a stable, precise rotation measure Image Credits : Yang, Shuo, et al. "Precise quadrotor autonomous landing with SRUKF vision perception." 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015

SRUKF (Square Root - UKF) UKF: Most computationally intensive step is calculating new set of sigma points at each time update In SRUKF, the square root S of P is propagated directly For state-space formulation, it has time complexity О(L 2 ) unlike UKF which has time complexity О(L 3 ) Is also numerically more stable and guarantees PSD-ness of the state covariances Image Source : Yang, Shuo, et al. "Precise quadrotor autonomous landing with SRUKF vision perception." 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015

Models Motion Model Measurement Model

UKF and KF After trying with previous model using UKF, we decided to use a linear model and a simple Kalman Filter on it State x = [x;y;z;x ;y ;z ] Linear motion model where x i = x i-1 + x i-1 *dt The action u = [a x ;a y ;a z ] Measurement model observes translation vector from camera, velocity from optical flow sensor Height from ground also observed using sonar

Data Flow

IMU Linear Acceleration PX4 Velocity Translation from ArUco PX4 Ground Distance KF Predict KF Update State output Rotation from IMU Rotation from ArUCo Rotation Compensation (for transformation to world frame)

What did we accomplish?

Testing: Sensor Set Problems with Nayan Used Pixhawk + Camera + PX4Flow camera to make a Rosbag File PixHawk BlueFox Camera PX4Flow Camera

ArUco Marker Detection

A Localization Example

Conclusion and Future work We built an architecture for implementing SRUKF, UKF or KF on the quadrotor Tracked pose of the quadrotor using the ArUco markers Tracked Pose of the quadrotor using variants of Kalman Filter Use the UKF/SRUKF/KF on Nayan Platform Send controls to the quad s flight controller and observe its performance, tuning the filter accordingly to get precision landing

Acknowledgements Thanks to Mr. Krishna Raj Gaur for helping us setup the testing module in the absence of the quadrotor Thanks to Mr. Shakti and Mr. Radhe Shyam, Intelligent Systems Lab for their support

Thank You Questions?