Chapter 13. Vision Based Guidance. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,

Similar documents
Introduction to quaternions. Mathematics. Operations

Chapter 12. Path Planning. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,

Motion estimation of unmanned marine vehicles Massimo Caccia

Vision-based Target Geo-location using a Fixed-wing Miniature Air Vehicle

This was written by a designer of inertial guidance machines, & is correct. **********************************************************************

Image Fusion of Video Images and Geo-localization for UAV Applications K.Senthil Kumar 1, Kavitha.G 2, Subramanian.

Tracking Multiple Mobile Targets Using Cooperative Unmanned Aerial Vehicles

Autonomous Navigation for Flying Robots

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Selection and Integration of Sensors Alex Spitzer 11/23/14

SAE Aerospace Control & Guidance Systems Committee #97 March 1-3, 2006 AFOSR, AFRL. Georgia Tech, MIT, UCLA, Virginia Tech

Stereo Observation Models

Methods for Localization and Mapping Using Vision and Inertial Sensors

Midterm Exam Solutions

Camera Model and Calibration

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Robotics - Projective Geometry and Camera model. Marcello Restelli

Autonomous Vehicle Navigation Using Stereoscopic Imaging

Unscented Kalman Filter for Vision Based Target Localisation with a Quadrotor

CHAPTER 2 SENSOR DATA SIMULATION: A KINEMATIC APPROACH

Vision-based Target Geo-location Using a Fixedwing Miniature Air Vehicle

Vision-Based Road-Following Using Proportional Navigation

Accurate Target Geolocation and Vision-Based Landing with Application to Search and Engage Missions for Miniature Air Vehicles

Autonomous Vehicle Navigation Using Stereoscopic Imaging

Integrated Estimation, Guidance & Control II

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles

Photogrammetry Metadata Set for Digital Motion Imagery

Inflight Alignment Simulation using Matlab Simulink

Stereo camera de-calibration detection based on observing kinematic attributes of detected objects and the camera rig

Autonomous Navigation for Flying Robots

Chapters 1 9: Overview

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Using Motion Fields to Estimate Video Utility and Detect GPS Spoofing

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI

Unmanned Aerial Vehicles

Camera Model and Calibration. Lecture-12

Guided Problem Solving

Smartphone Video Guidance Sensor for Small Satellites

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Navigation of a mobile robot on the temporal development of the optic

Exterior Orientation Parameters

UAV Autonomous Navigation in a GPS-limited Urban Environment

Vision-Aided Inertial Navigation for Flight Control

Preview. Two-Dimensional Motion and Vectors Section 1. Section 1 Introduction to Vectors. Section 2 Vector Operations. Section 3 Projectile Motion

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Lecture 13 Visual Inertial Fusion

Modeling, Parameter Estimation, and Navigation of Indoor Quadrotor Robots

Autonomous Mobile Robot Design

1.1.1 Orientation Coordinate Systems

Position Analysis

Tracking a Target in Wind Using a Micro Air Vehicle with a Fixed Angle Camera

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

Semester 2 Review Problems will be sectioned by chapters. The chapters will be in the order by which we covered them.

Calculating the Velocity of a Ball Moving Toward or Away From a Camera with ProAnalyst

Video integration in a GNSS/INS hybridization architecture for approach and landing

navigation Isaac Skog

Line of Sight Stabilization Primer Table of Contents

Self-calibration of a pair of stereo cameras in general position

Visual Recognition: Image Formation

Vision-Aided Inertial Navigation for Flight Control

Payload Directed Flight of Miniature Air Vehicles

Inspire 2 Release Notes

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

THE VIEWING TRANSFORMATION

Optical Flow-Based Person Tracking by Multiple Cameras

University of Technology Building & Construction Department / Remote Sensing & GIS lecture

Leaderless Formation Control for Multiple Autonomous Vehicles. Wei Ren

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Vision-based Local-level Frame Mapping and Planning in Spherical Coordinates for Miniature Air Vehicles

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 COORDINATE TRANSFORMS. Prof. Steven Waslander

(1) and s k ωk. p k vk q

Keywords: UAV, Formation flight, Virtual Pursuit Point

Camera Drones Lecture 2 Control and Sensors

Robotics (Kinematics) Winter 1393 Bonab University

An idea which can be used once is a trick. If it can be used more than once it becomes a method

Satellite Attitude Determination II

Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy

Guidance and obstacle avoidance of MAV in uncertain urban environment

Chapters 1-4: Summary

Massachusetts Institute of Technology Department of Computer Science and Electrical Engineering 6.801/6.866 Machine Vision QUIZ II

Inspire 1 Pro Release Notes

Jacobian of Point Coordinates w.r.t. Parameters of General Calibrated Projective Camera

PSE Game Physics. Session (3) Springs, Ropes, Linear Momentum and Rotations. Oliver Meister, Roland Wittmann

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

PRICISE TARGET GEOLOCATION BASED ON INTEGERATION OF THERMAL VIDEO IMAGERY AND RTK GPS IN UAVS

Inertial Navigation Systems

Autonomous Landing of an Unmanned Aerial Vehicle

Projectile Trajectory Scenarios

THE UNIVERSITY OF AUCKLAND

Design of a Three-Axis Rotary Platform

VISION-BASED UAV FLIGHT CONTROL AND OBSTACLE AVOIDANCE. Zhihai He, Ram Venkataraman Iyer, and Phillip R. Chandler

Geometric Stereo Increases Accuracy of Depth Estimations for an Unmanned Air Vehicle with a Small Baseline Stereo System

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

9.5 Polar Coordinates. Copyright Cengage Learning. All rights reserved.

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

Low Cost solution for Pose Estimation of Quadrotor

Transcription:

Chapter 3 Vision Based Guidance Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide

Architecture w/ Camera targets to track/avoid vision-based guidance waypoints status path manager path definition tracking error airspeed, altitude, heading, commands path following autopilot position error servo commands wind unmanned aircraft state estimator on-board sensors ˆx(t) camera Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

Gimbal and Camera Reference Frames Gimbal- frame: F g =(i g, j g, k g ) Gimbal frame: F g =(i g, j g, k g ) Camera frame: F c =(i c, j c, k c ) Gimbal- frame: Rotate body frame about k b axis by gimbal azimuth angle, az R g b ( az) 4 = @ cos az sin az sin az cos az A Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3

Gimbal Reference Frames Gimbal frame: Rotate gimbal- frame about j g axis by gimbal elevation angle, el R g g ( el) 4 = R g b = Rg g Rg b = cos el sin el @ A sin el cos el @ cos el cos az cos el sin az sin el sin az cos az A sin el cos az sin el sin az cos el Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 4

Camera Reference Frame Camera frame: i c points to the right in the image j c points down in the image k c points along the optical axis R c g = @ A Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 5

Pinhole Camera Model Describes mathematical relationship between coordinates of 3-D point and its projection onto 2-D image plane Assumes point aperture, no lenses Good st order approximation Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 6

Pinhole Camera Model Goal: From pixel location ( x, y ) in image plane and camera parameters, determine location of object in world `c Approach: Use similar triangles geometry image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 7

f: focal length in pixels P : converts pixels to meters M: width of square pixel array v: field of view `c: 3-D location of point Camera Model f = M 2 tan 2 Location of projection of object in camera frame: (P x,p y,pf) Pixel location (in pixels): x, y image plane Distance from origin of camera frame to object q image: F = f 2 + 2 x + 2 y Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 8

By similar triangles: Camera Model `cx L = P x PF = x F Similarly: `cy/l = y /F and `cz/l = f/f Combining: `c = @ `cx `c y `cz A = L F @ x y f A Problem: L is unknown... image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 9

Camera Model Unit direction vector to target: `c L = x @ y A = q F f 2 x + 2 y + f 2 @ x y f A Define notation: ˇ`x 4 ˇ` = @ ˇ`y A = 4 ` L ˇ`z image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide

Two scenarios Gimbal Pointing Point gimbal at given world coordinate Point to this GPS location Point gimbal so that optical axis aligns with certain point in image plane Point at this object Gimbal dynamics Assume rate control inputs Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide

Scenario : Point Gimbal at World Coordinate p i obj : known location of object in inertial frame (world coordinate) p i MAV =(p n,p e,p d ) > : location of MAV in inertial frame Objective: Align optical axis of camera with desired relative position vector `id 4 = p i obj p i MAV Body-frame unit vector that points in desired direction of specified world coordinates ˇ`b d = R b i `id `id Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

Scenario 2: Point Gimbal at Object in Image Objective: Maneuver gimbal so that object at pixel location is pushed to center of image Desired direction of optical axis: ˇ`c d = q f 2 + 2 x + 2 y x y @ A f Desired direction of optical axis expressed in body frame: ˇ`b d = R b gr g c ˇ`c d Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3

Gimbal Pointing Angles We know direction to point gimbal What are corresponding azimuth and elevation angles? Optical axis in camera frame = (,, ) > Objective: Select commanded gimbal angles az c and el c so that ˇ`bxd 4 ˇ`b d = @ ˇ`byd A = R b g( az, c el)r c g @ c A ˇ`bzd = @ cos c el cos c az sin el c sin el c cos c az cos el c sin c az cos az c sin el c sin c A @ az A @ A sin el c cos el c cos el c cos c az = @ cos el c sin c A az sin el c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 4

Gimbal Pointing Angles Objective: Select commanded gimbal angles gimbal angles az c and el c ˇ`bxd @ ˇ`byd A = @ cos c el cos c az cos el c sin c A az ˇ`bzd sin el c so that Solving for el c and c az gives desired azimuth and elevation angles:! c az = tan ˇ`b yd ˇ`bxd c el =sin ˇ`b zd Gimbal servo commands can be selected as: u az = k az ( az c az ) u el = k el ( el c el ) k az and k el are positive control gains Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 5

Geolocation Relative position vector between target and MAV: ` = p obj p MAV Define: L = k`k (range to target) ˇ` = `/L (unit vector pointing from aircraft to target) From geometry: p i obj = p i MAV + R i br b gr g c`c = p i MAV + L R i br b gr g cˇ`c where p i MAV =(p n,p e,p d ) > R i b = R i b(,, ) R b g = R b g( az, el ) L is unknown! solving geolocation problem reduces to estimating range to target L Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 6

Range to Target Flat Earth Model From geometry: cos ' = h L From Cauchy-Schwartz equality: cos ' = k i ˇ`i = k i R i br b gr g cˇ`c h Equating: L = k i R i b Rb gr g cˇ`c From before: p i obj = p i MAV + L R i br b gr g cˇ`c Therefore: p i obj = @ p n p e p d A + h Ri b Rb gr g cˇ`c k i R i b Rb gr g cˇ`c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 7

Geolocation Errors The position of the object is p n p i obj = @ p e A + h Ri b Rb gr g cˇ`c p k i R i d b Rb gr g cˇ`c Equation is highly sensitive to measurement errors attitude estimation errors gimbal pointing angle errors gimbal/autopilot alignment errors Two types of errors bias errors (address with calibration) random errors (address with EKF) Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 8

Geolocation Using EKF The states are x =(p i> obj, L)>. Need propagation equations ẋ = f(x, u). Assuming the object is stationary: Also: ṗ i obj = L = d q (p i obj dt p i MAV )> (p i obj p i MAV ) = (pi obj p i MAV )> (ṗ i obj L ṗ i MAV ) = (pi obj p i MAV )> ṗ i MAV L where p i MAV = p i obj L R i br b gr g cˇ`c and for constant-altitude flight: ṗ i MAV = ˆV g cos ˆ @ ˆV g sin ˆA Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 9

Geolocation Using EKF Prediction equations: ˆp i obj ˆL! = (ˆp i obj ˆp i MAV )> ˆp imav ˆL! Jacobian of prediction equation: @f @x = ˆp it (ˆp i MAV obj ˆL ˆp i MAV )> ˆpMAV ˆL 2! Measurement equation: p i MAV = p i obj L R i br b gr g cˇ`c Jacobian of measurement equation: @h @x = I Ri b Rb gr g cˇ`c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

Geolocation Architecture GPS smoother vision processing geo-location attitude estimation gimbal Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

Target Motion in Image Plane The objective is to track targets seen in the image plane. Feature tracking, or other computer vision algorithms, are used to track pixels in image plane. Feature tracking is noisy, and so pixel location must be low pass filtered. Motion of the target in the image plane is due to two sources: Self motion, or ego motion. Primarily due to rotation of the MAV. Relative target motion. Since we are primarily interested in the relative motion, we must subtract the pixel movement due to ego motion. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 22

Pixel LPF and Differentiation Define the following =( x, y ) >, =( x, y ) >, =( x, y ) >, Raw pixel measurements, Filtered pixel location, Filtered pixel velocity Basic idea: LPF to obtain, and use dirty derivative to obtain : (s) = s (s) = s +, s +. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 23

Digital Approximation Converting to the z-domain gives Tustin approximation s 7! 2 T s z z +, [z] = [z] = 2 z + = T s z+ 2 z T s z+ 2 z + = T s z+ T s 2 +T s (z + ) 2 T z s 2 2 +T s 2 +T s (z ) 2 T z s. 2 +T s Taking the inverse z-transform gives the di erence equations 2 Ts Ts [n] = [n ] + ( [n]+ [n ]) 2 + T s 2 + T s 2 Ts 2 [n] = [n ] + ( [n] [n ]), 2 + T s 2 + T s where [] = [] and [] =. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 24

Digital Approximation Define =(2 T s )/(2 + T s ), and note that =2T s /(2 + T s ). Then [n]+ [n ] [n] = [n ] + ( ) 2 {z } [n] = [n ] + ( ) where [] = [] and [] =. Average of last two samples [n] [n ] T s {z } Euler approximation of derivative, Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 25

Apparent Motion Let ˇ` 4= `/L =(p obj p MAV )/ kp obj p MAV k be the normalized relative position vector. The Coriolis formula (expressed in camera frame) gives True relative translational motion between target and MAV dˇ`c dt i = dˇ` c +! c c/i dt ˇ`c. c Apparent motion due to rotation of MAV and gimbal Motion of target in image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 26

Apparent Motion, cont. Motion of target in image plane: dˇ`c dt c = d dt c = F 3 = Z( ), x x x @ y A F @ y A F @ y A f f = F F 2 = F 2 2 x x y @ x y F 2 2 y x f y f A x y F = F 3 @ x y A x x + y y F F 2 @ x y f 2 y + f 2 x y @ x y 2 x + f 2 A x f y f A where Z( ) 4 = F 3 2 y + f 2 x y @ x y 2 x + f 2 A. x f y f Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 27

Apparent Motion, cont. Ego Motion,! c c/i ˇ`c:! c c/i =!c c/g {z} +! c g/b {z} R c g Rg b B @ sin( az ) el cos( az ) el az C A +! c b/i. {z} p B @ qc A r Therefore ˇ`capp 4 =! c c/i ˇ`c = F 2 4R c gr g b 3 p sin( az ) el @ q + cos( az ) el A5 r + az @ x y A f Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 28

Total Motion in Image Plane 2 dˇ`c = Z( ) + 4R c dt i {z } F gr g @ p sin( 3 az) el b q + cos( az ) el A5 @ x y A r + Target Motion az f {z } Ego Motion Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 29

Time to Collision For all objects in the camera field of view, it is possible to compute the time to collision to that obstacle, defined as: The time to collision if the vehicle were to proceed along the line-ofsight vector to the obstacle at the current closing velocity. Therefore, for each obstacle, the time to collision is given by t c 4 = L L Impossible to compute using only a monocular camera because of scale ambiguity. Two techniques: Looming Requires target size. Flat earth approximation Requires height above ground. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3

Time to Collision - Looming From similar triangles: S obj L = P s PF = s F If S obj is not changing, then L L = Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3 L S obj = F s " F = F " s F s s s F F F F F = x x + y y F s F s F # # s s, the inverse of which is the time to collision t c.

Time to Collision Flat Earth Di erentiation gives L = h cos ' L L = ḣ h + ' tan '. From geometry we have Di erentiate and solve for ': cos ' = ˇ`iz. ' = sin ' d dt i ˇ`i z Therefore, ' tan ' = cos ' d dt i ˇ`i z = ˇ`i z d dt i ˇ`i z Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 32

Precision Landing For precision landing, we use the guidance model ṗ n = V g cos cos ṗ e = V g sin cos ṗ d = V g sin = g tan V g Define: = u = u 2. p i MAV = p n, p e, p d > v i MAV = V g cos cos, V g sin cos, V g sin v v 2 MAV = V g,, > > Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 33

Proportional Navigation (PN) Proportional Navigation: align the line-of-sight rate ` with the negative line-of-sight vector `. Define? = ˇ` ` L. v obj Assuming constant velocity, we can only accelerate in the plane perpendicular to the velocity vector. Therefore, to zero?, the desired acceleration is v MAV ` p obj ` = vobj vmav a MAV = N? v MAV, p MAV a MAV where N> is the (tunable) navigation constant. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 34

Acceleration Command To implement the acceleration command, must convert to vehicle-2 frame as where a v2 MAV = µn v2? v v2 = µn = @ @ v2?,x v2?,y v2?,z MAV A @ Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 35 µnv v2 A?,z, µnv v2?,y v2? = R v2 b R b gr g c c? V g c? = ˇ`c `c L `c L = L Lˇ`c + Z( ) + ˇ`c app where the inverse of time to collision L/L can be estimated using looming or flat earth model. A

Polar Converting Logic Use polar converting logic to convert acceleration command a v2 to roll angle and flight path angle commands:! c = tan a v2 y a v2 z q V g c = (a v2 y ) 2 +(a v2 z ) 2.! c = tan a v2 y a v2 z q V g c = (a v2 y ) 2 +(a v2 z ) 2. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 36

Polar Converting Logic, cont. General rule: c = tan c =! a v2 y a v2 z sign(a v2 z ) V g q (a v2 y ) 2 +(a v2 z ) 2 Note discontinuity in When a v2 y >! c = /2 When a v2 y <! c = /2 Remove discontinuity as c at (a v2 y,a v2 z )=(, ). When a v2 z c = (a v2 y ) tan! a v2 y a v2 z =, then where (a v2 y ) = sign(a v2 y ) e ka v2 y +e kav2 y Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 37