Chapter 13. Vision Based Guidance. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,

Size: px
Start display at page:

Download "Chapter 13. Vision Based Guidance. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,"

Transcription

1 Chapter 3 Vision Based Guidance Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide

2 Architecture w/ Camera targets to track/avoid vision-based guidance waypoints status path manager path definition tracking error airspeed, altitude, heading, commands path following autopilot position error servo commands wind unmanned aircraft state estimator on-board sensors ˆx(t) camera Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

3 Gimbal and Camera Reference Frames Gimbal- frame: F g =(i g, j g, k g ) Gimbal frame: F g =(i g, j g, k g ) Camera frame: F c =(i c, j c, k c ) Gimbal- frame: Rotate body frame about k b axis by gimbal azimuth angle, az R g b ( az) 4 cos az sin az sin az cos az A Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3

4 Gimbal Reference Frames Gimbal frame: Rotate gimbal- frame about j g axis by gimbal elevation angle, el R g g ( el) 4 = R g b = Rg g Rg b = cos el sin A sin el cos cos el cos az cos el sin az sin el sin az cos az A sin el cos az sin el sin az cos el Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 4

5 Camera Reference Frame Camera frame: i c points to the right in the image j c points down in the image k c points along the optical axis R c g A Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 5

6 Pinhole Camera Model Describes mathematical relationship between coordinates of 3-D point and its projection onto 2-D image plane Assumes point aperture, no lenses Good st order approximation Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 6

7 Pinhole Camera Model Goal: From pixel location ( x, y ) in image plane and camera parameters, determine location of object in world `c Approach: Use similar triangles geometry image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 7

8 f: focal length in pixels P : converts pixels to meters M: width of square pixel array v: field of view `c: 3-D location of point Camera Model f = M 2 tan 2 Location of projection of object in camera frame: (P x,p y,pf) Pixel location (in pixels): x, y image plane Distance from origin of camera frame to object q image: F = f x + 2 y Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 8

9 By similar triangles: Camera Model `cx L = P x PF = x F Similarly: `cy/l = y /F and `cz/l = f/f Combining: `c `cx `c y `cz A = L x y f A Problem: L is unknown... image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 9

10 Camera Model Unit direction vector to target: `c L = y A = q F f 2 x + 2 y + f x y f A Define notation: ˇ`x 4 ˇ` ˇ`y A = 4 ` L ˇ`z image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide

11 Two scenarios Gimbal Pointing Point gimbal at given world coordinate Point to this GPS location Point gimbal so that optical axis aligns with certain point in image plane Point at this object Gimbal dynamics Assume rate control inputs Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide

12 Scenario : Point Gimbal at World Coordinate p i obj : known location of object in inertial frame (world coordinate) p i MAV =(p n,p e,p d ) > : location of MAV in inertial frame Objective: Align optical axis of camera with desired relative position vector `id 4 = p i obj p i MAV Body-frame unit vector that points in desired direction of specified world coordinates ˇ`b d = R b i `id `id Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

13 Scenario 2: Point Gimbal at Object in Image Objective: Maneuver gimbal so that object at pixel location is pushed to center of image Desired direction of optical axis: ˇ`c d = q f x + 2 y x A f Desired direction of optical axis expressed in body frame: ˇ`b d = R b gr g c ˇ`c d Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3

14 Gimbal Pointing Angles We know direction to point gimbal What are corresponding azimuth and elevation angles? Optical axis in camera frame = (,, ) > Objective: Select commanded gimbal angles az c and el c so that ˇ`bxd 4 ˇ`b d ˇ`byd A = R b g( az, c el)r c c A ˇ`bzd cos c el cos c az sin el c sin el c cos c az cos el c sin c az cos az c sin el c sin c az A sin el c cos el c cos el c cos c az cos el c sin c A az sin el c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 4

15 Gimbal Pointing Angles Objective: Select commanded gimbal angles gimbal angles az c and el c ˇ`byd A cos c el cos c az cos el c sin c A az ˇ`bzd sin el c so that Solving for el c and c az gives desired azimuth and elevation angles:! c az = tan ˇ`b yd ˇ`bxd c el =sin ˇ`b zd Gimbal servo commands can be selected as: u az = k az ( az c az ) u el = k el ( el c el ) k az and k el are positive control gains Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 5

16 Geolocation Relative position vector between target and MAV: ` = p obj p MAV Define: L = k`k (range to target) ˇ` = `/L (unit vector pointing from aircraft to target) From geometry: p i obj = p i MAV + R i br b gr g c`c = p i MAV + L R i br b gr g cˇ`c where p i MAV =(p n,p e,p d ) > R i b = R i b(,, ) R b g = R b g( az, el ) L is unknown! solving geolocation problem reduces to estimating range to target L Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 6

17 Range to Target Flat Earth Model From geometry: cos ' = h L From Cauchy-Schwartz equality: cos ' = k i ˇ`i = k i R i br b gr g cˇ`c h Equating: L = k i R i b Rb gr g cˇ`c From before: p i obj = p i MAV + L R i br b gr g cˇ`c Therefore: p i obj p n p e p d A + h Ri b Rb gr g cˇ`c k i R i b Rb gr g cˇ`c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 7

18 Geolocation Errors The position of the object is p n p i obj p e A + h Ri b Rb gr g cˇ`c p k i R i d b Rb gr g cˇ`c Equation is highly sensitive to measurement errors attitude estimation errors gimbal pointing angle errors gimbal/autopilot alignment errors Two types of errors bias errors (address with calibration) random errors (address with EKF) Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 8

19 Geolocation Using EKF The states are x =(p i> obj, L)>. Need propagation equations ẋ = f(x, u). Assuming the object is stationary: Also: ṗ i obj = L = d q (p i obj dt p i MAV )> (p i obj p i MAV ) = (pi obj p i MAV )> (ṗ i obj L ṗ i MAV ) = (pi obj p i MAV )> ṗ i MAV L where p i MAV = p i obj L R i br b gr g cˇ`c and for constant-altitude flight: ṗ i MAV = ˆV g cos ˆV g sin ˆA Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 9

20 Geolocation Using EKF Prediction equations: ˆp i obj ˆL! = (ˆp i obj ˆp i MAV )> ˆp imav ˆL! Jacobian of = ˆp it (ˆp i MAV obj ˆL ˆp i MAV )> ˆpMAV ˆL 2! Measurement equation: p i MAV = p i obj L R i br b gr g cˇ`c Jacobian of = I Ri b Rb gr g cˇ`c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

21 Geolocation Architecture GPS smoother vision processing geo-location attitude estimation gimbal Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2

22 Target Motion in Image Plane The objective is to track targets seen in the image plane. Feature tracking, or other computer vision algorithms, are used to track pixels in image plane. Feature tracking is noisy, and so pixel location must be low pass filtered. Motion of the target in the image plane is due to two sources: Self motion, or ego motion. Primarily due to rotation of the MAV. Relative target motion. Since we are primarily interested in the relative motion, we must subtract the pixel movement due to ego motion. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 22

23 Pixel LPF and Differentiation Define the following =( x, y ) >, =( x, y ) >, =( x, y ) >, Raw pixel measurements, Filtered pixel location, Filtered pixel velocity Basic idea: LPF to obtain, and use dirty derivative to obtain : (s) = s (s) = s +, s +. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 23

24 Digital Approximation Converting to the z-domain gives Tustin approximation s 7! 2 T s z z +, [z] = [z] = 2 z + = T s z+ 2 z T s z+ 2 z + = T s z+ T s 2 +T s (z + ) 2 T z s 2 2 +T s 2 +T s (z ) 2 T z s. 2 +T s Taking the inverse z-transform gives the di erence equations 2 Ts Ts [n] = [n ] + ( [n]+ [n ]) 2 + T s 2 + T s 2 Ts 2 [n] = [n ] + ( [n] [n ]), 2 + T s 2 + T s where [] = [] and [] =. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 24

25 Digital Approximation Define =(2 T s )/(2 + T s ), and note that =2T s /(2 + T s ). Then [n]+ [n ] [n] = [n ] + ( ) 2 {z } [n] = [n ] + ( ) where [] = [] and [] =. Average of last two samples [n] [n ] T s {z } Euler approximation of derivative, Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 25

26 Apparent Motion Let ˇ` 4= `/L =(p obj p MAV )/ kp obj p MAV k be the normalized relative position vector. The Coriolis formula (expressed in camera frame) gives True relative translational motion between target and MAV dˇ`c dt i = dˇ` c +! c c/i dt ˇ`c. c Apparent motion due to rotation of MAV and gimbal Motion of target in image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 26

27 Apparent Motion, cont. Motion of target in image plane: dˇ`c dt c = d dt c = F 3 = Z( ), x x y A y A y A f f = F F 2 = F 2 2 x x x y F 2 2 y x f y f A x y F = F x y A x x + y y F F x y f 2 y + f 2 x x y 2 x + f 2 A x f y f A where Z( ) 4 = F 3 2 y + f 2 x x y 2 x + f 2 A. x f y f Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 27

28 Apparent Motion, cont. Ego Motion,! c c/i ˇ`c:! c c/i =!c c/g {z} +! c g/b {z} R c g Rg b sin( az ) el cos( az ) el az C A +! c b/i. {z} p qc A r Therefore ˇ`capp 4 =! c c/i ˇ`c = F 2 4R c gr g b 3 p sin( az ) q + cos( az ) el A5 r + x y A f Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 28

29 Total Motion in Image Plane 2 dˇ`c = Z( ) + 4R c dt i {z } F gr p sin( 3 az) el b q + cos( az ) el x y A r + Target Motion az f {z } Ego Motion Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 29

30 Time to Collision For all objects in the camera field of view, it is possible to compute the time to collision to that obstacle, defined as: The time to collision if the vehicle were to proceed along the line-ofsight vector to the obstacle at the current closing velocity. Therefore, for each obstacle, the time to collision is given by t c 4 = L L Impossible to compute using only a monocular camera because of scale ambiguity. Two techniques: Looming Requires target size. Flat earth approximation Requires height above ground. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3

31 Time to Collision - Looming From similar triangles: S obj L = P s PF = s F If S obj is not changing, then L L = Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3 L S obj = F s " F = F " s F s s s F F F F F = x x + y y F s F s F # # s s, the inverse of which is the time to collision t c.

32 Time to Collision Flat Earth Di erentiation gives L = h cos ' L L = ḣ h + ' tan '. From geometry we have Di erentiate and solve for ': cos ' = ˇ`iz. ' = sin ' d dt i ˇ`i z Therefore, ' tan ' = cos ' d dt i ˇ`i z = ˇ`i z d dt i ˇ`i z Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 32

33 Precision Landing For precision landing, we use the guidance model ṗ n = V g cos cos ṗ e = V g sin cos ṗ d = V g sin = g tan V g Define: = u = u 2. p i MAV = p n, p e, p d > v i MAV = V g cos cos, V g sin cos, V g sin v v 2 MAV = V g,, > > Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 33

34 Proportional Navigation (PN) Proportional Navigation: align the line-of-sight rate ` with the negative line-of-sight vector `. Define? = ˇ` ` L. v obj Assuming constant velocity, we can only accelerate in the plane perpendicular to the velocity vector. Therefore, to zero?, the desired acceleration is v MAV ` p obj ` = vobj vmav a MAV = N? v MAV, p MAV a MAV where N> is the (tunable) navigation constant. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 34

35 Acceleration Command To implement the acceleration command, must convert to vehicle-2 frame as where a v2 MAV = µn v2? v v2 = v2?,x v2?,y v2?,z MAV Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 35 µnv v2 A?,z, µnv v2?,y v2? = R v2 b R b gr g c c? V g c? = ˇ`c `c L `c L = L Lˇ`c + Z( ) + ˇ`c app where the inverse of time to collision L/L can be estimated using looming or flat earth model. A

36 Polar Converting Logic Use polar converting logic to convert acceleration command a v2 to roll angle and flight path angle commands:! c = tan a v2 y a v2 z q V g c = (a v2 y ) 2 +(a v2 z ) 2.! c = tan a v2 y a v2 z q V g c = (a v2 y ) 2 +(a v2 z ) 2. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 36

37 Polar Converting Logic, cont. General rule: c = tan c =! a v2 y a v2 z sign(a v2 z ) V g q (a v2 y ) 2 +(a v2 z ) 2 Note discontinuity in When a v2 y >! c = /2 When a v2 y <! c = /2 Remove discontinuity as c at (a v2 y,a v2 z )=(, ). When a v2 z c = (a v2 y ) tan! a v2 y a v2 z =, then where (a v2 y ) = sign(a v2 y ) e ka v2 y +e kav2 y Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 37

Introduction to quaternions. Mathematics. Operations

Introduction to quaternions. Mathematics. Operations Introduction to quaternions Topics: Definition Mathematics Operations Euler Angles (optional) intro to quaternions 1 noel.h.hughes@gmail.com Euler's Theorem y y Angle! rotation follows right hand rule

More information

Chapter 12. Path Planning. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,

Chapter 12. Path Planning. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012, Chapter 12 Path Planning Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 212, Chapter 12: Slide 1 Control Architecture destination, obstacles map path planner waypoints status path

More information

Motion estimation of unmanned marine vehicles Massimo Caccia

Motion estimation of unmanned marine vehicles Massimo Caccia Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it

More information

Vision-based Target Geo-location using a Fixed-wing Miniature Air Vehicle

Vision-based Target Geo-location using a Fixed-wing Miniature Air Vehicle Journal of Intelligent and Robotic Systems manuscript No. (will be inserted by the editor) Vision-based Target Geo-location using a Fixed-wing Miniature Air Vehicle D. Blake Barber 1, Joshua D. Redding

More information

This was written by a designer of inertial guidance machines, & is correct. **********************************************************************

This was written by a designer of inertial guidance machines, & is correct. ********************************************************************** EXPLANATORY NOTES ON THE SIMPLE INERTIAL NAVIGATION MACHINE How does the missile know where it is at all times? It knows this because it knows where it isn't. By subtracting where it is from where it isn't

More information

Image Fusion of Video Images and Geo-localization for UAV Applications K.Senthil Kumar 1, Kavitha.G 2, Subramanian.

Image Fusion of Video Images and Geo-localization for UAV Applications K.Senthil Kumar 1, Kavitha.G 2, Subramanian. Image Fusion of Video Images and Geo-localization for UAV Applications K.Senthil Kumar 1, Kavitha.G 2, Subramanian.R 3 and Marwan 4 Abstract We present in this paper a very fine method for determining

More information

Tracking Multiple Mobile Targets Using Cooperative Unmanned Aerial Vehicles

Tracking Multiple Mobile Targets Using Cooperative Unmanned Aerial Vehicles 215 International Conference on Unmanned Aircraft Systems (ICUAS) Denver Marriott Tech Center Denver, Colorado, USA, June 9-12, 215 Tracking Multiple Mobile Targets Using Cooperative Unmanned Aerial Vehicles

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München Sensors IMUs (inertial measurement units) Accelerometers

More information

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement

More information

Selection and Integration of Sensors Alex Spitzer 11/23/14

Selection and Integration of Sensors Alex Spitzer 11/23/14 Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs

More information

SAE Aerospace Control & Guidance Systems Committee #97 March 1-3, 2006 AFOSR, AFRL. Georgia Tech, MIT, UCLA, Virginia Tech

SAE Aerospace Control & Guidance Systems Committee #97 March 1-3, 2006 AFOSR, AFRL. Georgia Tech, MIT, UCLA, Virginia Tech Systems for Aircraft SAE Aerospace Control & Guidance Systems Committee #97 March 1-3, 2006 AFOSR, AFRL Georgia Tech, MIT, UCLA, Virginia Tech controls.ae.gatech.edu/avcs Systems Systems MURI Development

More information

Stereo Observation Models

Stereo Observation Models Stereo Observation Models Gabe Sibley June 16, 2003 Abstract This technical report describes general stereo vision triangulation and linearized error modeling. 0.1 Standard Model Equations If the relative

More information

Methods for Localization and Mapping Using Vision and Inertial Sensors

Methods for Localization and Mapping Using Vision and Inertial Sensors AIAA Guidance, Navigation and Control Conference and Exhibit 8 - August 8, Honolulu, Hawaii AIAA 8-744 Methods for Localization and Mapping Using Vision and Inertial Sensors Allen D. Wu and Eric N. Johnson

More information

Midterm Exam Solutions

Midterm Exam Solutions Midterm Exam Solutions Computer Vision (J. Košecká) October 27, 2009 HONOR SYSTEM: This examination is strictly individual. You are not allowed to talk, discuss, exchange solutions, etc., with other fellow

More information

Camera Model and Calibration

Camera Model and Calibration Camera Model and Calibration Lecture-10 Camera Calibration Determine extrinsic and intrinsic parameters of camera Extrinsic 3D location and orientation of camera Intrinsic Focal length The size of the

More information

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2

More information

Robotics - Projective Geometry and Camera model. Marcello Restelli

Robotics - Projective Geometry and Camera model. Marcello Restelli Robotics - Projective Geometr and Camera model Marcello Restelli marcello.restelli@polimi.it Dipartimento di Elettronica, Informazione e Bioingegneria Politecnico di Milano Ma 2013 Inspired from Matteo

More information

Autonomous Vehicle Navigation Using Stereoscopic Imaging

Autonomous Vehicle Navigation Using Stereoscopic Imaging Autonomous Vehicle Navigation Using Stereoscopic Imaging Project Proposal By: Beach Wlaznik Advisors: Dr. Huggins Dr. Stewart December 7, 2006 I. Introduction The objective of the Autonomous Vehicle Navigation

More information

Unscented Kalman Filter for Vision Based Target Localisation with a Quadrotor

Unscented Kalman Filter for Vision Based Target Localisation with a Quadrotor Unscented Kalman Filter for Vision Based Target Localisation with a Quadrotor Jos Alejandro Dena Ruiz, Nabil Aouf Centre of Electronic Warfare, Defence Academy of the United Kingdom Cranfield University,

More information

CHAPTER 2 SENSOR DATA SIMULATION: A KINEMATIC APPROACH

CHAPTER 2 SENSOR DATA SIMULATION: A KINEMATIC APPROACH 27 CHAPTER 2 SENSOR DATA SIMULATION: A KINEMATIC APPROACH 2.1 INTRODUCTION The standard technique of generating sensor data for navigation is the dynamic approach. As revealed in the literature (John Blakelock

More information

Vision-based Target Geo-location Using a Fixedwing Miniature Air Vehicle

Vision-based Target Geo-location Using a Fixedwing Miniature Air Vehicle Brigham Young University BYU ScholarsArchive All Faculty Publications 26-12 Vision-based Target Geo-location Using a Fixedwing Miniature Air Vehicle D. Blake Barber Brigham Young University - Provo Joshua

More information

Vision-Based Road-Following Using Proportional Navigation

Vision-Based Road-Following Using Proportional Navigation Vision-Based Road-Following Using Proportional Navigation Ryan S. Holt and Randal W. Beard Abstract This paper describes a new approach for autonomous guidance along a road for an unmanned air vehicle

More information

Accurate Target Geolocation and Vision-Based Landing with Application to Search and Engage Missions for Miniature Air Vehicles

Accurate Target Geolocation and Vision-Based Landing with Application to Search and Engage Missions for Miniature Air Vehicles Brigham Young University BYU ScholarsArchive All Theses and Dissertations 2007-02-22 Accurate Target Geolocation and Vision-Based Landing with Application to Search and Engage Missions for Miniature Air

More information

Autonomous Vehicle Navigation Using Stereoscopic Imaging

Autonomous Vehicle Navigation Using Stereoscopic Imaging Autonomous Vehicle Navigation Using Stereoscopic Imaging Functional Description and Complete System Block Diagram By: Adam Beach Nick Wlaznik Advisors: Dr. Huggins Dr. Stewart December 14, 2006 I. Introduction

More information

Integrated Estimation, Guidance & Control II

Integrated Estimation, Guidance & Control II Optimal Control, Guidance and Estimation Lecture 32 Integrated Estimation, Guidance & Control II Prof. Radhakant Padhi Dept. of Aerospace Engineering Indian Institute of Science - Bangalore Motivation

More information

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical

More information

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles

A Reactive Bearing Angle Only Obstacle Avoidance Technique for Unmanned Ground Vehicles Proceedings of the International Conference of Control, Dynamic Systems, and Robotics Ottawa, Ontario, Canada, May 15-16 2014 Paper No. 54 A Reactive Bearing Angle Only Obstacle Avoidance Technique for

More information

Photogrammetry Metadata Set for Digital Motion Imagery

Photogrammetry Metadata Set for Digital Motion Imagery MISB RP 0801.4 RECOMMENDED PRACTICE Photogrammetry Metadata Set for Digital Motion Imagery 4 October 013 1 Scope This Recommended Practice presents the Key-Length-Value (KLV) metadata necessary for the

More information

Inflight Alignment Simulation using Matlab Simulink

Inflight Alignment Simulation using Matlab Simulink Inflight Alignment Simulation using Matlab Simulink Authors, K. Chandana, Soumi Chakraborty, Saumya Shanker, R.S. Chandra Sekhar, G. Satheesh Reddy. RCI /DRDO.. 2012 The MathWorks, Inc. 1 Agenda with Challenging

More information

Stereo camera de-calibration detection based on observing kinematic attributes of detected objects and the camera rig

Stereo camera de-calibration detection based on observing kinematic attributes of detected objects and the camera rig Technical University of Dortmund Stereo camera de-calibration detection based on observing kinematic attributes of detected objects and the camera rig by Venkata Rama Prasad Donda A thesis submitted in

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 7.2: Visual Odometry Jürgen Sturm Technische Universität München Cascaded Control Robot Trajectory 0.1 Hz Visual

More information

Chapters 1 9: Overview

Chapters 1 9: Overview Chapters 1 9: Overview Chapter 1: Introduction Chapters 2 4: Data acquisition Chapters 5 9: Data manipulation Chapter 5: Vertical imagery Chapter 6: Image coordinate measurements and refinements Chapters

More information

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The

More information

Using Motion Fields to Estimate Video Utility and Detect GPS Spoofing

Using Motion Fields to Estimate Video Utility and Detect GPS Spoofing Brigham Young University BYU ScholarsArchive All Theses and Dissertations 2012-08-08 Using Motion Fields to Estimate Video Utility and Detect GPS Spoofing Brandon T. Carroll Brigham Young University -

More information

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Jean-François Lalonde, Srinivasa G. Narasimhan and Alexei A. Efros {jlalonde,srinivas,efros}@cs.cmu.edu CMU-RI-TR-8-32 July

More information

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI Chapter 4 Dynamics Part 2 4.3 Constrained Kinematics and Dynamics 1 Outline 4.3 Constrained Kinematics and Dynamics 4.3.1 Constraints of Disallowed Direction 4.3.2 Constraints of Rolling without Slipping

More information

Unmanned Aerial Vehicles

Unmanned Aerial Vehicles Unmanned Aerial Vehicles Embedded Control Edited by Rogelio Lozano WILEY Table of Contents Chapter 1. Aerodynamic Configurations and Dynamic Models 1 Pedro CASTILLO and Alejandro DZUL 1.1. Aerodynamic

More information

Camera Model and Calibration. Lecture-12

Camera Model and Calibration. Lecture-12 Camera Model and Calibration Lecture-12 Camera Calibration Determine extrinsic and intrinsic parameters of camera Extrinsic 3D location and orientation of camera Intrinsic Focal length The size of the

More information

Guided Problem Solving

Guided Problem Solving -1 Guided Problem Solving GPS Student Page 57, Exercises 1 1: Match each rule with the correct translation. A. (x, y) (x, y 1 ) I. P(, 1) P (3, ) B. (x, y) (x 1 3, y) II. Q(3, 0) Q (3, ) C. (x, y) (x 1,

More information

Smartphone Video Guidance Sensor for Small Satellites

Smartphone Video Guidance Sensor for Small Satellites SSC13-I-7 Smartphone Video Guidance Sensor for Small Satellites Christopher Becker, Richard Howard, John Rakoczy NASA Marshall Space Flight Center Mail Stop EV42, Huntsville, AL 35812; 256-544-0114 christophermbecker@nasagov

More information

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza ETH Master Course: 151-0854-00L Autonomous Mobile Robots Summary 2 Lecture Overview Mobile Robot Control Scheme knowledge, data base mission

More information

Navigation of a mobile robot on the temporal development of the optic

Navigation of a mobile robot on the temporal development of the optic 558 Navigation of a mobile robot on the temporal development of the optic flow Anuj Dev, Ben Krose, Frans Groen RWCP Novel Functions SNN*Laboratory Department of Computer Science, University of Amsterdam

More information

Exterior Orientation Parameters

Exterior Orientation Parameters Exterior Orientation Parameters PERS 12/2001 pp 1321-1332 Karsten Jacobsen, Institute for Photogrammetry and GeoInformation, University of Hannover, Germany The georeference of any photogrammetric product

More information

UAV Autonomous Navigation in a GPS-limited Urban Environment

UAV Autonomous Navigation in a GPS-limited Urban Environment UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight

More information

Vision-Aided Inertial Navigation for Flight Control

Vision-Aided Inertial Navigation for Flight Control JOURNAL OF AEROSPACE COMPUTING, INFORMATION, AND COMMUNICATION Vol. 2, September 2005 Vision-Aided Inertial Navigation for Flight Control Allen D. Wu, Eric N. Johnson, and Alison A. Proctor Georgia Institute

More information

Preview. Two-Dimensional Motion and Vectors Section 1. Section 1 Introduction to Vectors. Section 2 Vector Operations. Section 3 Projectile Motion

Preview. Two-Dimensional Motion and Vectors Section 1. Section 1 Introduction to Vectors. Section 2 Vector Operations. Section 3 Projectile Motion Two-Dimensional Motion and Vectors Section 1 Preview Section 1 Introduction to Vectors Section 2 Vector Operations Section 3 Projectile Motion Section 4 Relative Motion Two-Dimensional Motion and Vectors

More information

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Przemys law G asior, Stanis law Gardecki, Jaros law Gośliński and Wojciech Giernacki Poznan University of

More information

Lecture 13 Visual Inertial Fusion

Lecture 13 Visual Inertial Fusion Lecture 13 Visual Inertial Fusion Davide Scaramuzza Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve

More information

Modeling, Parameter Estimation, and Navigation of Indoor Quadrotor Robots

Modeling, Parameter Estimation, and Navigation of Indoor Quadrotor Robots Brigham Young University BYU ScholarsArchive All Theses and Dissertations 2013-04-29 Modeling, Parameter Estimation, and Navigation of Indoor Quadrotor Robots Stephen C. Quebe Brigham Young University

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: EKF-based SLAM Dr. Kostas Alexis (CSE) These slides have partially relied on the course of C. Stachniss, Robot Mapping - WS 2013/14 Autonomous Robot Challenges Where

More information

1.1.1 Orientation Coordinate Systems

1.1.1 Orientation Coordinate Systems 1.1.1 Orientation 1.1.1.1 Coordinate Systems The velocity measurement is a vector in the direction of the transducer beam, which we refer to as beam coordinates. Beam coordinates can be converted to a

More information

Position Analysis

Position Analysis Position Analysis 2015-03-02 Position REVISION The position of a point in the plane can be defined by the use of a position vector Cartesian coordinates Polar coordinates Each form is directly convertible

More information

Tracking a Target in Wind Using a Micro Air Vehicle with a Fixed Angle Camera

Tracking a Target in Wind Using a Micro Air Vehicle with a Fixed Angle Camera 2008 American Control Conference Westin Seattle Hotel, Seattle, Washington, USA June 11-13, 2008 FrA07.4 Tracking a Target in Wind Using a Micro Air Vehicle with a Fixed Angle Camera Jeffery Saunders and

More information

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more Roadmap of topics n Review perspective transformation n Camera calibration n Stereo methods n Structured

More information

Semester 2 Review Problems will be sectioned by chapters. The chapters will be in the order by which we covered them.

Semester 2 Review Problems will be sectioned by chapters. The chapters will be in the order by which we covered them. Semester 2 Review Problems will be sectioned by chapters. The chapters will be in the order by which we covered them. Chapter 9 and 10: Right Triangles and Trigonometric Ratios 1. The hypotenuse of a right

More information

Calculating the Velocity of a Ball Moving Toward or Away From a Camera with ProAnalyst

Calculating the Velocity of a Ball Moving Toward or Away From a Camera with ProAnalyst Calculating the Velocity of a Ball Moving Toward or Away From a Camera with ProAnalyst Date Published: August 2013 Abstract This application note describes a technique for measuring the velocity of a ball

More information

Video integration in a GNSS/INS hybridization architecture for approach and landing

Video integration in a GNSS/INS hybridization architecture for approach and landing Author manuscript, published in "IEEE/ION PLANS 2014, Position Location and Navigation Symposium, Monterey : United States (2014)" Video integration in a GNSS/INS hybridization architecture for approach

More information

navigation Isaac Skog

navigation Isaac Skog Foot-mounted zerovelocity aided inertial navigation Isaac Skog skog@kth.se Course Outline 1. Foot-mounted inertial navigation a. Basic idea b. Pros and cons 2. Inertial navigation a. The inertial sensors

More information

Line of Sight Stabilization Primer Table of Contents

Line of Sight Stabilization Primer Table of Contents Line of Sight Stabilization Primer Table of Contents Preface 1 Chapter 1.0 Introduction 3 Chapter 2.0 LOS Control Architecture and Design 11 2.1 Direct LOS Stabilization 15 2.2 Indirect LOS Stabilization

More information

Self-calibration of a pair of stereo cameras in general position

Self-calibration of a pair of stereo cameras in general position Self-calibration of a pair of stereo cameras in general position Raúl Rojas Institut für Informatik Freie Universität Berlin Takustr. 9, 14195 Berlin, Germany Abstract. This paper shows that it is possible

More information

Visual Recognition: Image Formation

Visual Recognition: Image Formation Visual Recognition: Image Formation Raquel Urtasun TTI Chicago Jan 5, 2012 Raquel Urtasun (TTI-C) Visual Recognition Jan 5, 2012 1 / 61 Today s lecture... Fundamentals of image formation You should know

More information

Vision-Aided Inertial Navigation for Flight Control

Vision-Aided Inertial Navigation for Flight Control AIAA Guidance, Navigation, and Control Conference and Exhibit 15-18 August 5, San Francisco, California AIAA 5-5998 Vision-Aided Inertial Navigation for Flight Control Allen D. Wu, Eric N. Johnson, and

More information

Payload Directed Flight of Miniature Air Vehicles

Payload Directed Flight of Miniature Air Vehicles Brigham Young University BYU ScholarsArchive All Faculty Publications 29-4 Payload Directed Flight of Miniature Air Vehicles Randal W. Beard Brigham Young University - Provo, beard@byu.edu Clark Taylor

More information

Inspire 2 Release Notes

Inspire 2 Release Notes Date: 2018.04.18 Remote Controller Firmware: DJI GO 4 app: V01.02.0100 V01.01.0010 ios V 4.2.12 or above, Android V 4.2.12 or above Added support for adjusting the maximum velocity of aircraft s real-time

More information

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra) Mierm Exam CS223b Stanford CS223b Computer Vision, Winter 2004 Feb. 18, 2004 Full Name: Email: This exam has 7 pages. Make sure your exam is not missing any sheets, and write your name on every page. The

More information

THE VIEWING TRANSFORMATION

THE VIEWING TRANSFORMATION ECS 178 Course Notes THE VIEWING TRANSFORMATION Kenneth I. Joy Institute for Data Analysis and Visualization Department of Computer Science University of California, Davis Overview One of the most important

More information

Optical Flow-Based Person Tracking by Multiple Cameras

Optical Flow-Based Person Tracking by Multiple Cameras Proc. IEEE Int. Conf. on Multisensor Fusion and Integration in Intelligent Systems, Baden-Baden, Germany, Aug. 2001. Optical Flow-Based Person Tracking by Multiple Cameras Hideki Tsutsui, Jun Miura, and

More information

University of Technology Building & Construction Department / Remote Sensing & GIS lecture

University of Technology Building & Construction Department / Remote Sensing & GIS lecture 5. Corrections 5.1 Introduction 5.2 Radiometric Correction 5.3 Geometric corrections 5.3.1 Systematic distortions 5.3.2 Nonsystematic distortions 5.4 Image Rectification 5.5 Ground Control Points (GCPs)

More information

Leaderless Formation Control for Multiple Autonomous Vehicles. Wei Ren

Leaderless Formation Control for Multiple Autonomous Vehicles. Wei Ren AIAA Guidance, Navigation, and Control Conference and Exhibit - 4 August 6, Keystone, Colorado AIAA 6-669 Leaderless Formation Control for Multiple Autonomous Vehicles Wei Ren Department of Electrical

More information

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR Mobile & Service Robotics Sensors for Robotics 3 Laser sensors Rays are transmitted and received coaxially The target is illuminated by collimated rays The receiver measures the time of flight (back and

More information

Vision-based Local-level Frame Mapping and Planning in Spherical Coordinates for Miniature Air Vehicles

Vision-based Local-level Frame Mapping and Planning in Spherical Coordinates for Miniature Air Vehicles 1 Vision-based Local-level Frame Mapping and Planning in Spherical Coordinates for Miniature Air Vehicles Huili Yu* Randal W. Beard** Abstract This paper presents a vision-based collision avoidance technique

More information

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles Shaojie Shen Dept. of Electrical and Systems Engineering & GRASP Lab, University of Pennsylvania Committee: Daniel

More information

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

Single View Geometry. Camera model & Orientation + Position estimation. What am I? Single View Geometry Camera model & Orientation + Position estimation What am I? Vanishing points & line http://www.wetcanvas.com/ http://pennpaint.blogspot.com/ http://www.joshuanava.biz/perspective/in-other-words-the-observer-simply-points-in-thesame-direction-as-the-lines-in-order-to-find-their-vanishing-point.html

More information

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 COORDINATE TRANSFORMS. Prof. Steven Waslander

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 COORDINATE TRANSFORMS. Prof. Steven Waslander ME 597: AUTONOMOUS MOILE ROOTICS SECTION 2 COORDINATE TRANSFORMS Prof. Steven Waslander OUTLINE Coordinate Frames and Transforms Rotation Matrices Euler Angles Quaternions Homogeneous Transforms 2 COORDINATE

More information

(1) and s k ωk. p k vk q

(1) and s k ωk. p k vk q Sensing and Perception: Localization and positioning Isaac Sog Project Assignment: GNSS aided INS In this project assignment you will wor with a type of navigation system referred to as a global navigation

More information

Keywords: UAV, Formation flight, Virtual Pursuit Point

Keywords: UAV, Formation flight, Virtual Pursuit Point DESIGN THE GUIDANCE LAW FOR FORMATION FLIGHT OF MULTIPLE UAVS Heemin Shin*, Hyungi Kim*, Jaehyun Lee*, David Hyunchul Shim* *Korea Advanced Institute of Science and Technology (KAIST) Keywords: UAV, Formation

More information

Camera Drones Lecture 2 Control and Sensors

Camera Drones Lecture 2 Control and Sensors Camera Drones Lecture 2 Control and Sensors Ass.Prof. Friedrich Fraundorfer WS 2017 1 Outline Quadrotor control principles Sensors 2 Quadrotor control - Hovering Hovering means quadrotor needs to hold

More information

Robotics (Kinematics) Winter 1393 Bonab University

Robotics (Kinematics) Winter 1393 Bonab University Robotics () Winter 1393 Bonab University : most basic study of how mechanical systems behave Introduction Need to understand the mechanical behavior for: Design Control Both: Manipulators, Mobile Robots

More information

An idea which can be used once is a trick. If it can be used more than once it becomes a method

An idea which can be used once is a trick. If it can be used more than once it becomes a method An idea which can be used once is a trick. If it can be used more than once it becomes a method - George Polya and Gabor Szego University of Texas at Arlington Rigid Body Transformations & Generalized

More information

Satellite Attitude Determination II

Satellite Attitude Determination II Satellite Attitude Determination II AERO4701 Space Engineering 3 Week 7 Last Week Looked at the problem of attitude determination for satellites Examined several common methods such as inertial navigation,

More information

Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy

Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy Vision-based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy Gideon P. Stein Ofer Mano Amnon Shashua MobileEye Vision Technologies Ltd. MobileEye Vision Technologies Ltd. Hebrew University

More information

Guidance and obstacle avoidance of MAV in uncertain urban environment

Guidance and obstacle avoidance of MAV in uncertain urban environment Guidance and obstacle avoidance of MAV in uncertain urban environment C. Kownacki Technical University of Białystok, ul. Wiejska 45C, 15-815 Białystok, Poland ABSTRACT Obstacle avoidance and guidance of

More information

Chapters 1-4: Summary

Chapters 1-4: Summary Chapters 1-4: Summary So far, we have been investigating the image acquisition process. Chapter 1: General introduction Chapter 2: Radiation source and properties Chapter 3: Radiation interaction with

More information

Massachusetts Institute of Technology Department of Computer Science and Electrical Engineering 6.801/6.866 Machine Vision QUIZ II

Massachusetts Institute of Technology Department of Computer Science and Electrical Engineering 6.801/6.866 Machine Vision QUIZ II Massachusetts Institute of Technology Department of Computer Science and Electrical Engineering 6.801/6.866 Machine Vision QUIZ II Handed out: 001 Nov. 30th Due on: 001 Dec. 10th Problem 1: (a (b Interior

More information

Inspire 1 Pro Release Notes

Inspire 1 Pro Release Notes 2017.07.10 1. All-in-One firmware version updated to v01.11.01.50. 2. Remote Controller firmware version updated to v1.7.80. 3. DJI GO app ios version updated to v3.1.13. 4. DJI GO app Android version

More information

Jacobian of Point Coordinates w.r.t. Parameters of General Calibrated Projective Camera

Jacobian of Point Coordinates w.r.t. Parameters of General Calibrated Projective Camera Jacobian of Point Coordinates w.r.t. Parameters of General Calibrated Projective Camera Karel Lebeda, Simon Hadfield, Richard Bowden Introduction This is a supplementary technical report for ACCV04 paper:

More information

PSE Game Physics. Session (3) Springs, Ropes, Linear Momentum and Rotations. Oliver Meister, Roland Wittmann

PSE Game Physics. Session (3) Springs, Ropes, Linear Momentum and Rotations. Oliver Meister, Roland Wittmann PSE Game Physics Session (3) Springs, Ropes, Linear Momentum and Rotations Oliver Meister, Roland Wittmann 08.05.2015 Session (3) Springs, Ropes, Linear Momentum and Rotations, 08.05.2015 1 Outline Springs

More information

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview Arun Das 05/09/2017 Arun Das Waterloo Autonomous Vehicles Lab Introduction What s in a name? Arun Das Waterloo Autonomous

More information

PRICISE TARGET GEOLOCATION BASED ON INTEGERATION OF THERMAL VIDEO IMAGERY AND RTK GPS IN UAVS

PRICISE TARGET GEOLOCATION BASED ON INTEGERATION OF THERMAL VIDEO IMAGERY AND RTK GPS IN UAVS PRCSE ARGE GEOLOCAON BASED ON NEGERAON OF HERMAL VDEO MAGERY AND RK GPS N UAVS H.R.Hosseinpoor, F.Samadzadegan, F. DadrasJavan* Department of Geomatics Engineering, Faculty of Engineering, University of

More information

Inertial Navigation Systems

Inertial Navigation Systems Inertial Navigation Systems Kiril Alexiev University of Pavia March 2017 1 /89 Navigation Estimate the position and orientation. Inertial navigation one of possible instruments. Newton law is used: F =

More information

Autonomous Landing of an Unmanned Aerial Vehicle

Autonomous Landing of an Unmanned Aerial Vehicle Autonomous Landing of an Unmanned Aerial Vehicle Joel Hermansson, Andreas Gising Cybaero AB SE-581 12 Linköping, Sweden Email: {joel.hermansson, andreas.gising}@cybaero.se Martin Skoglund and Thomas B.

More information

Projectile Trajectory Scenarios

Projectile Trajectory Scenarios Projectile Trajectory Scenarios Student Worksheet Name Class Note: Sections of this document are numbered to correspond to the pages in the TI-Nspire.tns document ProjectileTrajectory.tns. 1.1 Trajectories

More information

THE UNIVERSITY OF AUCKLAND

THE UNIVERSITY OF AUCKLAND THE UNIVERSITY OF AUCKLAND FIRST SEMESTER, 2002 Campus: Tamaki COMPUTER SCIENCE Intelligent Active Vision (Time allowed: TWO hours) NOTE: Attempt questions A, B, C, D, E, and F. This is an open book examination.

More information

Design of a Three-Axis Rotary Platform

Design of a Three-Axis Rotary Platform Design of a Three-Axis Rotary Platform William Mendez, Yuniesky Rodriguez, Lee Brady, Sabri Tosunoglu Mechanics and Materials Engineering, Florida International University 10555 W Flagler Street, Miami,

More information

VISION-BASED UAV FLIGHT CONTROL AND OBSTACLE AVOIDANCE. Zhihai He, Ram Venkataraman Iyer, and Phillip R. Chandler

VISION-BASED UAV FLIGHT CONTROL AND OBSTACLE AVOIDANCE. Zhihai He, Ram Venkataraman Iyer, and Phillip R. Chandler VISION-BASED UAV FLIGHT CONTROL AND OBSTACLE AVOIDANCE Zhihai He, Ram Venkataraman Iyer, and Phillip R Chandler ABSTRACT In this work, we explore various ideas and approaches to deal with the inherent

More information

Geometric Stereo Increases Accuracy of Depth Estimations for an Unmanned Air Vehicle with a Small Baseline Stereo System

Geometric Stereo Increases Accuracy of Depth Estimations for an Unmanned Air Vehicle with a Small Baseline Stereo System Geometric Stereo Increases Accuracy of Depth Estimations for an Unmanned Air Vehicle with a Small Baseline Stereo System Eleanor Tursman (Grinnell College, Physics), SUNFEST Fellow Camillo Jose Taylor,

More information

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016 Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused

More information

9.5 Polar Coordinates. Copyright Cengage Learning. All rights reserved.

9.5 Polar Coordinates. Copyright Cengage Learning. All rights reserved. 9.5 Polar Coordinates Copyright Cengage Learning. All rights reserved. Introduction Representation of graphs of equations as collections of points (x, y), where x and y represent the directed distances

More information

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras UAV Position and Attitude Sensoring in Indoor Environment Using Cameras 1 Peng Xu Abstract There are great advantages of indoor experiment for UAVs. Test flights of UAV in laboratory is more convenient,

More information

Low Cost solution for Pose Estimation of Quadrotor

Low Cost solution for Pose Estimation of Quadrotor Low Cost solution for Pose Estimation of Quadrotor mangal@iitk.ac.in https://www.iitk.ac.in/aero/mangal/ Intelligent Guidance and Control Laboratory Indian Institute of Technology, Kanpur Mangal Kothari

More information