Chapter 3 Vision Based Guidance Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide
Architecture w/ Camera targets to track/avoid vision-based guidance waypoints status path manager path definition tracking error airspeed, altitude, heading, commands path following autopilot position error servo commands wind unmanned aircraft state estimator on-board sensors ˆx(t) camera Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2
Gimbal and Camera Reference Frames Gimbal- frame: F g =(i g, j g, k g ) Gimbal frame: F g =(i g, j g, k g ) Camera frame: F c =(i c, j c, k c ) Gimbal- frame: Rotate body frame about k b axis by gimbal azimuth angle, az R g b ( az) 4 = @ cos az sin az sin az cos az A Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3
Gimbal Reference Frames Gimbal frame: Rotate gimbal- frame about j g axis by gimbal elevation angle, el R g g ( el) 4 = R g b = Rg g Rg b = cos el sin el @ A sin el cos el @ cos el cos az cos el sin az sin el sin az cos az A sin el cos az sin el sin az cos el Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 4
Camera Reference Frame Camera frame: i c points to the right in the image j c points down in the image k c points along the optical axis R c g = @ A Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 5
Pinhole Camera Model Describes mathematical relationship between coordinates of 3-D point and its projection onto 2-D image plane Assumes point aperture, no lenses Good st order approximation Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 6
Pinhole Camera Model Goal: From pixel location ( x, y ) in image plane and camera parameters, determine location of object in world `c Approach: Use similar triangles geometry image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 7
f: focal length in pixels P : converts pixels to meters M: width of square pixel array v: field of view `c: 3-D location of point Camera Model f = M 2 tan 2 Location of projection of object in camera frame: (P x,p y,pf) Pixel location (in pixels): x, y image plane Distance from origin of camera frame to object q image: F = f 2 + 2 x + 2 y Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 8
By similar triangles: Camera Model `cx L = P x PF = x F Similarly: `cy/l = y /F and `cz/l = f/f Combining: `c = @ `cx `c y `cz A = L F @ x y f A Problem: L is unknown... image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 9
Camera Model Unit direction vector to target: `c L = x @ y A = q F f 2 x + 2 y + f 2 @ x y f A Define notation: ˇ`x 4 ˇ` = @ ˇ`y A = 4 ` L ˇ`z image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide
Two scenarios Gimbal Pointing Point gimbal at given world coordinate Point to this GPS location Point gimbal so that optical axis aligns with certain point in image plane Point at this object Gimbal dynamics Assume rate control inputs Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide
Scenario : Point Gimbal at World Coordinate p i obj : known location of object in inertial frame (world coordinate) p i MAV =(p n,p e,p d ) > : location of MAV in inertial frame Objective: Align optical axis of camera with desired relative position vector `id 4 = p i obj p i MAV Body-frame unit vector that points in desired direction of specified world coordinates ˇ`b d = R b i `id `id Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2
Scenario 2: Point Gimbal at Object in Image Objective: Maneuver gimbal so that object at pixel location is pushed to center of image Desired direction of optical axis: ˇ`c d = q f 2 + 2 x + 2 y x y @ A f Desired direction of optical axis expressed in body frame: ˇ`b d = R b gr g c ˇ`c d Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3
Gimbal Pointing Angles We know direction to point gimbal What are corresponding azimuth and elevation angles? Optical axis in camera frame = (,, ) > Objective: Select commanded gimbal angles az c and el c so that ˇ`bxd 4 ˇ`b d = @ ˇ`byd A = R b g( az, c el)r c g @ c A ˇ`bzd = @ cos c el cos c az sin el c sin el c cos c az cos el c sin c az cos az c sin el c sin c A @ az A @ A sin el c cos el c cos el c cos c az = @ cos el c sin c A az sin el c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 4
Gimbal Pointing Angles Objective: Select commanded gimbal angles gimbal angles az c and el c ˇ`bxd @ ˇ`byd A = @ cos c el cos c az cos el c sin c A az ˇ`bzd sin el c so that Solving for el c and c az gives desired azimuth and elevation angles:! c az = tan ˇ`b yd ˇ`bxd c el =sin ˇ`b zd Gimbal servo commands can be selected as: u az = k az ( az c az ) u el = k el ( el c el ) k az and k el are positive control gains Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 5
Geolocation Relative position vector between target and MAV: ` = p obj p MAV Define: L = k`k (range to target) ˇ` = `/L (unit vector pointing from aircraft to target) From geometry: p i obj = p i MAV + R i br b gr g c`c = p i MAV + L R i br b gr g cˇ`c where p i MAV =(p n,p e,p d ) > R i b = R i b(,, ) R b g = R b g( az, el ) L is unknown! solving geolocation problem reduces to estimating range to target L Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 6
Range to Target Flat Earth Model From geometry: cos ' = h L From Cauchy-Schwartz equality: cos ' = k i ˇ`i = k i R i br b gr g cˇ`c h Equating: L = k i R i b Rb gr g cˇ`c From before: p i obj = p i MAV + L R i br b gr g cˇ`c Therefore: p i obj = @ p n p e p d A + h Ri b Rb gr g cˇ`c k i R i b Rb gr g cˇ`c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 7
Geolocation Errors The position of the object is p n p i obj = @ p e A + h Ri b Rb gr g cˇ`c p k i R i d b Rb gr g cˇ`c Equation is highly sensitive to measurement errors attitude estimation errors gimbal pointing angle errors gimbal/autopilot alignment errors Two types of errors bias errors (address with calibration) random errors (address with EKF) Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 8
Geolocation Using EKF The states are x =(p i> obj, L)>. Need propagation equations ẋ = f(x, u). Assuming the object is stationary: Also: ṗ i obj = L = d q (p i obj dt p i MAV )> (p i obj p i MAV ) = (pi obj p i MAV )> (ṗ i obj L ṗ i MAV ) = (pi obj p i MAV )> ṗ i MAV L where p i MAV = p i obj L R i br b gr g cˇ`c and for constant-altitude flight: ṗ i MAV = ˆV g cos ˆ @ ˆV g sin ˆA Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 9
Geolocation Using EKF Prediction equations: ˆp i obj ˆL! = (ˆp i obj ˆp i MAV )> ˆp imav ˆL! Jacobian of prediction equation: @f @x = ˆp it (ˆp i MAV obj ˆL ˆp i MAV )> ˆpMAV ˆL 2! Measurement equation: p i MAV = p i obj L R i br b gr g cˇ`c Jacobian of measurement equation: @h @x = I Ri b Rb gr g cˇ`c Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2
Geolocation Architecture GPS smoother vision processing geo-location attitude estimation gimbal Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 2
Target Motion in Image Plane The objective is to track targets seen in the image plane. Feature tracking, or other computer vision algorithms, are used to track pixels in image plane. Feature tracking is noisy, and so pixel location must be low pass filtered. Motion of the target in the image plane is due to two sources: Self motion, or ego motion. Primarily due to rotation of the MAV. Relative target motion. Since we are primarily interested in the relative motion, we must subtract the pixel movement due to ego motion. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 22
Pixel LPF and Differentiation Define the following =( x, y ) >, =( x, y ) >, =( x, y ) >, Raw pixel measurements, Filtered pixel location, Filtered pixel velocity Basic idea: LPF to obtain, and use dirty derivative to obtain : (s) = s (s) = s +, s +. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 23
Digital Approximation Converting to the z-domain gives Tustin approximation s 7! 2 T s z z +, [z] = [z] = 2 z + = T s z+ 2 z T s z+ 2 z + = T s z+ T s 2 +T s (z + ) 2 T z s 2 2 +T s 2 +T s (z ) 2 T z s. 2 +T s Taking the inverse z-transform gives the di erence equations 2 Ts Ts [n] = [n ] + ( [n]+ [n ]) 2 + T s 2 + T s 2 Ts 2 [n] = [n ] + ( [n] [n ]), 2 + T s 2 + T s where [] = [] and [] =. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 24
Digital Approximation Define =(2 T s )/(2 + T s ), and note that =2T s /(2 + T s ). Then [n]+ [n ] [n] = [n ] + ( ) 2 {z } [n] = [n ] + ( ) where [] = [] and [] =. Average of last two samples [n] [n ] T s {z } Euler approximation of derivative, Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 25
Apparent Motion Let ˇ` 4= `/L =(p obj p MAV )/ kp obj p MAV k be the normalized relative position vector. The Coriolis formula (expressed in camera frame) gives True relative translational motion between target and MAV dˇ`c dt i = dˇ` c +! c c/i dt ˇ`c. c Apparent motion due to rotation of MAV and gimbal Motion of target in image plane Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 26
Apparent Motion, cont. Motion of target in image plane: dˇ`c dt c = d dt c = F 3 = Z( ), x x x @ y A F @ y A F @ y A f f = F F 2 = F 2 2 x x y @ x y F 2 2 y x f y f A x y F = F 3 @ x y A x x + y y F F 2 @ x y f 2 y + f 2 x y @ x y 2 x + f 2 A x f y f A where Z( ) 4 = F 3 2 y + f 2 x y @ x y 2 x + f 2 A. x f y f Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 27
Apparent Motion, cont. Ego Motion,! c c/i ˇ`c:! c c/i =!c c/g {z} +! c g/b {z} R c g Rg b B @ sin( az ) el cos( az ) el az C A +! c b/i. {z} p B @ qc A r Therefore ˇ`capp 4 =! c c/i ˇ`c = F 2 4R c gr g b 3 p sin( az ) el @ q + cos( az ) el A5 r + az @ x y A f Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 28
Total Motion in Image Plane 2 dˇ`c = Z( ) + 4R c dt i {z } F gr g @ p sin( 3 az) el b q + cos( az ) el A5 @ x y A r + Target Motion az f {z } Ego Motion Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 29
Time to Collision For all objects in the camera field of view, it is possible to compute the time to collision to that obstacle, defined as: The time to collision if the vehicle were to proceed along the line-ofsight vector to the obstacle at the current closing velocity. Therefore, for each obstacle, the time to collision is given by t c 4 = L L Impossible to compute using only a monocular camera because of scale ambiguity. Two techniques: Looming Requires target size. Flat earth approximation Requires height above ground. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3
Time to Collision - Looming From similar triangles: S obj L = P s PF = s F If S obj is not changing, then L L = Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 3 L S obj = F s " F = F " s F s s s F F F F F = x x + y y F s F s F # # s s, the inverse of which is the time to collision t c.
Time to Collision Flat Earth Di erentiation gives L = h cos ' L L = ḣ h + ' tan '. From geometry we have Di erentiate and solve for ': cos ' = ˇ`iz. ' = sin ' d dt i ˇ`i z Therefore, ' tan ' = cos ' d dt i ˇ`i z = ˇ`i z d dt i ˇ`i z Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 32
Precision Landing For precision landing, we use the guidance model ṗ n = V g cos cos ṗ e = V g sin cos ṗ d = V g sin = g tan V g Define: = u = u 2. p i MAV = p n, p e, p d > v i MAV = V g cos cos, V g sin cos, V g sin v v 2 MAV = V g,, > > Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 33
Proportional Navigation (PN) Proportional Navigation: align the line-of-sight rate ` with the negative line-of-sight vector `. Define? = ˇ` ` L. v obj Assuming constant velocity, we can only accelerate in the plane perpendicular to the velocity vector. Therefore, to zero?, the desired acceleration is v MAV ` p obj ` = vobj vmav a MAV = N? v MAV, p MAV a MAV where N> is the (tunable) navigation constant. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 34
Acceleration Command To implement the acceleration command, must convert to vehicle-2 frame as where a v2 MAV = µn v2? v v2 = µn = @ @ v2?,x v2?,y v2?,z MAV A @ Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 35 µnv v2 A?,z, µnv v2?,y v2? = R v2 b R b gr g c c? V g c? = ˇ`c `c L `c L = L Lˇ`c + Z( ) + ˇ`c app where the inverse of time to collision L/L can be estimated using looming or flat earth model. A
Polar Converting Logic Use polar converting logic to convert acceleration command a v2 to roll angle and flight path angle commands:! c = tan a v2 y a v2 z q V g c = (a v2 y ) 2 +(a v2 z ) 2.! c = tan a v2 y a v2 z q V g c = (a v2 y ) 2 +(a v2 z ) 2. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 36
Polar Converting Logic, cont. General rule: c = tan c =! a v2 y a v2 z sign(a v2 z ) V g q (a v2 y ) 2 +(a v2 z ) 2 Note discontinuity in When a v2 y >! c = /2 When a v2 y <! c = /2 Remove discontinuity as c at (a v2 y,a v2 z )=(, ). When a v2 z c = (a v2 y ) tan! a v2 y a v2 z =, then where (a v2 y ) = sign(a v2 y ) e ka v2 y +e kav2 y Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide 37