Bias Estimation for Optical Sensor Measurements with Targets of Opportunity

Size: px
Start display at page:

Download "Bias Estimation for Optical Sensor Measurements with Targets of Opportunity"

Transcription

1 Bias Estimation for Optical Measurements with s of Opportunity Djedjiga Belfadel, Richard W Osborne, III and Yaakov Bar-Shalom Electrical and Computer Engineering University of Connecticut Storrs, CT, USA djedjigabelfadel@uconnedu rosborne@engruconnedu, ybs@engruconnedu Abstract This paper provides a solution for bias estimation of multiple passive sensors using common targets of opportunity The measurements provided by these sensors are assumed timecoincident (synchronous) and perfectly associated Since these sensors provide only line of sight (LOS) measurements, the formation of a single composite Cartesian measurement obtained from fusing the LOS measurements from different sensors is needed to avoid the need for nonlinear filtering The evaluation of the Cramer-Rao Lower Bound (CRLB) on the covariance of the bias estimate, ie, the quantification of the available information about the biases, combined with simulations, shows that this method is statistically efficient, even for small sample sizes Index Terms Bias estimation, bias observability, statistical efficiency, CRLB, composite measurements, maximum likelihood I ITRODUCTIO In order to carry out data fusion, registration error correction is crucial in multisensor systems This requires estimation of the sensor measurement biases Measurement bias, in target tracking problems, can result from a number of different sources Some primary sources of bias errors include: measurement biases due to the deterioration of initial sensor calibration over time; attitude errors caused by biases in the gyros of the inertial measurement units of (airborne, seaborne or spaceborne) sensors; and timing errors due to the biases in the onboard clock of each sensor platform [8] The effect of biases introduced in the process of converting sensor measurements from polar (or spherical) coordinates to Cartesian coordinates has been discussed extensively in [] together with the limit of validity of the standard transformation One of the most studied examples of converted measurement tracking involves the conversion from polar or spherical measurements to Cartesian coordinates for use in a linear Kalman filter If the conversion process is unbiased, the performance of a converted measurement Kalman filter is superior to a mixed coordinate EKF (ie target motion in Cartesian coordinates and measurements in polar coordinates) [] The approaches for conversion include the conventional conversion, the Unbiased Converted Measurement (UCM), the Modified Unbiased Converted Measurement (MUCM), and the Unscented Transform (UT) Recently, a decorrelated version of the UCM technique (DUCM) has been developed to address both conversion and estimation bias [5], [6] Another example of biased measurement conversion is the estimation of range-rate from a moving platform To measure range rate using the Doppler effect, it is necessary to nullify the impact of platform motion The conventional nullification approach suffers from a similar bias problem as the position measurement conversion [3] A novel scheme was proposed in [3] and [4] by applying the DUCM technique to own-doppler nullification to eliminate this bias Time varying bias estimation based on a nonlinear least squares formulation and the singular value decomposition using truth data was presented in [8] However, this work did not discuss the CRLB for bias estimation An approach using maximum a posteriori (MAP) data association for concurrent bias estimation and data association based on sensor-level track state estimates was proposed in [9] and extended in [1] For angle-only sensors, imperfect registration leads to LOS angle measurement biases in azimuth and elevation It is important to correct for these bias errors so that the multiple sensor measurements and/or tracks can be referenced to a common tracking coordinate system (frame) If uncorrected, registration error can lead to large tracking errors and potentially to the formation of multiple tracks (ghosts) on the same target In this paper, bias estimation is investigated when only targets of opportunity are available We assume the sensor locations are known and we estimate their orientation biases Estimation of sensor biases only was discussed in [7] The estimation of range and location biases for active sensors was presented in [11] First, we discuss the bias estimation for synchronous biased sensors Then we evaluate the corresponding Cramer-Rao lower bound (CRLB) on the covariance of the bias estimates, which is the quantification of the available information on the sensor biases Section II presents the problem formulation and solution in detail Section III describes the simulations performed and gives the results Finally, Section IV gives the conclusions II PROBLEM FORMULATIO The fundamental frame of reference used in this paper is a 3D Cartesian Common Coordinate System (CCS) defined by the orthogonal set of unit vectors (e x, e y, e z ) In a multisensor scenario, sensor platforms will typically have a

2 sensor reference frame associated with it (measurement frame of the sensor) defined by the orthogonal set of unit vectors (e ξs, e ηs, e ζs ) The origin of the measurement frame of the sensor is a translation of the CCS origin, and its axes are rotated with respect to the CCS axes The rotation between these frames can be described by a set of Euler angles We will refer to these angles φ s +φ s, ρ s +ρ s, ψ s +ψ s of sensor s, as roll, pitch and yaw respectively [8], where φ s is the nominal roll angle, φ s is the roll bias, etc Each angle defines a rotation about a prescribed axis, in order to align the sensor frame axes with the CCS axes The xyz rotation sequence is chosen, which is accomplished by first rotating about the x axis by φ s, then rotating about the y axis by ρ s, and finally rotating about the z axis by ψ s The rotations sequence can be expressed by the matrices T s (ψ s, ρ s, φ s ) =T z (ψ s )T y (ρ s )T x (φ s ) cos ψ s sin ψ s = sin ψ s cos ψ s 1 cos ρ s sin ρ s 1 sin ρ s cos ρ s 1 cos φ s sin φ s (1) sin φ s cos φ s Assume there are S synchronized passive sensors, with known position in the CCS, ξ s = [ξ s, η s, ζ s ], s = 1,,, S, and t targets, located at x i = [x i, y i, z i ], i = 1,,, t, in the same CCS With the previous convention, the operations needed to transform the position of a given target i expressed in the CCS coordinate into a position expressed in a sensor s coordinate system (x is = [x is, y is, z is ] ) can be expressed as x is = T (ω s )(x i ξ s ) i = 1,,, t, s = 1,,, S () where ω s = [φ s, ρ s, ψs] is the nominal orientation of sensor s and T (ω s ) is the appropriate rotation matrix and the translation (x i ξ s ) is the difference between the vector position of the target i and the vector position of the sensor s, both expressed in the CCS As shown in Figure 1, the azimuth angle α is is the angle in the sensor xz plane between the sensor z axis and the line of sight to the target, while the elevation angle ɛ is is the angle between the line of sight to the target and its projection onto the xz plane, that is [ αis ɛ is ] = ( ) tan 1 x is ( zis tan 1 y is x is +zis ) (3) The model for the biased noise-free LOS measurements is then [ ] [ ] α b is g1 (x = i, ξ s, ω s, b s ) = g(x g (x i, ξ s, ω s, b s ) i, ξ s, ω s, b s ) (4) ɛ b is y Fig 1 Optical sensor coordinate system with the origin in the center of the focal plane where g 1 and g denote the sensor Cartesian coordinatesto-azimuth/elevation angle mapping that can be found by inserting equations () and (3) into (4) The bias vector of sensor s is b s = [φ s, ρ s, ψ s ] (5) and θ is = [x is, y is, z is, ψ s, ρ s, φ s ] (6) denotes the augmented parameter vector for target i and sensor s For a given target, each sensor provides the noisy LOS measurements where z is = g(x i, ξ s, ω s, b s ) + w is (7) x z w is = [w α is, w ɛ is] (8) The measurement noises wis α and wɛ is are zero-mean, white with corresponding standard deviations σs α, σs ɛ and are assumed mutually independent The problem is to estimate the bias vector for all sensors and obtain composite position measurements in Cartesian coordinates from LOS measurements that will then be used in a tracking filter We shall obtain the maximum likelihood (ML) estimate of the parameter vector θ = [x 1,, x t, b 1,, b S ] (9) which contains all the elements of θ is, i = 1,, t, s = 1,, S, rearranged and transformed into CCS, by maximizing the likelihood function where t S Λ(θ) = p (z is θ) (1) i=1 s=1 p (z is θ) = πr s 1/ ( exp 1 ) [z is h is (θ)] Rs 1 [z is h is (θ)] and (11) h is (θ) = g(x i, ξ s, ω s, b s ) (1)

3 The ML estimate (MLE) is then ˆθ ML = arg max Λ(θ) (13) θ In order to find the MLE, one has to solve a nonlinear least squares problem for the exponent in (11) This will be done using a numerical search via the Iterated Least Squares (ILS) technique [1] A Conditions for bias estimability First condition for bias estimability: For a given target we have measurements from each sensor (the two LOS angles of the target point) We assume that each sensor sees all the targets at a common time 1 Therefore the number of measurements of t targets seen by S sensors is t S Given that the dimensions of the position and bias vectors of each target are 3, and knowing that the number of measurements has to be at least equal to the number of parameters to be estimated (target positions and biases), we must have t S 3( t + S ) (14) For example, to estimate the biases of 3 sensors (9 bias components) we need 3 targets (9 position components), ie, the search is in an 18-dimensional space For sensors (6 bias components) we need at least 6 targets (18 position components) to have complete observability (estimability) In this case a search in a 4-dimensional space is needed Second condition of bias estimability: This is the invertibility of the Fisher Information matrix [1], to be discussed later B Iterated Least Squares Given the estimate ˆθ j after j iterations, the ILS estimate after the (j + 1)th iteration will be ˆθ j+1 = ˆθ j + [ (H j ) R 1 H j] 1 (H j ) R 1 [z h( ˆθ j )] (15) where z = [z 11,, z is,, z t S ] (16) h( ˆθ j ) = [h 11 ( ˆθ j ),, h is ( ˆθ j ),, h t S ( ˆθ j ) ] (17) R 1 R R = R S where R s is the covariance matrix of sensor s, and H j = h ( θ j) θ θ= ˆθj (18) (19) is the Jacobian matrix of the vector consisting of the stacked measurement functions (17) wrt (9) evaluated at the ILS 1 This can also be the same target at different times, as long as the sensors are synchronized estimate from the previous iteration j In this case, the Jacobian matrix is, with the iteration index omitted for conciseness, H = [ H 11 H 1 H t1 H 1 H t S ] where H is = x 1 y 1 z 1 x t y t z t ψ 1 ρ 1 φ 1 ψ S ρ S φ S g is x 1 g is y 1 g is z 1 g is x t g is y t g is z t g is ψ 1 g is ρ 1 g is φ 1 g is ψ S g is ρ S g is φ S () (1) The appropriate partial derivatives are given in the Appendix C Initialialization In order to perform the numerical search via ILS, an initial estimate ˆθ is required Assuming that the biases are null, the LOS measurements from the first and the second sensor α i1, α i and ɛ i1 can be used to solve for each initial Cartesian target position, in the CCS, as x i = ξ ξ 1 + ζ 1 tan α i1 ζ tan α i tan α i1 tan α i () yi = tan α i1 (ξ + tan α i (ζ 1 ζ )) ξ 1 tan α i (3) tan α i1 tan α i zi (ξ 1 ξ ) cos α i + (ζ ζ 1 ) sin α i =η 1 + tan ɛ i1 sin (α i1 α i ) (4) D Cramer-Rao Lower Bound In order to evaluate the efficiency of the estimator, the CRLB must be calculated The CRLB provides a lower bound on the covariance matrix of an unbiased estimator as [1] E{(θ ˆθ)(θ ˆθ) } J 1 (5) where J is the Fisher Information Matrix (FIM), θ is the true parameter vector to be estimated, and ˆθ is the estimate The FIM is J = E { [ θ ln Λ(θ)] [ θ ln Λ(θ)] } θ=θtrue (6) where the gradient of the log-likelihood function is λ(θ) = ln Λ(θ) (7)

4 Scenario 1 Scenario Y axis(m) Y axis(m) Fig Scenario 1 Fig 3 Scenario t S θ λ(θ) = H isr 1 is (z is h is (θ)) (8) i=1 s=1 Scenario 3 which, when plugged into (6), gives J = t S H is i=1 s=1 ( ) R 1 s His θ=θtrue = H ( R 1) H θ=θtrue (9) III SIMULATIOS We simulated three optical sensors at various locations observing a target at three (unknown) points in time (which is equivalent to viewing three different targets at unknown locations) Five scenarios of three sensors are examined for a set of target locations They are shown in Figures 6 Each scenario is such that each target position can be observed by all sensors As discussed in the previous section, the three sensor biases were roll, pitch and yaw angle offsets The biases for each sensor were set to 3 = 5mrad We made 1 Monte Carlo runs for each scenario In order to establish a baseline for evaluating the performance of our algorithm, we also ran the simulations without bias and with bias but without bias estimation The horizontal and vertical field-of-view of each sensor is assumed to be 6 The measurement noise standard deviation σ s (identical across sensors for both azimuth and elevation measurements) was assumed to be 34 mrad A Description of the Scenarios The sensors are assumed to provide LOS angle measurements We denote by ξ 1, ξ, ξ 3 the 3D Cartesian sensor positions, and x 1, x, x 3 the 3D Cartesian target positions (all in CCS) The three target positions are the same for all the scenarios, and they were chosen from a trajectory of a ballistic Y axis(m) target as follows (m) Fig 4 Scenario 3 x 1 = ( 86,, 68 ) x = ( 359,, 815 ) x 3 = (413,, 6451 ) Table I summarizes the sensor positions (in m) for the 5 scenarios considered B Statistical efficiency of the estimates In order to test for the statistical efficiency of the estimate (of the 18 dimensional vector (9)), the normalized estimation error squared (EES) [1] is used, with the CRLB as the covariance matrix The sample average EES over 1 Monte Carlo runs is given in Table II for all the scenarios The EES is calculated using the FIM evaluated at both the true bias values and target positions, as well as at the estimated biases and target positions According to the CRLB, the FIM has to be evaluated at the true parameter Since this is not

5 TABLE I SESOR POSITIOS (M) FOR THE SCEARIOS COSIDERED First Second Third Scenario ξ η ζ ξ η ζ ξ η ζ Scenario EES θ=θils EES θ=θt 185 EES Y axis(m) Scenarios Fig 5 Scenario 4 Fig 7 Sample average EES over 1 Monte Carlo runs for all 5 scenarios Scenario EES of ψ 1 EES of ρ 1 EES of φ 1 EES of ψ Bias EES EES of ρ EES of φ EES of ψ 3 EES of ρ 3 EES of φ Y axis(m) Scenarios Fig 8 Sample average bias EES (CRLB evaluated at the estimate), for each of the 9 biases, over 1 Monte Carlo runs for all 5 scenarios Fig 6 Scenario 5 available in practice, however, it is useful to evaluate the FIM also at the estimated parameter, the only one available in real world implementations [1], [13] The results are practically identical regardless of which values are chosen for evaluation of the FIM The 95% probability region for the 1 sample average EES of the 18 dimensional parameter vector is [1684, 1919] For all 5 scenarios the estimate is found to be within this interval and is therefore considered a statistically efficient estimate Figure 7 shows the EES for all scenarios, Figure 8 shows the individual bias component EES for all scenarios, The 95% probability region for the 1 sample average single component EES is [74, 19] For all 5 scenarios the estimates are found to be within this interval The RMS position errors for the 3 targets are summarized in Table III In this table, the first estimation scheme was established as a baseline using bias-free LOS measurements to

6 GDOP GDOP Tar1 Sens1 GDOP Tar1 Sens GDOP Tar1 Sens3 GDOP Tar Sens1 GDOP Tar Sens GDOP Tar Sens3 GDOP Tar3 Sens1 GDOP Tar3 Sens GDOP Tar3 Sens3 TABLE II SAMPLE AVERAGE EES OVER 1 MOTE CARLO RUS FOR 5 SCEARIOS (BASED O 18 1 = 18 DEGREES OF FREEDOM CHI-SQUARE DISTRIBUTIO) Scenarios EES θ=θt EES θ=ˆθils TABLE III SAMPLE AVERAGE POSITIO RMSE (M) FOR THE 3 TARGETS, OVER 1 MOTE CARLO RUS, FOR THE 3 ESTIMATIO SCHEMES Scenarios Fig 9 GDOPs for the 5 scenarios considered First Second Third Scheme RMSE RMSE RMSE estimate the target positions For the second scheme, we used biased LOS measurements but we only estimated target positions In the last scheme, we used biased LOS measurements and we simultaneously estimated the target positions and sensor biases Bias estimation yields significantly improved target RMS position errors in the presence of biases Each component of θ should also be individually consistent with its corresponding σ CRLB (the square root of the corresponding diagonal element of the inverse of FIM) In this case, the sample average bias RMSE over 1 Monte Carlo runs should be within 15% of its corresponding bias standard deviation from the CRLB (σ CRLB ) with 975% probability Table IV demonstrates the consistency of the individual bias estimates To confirm that the bias estimates are unbiased, the average bias error b, from Table V, over 1 Monte Carlo runs confirms that b is less then (which it should be with 95% probability), ie, these estimates are unbiased In order to examine the statistical efficiency for a variety of target-sensor geometries, the sensors locations were varied from one scenario to another in order to vary the Geometric Dilution of Precision (GDOP), defined as GDOP = RMSE r σα + σɛ (3) where RMSE is the RMS position error for a target location (in the absence of biases), r is the range to the target, and σ α and σ ɛ are the azimuth and elevation measurement error standard deviations, respectively Figure 9 shows the various GDOP levels in the 9 target-sensor combinations for which statistical efficiency was confirmed IV COCLUSIOS In this paper, we presented an algorithm that uses targets of opportunity for estimation of measurement biases The first step was formulating a general bias model for synchronized optical sensors The association of measurements is assumed to be perfect Based on this, we used a ML approach that led to a nonlinear least-squares estimation scheme for concurrent estimation of the 3D Cartesian positions of the targets and the angle measurement biases of the sensors The bias estimates, obtained via ILS, were shown to be unbiased and statistically efficient APPEDIX The appropriate partial derivatives of (1) are z k g is g is g is z k g is g is g is = + + (31) = + + (3) = + + (33) z k z k z k = + + (34) = + + (35) = + + (36) = g is + g is + g is (37) = g is + g is + g is (38) = g is + g is + g is (39) z k z k z k = g is + g is + g is (4) = g is + g is + g is (41) = g is + g is + g is (4)

7 TABLE IV SAMPLE AVERAGE BIAS RMSE OVER 1 MOTE CARLO RUS AD THE CORRESPODIG BIAS STADARD DEVIATIO FROM THE CRLB (σ CRLB ), FOR ALL COFIGURATIOS (MRAD) First Second Third Scenario ψ ρ φ ψ ρ φ ψ ρ φ RMSE σ CRLB RMSE σ CRLB RMSE σ CRLB RMSE σ CRLB RMSE σ CRLB TABLE V SAMPLE AVERAGE BIAS ERROR b OVER =1 MOTE CARLO RUS FOR ALL COFIGURATIOS (MRAD) (TO COFIRM THAT THE BIAS ESTIMATES ARE UBIASED) First Second Third Scenario ψ ρ φ ψ ρ φ ψ ρ φ 1 b b b b b Given that () can be written as x is x is = y is = T s (x i ξ s ) z is T s11 T s1 T s13 = T s1 T s T s3 T s31 T s3 T s33 therefore and x i ξ s y i η s z i ζ s (43) x is = T s11 (x i ξ s ) + T s1 (y i η s ) + T s13 (44) y is = T s1 (x i ξ s ) + T s (y i η s ) + T s3 (45) z is = T s31 (x i ξ s ) + T s3 (y i η s ) + T s33 (46) = T s11, = T s1, = T s31, = T s1, = T s, = T s3, = T s13 = T s3 (47) = T s33 = T s 11 (x i ξ s ) + T s 1 (y i η s ) + T s 13 (48) = T s 11 (x i ξ s ) + T s 1 (y i η s ) + T s 13 (49) = T s 11 (x i ξ s ) + T s 1 (y i η s ) + T s 13 (5) = T s 1 (x i ξ s ) + T s (y i η s ) + T s 3 (51) = T s 1 (x i ξ s ) + T s (y i η s ) + T s 3 (5) = T s 11 (x i ξ s ) + T s (y i η s ) + T s 3 (53) = T s 31 (x i ξ s ) + T s 3 (y i η s ) + T s 33 (54) = T s 31 (x i ξ s ) + T s 3 (y i η s ) + T s 33 (55)

8 = T s 31 (x i ξ s ) + T s 3 (y i η s ) + T s 33 z is (56) = zis + (57) x is = (58) = x is x is + (59) z is g is x is y = is (x is + zis )(x is + y is + z is ) (6) g is x = is + zis x is + y is + (61) z is g is z is y is = (x is + y is + z is )( x is + z is ) (6) T s11 = sin ψ k cos ρ k (63) T s1 = sin ψ k sin ρ k sin φ k cos ψ k cos φ k (64) T s13 = sin ψ k sin ρ k cos φ k + cos ψ k sin φ k (65) T s1 = cos ψ k cos ρ k (66) T s = cos ψ k sin ρ k sin φ k sin ψ k cos φ k (67) T s3 = cos ψ k sin ρ k cos φ k + sin ψ k sin φ k (68) T s31 = (69) T s3 = (7) T s33 = (71) T s11 = cos ψ k sin ρ k (7) T s1 = cos ψ k cos ρ k sin φ k (73) T s13 = cos ψ k cos ρ k cos φ k (74) T s1 = sin ψ k sin φ k (75) T s = sin ψ k cos ρ k sin φ k (76) T s3 = sin ψ k cos ρ k cos φ k (77) T s31 = cos φ k (78) T s3 = sin ρ k sin φ k (79) T s33 = sin ρ k cos φ k (8) T s11 = (81) T s1 = cos ψ k sin ρ k cos φ k + sin ψ k sin φ k (8) T s13 = cos ψ k sin ρ k sin φ k + sin ψ k cos φ k (83) T s1 = (84) T s = sin ψ k sin ρ k cos φ k cos ψ k sin φ k (85) T s3 = sin ψ k sin ρ k sin φ k cos ψ k cos φ k (86) T s31 = (87) T s3 = cos ψ k cos φ k (88) T s33 = cos ρ k sin φ k (89) REFERECES [1] Y Bar-Shalom, X-R Li, and T Kirubarajan, Estimation with Applications to Tracking and avigation: Theory, Algorithms and Software J Wiley and Sons, 1 [] Y Bar-Shalom, P K Willett, and X Tian, Tracking and Data Fusion YBS Publishing, 11 [3] S V Bordonaro, P Willett, and Y Bar-Shalom, Tracking with Converted Position and Doppler Measurements, in Proc SPIE Conf Signal and Data Processing of Small s, #8137-1, San Diego, CA, Aug 11 [4] S V Bordonaro, P Willett, and Y Bar-Shalom, Consistent Linear Tracker with Position and Range Rate Measurements, in Proc Asilomar Conf, Asilomar, CA, ov 1 [5] S V Bordonaro, P Willett, and Y Bar-Shalom, Unbiased Tracking with Converted Measurements, in Proc 1 IEEE Radar Conference, Atlanta, GA, May 1 [6] S V Bordonaro, P Willett, and Y Bar-Shalom, Unbiased Tracking with Converted Measurements, in Proc 51st IEEE Conf on Decision and Control, Maui, HI, Dec 1 [7] D F Crouse, R Osborne, III, K Pattipati, P Willett, and Y Bar-Shalom, D Location Estimation of Angle-Only Arrays Using s of Opportunity, in Proc 13th International Conference on Information Fusion, Edinburgh, Scotland, July 1 [8] B D Kragel, S Danford, S M Herman, and A B Poore, Bias Estimation Using s of Opportunity, Proc SPIE Conf on Signal and Data Processing of Small s, #6699, Aug 7 [9] B D Kragel, S Danford, S M Herman, and A B Poore, Joint MAP Bias Estimation and Data Association: Algorithms, Proc SPIE Conf on Signal and Data Processing of Small s, #6699-1E, Aug 7 [1] B D Kragel, S Danford, and A B Poore, Concurrent MAP Data Association and Absolute Bias Estimation with an Arbitrary umber of s, Proc SPIE Conf on Signal and Data Processing of Small s, #6969-5, May 8 [11] X Lin, F Y Bar-Shalom, and T Kirubarajan, Exact Multisensor Dynamic Bias Estimation with Local Tracks, IEEE Tans on Aerospace and Electronic Systems, vol 4, no, pp , April 4 [1] R W Osborne, III, and Y Bar-Shalom, Statistical Efficiency of Composite Position Measurements from Passive s, in Proc SPIE Conf on Signal Proc, Fusion, and Recognition, #85-7, Orlando, FL, April 11 [13] R W Osborne, III, and Y Bar-Shalom, Statistical Efficiency of Composite Position Measurements from Passive s, IEEE Tans on Aerospace and Electronic Systems, To appear 13

FUSION Multitarget-Multisensor Bias Estimation. T. Kirubarajan July Estimation, Tracking and Fusion Laboratory (ETFLab)

FUSION Multitarget-Multisensor Bias Estimation. T. Kirubarajan July Estimation, Tracking and Fusion Laboratory (ETFLab) Multitarget-Multisensor Bias Estimation Estimation, Tracking and Fusion Laboratory (ETFLab) E. Taghavi, R. Tharmarasa, T. Kirubarajan T. Kirubarajan July 2017 Motivation (1) Fusion of different sensors

More information

Bias Estimation and Observability for Optical Sensor Measurements with Targets of Opportunity

Bias Estimation and Observability for Optical Sensor Measurements with Targets of Opportunity Bias Estimation and Observability for Otical Sensor Measurements with Targets of Oortunity DJEDJIGA BELFADEL RICHARD W. OSBORE, III YAAKOV BAR-SHALOM In order to carry out data fusion, registration error

More information

Localization of Piecewise Curvilinearly Moving Targets Using Azimuth and Azimuth Rates

Localization of Piecewise Curvilinearly Moving Targets Using Azimuth and Azimuth Rates Localization of Piecewise Curvilinearly Moving Targets Using Azimuth and Azimuth Rates Julian Hörst and Marc Oispuu Fraunhofer FKIE, Dept Sensor Data and Information Fusion Neuenahrer Str 20, 53343 Wachtberg,

More information

DIMENSIONAL SYNTHESIS OF SPATIAL RR ROBOTS

DIMENSIONAL SYNTHESIS OF SPATIAL RR ROBOTS DIMENSIONAL SYNTHESIS OF SPATIAL RR ROBOTS ALBA PEREZ Robotics and Automation Laboratory University of California, Irvine Irvine, CA 9697 email: maperez@uci.edu AND J. MICHAEL MCCARTHY Department of Mechanical

More information

DETC APPROXIMATE MOTION SYNTHESIS OF SPHERICAL KINEMATIC CHAINS

DETC APPROXIMATE MOTION SYNTHESIS OF SPHERICAL KINEMATIC CHAINS Proceedings of the ASME 2007 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference IDETC/CIE 2007 September 4-7, 2007, Las Vegas, Nevada, USA DETC2007-34372

More information

REPORT DOCUMENTATION PAGE

REPORT DOCUMENTATION PAGE REPORT DOCUMENTATION PAGE 1 Form Approved OMB NO. 0704-0188 The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing

More information

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS

METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS METRIC PLANE RECTIFICATION USING SYMMETRIC VANISHING POINTS M. Lefler, H. Hel-Or Dept. of CS, University of Haifa, Israel Y. Hel-Or School of CS, IDC, Herzliya, Israel ABSTRACT Video analysis often requires

More information

3D Transformations. CS 4620 Lecture 10. Cornell CS4620 Fall 2014 Lecture Steve Marschner (with previous instructors James/Bala)

3D Transformations. CS 4620 Lecture 10. Cornell CS4620 Fall 2014 Lecture Steve Marschner (with previous instructors James/Bala) 3D Transformations CS 4620 Lecture 10 1 Translation 2 Scaling 3 Rotation about z axis 4 Rotation about x axis 5 Rotation about y axis 6 Properties of Matrices Translations: linear part is the identity

More information

Outline. EE793 Target Tracking: Lecture 2 Introduction to Target Tracking. Introduction to Target Tracking (TT) A Conventional TT System

Outline. EE793 Target Tracking: Lecture 2 Introduction to Target Tracking. Introduction to Target Tracking (TT) A Conventional TT System Outline EE793 Target Tracking: Lecture 2 Introduction to Target Tracking Umut Orguner umut@metu.edu.tr room: EZ-12 tel: 4425 Department of Electrical & Electronics Engineering Middle East Technical University

More information

Selection and Integration of Sensors Alex Spitzer 11/23/14

Selection and Integration of Sensors Alex Spitzer 11/23/14 Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs

More information

3D Transformations. CS 4620 Lecture Kavita Bala w/ prior instructor Steve Marschner. Cornell CS4620 Fall 2015 Lecture 11

3D Transformations. CS 4620 Lecture Kavita Bala w/ prior instructor Steve Marschner. Cornell CS4620 Fall 2015 Lecture 11 3D Transformations CS 4620 Lecture 11 1 Announcements A2 due tomorrow Demos on Monday Please sign up for a slot Post on piazza 2 Translation 3 Scaling 4 Rotation about z axis 5 Rotation about x axis 6

More information

(X 1:n η) 1 θ e 1. i=1. Using the traditional MLE derivation technique, the penalized MLEs for η and θ are: = n. (X i η) = 0. i=1 = 1.

(X 1:n η) 1 θ e 1. i=1. Using the traditional MLE derivation technique, the penalized MLEs for η and θ are: = n. (X i η) = 0. i=1 = 1. EXAMINING THE PERFORMANCE OF A CONTROL CHART FOR THE SHIFTED EXPONENTIAL DISTRIBUTION USING PENALIZED MAXIMUM LIKELIHOOD ESTIMATORS: A SIMULATION STUDY USING SAS Austin Brown, M.S., University of Northern

More information

CALCULATING TRANSFORMATIONS OF KINEMATIC CHAINS USING HOMOGENEOUS COORDINATES

CALCULATING TRANSFORMATIONS OF KINEMATIC CHAINS USING HOMOGENEOUS COORDINATES CALCULATING TRANSFORMATIONS OF KINEMATIC CHAINS USING HOMOGENEOUS COORDINATES YINGYING REN Abstract. In this paper, the applications of homogeneous coordinates are discussed to obtain an efficient model

More information

Model Based Perspective Inversion

Model Based Perspective Inversion Model Based Perspective Inversion A. D. Worrall, K. D. Baker & G. D. Sullivan Intelligent Systems Group, Department of Computer Science, University of Reading, RG6 2AX, UK. Anthony.Worrall@reading.ac.uk

More information

Metrics for Feature-Aided Track Association

Metrics for Feature-Aided Track Association Metrics for eature-aided rack Association Chee-Yee Chong BAE Systems Los Altos, CA, U.S.A. chee.chong@baesystems.com Shozo Mori BAE Systems Los Altos, CA, U.S.A. shozo.mori@baesystems.com Abstract: rack

More information

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI Chapter 4 Dynamics Part 2 4.3 Constrained Kinematics and Dynamics 1 Outline 4.3 Constrained Kinematics and Dynamics 4.3.1 Constraints of Disallowed Direction 4.3.2 Constraints of Rolling without Slipping

More information

Visualizing Quaternions

Visualizing Quaternions Visualizing Quaternions Andrew J. Hanson Computer Science Department Indiana University Siggraph 1 Tutorial 1 GRAND PLAN I: Fundamentals of Quaternions II: Visualizing Quaternion Geometry III: Quaternion

More information

Kinematics. Kinematics analyzes the geometry of a manipulator, robot or machine motion. The essential concept is a position.

Kinematics. Kinematics analyzes the geometry of a manipulator, robot or machine motion. The essential concept is a position. Kinematics Kinematics analyzes the geometry of a manipulator, robot or machine motion. The essential concept is a position. 1/31 Statics deals with the forces and moments which are aplied on the mechanism

More information

Fall 2016 Semester METR 3113 Atmospheric Dynamics I: Introduction to Atmospheric Kinematics and Dynamics

Fall 2016 Semester METR 3113 Atmospheric Dynamics I: Introduction to Atmospheric Kinematics and Dynamics Fall 2016 Semester METR 3113 Atmospheric Dynamics I: Introduction to Atmospheric Kinematics and Dynamics Lecture 5 August 31 2016 Topics: Polar coordinate system Conversion of polar coordinates to 2-D

More information

Passive Multi Target Tracking with GM-PHD Filter

Passive Multi Target Tracking with GM-PHD Filter Passive Multi Target Tracking with GM-PHD Filter Dann Laneuville DCNS SNS Division Toulon, France dann.laneuville@dcnsgroup.com Jérémie Houssineau DCNS SNS Division Toulon, France jeremie.houssineau@dcnsgroup.com

More information

Visual Recognition: Image Formation

Visual Recognition: Image Formation Visual Recognition: Image Formation Raquel Urtasun TTI Chicago Jan 5, 2012 Raquel Urtasun (TTI-C) Visual Recognition Jan 5, 2012 1 / 61 Today s lecture... Fundamentals of image formation You should know

More information

Analysis of Euler Angles in a Simple Two-Axis Gimbals Set

Analysis of Euler Angles in a Simple Two-Axis Gimbals Set Vol:5, No:9, 2 Analysis of Euler Angles in a Simple Two-Axis Gimbals Set Ma Myint Myint Aye International Science Index, Mechanical and Mechatronics Engineering Vol:5, No:9, 2 waset.org/publication/358

More information

An idea which can be used once is a trick. If it can be used more than once it becomes a method

An idea which can be used once is a trick. If it can be used more than once it becomes a method An idea which can be used once is a trick. If it can be used more than once it becomes a method - George Polya and Gabor Szego University of Texas at Arlington Rigid Body Transformations & Generalized

More information

CS354 Computer Graphics Rotations and Quaternions

CS354 Computer Graphics Rotations and Quaternions Slide Credit: Don Fussell CS354 Computer Graphics Rotations and Quaternions Qixing Huang April 4th 2018 Orientation Position and Orientation The position of an object can be represented as a translation

More information

Perspective Projection Describes Image Formation Berthold K.P. Horn

Perspective Projection Describes Image Formation Berthold K.P. Horn Perspective Projection Describes Image Formation Berthold K.P. Horn Wheel Alignment: Camber, Caster, Toe-In, SAI, Camber: angle between axle and horizontal plane. Toe: angle between projection of axle

More information

HIGH MANEUVERING TARGET TRACKING USING A NOVEL HYBRID KALMAN FILTER-FUZZY LOGIC ARCHITECTURE

HIGH MANEUVERING TARGET TRACKING USING A NOVEL HYBRID KALMAN FILTER-FUZZY LOGIC ARCHITECTURE International Journal of Innovative Computing, Information and Control ICIC International c 2011 ISSN 1349-4198 Volume 7, Number 2, February 2011 pp. 501 510 HIGH MANEUVERING TARGET TRACKING USING A NOVEL

More information

Coordinate Transformations in Advanced Calculus

Coordinate Transformations in Advanced Calculus Coordinate Transformations in Advanced Calculus by Sacha Nandlall T.A. for MATH 264, McGill University Email: sacha.nandlall@mail.mcgill.ca Website: http://www.resanova.com/teaching/calculus/ Fall 2006,

More information

Resolution of spherical parallel Manipulator (SPM) forward kinematic model (FKM) near the singularities

Resolution of spherical parallel Manipulator (SPM) forward kinematic model (FKM) near the singularities Resolution of spherical parallel Manipulator (SPM) forward kinematic model (FKM) near the singularities H. Saafi a, M. A. Laribi a, S. Zeghloul a a. Dept. GMSC, Pprime Institute, CNRS - University of Poitiers

More information

Computer Vision cmput 428/615

Computer Vision cmput 428/615 Computer Vision cmput 428/615 Basic 2D and 3D geometry and Camera models Martin Jagersand The equation of projection Intuitively: How do we develop a consistent mathematical framework for projection calculations?

More information

Array Shape Tracking Using Active Sonar Reverberation

Array Shape Tracking Using Active Sonar Reverberation Lincoln Laboratory ASAP-2003 Worshop Array Shape Tracing Using Active Sonar Reverberation Vijay Varadarajan and Jeffrey Kroli Due University Department of Electrical and Computer Engineering Durham, NC

More information

METR 4202: Advanced Control & Robotics

METR 4202: Advanced Control & Robotics Position & Orientation & State t home with Homogenous Transformations METR 4202: dvanced Control & Robotics Drs Surya Singh, Paul Pounds, and Hanna Kurniawati Lecture # 2 July 30, 2012 metr4202@itee.uq.edu.au

More information

A Non-Iterative Approach to Frequency Estimation of a Complex Exponential in Noise by Interpolation of Fourier Coefficients

A Non-Iterative Approach to Frequency Estimation of a Complex Exponential in Noise by Interpolation of Fourier Coefficients A on-iterative Approach to Frequency Estimation of a Complex Exponential in oise by Interpolation of Fourier Coefficients Shahab Faiz Minhas* School of Electrical and Electronics Engineering University

More information

ξ ν ecliptic sun m λ s equator

ξ ν ecliptic sun m λ s equator Attitude parameterization for GAIA L. Lindegren (1 July ) SAG LL Abstract. The GAIA attaitude may be described by four continuous functions of time, q1(t), q(t), q(t), q4(t), which form a quaternion of

More information

Geometric transformations in 3D and coordinate frames. Computer Graphics CSE 167 Lecture 3

Geometric transformations in 3D and coordinate frames. Computer Graphics CSE 167 Lecture 3 Geometric transformations in 3D and coordinate frames Computer Graphics CSE 167 Lecture 3 CSE 167: Computer Graphics 3D points as vectors Geometric transformations in 3D Coordinate frames CSE 167, Winter

More information

Geometrical Feature Extraction Using 2D Range Scanner

Geometrical Feature Extraction Using 2D Range Scanner Geometrical Feature Extraction Using 2D Range Scanner Sen Zhang Lihua Xie Martin Adams Fan Tang BLK S2, School of Electrical and Electronic Engineering Nanyang Technological University, Singapore 639798

More information

Optimization-Based Calibration of a Triaxial Accelerometer-Magnetometer

Optimization-Based Calibration of a Triaxial Accelerometer-Magnetometer 2005 American Control Conference June 8-10, 2005. Portland, OR, USA ThA07.1 Optimization-Based Calibration of a Triaxial Accelerometer-Magnetometer Erin L. Renk, Walter Collins, Matthew Rizzo, Fuju Lee,

More information

Robotics (Kinematics) Winter 1393 Bonab University

Robotics (Kinematics) Winter 1393 Bonab University Robotics () Winter 1393 Bonab University : most basic study of how mechanical systems behave Introduction Need to understand the mechanical behavior for: Design Control Both: Manipulators, Mobile Robots

More information

Camera Model and Calibration

Camera Model and Calibration Camera Model and Calibration Lecture-10 Camera Calibration Determine extrinsic and intrinsic parameters of camera Extrinsic 3D location and orientation of camera Intrinsic Focal length The size of the

More information

Graphics and Interaction Transformation geometry and homogeneous coordinates

Graphics and Interaction Transformation geometry and homogeneous coordinates 433-324 Graphics and Interaction Transformation geometry and homogeneous coordinates Department of Computer Science and Software Engineering The Lecture outline Introduction Vectors and matrices Translation

More information

Introduction to Robotics

Introduction to Robotics Université de Strasbourg Introduction to Robotics Bernard BAYLE, 2013 http://eavr.u-strasbg.fr/ bernard Modelling of a SCARA-type robotic manipulator SCARA-type robotic manipulators: introduction SCARA-type

More information

Passive Differential Matched-field Depth Estimation of Moving Acoustic Sources

Passive Differential Matched-field Depth Estimation of Moving Acoustic Sources Lincoln Laboratory ASAP-2001 Workshop Passive Differential Matched-field Depth Estimation of Moving Acoustic Sources Shawn Kraut and Jeffrey Krolik Duke University Department of Electrical and Computer

More information

COMP30019 Graphics and Interaction Transformation geometry and homogeneous coordinates

COMP30019 Graphics and Interaction Transformation geometry and homogeneous coordinates COMP30019 Graphics and Interaction Transformation geometry and homogeneous coordinates Department of Computer Science and Software Engineering The Lecture outline Introduction Vectors and matrices Translation

More information

Video integration in a GNSS/INS hybridization architecture for approach and landing

Video integration in a GNSS/INS hybridization architecture for approach and landing Author manuscript, published in "IEEE/ION PLANS 2014, Position Location and Navigation Symposium, Monterey : United States (2014)" Video integration in a GNSS/INS hybridization architecture for approach

More information

Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation

Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation Bryan Poling University of Minnesota Joint work with Gilad Lerman University of Minnesota The Problem of Subspace

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.1: 3D Geometry Jürgen Sturm Technische Universität München Points in 3D 3D point Augmented vector Homogeneous

More information

Error Simulation and Multi-Sensor Data Fusion

Error Simulation and Multi-Sensor Data Fusion Error Simulation and Multi-Sensor Data Fusion AERO4701 Space Engineering 3 Week 6 Last Week Looked at the problem of attitude determination for satellites Examined several common methods such as inertial

More information

Task selection for control of active vision systems

Task selection for control of active vision systems The 29 IEEE/RSJ International Conference on Intelligent Robots and Systems October -5, 29 St. Louis, USA Task selection for control of active vision systems Yasushi Iwatani Abstract This paper discusses

More information

Planar Robot Kinematics

Planar Robot Kinematics V. Kumar lanar Robot Kinematics The mathematical modeling of spatial linkages is quite involved. t is useful to start with planar robots because the kinematics of planar mechanisms is generally much simpler

More information

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains PhD student: Jeff DELAUNE ONERA Director: Guy LE BESNERAIS ONERA Advisors: Jean-Loup FARGES Clément BOURDARIAS

More information

Chapters 1 9: Overview

Chapters 1 9: Overview Chapters 1 9: Overview Chapter 1: Introduction Chapters 2 4: Data acquisition Chapters 5 9: Data manipulation Chapter 5: Vertical imagery Chapter 6: Image coordinate measurements and refinements Chapters

More information

Position Error Reduction of Kinematic Mechanisms Using Tolerance Analysis and Cost Function

Position Error Reduction of Kinematic Mechanisms Using Tolerance Analysis and Cost Function Position Error Reduction of Kinematic Mechanisms Using Tolerance Analysis and Cost Function B.Moetakef-Imani, M.Pour Department of Mechanical Engineering, Faculty of Engineering, Ferdowsi University of

More information

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces Chapter 6 Curves and Surfaces In Chapter 2 a plane is defined as the zero set of a linear function in R 3. It is expected a surface is the zero set of a differentiable function in R n. To motivate, graphs

More information

Flexible Calibration of a Portable Structured Light System through Surface Plane

Flexible Calibration of a Portable Structured Light System through Surface Plane Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured

More information

Nonlinear Filtering with IMM Algorithm for Coastal Radar Target Tracking System

Nonlinear Filtering with IMM Algorithm for Coastal Radar Target Tracking System TELKOMNIKA, Vol.13, No.1, March 2015, pp. 211~220 ISSN: 1693-6930, accredited A by DIKTI, Decree No: 58/DIKTI/Kep/2013 DOI: 10.12928/TELKOMNIKA.v13i1.791 211 Nonlinear Filtering with IMM Algorithm for

More information

Quaternion-Based Tracking of Multiple Objects in Synchronized Videos

Quaternion-Based Tracking of Multiple Objects in Synchronized Videos Quaternion-Based Tracking of Multiple Objects in Synchronized Videos Quming Zhou 1, Jihun Park 2, and J.K. Aggarwal 1 1 Department of Electrical and Computer Engineering The University of Texas at Austin

More information

To do this the end effector of the robot must be correctly positioned relative to the work piece.

To do this the end effector of the robot must be correctly positioned relative to the work piece. Spatial Descriptions and Transformations typical robotic task is to grasp a work piece supplied by a conveyer belt or similar mechanism in an automated manufacturing environment, transfer it to a new position

More information

Geometric Transformations

Geometric Transformations Geometric Transformations CS 4620 Lecture 9 2017 Steve Marschner 1 A little quick math background Notation for sets, functions, mappings Linear and affine transformations Matrices Matrix-vector multiplication

More information

Extraction of surface normal and index of refraction using a pair of passive infrared polarimetric sensors

Extraction of surface normal and index of refraction using a pair of passive infrared polarimetric sensors Extraction of surface normal and index of refraction using a pair of passive infrared polarimetric sensors Firooz Sadjadi Lockheed Martin Corporation Saint Paul, Minnesota firooz.sadjadi@ieee.org Farzad

More information

An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory

An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory An Efficient Method for Solving the Direct Kinematics of Parallel Manipulators Following a Trajectory Roshdy Foaad Abo-Shanab Kafr Elsheikh University/Department of Mechanical Engineering, Kafr Elsheikh,

More information

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences

Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Camera Parameters Estimation from Hand-labelled Sun Sositions in Image Sequences Jean-François Lalonde, Srinivasa G. Narasimhan and Alexei A. Efros {jlalonde,srinivas,efros}@cs.cmu.edu CMU-RI-TR-8-32 July

More information

State-space models for 3D visual tracking

State-space models for 3D visual tracking SA-1 State-space models for 3D visual tracking Doz. G. Bleser Prof. Stricker Computer Vision: Object and People Tracking Example applications Head tracking, e.g. for mobile seating buck Object tracking,

More information

MTRX4700 Experimental Robotics

MTRX4700 Experimental Robotics MTRX 4700 : Experimental Robotics Lecture 2 Stefan B. Williams Slide 1 Course Outline Week Date Content Labs Due Dates 1 5 Mar Introduction, history & philosophy of robotics 2 12 Mar Robot kinematics &

More information

Perspective Projection in Homogeneous Coordinates

Perspective Projection in Homogeneous Coordinates Perspective Projection in Homogeneous Coordinates Carlo Tomasi If standard Cartesian coordinates are used, a rigid transformation takes the form X = R(X t) and the equations of perspective projection are

More information

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Jian Guo, Debadyuti Roy, Jing Wang University of Michigan, Department of Statistics Introduction In this report we propose robust

More information

CSE 252B: Computer Vision II

CSE 252B: Computer Vision II CSE 252B: Computer Vision II Lecturer: Serge Belongie Scribe: Sameer Agarwal LECTURE 1 Image Formation 1.1. The geometry of image formation We begin by considering the process of image formation when a

More information

Smartphone Video Guidance Sensor for Small Satellites

Smartphone Video Guidance Sensor for Small Satellites SSC13-I-7 Smartphone Video Guidance Sensor for Small Satellites Christopher Becker, Richard Howard, John Rakoczy NASA Marshall Space Flight Center Mail Stop EV42, Huntsville, AL 35812; 256-544-0114 christophermbecker@nasagov

More information

Simultaneous Vanishing Point Detection and Camera Calibration from Single Images

Simultaneous Vanishing Point Detection and Camera Calibration from Single Images Simultaneous Vanishing Point Detection and Camera Calibration from Single Images Bo Li, Kun Peng, Xianghua Ying, and Hongbin Zha The Key Lab of Machine Perception (Ministry of Education), Peking University,

More information

Multi-focus image fusion using de-noising and sharpness criterion

Multi-focus image fusion using de-noising and sharpness criterion Multi-focus image fusion using de-noising and sharpness criterion Sukhdip Kaur, M.Tech (research student) Department of Computer Science Guru Nanak Dev Engg. College Ludhiana, Punjab, INDIA deep.sept23@gmail.com

More information

A New Algorithm for Measuring and Optimizing the Manipulability Index

A New Algorithm for Measuring and Optimizing the Manipulability Index DOI 10.1007/s10846-009-9388-9 A New Algorithm for Measuring and Optimizing the Manipulability Index Ayssam Yehia Elkady Mohammed Mohammed Tarek Sobh Received: 16 September 2009 / Accepted: 27 October 2009

More information

Inverse Kinematics Analysis for Manipulator Robot With Wrist Offset Based On the Closed-Form Algorithm

Inverse Kinematics Analysis for Manipulator Robot With Wrist Offset Based On the Closed-Form Algorithm Inverse Kinematics Analysis for Manipulator Robot With Wrist Offset Based On the Closed-Form Algorithm Mohammed Z. Al-Faiz,MIEEE Computer Engineering Dept. Nahrain University Baghdad, Iraq Mohammed S.Saleh

More information

Lecture Note 3: Rotational Motion

Lecture Note 3: Rotational Motion ECE5463: Introduction to Robotics Lecture Note 3: Rotational Motion Prof. Wei Zhang Department of Electrical and Computer Engineering Ohio State University Columbus, Ohio, USA Spring 2018 Lecture 3 (ECE5463

More information

Centre for Autonomous Systems

Centre for Autonomous Systems Robot Henrik I Centre for Autonomous Systems Kungl Tekniska Högskolan hic@kth.se 27th April 2005 Outline 1 duction 2 Kinematic and Constraints 3 Mobile Robot 4 Mobile Robot 5 Beyond Basic 6 Kinematic 7

More information

Accurate Computation of Quaternions from Rotation Matrices

Accurate Computation of Quaternions from Rotation Matrices Accurate Computation of Quaternions from Rotation Matrices Soheil Sarabandi and Federico Thomas Institut de Robòtica i Informàtica Industrial (CSIC-UPC) Llorens Artigas 4-6, 0808 Barcelona, Spain soheil.sarabandi@gmail.com,

More information

Math D Printing Group Final Report

Math D Printing Group Final Report Math 4020 3D Printing Group Final Report Gayan Abeynanda Brandon Oubre Jeremy Tillay Dustin Wright Wednesday 29 th April, 2015 Introduction 3D printers are a growing technology, but there is currently

More information

Epipolar Geometry and the Essential Matrix

Epipolar Geometry and the Essential Matrix Epipolar Geometry and the Essential Matrix Carlo Tomasi The epipolar geometry of a pair of cameras expresses the fundamental relationship between any two corresponding points in the two image planes, and

More information

10. Cartesian Trajectory Planning for Robot Manipulators

10. Cartesian Trajectory Planning for Robot Manipulators V. Kumar 0. Cartesian rajectory Planning for obot Manipulators 0.. Introduction Given a starting end effector position and orientation and a goal position and orientation we want to generate a smooth trajectory

More information

Introduction to quaternions. Mathematics. Operations

Introduction to quaternions. Mathematics. Operations Introduction to quaternions Topics: Definition Mathematics Operations Euler Angles (optional) intro to quaternions 1 noel.h.hughes@gmail.com Euler's Theorem y y Angle! rotation follows right hand rule

More information

DD2429 Computational Photography :00-19:00

DD2429 Computational Photography :00-19:00 . Examination: DD2429 Computational Photography 202-0-8 4:00-9:00 Each problem gives max 5 points. In order to pass you need about 0-5 points. You are allowed to use the lecture notes and standard list

More information

(1) and s k ωk. p k vk q

(1) and s k ωk. p k vk q Sensing and Perception: Localization and positioning Isaac Sog Project Assignment: GNSS aided INS In this project assignment you will wor with a type of navigation system referred to as a global navigation

More information

Calibration and Noise Identification of a Rolling Shutter Camera and a Low-Cost Inertial Measurement Unit

Calibration and Noise Identification of a Rolling Shutter Camera and a Low-Cost Inertial Measurement Unit sensors Article Calibration and Noise Identification of a Rolling Shutter Camera and a Low-Cost Inertial Measurement Unit Chang-Ryeol Lee 1 ID, Ju Hong Yoon 2 and Kuk-Jin Yoon 3, * 1 School of Electrical

More information

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993

Camera Calibration for Video See-Through Head-Mounted Display. Abstract. 1.0 Introduction. Mike Bajura July 7, 1993 Camera Calibration for Video See-Through Head-Mounted Display Mike Bajura July 7, 1993 Abstract This report describes a method for computing the parameters needed to model a television camera for video

More information

CS612 - Algorithms in Bioinformatics

CS612 - Algorithms in Bioinformatics Fall 2017 Structural Manipulation November 22, 2017 Rapid Structural Analysis Methods Emergence of large structural databases which do not allow manual (visual) analysis and require efficient 3-D search

More information

Vehicle Tracking with Heading Estimation using a Mono Camera System

Vehicle Tracking with Heading Estimation using a Mono Camera System Master of Science Thesis in Electrical Engineering Department of Electrical Engineering, Linköping University, 218 Vehicle Tracking with Heading Estimation using a Mono Camera System Fredrik Nilsson Master

More information

Outline. Target Tracking: Lecture 1 Course Info + Introduction to TT. Course Info. Course Info. Course info Introduction to Target Tracking

Outline. Target Tracking: Lecture 1 Course Info + Introduction to TT. Course Info. Course Info. Course info Introduction to Target Tracking REGLERTEKNIK Outline AUTOMATIC CONTROL Target Tracking: Lecture 1 Course Info + Introduction to TT Emre Özkan emre@isy.liu.se Course info Introduction to Target Tracking Division of Automatic Control Department

More information

Short on camera geometry and camera calibration

Short on camera geometry and camera calibration Short on camera geometry and camera calibration Maria Magnusson, maria.magnusson@liu.se Computer Vision Laboratory, Department of Electrical Engineering, Linköping University, Sweden Report No: LiTH-ISY-R-3070

More information

Spatter Tracking in Laser Machining

Spatter Tracking in Laser Machining Spatter Tracking in Laser Machining No Author Given No Institute Given Abstract. In laser drilling, an assist gas is often used to remove material from the drilling point. In order to design assist gas

More information

Contents. 1 Introduction Background Organization Features... 7

Contents. 1 Introduction Background Organization Features... 7 Contents 1 Introduction... 1 1.1 Background.... 1 1.2 Organization... 2 1.3 Features... 7 Part I Fundamental Algorithms for Computer Vision 2 Ellipse Fitting... 11 2.1 Representation of Ellipses.... 11

More information

Adaptive Doppler centroid estimation algorithm of airborne SAR

Adaptive Doppler centroid estimation algorithm of airborne SAR Adaptive Doppler centroid estimation algorithm of airborne SAR Jian Yang 1,2a), Chang Liu 1, and Yanfei Wang 1 1 Institute of Electronics, Chinese Academy of Sciences 19 North Sihuan Road, Haidian, Beijing

More information

INCREMENTAL DISPLACEMENT ESTIMATION METHOD FOR VISUALLY SERVOED PARIED STRUCTURED LIGHT SYSTEM (ViSP)

INCREMENTAL DISPLACEMENT ESTIMATION METHOD FOR VISUALLY SERVOED PARIED STRUCTURED LIGHT SYSTEM (ViSP) Blucher Mechanical Engineering Proceedings May 2014, vol. 1, num. 1 www.proceedings.blucher.com.br/evento/10wccm INCREMENAL DISPLACEMEN ESIMAION MEHOD FOR VISUALLY SERVOED PARIED SRUCURED LIGH SYSEM (ViSP)

More information

Robots are built to accomplish complex and difficult tasks that require highly non-linear motions.

Robots are built to accomplish complex and difficult tasks that require highly non-linear motions. Path and Trajectory specification Robots are built to accomplish complex and difficult tasks that require highly non-linear motions. Specifying the desired motion to achieve a specified goal is often a

More information

Forward Time-of-Flight Geometry for CLAS12

Forward Time-of-Flight Geometry for CLAS12 Forward Time-of-Flight Geometry for CLAS12 D.S. Carman, Jefferson Laboratory ftof geom.tex April 13, 2016 Abstract This document details the nominal geometry for the CLAS12 Forward Time-of- Flight System

More information

High-precision, consistent EKF-based visual-inertial odometry

High-precision, consistent EKF-based visual-inertial odometry High-precision, consistent EKF-based visual-inertial odometry Mingyang Li and Anastasios I. Mourikis, IJRR 2013 Ao Li Introduction What is visual-inertial odometry (VIO)? The problem of motion tracking

More information

Inertial Measurement Units I!

Inertial Measurement Units I! ! Inertial Measurement Units I! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 9! stanford.edu/class/ee267/!! Lecture Overview! coordinate systems (world, body/sensor, inertial,

More information

arxiv:hep-ex/ v1 24 Jun 1994

arxiv:hep-ex/ v1 24 Jun 1994 MULTIPLE SCATTERING ERROR PROPAGATION IN PARTICLE TRACK RECONSTRUCTION M. Penţia, G. Iorgovan INSTITUTE OF ATOMIC PHYSICS RO-76900 P.O.Box MG-6, Bucharest, ROMANIA e-mail: pentia@roifa.bitnet arxiv:hep-ex/9406006v1

More information

Spatial Alignment of Passive Radar and Infrared Image Data on Seeker

Spatial Alignment of Passive Radar and Infrared Image Data on Seeker Available online at www.sciencedirect.com Procedia Engineering 4 (011) 649 653 011 International Conference on Advances in Engineering Spatial Alignment of Passive Radar and Infrared Image Data on Seeker

More information

3D Tracking Using Two High-Speed Vision Systems

3D Tracking Using Two High-Speed Vision Systems 3D Tracking Using Two High-Speed Vision Systems Yoshihiro NAKABO 1, Idaku ISHII 2, Masatoshi ISHIKAWA 3 1 University of Tokyo, Tokyo, Japan, nakabo@k2.t.u-tokyo.ac.jp 2 Tokyo University of Agriculture

More information

Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving assistance applications on smart mobile devices

Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving assistance applications on smart mobile devices Technical University of Cluj-Napoca Image Processing and Pattern Recognition Research Center www.cv.utcluj.ro Machine learning based automatic extrinsic calibration of an onboard monocular camera for driving

More information

Visualisation Pipeline : The Virtual Camera

Visualisation Pipeline : The Virtual Camera Visualisation Pipeline : The Virtual Camera The Graphics Pipeline 3D Pipeline The Virtual Camera The Camera is defined by using a parallelepiped as a view volume with two of the walls used as the near

More information

Egomotion Estimation by Point-Cloud Back-Mapping

Egomotion Estimation by Point-Cloud Back-Mapping Egomotion Estimation by Point-Cloud Back-Mapping Haokun Geng, Radu Nicolescu, and Reinhard Klette Department of Computer Science, University of Auckland, New Zealand hgen001@aucklanduni.ac.nz Abstract.

More information

Hw 4 Due Feb 22. D(fg) x y z (

Hw 4 Due Feb 22. D(fg) x y z ( Hw 4 Due Feb 22 2.2 Exercise 7,8,10,12,15,18,28,35,36,46 2.3 Exercise 3,11,39,40,47(b) 2.4 Exercise 6,7 Use both the direct method and product rule to calculate where f(x, y, z) = 3x, g(x, y, z) = ( 1

More information