Metrological Characterization of a Vision-Based System for Relative Pose Measurements with Fiducial Marker Mapping for Spacecrafts

Size: px
Start display at page:

Download "Metrological Characterization of a Vision-Based System for Relative Pose Measurements with Fiducial Marker Mapping for Spacecrafts"

Transcription

1 robotics Article Metrological Characterization a Vision-Based System for Relative Pose Measurements with Fiducial Marker Mapping for Spacecrafts Marco Pertile 1,2, * ID, Sebastiano Chiodini 1 ID, Riccardo Giubilato 1 ID, Mattia Mazzucato 1, Andrea Valmorbida 2, Alberto Fornaser 3, Stefano Debei 1,2 Enrico C. Lorenzini 1,2 1 CISAS G. Colombo, University Padova, Padova, Italy; sebastiano.chiodini@unipd.it (S.C.); riccardo.giubilato@gmail.com (R.G.); matt.mazzucato@gmail.com (M.M.); stefano.debei@unipd.it (S.D.); enrico.lorenzini@unipd.it (E.C.L.) 2 Industrial Engineering Department, University Padova, Padova, Italy; rea.valmorbida@unipd.it 3 Industrial Engineering Department, University Trento, Povo (TN), Italy; alberto.fornaser@unitn.it * Correspondence: marco.pertile@unipd.it; Tel.: Received: 13 June 2018; Accepted: 6 August 2018; Published: 14 August 2018 Abstract: An improved approach for measurement relative pose between a target a chaser spacecraft is presented. The selected method is based on a single camera, which can be mounted on chaser, a plurality fiducial markers, which can be mounted on external surface target. The measurement procedure comprises a closed-form solution Perspective from n Points (PnP) problem, a RANdom SAmple Consensus (RANSAC) procedure, a non-linear local optimization a global Bundle Adjustment refinement marker map relative poses. A metrological characterization measurement system is performed using an experimental set-up that can impose rotations combined with a linear translation can measure m. The rotation position measurement errors are calculated with reference instrumentations ir uncertainties are evaluated by Monte Carlo method. The experimental laboratory tests highlight significant improvements provided by Bundle Adjustment refinement. Moreover, a set possible influencing physical parameters are defined ir correlations with rotation position errors uncertainties are analyzed. Using both numerical quantitative correlation coefficients qualitative graphical representations, most significant parameters for final measurement errors uncertainties are determined. The obtained results give clear indications advice for design future measurement systems for selection marker positioning on a satellite surface. Keywords: vision system; pose measurement; uncertainty evaluation; metrological calibration 1. Introduction There are two main applications that require an accurate measurement relative pose (position orientation) between two Spacecraft: autonomous rendezvous docking for on-orbit servicing formation flight two or more Spacecraft. The first example relative pose measurement in space using a vision system in docking phase has been provided by Proximity Operation Sensor (PXS), on-board 7th mission Engineering Test Satellite Program (ETS-VII) launched in The measurement system comprises a single visible camera mounted on chaser spacecraft seven non-coplanar, passive, round shaped markers, placed near docking interface target spacecraft; see Reference [1] for details. The PXS is an example cooperative pose measurement exhibits centimetric accuracy in relative position on a measurement range up to 3 m. In May 2007, a fully Robotics 2018, 7, 43; doi: /robotics

2 Robotics 2018, 7, autonomous rendezvous capture was successfully performed by DARPA s Orbital Express (OE) mission [2,3], whose suite sensors, called Autonomous Rendezvous Capture Sensor System, comprises Advanced Video Guidance Sensor (AVGS) a laser rangefinder. The AVGS is a laser-based vision system, capable a full six degrees--freedom relative pose measurement at near ranges, employs a monocular camera mounted on chaser a set markers (Corner Cube Reflectors) mounted on target spacecraft at known positions. To improve markers visibility, chaser spacecraft comprises two sets laser diodes at wavelengths nanometers. The reflectors are conceived to reflect only one two wavelengths. A first image is acquired with target illuminated by first set diodes, while a second image is acquired with or wavelength set diodes. The two images are subtracted to remove background to make detection markers easier. The accuracy requirements AVGS are ±12 mm on a measurement range 1 3 m; ±35 mm on 3 5 m; ±150 mm on 5 10 m, as defined in Reference [3]. In flight tests at close ranges, stard deviation measured errors was an order magnitude smaller than specifications. Anor interesting space application that requires a relative pose determination is formation flight two or more Spacecraft. In Reference [4], PRISMA formation flight demonstrator is described, which is a successful example a cooperative relative pose measurement in space between two Spacecraft (Mango Tango). The PRISMA mission comprises several hardware stware experiments involving vision systems, Global Navigation Satellite System, Radio-Frequency navigation, Guidance, Navigation Control algorithms. The Vision-Based Sensor (VBS), onboard PRIMSA mission, comprises a close-range camera mounted on Mango spacecraft active optical markers (flashing Light Emitting Diodes) mounted on Tango spacecraft fixed in known positions. Similar to ETS-VII OE, three-dimensional (3D) positions markers on Tango spacecraft two-dimensional (2D) position measurement markers observed by camera allow us to evaluate relative pose between Tango Mango. The VBS exhibits a stard uncertainty 1 2 cm (depending on direction) for a measurement range 10 m [4]. In References [5,6], flight tests to measure relative position, orientation, velocities between two SPHERES satellites are described. The micro-satellites SPHEREs are developed by MIT Space Systems Laboratory are flown inside International Space Station. A stereo-camera is mounted on a SPHERES satellite is employed to acquire four circular markers attached to anor SPHERES satellite. These observations allow us to evaluate relative pose between two satellites, while Multiplicative Extended Kalman Filter is employed to find relative velocities. As reported in Reference [6], a 0.5 cm maximum error is obtained for a total linear translation 40 cm, while for a 45 rotation around a vertical axis a maximum error 4 is achieved. Authors in [7] describes a measurement system suitable for nanosatellites, which comprises a single camera infrared LEDs designed to be mounted on a chaser spacecraft, while two passive patterns four retro-reflective fiducial markers are conceived to be mounted on a target spacecraft. The system is tested in laboratory yields measurement errors target-chaser relative position generally better than 5 mm, while maximum attitude error is between 3 5. All reported examples cope with same geometrical problem: evaluation relative chaser-target pose from a set 3D points (fiducial markers) known in target frame reference from 2D measurements ir projections on a chaser camera. As described in Reference [8], this problem is known as Perspective from n Points (PnP) re are numerous solutions, some which are described in References [9 16] briefly introduced in our previous work [8]. Authors in [9,13 16] describe five well known widely employed methods, while more general descriptions can be found in References [10 12]. The previously reported SPHERES case [5,6] employs an iterative solution. The solution that will be described in Section 2 was originally proposed by Reference [16] is selected since it is particularly efficient, closed-form, it requires only points, it can be easily integrated into a RANdom SAmple Consensus (RANSAC) scheme, it proved to be more accurate (see Section 5). As explained in Reference [8], a vision based instrument is selected since it is generally smaller, lighter less expensive than or laser-based sensors; an example this is provided

3 Robotics 2018, 7, in Reference [17]. A comprehensive well-written review different available methods is Reference [18], which analyses both cooperative uncooperative cases relative pose determination. The whole measurement procedure begins with acquisition images satellite provided with known fiducial markers comprises four main steps: detection 2D marker points on each acquired image; solution a P3P problem inside a RANSAC scheme to find a first approximation relative pose satellite; local refinement first solution using an iterative non-linear optimization; global refinement all detected poses all fiducial marker 3D positions, based on Bundle Adjustment (BA) method. The measurement outputs, i.e., relative pose, is compared with a reference instrumentation to evaluate measurement errors. Moreover, output uncertainty is evaluated using a Monte Carlo propagation approach. In this way, whole measurement procedure can be evaluated from a metrological point view. With reference to known literature to previous manuscript [8], main contributions present work, are following three points: 1. The effect several potentially influencing parameters is analyzed for measurement errors uncertainties satellite poses to highlight which ones exhibit a greater numerical correlation with obtained errors /or uncertainties. Since considered parameters can be adjusted by a proper selection positioning fiducial markers on satellite surface, aim proposed analysis is to yield useful advice in design future systems for relative pose measurement. 2. The BA approach is well known widely employed in robotics computer vision fields. In experimental verification BA approach applied to pose measurement with fiducial markers, evaluated errors uncertainties demonstrate a superior performance BA approach, also taking into account known numerical results described in References [4 8]. 3. A more detailed uncertainty analysis is performed, taking into account more uncertainty sources than in Reference [8], for 2D positions markers in images, such as possible inaccuracy corner detection algorithm. The first highlighted contribution is particularly important, since, to author s knowledge, re are very few works in literature related to a parametric study for vision-based navigation systems. One most significant recent works is Reference [19], which points out that number location sensors are deeply related to accuracy relative navigation algorithm investigates an optimal placement multiple Position Sensitive Diode (PSD) sensors mounted on a chaser satellite. The PSD sensors are employed to detect 2D projections active LEDs mounted on a target spacecraft. Similar investigations are carried out for sun sensors in References [20,21]. However, system measurement approach selected in present work is different, since only one camera is used supposed to be mounted on chaser, while multiple fiducial markers are mounted on target spacecraft. The proposed method is conceived to perform instant measurement relative pose. It means that pose measurement marker detection identification are performed starting from scratch in every relative position (time instant), without possible aid previous knowledge. We wanted to concentrate on how instant measurement can be improved; thus, possible tracking relative pose /or acquired fiducial markers is not considered. Any exploitation previous knowledge (time-wise) is beyond main purpose this work. For a similar reason, any dynamics between camera target is not considered. The dynamics would introduce both advantages drawbacks. On one h, explicit consideration dynamics could help measurement procedure improve final uncertainty, if tracking algorithms relative pose /or observed fiducial markers are properly conceived designed. On or h, dynamics would introduce time constraints: exposure time each image should be short to prevent motion blur in acquired images, frame rate between images should be sufficiently high, computational time should be low to allow for a real-time measurement. The proposed work wants to provide useful indications to improve

4 Robotics 2018, 7, Robotics 2018, 7, x FOR PEER REVIEW 4 30 Robotics 2018, 7, x FOR PEER REVIEW 4 30 low instant to allow posefor measurement, a real-time measurement. definition The proposed analysis work wants seto requirements provide useful (onindications exposure to time, frame improve low to rate, allow computational instant for a real-time pose measurement, time) measurement. are beyond The proposed definition main purpose. work wants analysis However, to provide se it is useful worth requirements indications highlighting (on to that exposure improve presented time, instant frame pose results rate, measurement, remain computational valid in presence time) are definition beyond relative main analysis dynamics purpose. se as However, requirements well if it is cited worth (on time requirements highlighting exposure time, are that frame satisfied. presented rate, computational results remain time) valid are in beyond presence main relative purpose. dynamics However, as it well is worth if cited highlighting In time following requirements that presented section are satisfied. results remain valid in presence relative dynamics as well if (Section 2), measurement method is described: marker detection cited in Section In time requirements 2.1, following P3P section are satisfied. problem(section solution 2), in Section measurement 2.2, method non-linear is described: local refinement marker indetection Section 2.3, in global Section In refinement 2.1, following P3P section with problem (Section BA in solution 2), Section 2.4. in measurement 2.2, Section non-linear method 3 presents local is described: refinement uncertainty in marker evaluation, 2.3, detection global while refinement Section 2.1, with BA P3P in 2.4. problem Section solution 3 presents in 2.2, uncertainty non-linear evaluation, local refinement while Section in 2.3, 4 deals global with Section 4 deals with experimental set-up. A preliminary method comparison is presented in refinement experimental with BA set-up. in 2.4. Section A preliminary 3 presents method uncertainty comparison evaluation, is presented while Section in Section 4 deals 5, with Section 5, improvements achieved by BA approach are described in Section 6, influence improvements experimental achieved set-up. by A preliminary BA approach method are described comparison in Section is presented 6, in Section influence 5, parameters is presented in Section 7. parameters improvements is presented achieved in by Section BA 7. approach are described in Section 6, influence parameters is presented in Section Measurement Algorithm 2. Measurement Algorithm 2. Measurement The whole measurement Algorithm procedure, outlined in 1, comprises four main steps: (A) The whole measurement procedure, outlined in 1, comprises four main steps: (A) detection 2D marker points on image acquired in each relative pose; (B) The solution P3P detection The whole 2D marker measurement points on procedure, image outlined acquired in in each relative 1, comprises pose; (B) four The main solution steps: (A) P3P problem inside a RANSAC approach for each relative pose; (C) local refinement evaluated problem detection inside 2D a marker RANSAC points approach on image for each acquired relative in pose; each (C) relative local pose; refinement (B) The solution evaluated P3P roto-translation, roto-translation, problem inside a for for RANSAC each each relative relative approach pose pose for at at each a time, time, relative using using pose; a non-linear (C) non-linear local optimization; optimization; refinement (D) (D) evaluated global global refinement refinement roto-translation, all all evaluated evaluated for each relative roto-translations pose at a (all time, relative relative using poses) poses) a non-linear simultaneously optimization; (D) 3D 3D points global points using using refinement BA BA approach. all evaluated s roto-translations (in section 4 (all (inrelative Section 2.3) show poses) 2.3) show details details four main four steps 3D main points A D, steps A D, which using which will BA will be approach. described described in s infollowing 2 following 4 (in sub-sections. section sub-sections. 2.3) show details four main steps A D, which will be described in sub-sections. 1. Scheme whole measurement procedure Scheme Scheme whole wholemeasurement measurementprocedure. 2. The scheme steps. (step A) detection 2D marker points (step B) P3P solution inside 2. RANSAC. 2. The The scheme steps. (step A) detection 2D marker points (step (stepb) B) P3P P3P solution solution inside inside RANSAC.

5 Robotics 2018, 7, Step A: Detection 2D Marker Points The pose measurement is based on 2D position determination known 3D points projected in image plane. The 3D points that are employed come from fiducial markers attached to external surface a satellite (a laboratory mock-up as described in Section 4). They are fixed ir positions S X i are known in a satellite frame S. In literature, several different types fiducial markers are described, e.g., ARToolKit [22] Reference [23]. These known types proved to be reliable in many situations, particularly, in simultaneous presence several markers. However, on a spacecraft, re are only a few regular free surfaces that can accept fiducial markers, as highlighted also by Reference [5]. Thus, number employable markers is low. In References [5,6], selected markers are concentric circles, but projection circles on image plane may yield measurement errors detected centroids as explained in Reference [24] if circle size is not small. For se reasons, a simplified set square markers is designed. Each square marker provides five 3D points S X i : four internal vertices ir centroid. The total number 3D points N points is maximum number 2D projections that can be detected in acquired images. Each marker comprises a known number internal dots, which are exploited to identify each observed marker. In images analyzed, number internal dots is always correctly detected. In a future improvement, to cope with more severe testing conditions that could cause erroneous detections internal dots, known 3D geometrical constraints among markers could be employed to perform a consistency check after marker projections are identified in each image or to enforce detection rules in images; in current implementation, 3D geometrical constraints are employed only for identification vertices each marker, as described at beginning Section 2.2. In proposed algorithm, possible erroneous matching between markers in images in 3D space can be identified eliminated by RANSAC approach illustrated in Section 2.2. Anor future improvement that could be useful in case erroneous detection internal dots is a tracking method for markers: if algorithm knows position each marker in a previous time instant a suitable motion model is devised, position each marker can be foreseen, providing a support helpful for both position detection identification markers. As explained in introduction, exploitation tracking previous knowledge is beyond main purpose this work. A camera, mounted on chaser spacecraft, is aimed to observe 3D marker points on target spacecraft. Before camera can be used for measurements, it should be calibrated; for this purpose, approach presented in Reference [25] is used. After camera calibration, measurement procedure begins with detection ir 2D projections on each acquired image. The detection markers ir 2D points in each image is described in 2 an output example is illustrated in 3a. The detection 2D points comprises following steps: image binarization using a selected threshold removal all small connected regions having less than a set number pixels; selected number pixels is 4, it depends on maximum distance observed markers noise level images. This removal reduces image clutter effect noise, but could make internal dot detection less reliable. extraction internal contour each square marker, using Moore-Neighbor algorithm described in Reference [26]; for each detected boundary its perimeter number internal boundaries can be evaluated; only objects having a number internal boundaries lower than maximum number internal dots having dimensions compatible with markers are analyzed. In this way, each observed marker can be identified. polygonal simplification using Douglas Peucker algorithm, which is called most widely used high-quality curve simplification algorithm in Reference [27]. Each contour extracted in previous step is a closed polygon with several vertices Douglas Peucker algorithm

6 Robotics 2018, 7, Robotics 2018, 7, x FOR PEER REVIEW 6 30 reduces reduces it it to to a a four-vertices four-vertices polygon, polygon, allowing allowing us to us detect to detect four internal four internal vertices vertices each square each fiducial square fiducial marker. marker. Then, from Then, from four vertices, four vertices, marker marker centroid centroid is calculated. is calculated. (a) (b) 3. (a) The detection 2D points in each acquired image; (b) vertices markers. In this way, five 2D 2D points C x C i are x i are detected detected for each for each marker marker in each in each image image are expressed are expressed in camera in camera frame frame reference reference C. TheC. number The number internal internal dots dots each marker each marker allows allows us to identify us to identify which fiducial which fiducial marker marker is observed. is observed Step Step B: B: P3P P3P Solution Solution inside inside RANSAC RANSAC Up Up to to now, now, algorithm algorithm knows knows five five 2D projections 2D projections for each for identified each identified marker marker which marker which y marker belong y to, belong but does to, not but know does not order know order observed vertices. observed Before vertices. any pose Before evaluation any pose can be evaluation performed, can be algorithm performed, has to algorithm associate each has to detected associate 2D each vertex detected in camera 2D vertex frame in C with camera corresponding frame C with 3D corresponding point in satellite 3D point frame in S. The satellite 3D geometrical frame S. The constraints 3D geometrical among constraints markers among ir known markers shape are ir employed known shape to enforce are employed rules on to enforce detected rules vertices, on in detected terms vertices, lines in distances terms lines in acquired distances images. in acquired For instance, images. with For reference instance, to with reference 3b, to two top vertices 3b, two v 1,1 top vertices v 1,4 marker v1,1 with v1,4 one marker internal with dot one (marker internal 1) will dot always (marker be 1) roughly will always aligned be with roughly aligned two top vertices with v two 4,1 top v vertices 4,4 marker v4,1 4 (with v4,4 four marker internal 4 (with dots) four internal vertex dots) v 1,1 will always vertex have v1,1 will a distance always from have a distance aligned from vertices v aligned 4,1 vertices v 4,4 larger than v4,1 vertex v4,4 larger v 1,4. than The same vertex two v1,4. rules The same (alignment two rules distance) (alignment are verified distance) by are vertices verified v 1,2 by, v 1,3, v 4,2 vertices, v 4,3. Thus, v1,2, v1,3, suppose v4,2, v4,3. we Thus, want to suppose identify we want vertices to identify marker 1 (detected vertices as marker described 1 (detected in Section as described 2.1): in Section 2.1): 1. two adjacent verticesv1,x v 1,x v1,y v 1,y are are considered; 2. algorithm verifies if y are aligned with two vertices marker 4 or marker 2; suppose y are aligned with two vertices marker 4; 3. algorithm evaluates which one is farr from vertices mark 4; suppose vv1,x is farr; it meansv1,x v 1,x couldbe bev1,1 v 1,1 or orv1,2, v 1,2, while v1,y v 1,y could could be be v1,3 vor 1,3 v1,4; or v 1,4 ; 4. or vertexv1,z v 1,z adjacentto tov1,x v 1,x is is considered; 5. algorithm evaluates ifv1,x v 1,x or orv1,z v 1,z is is nearerto to vertices marker2; 2; supposev1,x v 1,x is is nearer; 6. in this case v1,x v v1,2, v1,y = v1,3, v1,z 1,x = v 1,2, v 1,y = v 1,3, v= 1,z v1,1, = v 1,1, fourth fourth vertex vertex marker marker 1 remains 1 remains identified. identified. Different Different orders orders associations associations are are obtained obtained if if verifications verifications at at points points 2, 2, 3, 3, 5 yield yield different different results. results. Similar Similar rules rules on on image image plane plane allow allow us to us perform to perform correct correct matching matching between between detected vertices detected vertices all markers all in markers projected in projected images images in 3D in space. 3D space. In this way, for foreach each2d 2D observedpoint point C C x xi 3D point i a corresponding 3D point C X is i is associated in spacecraft frame S. An image is acquired in every camera-target relative pose; each observed 2D point depends on corresponding 3D point on camera-target relative pose; thus, for for each image k (k k (k = = 1, 1,..,., NNposes), ), matching phase yieldsnk N ( k, i, k pairs ( C x k,i, S X i ) with ) with i = i = 1, 1,..,., NNk Npoints. i k N. C After pairs ( x k, i, S X ) are evaluated, a first approximation relative pose can be i obtained using one solutions PnP problem cited in introduction. Preliminary tests

7 Robotics 2018, 7, After pairs ( C x k,i, S X i ) are evaluated, a first approximation relative pose can be obtained using one solutions PnP problem cited in introduction. Preliminary tests were performed using three different approaches: Gao [14], EPnP [9], Kneip [16]. The results, reported in Section 5, highlight advantage Kneip solution [16], which is selected for second step B. The Kneip method is chosen also because it is direct ( intermediate step calculating 3D points C X i in camera frame C is completely eliminated), closed-form (no iterations are needed), requires only three plus one pairs. Using three pairs ( C x k,i, S X i ), four rotation matrices C S R j from satellite frame S to camera frame C four position vectors C X S,j satellite frame S expressed in camera frame C, with j = 1,..., 4, can be evaluated. The four rotation matrices C S R j position vectors C X S,j are evaluated from roots a fourth order polynomial, which is solved by applying Ferrari s closed form solution. The equations are described in four pages Reference [16], stware implementation computation procedure is available in an on-line directory reported in Reference [16]. Then, 3D point S X i a fourth pair ( C x k,i, S X i ) is re-projected in image k using four roto-translations evaluated C S R j, C X S,j solution which provides smallest re-projection error is selected as correct rotation matrix C S R k translation C X S,k. The re-projection equations can be found in Reference [8]. Due to small number points required to closed-form, Kneip solution [16] P3P problem is particularly efficient suitable to be embedded in a RANSAC scheme, [28]. For each image k, Kneip method is applied several times, in each iteration, following steps are performed, see 2: four pairs ( C x k,i, S X i ) are romly selected; rotation C S R k translation C X S,k are evaluated with Kneip; 2D re-projections C ˆx k,i all available 3D points S X i are calculated; N k,inlier (with N k,inlier N k N points ) pairs whose re-projection error e k,i = C ˆx k,i C x k,i is less than a threshold lth are considered inliers define a consensus set examined iteration. At end RANSAC scheme, set with larger consensus (e.g., maximum number inliers) is selected. Even if application RANSAC scheme in presence observed fiducial markers with known shapes may appear unnecessary, experimental tests highlight that it is useful to reject inaccurate 2D points detected in images. Inaccurate 2D positions can happen mainly due to a partial visibility or high deformation an observed marker. The outputs RANSAC procedure are two: selected roto-translation C S R k C X S,k for each image k, calculated by Kneip solution, set pairs ( C x k,i, S X i ), with i = 1,..., N k,inlier N k N points, which are considered inliers Step C: Local Refinement with Non-Linear Optimization For each relative position orientation between camera (chaser) spacecraft (target), i.e., each image k, rotation matrix C S R k translation vector C X S,k found by P3P solution plus RANSAC, can be refined using a non-linear optimization. The scheme is illustrated in 4 (left). The rotation has to be expressed by three Euler angles or a quaternion. In this work, Euler angle approach is selected since it gives us a clear understing obtained results. For each image k, if roto-translation C S R k C X S,k between camera satellite is known, it can be used to re-project 3D points S X i all N k,inlier pairs ( C x k,i, S X i ) on image k, obtaining reprojections C ˆx k,i. Then, 2D reprojection errors can be evaluated following cost function can be calculated: N k,inlier N k,inlier (e k,i ) 2 = i=1 i=1 C ˆx k,i C x k,i 2 (1) In Equation (1) only N k,inlier pairs coming from RANSAC procedure are used. The Equation (1) becomes cost function a minimization problem with variables C S R k C X S,k. A numerical solution minimization problem can be found with Levenberg Marquardt (LM) algorithm [29]. The LM algorithm requires initial values variables C S R k C X S,k to start minimum

8 In Equation (1) only Nk,inlier pairs coming from RANSAC procedure are used. The Equation (1) becomes cost function a minimization problem with variables C S R C k X S, k. A numerical solution minimization problem can be found with Levenberg Marquardt (LM) algorithm [29]. The LM algorithm requires initial values variables C S R C k X S, k to Robotics 2018, 7, start minimum search. In this case, values obtained at end previous step B are C employed. During searching process, Nk,inlier pairs ( x k, i, S X ) with i = 1,, Nk,inlier are kept i search. constant, Inwhile this case, variables values C obtained at end previous step B are employed. During S R C k searching process, N k,inlier pairs ( C X x k,i, S S, k are iteratively changed. At end searching X i ) with i = 1,..., N k,inlier are kept constant, while process, variables C S R LM algorithm k C provides refined roto-translation X S,k are iteratively changed. At end searching C S R C k process, X S, k. As illustrated in LM algorithm provides 1, refined steps A C roto-translation are repeated C S R for k all C X Nposes S,k. As illustrated camera-satellite in relative 1, poses. stepsthe A Clocal are C repeated refinement forstep all C Ninvolves poses camera-satellite only Nk,inlier relative pairs poses. ( x k, The i, S local X ) refinement considered step C involves image k only N k,inlier pairs ( C x k,i, S i associated relative pose. X i ) considered image k associated relative pose. 4. The Thescheme scheme steps; steps; (step (step C) C) local local refinement refinement with with non-linear non-linear optimization optimization (step (step D) D) global global refinement refinement with with BA. BA Step D: Global Refinement with BA 2.4. Step D: Global Refinement with BA The measurement phase described in previous Section 2.3 is called local refinement since The measurement phase described in previous Section 2.3 is called local refinement since optimization is performed for each relative position orientation affects only evaluated optimization is performed for each relative position orientation affects only evaluated roto-translation. Up Up to to local local refinement refinement phase, phase, 3D 3D point point positions positions S X S X in spacecraft i in i spacecraft frame Sframe S ir uncertainties ir uncertainties are obtained are obtained by aby manual a manual direct direct measurement with with aa caliper define an initial 3D map. Since se 3D point positions are input quantities examined measurement procedure y are kept constant up to local refinement phase, errors in ir measurements can yield systematic effects on final indirect measurement relative roto-translation. The following Section 5 experimental results will show that analysis described in previous work [8] is partially affected by a systematic effect due to 3D point positions (3D initial map). To compensate for this undesired systematic effect, in in present work, work, a aba BA global refinementis is introduced. After roto-translations C C S R k C XX S,k S, k for for all all considered relative positions orientations (k (k = = 1, 1,,.. Nposes)., Nare poses ) locally are locally refined refined as described as described in Section in2.3, Section BA 2.3, approach BA approach References [30,31] adjusts [30,31] adjusts all already-evaluated all already-evaluated roto-translations roto-translations 3D point positions, 3D point positions, i.e., 3D i.e., map, to globally 3D map, minimize to globally sum minimize squared sumre-projection squared errors. re-projection With reference errors. to With reference to 4 (right), input quantities this step D are all roto-translations C S R k C X S,k with k = 1,..., N poses, which come from iterated local refinement step C all pairs ( C x k,i, S X i ) inliers with k = 1,..., N poses i = 1,..., N k,inlier, which are identified by RANSAC scheme repeated for all relative poses N poses.

9 Robotics 2018, 7, Similarly to local refinement step C, a new cost function can be defined: N poses k=1 [ Nk,inlier ] (e k,i ) 2 = i=1 N poses k=1 [ Nk,inlier ] C ˆx k,i C x k,i 2 i=1 (2) In this case, constants that do not change during minimum searching process are all detected 2D points C x k,i that belong to a pair inliers ( C x k,i, S X i ), with k = 1,..., N poses i = 1,..., N k,inlier N k N points, while variables that are changed in searching process are all roto-translations C S R k C X S,k with k = 1,..., N poses all 3D points S X i with i = 1,..., N points. The LM, which is used also for minimization cost function (Equation (2)), starts minimum search from values C S R k C X S,k obtained at end iterations previous step C, from values S X i that come from manual direct measurements. The BA approach yields both refined roto-translations a refined set 3D points. In present work, two different implementations BA are considered compared. In first implementation, final output quantities are roto-translations C S R k C X S,k with k = 1,..., N poses obtained at end BA approach as described. In this case, globally refined 3D positions S X i, obtained as additional output BA approach, are not used. In a second implementation (see 5) whole procedure as described comprising steps A D is applied to a subset available N poses relative poses, n only steps A ( 2D point detection), B ( P3P solution inside RANSAC) C ( local refinement with non-linear optimization) are performed for all N poses relative poses, using as 3D points S X i those refined by BA approach. In this case, roto-translations obtained at end BA approach are not considered, while globally refined 3D points S X i with i = 1,..., N points all images k with k = 1,..., N poses become input simplified procedure with steps A C. Obviously, for those images whose 2D points C x k,i were already detected for whole procedure A D, first step A is not repeated, but same 2D points C x k,i are employed. With reference to 5, in second implementation, whole procedure is applied with a number poses N poses N poses only outputs that are retained are globally refined 3D points S X i (bottom 5). Then steps A C inside second red rectangle 5 are repeated for all poses N poses, using new refined 3D points (in green input rectangle). The following Section 5 will compare results se two implementations. Robotics 2018, 7, x FOR PEER REVIEW Uncertainty Evaluation 5. The scheme second implementation BA approach. The uncertainties relative positions orientations between target spacecraft camera (chaser) are evaluated using procedures foreseen by References [32,33] in case an indirect measurement with a plurality input quantities. As explained in Reference [8], Monte Carlo (MC) propagation approach is chosen due to non-linearity measurement method. The MC method allows us to propagate uncertainties evaluated for input quantities to uncertainties output relative pose. According to MC method, uncertainty propagation

10 Robotics 2018, 7, Uncertainty Evaluation The uncertainties relative positions orientations between target spacecraft camera (chaser) are evaluated using procedures foreseen by References [32,33] in case an indirect measurement with a plurality input quantities. As explained in Reference [8], Monte Carlo (MC) propagation approach is chosen due to non-linearity measurement method. The MC method allows us to propagate uncertainties evaluated for input quantities to uncertainties output relative pose. According to MC method, uncertainty propagation is performed by a numerical procedure, which allows us to take into account non-linearity functional relationship between inputs outputs. The Kline McClintock propagation formula is not used since it approximates (linearization) propagation model. The input quantities are intrinsic parameters camera, 3D point positions in satellite frame S, positions 2D points detected on each image. The uncertainties intrinsic parameters camera are evaluated using Zhang procedure [25] for camera calibration, while uncertainty each 3D point is evaluated from repeated experimental measurements manually acquired by three different operators using a caliper, according to type A contributions described in Reference [32]. For 2D positions in each image, three uncertainty contributions are considered, evaluated n combined toger: effect selected threshold for image binarization, inexact detection four internal vertices each marker, residual uncorrected optical distortions. The contribution threshold selection is evaluated by changing threshold level within a suitable range. The contribution contour extraction algorithm Douglas Peucker polygonal simplification is evaluated experimentally comparing vertices detected by algorithm manually by an operator in different orientations spacecraft. Finally, contribution optical distortions that distortion model is not able to remove is evaluated acquiring a planar chessboard orthogonal to optical axis. The uncertainties 2D points are evaluated more carefully than in Reference [8], experimentally taking into account that Douglas Peucker algorithm can sometimes yield a 2D point near but not exactly coincident with desired corner a marker. For this reason, evaluated uncertainties depicted in Sections 6 7 may be slightly larger than those obtained in Reference [8] using same non-linear approach without BA. The uncertainties evaluated for output relative roto-translations allow us to verify compatibility experimentally obtained errors ( difference between values obtained by vision system those obtained by reference instrumentation). 4. Experimental Set-Up The purpose experimental tests is to calibrate whole measurement system for satellite position orientation measurement. To this aim, a satellite mock-up, provided with eight square fiducial markers as depicted in 6, is mounted on a high precision motor-driven rotary stage on a linear slide, which allows us to impose linear displacements /or rotations to satellite mock-up, while camera is fixed. At same time, rotary stage linear slide can measure roto-translations are used as reference instruments to evaluate measurement errors. The set-up, comprising rotary stage, linear slide, camera, is same as that employed described in Reference [8], which can be consulted for details. Twenty-two different positions along linear slide are tested, with a longitudinal step 50 mm. The smallest distance camera mock-up is 1500 mm, while farst one is 2550 mm. For each linear position, mock-up attitude is varied within 90 in 2 steps (when attitude angle is near 0 90, observed markers are substantially orthogonal to optical axis, with an evaluated misalignment 1.3 ). Thus, a total number = 990 measurements position orientation are acquired using fixed camera are compared with corresponding imposed values. The experimental data set, i.e., imposed rotations translations corresponding acquired images are same employed in Reference [8]. Thus, error results illustrated in Sections 5 7 can be directly compared with

11 mock-up attitude is varied within 90 in 2 steps (when attitude angle is near 0 90, observed markers are substantially orthogonal to optical axis, with an evaluated misalignment 1.3 ). Thus, a total number = 990 measurements position orientation are acquired using fixed camera are compared with corresponding imposed values. The experimental data set, i.e., imposed rotations translations corresponding acquired Robotics 2018, 7, images are same employed in Reference [8]. Thus, error results illustrated in Sections 5 7 can be directly compared with Reference [8], while obtained uncertainties may be slightly larger than in Reference [8], [8], while as explained obtained uncertainties Section 3. may7 beshows slightly larger images than acquired in Reference by [8], camera as explained when in Section distance 3. between 7 shows camera images acquired satellite byreaches camera its maximum when (a,b,c) distance between minimum camera (d,e,f) value, satellite when reaches rotation its maximum angle is 0 (a,b,c) (a,d), 45 minimum (b,e), 90 (c,f). (d,e,f) The value, partial occlusion when rotation markers, angle which is 0 (a,d), happens 45 (b,e), when 90 (c,f). mock-up The partial is near occlusion its minimum markers, distance, which affects happens number when observed mock-up 2D is near points. its minimum distance, affects number observed 2D points. 6. The laboratory set-up. 6. The laboratory set-up. Robotics 2018, 7, x FOR PEER REVIEW The images acquired in different positions rotations: (a c) with maximum distance 7. The images acquired in different positions rotations: (a c) with maximum distance rotation angles, respectively, 0, 45, 90 ; (d f) with minimum distance (partial occlusion rotation angles, respectively, 0, 45, 90 markers) rotation angles, respectively, ; (d f) with minimum distance (partial occlusion 0, 45, 90. markers) rotation angles, respectively, 0, 45, 90. The algorithm, described in Section 2, measures all three Euler angles three relative position components, The algorithm, but described experimental in Section set-up 2, measures allows us all to evaluate three Euler angles errors only three for two relative degrees position components, freedom: one but translation experimental one rotation. set-up The allows rotary us stage to evaluate allows us to errors impose only for directly two measure degrees only first Euler angle which is a rotation around a vertical axis position along linear slide whose longitudinal axis has an evaluated misalignment 1.3 with reference to optical axis camera. Thus, angular error can be assessed only around vertical axis. Similarly, from three position components measured by described approach, total distance between camera frame C spacecraft frame S is evaluated it is compared with imposed position

12 Robotics 2018, 7, freedom: one translation one rotation. The rotary stage allows us to impose directly measure only first Euler angle which is a rotation around a vertical axis position along linear slide whose longitudinal axis has an evaluated misalignment 1.3 with reference to optical axis camera. Thus, angular error can be assessed only around vertical axis. Similarly, from three position components measured by described approach, total distance between camera frame C spacecraft frame S is evaluated it is compared with imposed position along linear slide, defining position error depicted in following figures. The accuracy linear slide ( 1 mm) rotary stage ( 72 arcsec) is much better than uncertainties achievable by proposed method; thus, y are employed as reference instruments ir uncertainties are neglected. In employed laboratory set-up, images markers are acquired in normal/good lighting conditions, while in a typical space application, acquired scene could be affected by a low light condition presence extremely bright areas deep shadows. These space lighting conditions could worsen uncertainties some detected 2D points, consequently, final results. The harsh lighting conditions space environment are certainly an additional uncertainty source should be considered in a future work; possible aid a marker illumination system may also be analyzed. However, an additional uncertainty source could also hide or mask possible correlations between pose errors/uncertainties geometrical distribution position markers, making correlations less clear to be observed. That said, increase 2D point uncertainty due to poor lighting can make re-projection error larger than threshold lth RANSAC procedure. Thus, net effect is a reduction available pairs ( C x k,i, S X i ) inliers, which generally has a negative effect on obtained errors uncertainties. This is confirmed by our results as described in Section 7. However, as it will be shown in 24, it is worth pointing out that a reduction number points (from a maximum value 38 to a flat value 14 in 24) could be associated only with a slight worsening final error. The results that will be described in Sections 5 7 are obtained implementing numerical measurement procedure on a desktop PC using Matlab. Since neir hardware nor stware code is optimized for a real-time application on a satellite, proposed analysis cannot give useful indications about absolute speed timing procedure. However, as it will be explained in following Section 5, selected implementation BA approach may not increase computational burden during real-time measurements. 5. Results: Preliminary Method Comparison In a preliminary phase presented work, three different solutions general PnP problem were analyzed compared: Gao [14], EPnP [9], Kneip [16]. For two additional methods (Gao EPnP) implemented code can be found from internet: Gao solution is comprised in Computer Vision Toolbox Matlab, while EPnP implementation can be downloaded from a web link reported in Reference [9]. For three methods, 8 compares evaluated attitude angle in a single position, while 9 shows root mean square (RMS) errors calculated for each angle considering all 22 positions. In our application, number available pairs ( C x k,i, S X i ) is rar low (as it will be depicted in 22, it varies from to 38) EPnP solution had problems with some internal singular matrices evaluated during calculation. The best errors are obtained by Kneip solution which is selected for step B measurement procedure. As described in Section 2.4 in preliminary comparison methods, two different implementations BA approach are compared: first implementation executes four steps A D as described for all available relative poses N poses, while second one executes four steps A D on a sub-set N poses ( N poses ) poses, n repeats steps A C (without BA approach) for all relative poses N poses, using refined 3D points obtained by BA; see 5.

13 Robotics 2018, 7, In this second case, steps A D are executed with main purpose evaluating a globally refined set 3D points. A clear advantage second implementation is that relative pose is evaluated using only steps A C, while BA approach (with its additional computational burden) can be executed only once, e.g., before real-time measurement begins, or only when it is needed, e.g., when rmal deformations may have slightly changed positions 3D points. An important aspect is choice relative poses used to apply BA refinement 3D points. In this work, eight linear positions eight angular rotations (a total number 8 8 = 64 = N poses N poses = 990) were selected to cover whole linear angular measurement range. 9, toger with three PnP solutions, also shows RMS attitude error versus imposed rotation obtained: after steps A C, i.e., after local refinement with non-linear minimization (purple line); after steps A D, i.e., after global refinement with BA approach according to first implementation (light blue line); in this case, BA approach is applied considering all available relative poses N poses. According to second implementation BA approach (green line). Robotics 2018, 7, x FOR PEER REVIEW Robotics 2018, 7, x FOR PEER REVIEW The evaluated attitude angle versus imposed rotation for a single position along 8. The linear slide: 8. The evaluated obtained evaluated attitude by attitude angle Kneip angle versus solution versus imposed (blue); imposed rotation for a single position along linear by Gao rotation solution for a (orange single position line); by along EPnP slide: linear obtained slide: by obtained Kneip by solution Kneip (blue); solution by(blue); Gao by solution Gao solution (orange line); (orange byline); EPnP by solution EPnP solution (yellow line). (yellow solution line). (yellow line). 9. The root mean square (RMS) error attitude angle versus imposed rotation: obtained 9. by The root Kneip mean solution square (blue); (RMS) by error Gao solution attitude (orange angle line); versus by EPnP imposed solution rotation: 9. The root mean square (RMS) error attitude angle versus imposed rotation: obtained (yellow obtained by Kneip solution (blue); by Gao solution (orange line); by EPnP solution (yellow by line); Kneip by solution non-linear (blue); local by optimization Gao solution (step(orange C) after line); Kneip by solution EPnP solution inside (yellow RANSAC line); line); procedure by (step non-linear B), (purple local line); optimization by first (step implementation C) after Kneip BA solution approach inside (light blue RANSAC by non-linear local optimization (step C) after Kneip solution inside RANSAC procedure line); by procedure second (step implementation B), (purple line); by BA approach first implementation (green line). BA approach (light blue line); by (step B), (purple line); by first implementation BA approach (light blue line); by second second implementation BA approach (green line). implementation BA approach (green line). The result is that second implementation yields lowest error. Due to implementation advantage The result (only is that steps second A C implementation can be implemented yields in real-time) lowest error. Due this preliminary to implementation result, advantage (only steps A C can be implemented in real-time) this preliminary result, second implementation BA approach is selected is employed in all analyses presented second in Sections implementation 6 7. Using BA second approach implementation, is selected is benefits employed in all BA approach analyses that presented will be in presented Sections in 6 7. following Using Section second 6 implementation, can be obtained without benefits any increase BA approach computational that will be presented in following Section 6 can be obtained without any increase computational burden or worsening speed timing performance algorithm during real-time

14 Robotics 2018, 7, The result is that second implementation yields lowest error. Due to implementation advantage (only steps A C can be implemented in real-time) this preliminary result, second implementation BA approach is selected is employed in all analyses presented in Sections 6 7. Using second implementation, benefits BA approach that will be presented in following Section 6 can be obtained without any increase computational burden or worsening speed timing performance algorithm during real-time measurements. The reason is that global refinement 3D points (steps A D) can be embedded into a calibration phase can be performed only at beginning operational activities system or on request when it is deemed useful. After images markers are acquired, this calibration phase with BA can be performed f-line using on-board hardware resources without any timing constraints or even on ground (if images are available), since purpose is refinement 3D points not relative pose evaluation. 6. Results: Effect Bundle Adjustment (BA) Robotics A more 2018, 7, in-depth x FOR PEER analysis REVIEW results without BA approach as in Reference [8], allows 15 us30 to find out a partial systematic effect due to measurements 3D points. In detail, 10 (blue line) line) shows angular errors obtained without BA BA approach, averaged for for distances versus rotation angle. These values cannot be bedirectly compared with with errors errors depicted depicted in in following following s s 11 14, 11 14, since sincey yare are average positive negative values, while in in s 11 14, an an RMS calculation is is carried out. out. The The blue line line in in clearly shows that that re is is an an average angular overestimation for angles lower than than ~50 ~50 vice vice versa versa for for larger larger angles. angles. This This systematic systematic effect effect can can potentially potentially modify modify experimental conclusions. However, orange line linein in 10 10shows same averaged errors in case with BA approach. It It is is clear that, after 3D 3D point point positions are are refined refined by by BA, BA, systematic over/under-estimation is heavily is heavily reduced, reduced, even even if if 3D 3D points points refined refined with with BA BA remain remain compatible with with position uncertainty 3D 3D points measured by by caliper. 10. The angular errors, averaged for 22 distances, versus rotation angle, obtained with 10. The angular errors, averaged for 22 distances, versus rotation angle, obtained with non-linear approach without BA (blue) with BA (orange). non-linear approach without BA (blue) with BA (orange). After reduction systematic effect, allowed by BA approach (second After reduction systematic effect, allowed by BA approach (second implementation), implementation), s compare measurement error (difference between s compare measurement error (difference between measured valued measured valued imposed one) for rotation around vertical axis obtained with imposed one) for rotation around vertical axis obtained with non-linear approach as in non-linear approach as in Reference [8] (red line with circular dots) with BA approach (blue Reference [8] (red line with circular dots) with BA approach (blue line with circular dots). line with circular dots). The same figures also depict evaluated extended uncertainties as shaded The same figures also depict evaluated extended uncertainties as shaded areas. s areas. s show measurement errors corresponding uncertainties for show measurement errors corresponding uncertainties for linear position without linear position without BA (red line shaded area) with BA (blue line shaded area). s illustrate root mean square (RMS) errors calculated for each angle considering all 22 positions, while s depict RMS errors calculated for each position considering all 45 angles. In a similar way, s show extended uncertainties with a level confidence 99.7% evaluated for each angle, averaging all 22 positions, while

15 Robotics 2018, 7, BA (red line shaded area) with BA (blue line shaded area). s illustrate root mean square (RMS) errors calculated for each angle considering all 22 positions, while s depict RMS errors calculated for each position considering all 45 angles. In a similar Robotics 2018, way, 7, s x FOR PEER 11 REVIEW 12 show extended uncertainties with a level confidence % 30 evaluated for each angle, averaging all 22 positions, while s depict extended uncertainties As Reference evaluated [8], for anor each position, effect, highlighted averaging all by 45 rotations. 14 present s after show BA correction that observed systematic measurement effect errors discussed are compatible above, is with increase evaluated with extended distance uncertainties position errors highlight uncertainties. significant A improvement similar error provided uncertainty by increase global refinement can be observed due to also BA for approach. attitude In angle several (see cases, 13) error even or if uncertainty it is less evident. obtained The reason with BA is is that less an than increase half in corresponding distance value observed without object BA. Thus, yields considering smaller observed achievable markers. errors The performed uncertainties, tests allow a clear us to result quantify this presented effect on analysis obtained is undoubted measurements. superiority The same effect approach distance with will BA refinement. also be analyzed in Section The root mean square (RMS) error attitude angle its extended uncertainty versus imposed rotation: angular errors without BA (red line with circular dots); extended uncertainties without BA (red shaded area); angular errors WITH BA (blue line with circular dots); extended uncertainties WITH BA (blue shaded area) The The root root mean mean square square (RMS) (RMS) error error linear linear position position its extended its extended uncertainty uncertainty versus versus imposed imposed rotation: rotation: position errors position without errors BA without (red line BA with (red circular line with dots); circular extended dots); uncertainties extended without uncertainties BA (red without shaded BA area); (red shaded position area); errors position WITH errors BA (blue WITH line BA with (blue circular line with dots); circular extended dots); uncertainties extended uncertainties WITH BA (blue WITH shaded BA (blue area). shaded area). As it can be observed in 11, a clear effect highlighted by experimental tests is a reduction angular error uncertainty when rotation angle is around This effect was already present in Reference [8], but interesting result present work is that same behavior is still

16 Robotics 2018, 7, present even after systematic effect correction carried out by BA approach. The possible causes this reduction angular error uncertainty near will be discussed in Section 7.2 when influencing parameters are analyzed. As in Reference [8], anor effect, highlighted by 14 present after BA correction systematic effect discussed above, is increase with distance position errors uncertainties. A similar error uncertainty increase can be observed also for attitude angle (see 13) even if it is less evident. The reason is that an increase in distance observed object yields smaller observed markers. The performed tests allow us to quantify this effect on obtained Robotics Robotics 2018, 2018, measurements. 7, 7, x FOR FOR PEER PEER REVIEW REVIEW The same effect distance will also be analyzed in Section The root mean square (RMS) error attitude angle its its extended uncertainty versus imposed position: angular errors without BA BA (red line with circular dots); extended uncertainties without BA BA (red shaded area); angular errors WITH BA BA (blue line with circular dots); extended uncertainties WITH BA BA (blue shaded area). 14. its The The root root mean mean square square (RMS) (RMS) error error linear linear position position its extended its extended uncertainty uncertainty versus BA versus imposed imposed position: position: errors position without errors BA without (red line BA with (red circular line with dots); circular extended dots); uncertainties extended BA BA without uncertainties BA (red without shaded BA area); (red shaded position area); errors position WITHerrors BA (blue WITH line BA with (blue circular line with dots); circular extended dots); BA uncertainties extended uncertainties WITH BA WITH (blue shaded BA (blue area). shaded area) Results: Results: Influencing Influencing Parameters Parameters To To investigate investigate possible possible causes causes most most influencing influencing characteristics characteristics that that affect affect error error uncertainty uncertainty rotation rotation angle angle position, position, following following eight parameters eight parameters are defined, are with defined, reference with reference to to 15: to 15: a. a. The mean angle β between optical axis normal to to marker plane each point that passes RANSAC selection, see 15a. b. b. The mean angle α between camera projection line normal to to marker plane each point that passes RANSAC selection, see 15b. c. α2 or α3 c. The mean angle or α2 or α3 between tangential or local velocity camera projection

17 Robotics 2018, 7, a. The mean angle β between optical axis normal to marker plane each point that passes RANSAC selection, see 15a. b. The mean angle α between camera projection line normal to marker plane each point that passes RANSAC selection, see 15b. c. The mean angle α 2 or α 3 between tangential or local velocity camera projection line each point that passes RANSAC selection. When rotations are examined, tangential velocity with reference to rotation center is employed as depicted in 15c, for calculation α 2. If positions are analyzed, local velocity translation motion is considered for each marker. In linear displacement case, direction local velocity is same for all markers is considered parallel to longitudinal slide set-up for calculation mean angle α 3. Robotics d. The 2018, maximum 7, x FOR PEER longitudinal REVIEW distance LongZ 3D points, see 15d e. The maximum transverse distance TransvX 3D points, see 15e. f. e. The The mean maximum distancetransverse DistBar distance 3D points TransvX from ir 3D barycenter, points, see see 15e. 15f. g. f. The Themean number distance N points DistBar 2D points 3D observed points from in each ir image barycenter, that pass see RANSAC 15f. selection. g. The The number number N points Npoints takes into 2D points account observed both in observability each image that pass markers, RANSAC which is selection. generally The number lower Npoints at short takes distances into account (see both 7d f) observability or when only one markers, face which mock-up is generally is visible lower at short (seedistances 7a,c,d,f), (see 7d f) consensus or when set yielded only one by face RANSAC procedure. mock-up is visible (see h. The 7a,c,d,f), distance Dist_BC consensus between set yielded camera by ( origin RANSAC procedure. C frame) barycenter h. 3D The points distance that pass Dist_BC RANSAC between selection. camera ( origin C frame) barycenter 3D points that pass RANSAC selection The The schematic schematic representation representation parameters parameters a f a-f that that potentially potentially affect affect measurement measurement errors errors uncertainties uncertainties ( ( yellow yellow square square is is top top view view satellite satellite mock-up; mock-up; circular circular red dots red dots are are corners corners square square markers; markers; camera camera is is depicted depicted in in red); red); subplot subplot a represents represents parameter parameter a (mean (mean angle angle β), β), subplot subplot b -> parameter parameter b (mean (mean angle angle α), α),..,., subplot subplot f -> parameter parameter f (mean distance distance DistBar). DistBar). The interest in angular parameters α2 α3 comes from ory projective The interest in angular parameters α geometry: if a 3D point moves along its projection 2 α line, 3 comes from ory projective geometry: i.e., its local velocity is parallel to its projection if a 3D point moves along its projection line, i.e., its local velocity is parallel to its projection line, line, its movement cannot be observed by camera, in this case measurement error should its movement cannot be observed by camera, in this case measurement error should be be large. This oretical reasoning leads to preference high values α2 α3. The direction large. This oretical reasoning leads to preference high values α local velocity can be easily detected in our set-up since rotation is imposed 2 α 3. The direction along known local velocity can be easily detected in our set-up since rotation is imposed along a known rotation axis by rotary stage displacement is along linear slide, but in a general motion, rotation axis by rotary stage displacement is along linear slide, but in a general motion, parameters α2 or α3 could be difficult to evaluate. Thus, simplified parameters α β were conceived. LongZ, TransvX, DistBar are parameters that characterize dispersion 3D points. If 3D points are far from each or satellite is rotating around an axis near its barycenter, even a small rotation will yield significant displacement 3D points rotation angle will be easily evaluated with a low error. Thus, oretical rationale behind dispersion

18 Robotics 2018, 7, parameters α 2 or α 3 could be difficult to evaluate. Thus, simplified parameters α β were conceived. LongZ, TransvX, DistBar are parameters that characterize dispersion 3D points. If 3D points are far from each or satellite is rotating around an axis near its barycenter, even a small rotation will yield significant displacement 3D points rotation angle will be easily evaluated with a low error. Thus, oretical rationale behind dispersion parameters LongZ, TransvX, DistBar is that y could be correlated with rotation errors uncertainties. The distance Dist_BC is selected since far 3D points are observed with large position uncertainty. Thus, Dist_BC could be correlated with position errors. The number points N points is introduced since its higher values are generally associated with lower errors in solution PnP problems. For each one 990 positions orientations examined, eight parameters already defined are evaluated. Then, Pearson correlation coefficient between obtained angle distance measurement errors uncertainties, eight considered parameters are calculated, to find out if re is any positive or negative correlation between m to highlight most influencing parameters. The following Table 1 summarizes most significant results: each column considers a parameter or its reciprocal value. The analysis considers evaluated values each parameter to find out if re is a linear correlation between achieved errors or uncertainties considered parameter. At same time, reciprocal values each parameter are evaluated to find out if re is a hyperbolic correlation. Each row Table 1 shows a different output result: Rotation Errors (RE) or Incremental Rotation Errors (dre); Rotation (Extended) Uncertainties (RU) or Incremental Rotation (Extended) Uncertainties (dru); Position Errors (PE); Position (Extended) Uncertainties (PU). For rotation angles, both total rotation measurements, which varies between 0 90, incremental rotation measurements are analyzed; in Table 1, only most significant case (total or incremental) is reported. For each row, parameters are ordered from most significant (first column on left) to least significant (last column on right). For each output result (each row), three most influencing parameters are highlighted. Table 1. The Pearson Correlation Coefficients for following output results: Rotation Errors (RE) or Incremental Rotation Errors (dre); Rotation (Extended) Uncertainties (RU) or Incremental Rotation (Extended) Uncertainties (dru); Position Errors (PE); Position (Extended) Uncertainties (PU). Influencing Parameters 1/α β 1/LongZ 1/α 2 1/DistBar 1/N points TransvX DistBC RE/dRE /α β LongZ 1/N points 1/DistBar 1/α 2 TransvX DistBC RU/dRU LongZ β α DistBC 1/DistBar N points TransvX 1/α 3 PE /α LongZ β 1/α 3 1/DistBar N points TransvX DistBC PU Table 1 gives a clear coherent idea most influencing parameters. For all output results ( rotation errors uncertainties, position errors uncertainties) three most important parameters are same: two mean angles α β, maximum longitudinal distance observed 3D points. The angles α β take into account mean inclination between marker plane projection line or optical axis, respectively; see 15a,b. This result highlights that achievable errors uncertainties are lower (better) if mean angles α β are higher. This conclusion is true for considered values α β, which vary between 0 50~60 in experimental test, as depicted in s Thus, in design a measurement system based on markers, a clear indication is to select satellite surfaces for markers in order to maximize those two mean angles α β. This means that obtained errors uncertainties are better if markers are inclined, not orthogonal, with reference to projection lines /or optical axis. Nothing sure can be said if markers tend to be parallel to projection lines /or

19 Robotics 2018, 7, optical axis since maximum α β achieved in described tests are about 50~60. A second clear indication is to select marker positions to maximize longitudinal distance among m. Since Pearson correlation coefficients give a concise but partial representation a possible correlation between two quantities, in following sections, rotation /or position errors /or extended uncertainties are graphically depicted versus each considered parameter. The graphical representation could highlight correlations in a more general way. Sections analyze each parameter at a time, to search for possible correlations not highlighted by correlation coefficient Robotics 2018, or 7, to x FOR confirm PEER REVIEW findings Table Robotics 2018, 7, x FOR PEER REVIEW Robotics 2018, 7, x FOR PEER REVIEW The rotation errors vs. mean angle β The The rotation errors errors vs. vs. mean mean angle angle β. β. 16. The rotation errors vs. mean angle β. 17. The rotation extended uncertainties vs. mean angle β. 17. The rotation extended uncertainties vs. mean angle β The The rotation extended uncertainties vs. vs. mean mean angle angle β. β. 18. The rotation extended uncertainties vs. mean angle β. 18. The rotation extended uncertainties vs. mean angle β The The rotation rotation extended extended uncertainties uncertainties vs. vs. mean mean angle angle β. β.

20 Robotics 2018, 7, The rotation extended uncertainties vs. mean angle β. 19. The incremental rotation errors vs. mean angle α. Robotics 2018, 7, x FOR PEER REVIEW 19. The incremental rotation errors vs. mean angle α Robotics 2018, 7, x FOR PEER REVIEW The The rotation rotation extended extended uncertainties vs. vs. mean mean angle angle α. α. 20. The rotation extended uncertainties vs. mean angle α. 21. The position extended uncertainties vs. mean angle α. 21. The position extended uncertainties vs. mean angle α. 21. The position extended uncertainties vs. mean angle α Parameter a: Mean Angle 7.1. Parameter a: Mean Angle β 7.1. Parameter According a: Mean to Table Angle 1, β mean angle is second most influencing parameter in three cases According to Table 1, mean angle β is second most influencing parameter in three cases (rows) According out four. to Table In all 1, mean four cases angle(rows) β is second Table 1, most correlation influencingis parameter more significant in threewith cases (rows) out four. In all four cases (rows) Table 1, correlation is more significant with β (rows) values out not four. ir Inreciprocal all four ones. cases Thus, (rows) correlation Table 1, coefficient correlation indicates is more that significant linear correlation with β values not ir reciprocal ones. Thus, correlation coefficient indicates that a linear correlation values is more evident not ir than reciprocal hyperbolic ones. one. Thus, These correlation findings are coefficient confirmed indicates by that 16 a(rotation linear correlation errors vs. is more evident than a hyperbolic one. These findings are confirmed by 16 (rotation errors vs. isβ) more evident than 18 a(rotation hyperbolic extended one. These uncertainties findings arevs. confirmed β), while by 1617 (rotation (rotation errors extended vs. β) β) 18 (rotation extended uncertainties vs. β), while 17 (rotation extended uncertainties 18 (rotation vs. β) seems extended to suggest uncertainties an intermediate vs. β), while behavior between 17 (rotation linear extended hyperbolic. uncertainties uncertainties vs. β) seems to suggest an intermediate behavior between linear hyperbolic. vs. β) seems to suggest an intermediate behavior between linear hyperbolic Parameter b: Mean Angle 7.2. Parameter b: Mean Angle α In three rows, out four, Table 1, highest values correlation coefficient are obtained In three rows, out four, Table 1, highest values correlation coefficient are obtained for mean angle α, between each projection line normal to marker plane. The absolute for mean angle α, between each projection line normal to marker plane. The absolute highest value correlation coefficient is achieved between rotation extended uncertainties highest value correlation coefficient is achieved between rotation extended uncertainties reciprocal values angles α. 20 depicts this best correlation case graphically reciprocal values angles α. 20 depicts this best correlation case graphically confirms hyperbolic behavior. 19 shows incremental rotation errors, while 21 confirms a hyperbolic behavior. 19 shows incremental rotation errors, while 21 depicts case position extended uncertainties. In all s 19 21, re is a hyperbolic

21 Robotics 2018, 7, Parameter b: Mean Angle α In three rows, out four, Table 1, highest values correlation coefficient are obtained for mean angle α, between each projection line normal to marker plane. The absolute highest value correlation coefficient is achieved between rotation extended uncertainties reciprocal values angles α. 20 depicts this best correlation case graphically confirms a hyperbolic behavior. 19 shows incremental rotation errors, while 21 depicts case position extended uncertainties. In all s 19 21, re is a hyperbolic increase errors uncertainties when angle α decreases. Both Table 1 s suggest that correlation between errors/uncertainties α is slightly stronger than in case β. 22 shows angles α, β number points N points (see Section 7.7) versus imposed rotation angle; in each imposed rotation angle, depicted value is averaged for all 22 positions. Taking into account 22 strong correlation between α rotation errors uncertainties, reduction angular error uncertainty near ( imposed rotation), observed in 11 in Section 6, can be understood: for central angular positions near 45 50, when observed surfaces satellite mock-up are both inclined (see 7b,e), angles α β reach ir maximum values obtained errors uncertainties are lower. This a clear indication for position selection fiducial markers on a satellite surface. 22 shows that mean number points N points remains near 16 points up to an imposed rotation equal to 18, near 20 between Comparing s 11 22, observing only ranges imposed angle, re is an evident error uncertainty variation in presence a sensible variation mean angles α β, while mean number pairs ( C x k,i, S X i ) remain constant. However, since shape depicted curve N points (versus imposed rotation) is similar to corresponding curves α β in 22, since it is generally accepted that an increase number points improves final results, doubt that observed error reduction near is due to number points is not associated with values α β could legitimately rise. To clarify this doubt, whole analysis is repeated imposing that total number points N points is lower than 14. This limitation is achieved in RANSAC procedure imposing that consensus set cannot have more than 14 pairs ( C x k,i, S X i ). Since in analyzed images minimum number available points is always larger than 14, effect consensus set limitation is that number points N points is constant does not change. 23 compares parameters α, β RMS error attitude angle, all obtained with number points N points kept constant equal to 14. In this way, effect N points is removed from analysis. It is clear that correlation between α, β, RMS angular error highlighted by s 11 22, with N points variable, is very similar still evident in case 23, without effect N points. 24 compares RMS error attitude angle evaluated considering all 22 positions, in case with variable N points (changing between to 38) in case with constant N points (equal to 14). It is clear that error is generally lower in case with all points. To conclude, from s 22 24, both parameters α/β N points can be exploited to improve final measurement error. A possible explanation results illustrated in s 11, 16 23, is that, in our experimental set-up, a rotation target spacecraft yields larger movements 2D projected points if mean angles α, β, α 2 are high. Larger movements for same rotation mean that imposed angular displacement is better observed by vision system final measurement errors uncertainties are lower. For high values angles α, β, α 2, positive effect larger observed movements is mitigated balanced by negative effect projective deformation markers, which increases uncertainty detected 2D points. This second negative effect with high values angles α, β, α 2 is not evident in presented analysis since ir maximum values are limited. The same explanation larger movements projected points is also valid for following parameters LongZ TansvX as described at end Section 7.5.

22 angles α, β, α2, positive effect larger observed movements is mitigated balanced by negative effect projective deformation markers, which increases uncertainty detected 2D points. This second negative effect with high values angles α, β, α2 is not evident in presented analysis since ir maximum values are limited. The same explanation larger movements projected points is also valid for following parameters LongZ Robotics 2018, 7, TansvX as described at end Section The mean angle (black), (red), mean number points Npoints 22. The mean angle α (black), β (red), mean number points N (blue) vs. points (blue) vs. imposed imposed rotations. rotations. Robotics 2018, 7, x FOR PEER REVIEW Robotics 2018, 7, x FOR PEER REVIEW The mean angle α (black), β (red), RMS error attitude angle (blue) vs. imposed The The mean mean angle angle α (black), (black), β (red), (red), RMS RMS error error attitude attitude angle angle (blue) (blue) vs. vs. imposed imposed rotations, computed with Npoints 14. rotations, rotations, computed computed with with N Npoints The RMS error attitude angle vs. imposed rotations, computed using all 24. The RMS error attitude angle vs. imposed rotations, computed using all available 24. points The(orange RMS error line) with attitude Npoints angle 14 vs. (blue line). imposed rotations, computed using all available points (orange line) with Npoints 14 (blue line). available points (orange line) with N points 14 (blue line). Coming back to analysis with all points looking at 22, it can be verified that up Coming back to analysis with all points looking at 22, it can be verified that up to 18 Coming between back to 86 analysis 90, all with all detected points points looking belong atto a planar 22, it face can be verified satellite that mockup. 18 Someone may argue that, in presence coplanarity among 3D points, homography up to to 18 between 86 90, all detected points belong to a planar face satellite mockup. Someone between may 86 argue 90 that,, all in detected presence points coplanarity belong to among a planar face 3D points, satellite homography mock-up. transformation could be exploited, instead P3P solution. A selective employment transformation could be exploited, instead P3P solution. A selective employment homography transformation could be subject an interesting future investigation, but it is homography transformation could be subject an interesting future investigation, but it is beyond purpose present work. The same authors selected closed-form direct beyond purpose present work. The same authors selected closed-form direct solution P3P problem, [16], do not highlight any degenerate coplanar configuration 3D solution P3P problem, [16], do not highlight any degenerate coplanar configuration 3D points. Moreover, in 11, comparing angular ranges with coplanarity (0 18 points. Moreover, in 11, comparing angular ranges with coplanarity (0 18

23 Robotics 2018, 7, Someone may argue that, in presence coplanarity among 3D points, homography transformation could be exploited, instead P3P solution. A selective employment homography transformation could be subject an interesting future investigation, but it is beyond purpose present work. The same authors selected closed-form direct solution P3P problem, [16], do not highlight any degenerate coplanar configuration 3D points. Moreover, in 11, comparing angular ranges with coplanarity ( ) without coplanarity (20 84 ), re is not an evident discontinuity obtained errors uncertainties. Thus, from s 11 22, even if an effect coplanarity cannot be excluded, or factors (e.g., mean angles α β) appear more important influencing for final errors uncertainties Parameter c: Mean Angles α 2 α 3 The correlation coefficients Table 1 obtained in case mean angles α 2 α 3 are less significant than or two mean angles α β. However, all following s show a clear error/uncertainty increase when α 2 /α 3 decreases. Moreover, from a oretical point view, angular distance between projection line local/tangential velocity (considered in α 2 α 3 ), instead marker plane (considered in α β), should be significant for incremental rotations Robotics 2018, or displacements 7, x FOR PEER REVIEW since motion a 2D point cannot be observed if its angle α 2 /α 3 is24 equal 30 torobotics zero ( 2018, considered 7, x FOR PEER 2DREVIEW point moves along projection line its position in image does 24 not 30 change). not change). These These observations observations lead lead to to indication indication that that angles angles α 2 α2 α 3 areα3 also are significant also significant should not change). should be taken be These taken intoobservations account account lead in design to design aindication measurement a measurement that system angles α2 system based based α3 on markers. on are markers. also significant should be taken into account in design a measurement system based on markers. 25. The incremental rotation errors vs. mean angle α The incremental rotation errors vs. mean angle α The incremental rotation errors vs. mean angle α The rotation extended uncertainties vs. mean angle α The The rotation rotation extended extended uncertainties uncertainties vs. vs. mean mean angle angle α α2. 2.

24 Robotics 2018, 7, The rotation extended uncertainties vs. mean angle α The position extended uncertainties vs. mean angle α The position extended uncertainties vs. mean angle α Parameter d: Maximum Longitudinal Distance LongZ 7.4. Parameter d: Maximum Longitudinal Distance LongZ The maximum longitudinal distance 3D points is third most influencing parameter if The maximum longitudinal distance 3D points is third most influencing parameter if correlation coefficients Table 1 are considered. The following s depict correlation coefficients Table 1 are considered. The following s depict respectively respectively rotation position extended uncertainties versus parameter LongZ rotation position extended uncertainties versus parameter LongZ confirm a clear confirm a clear negative correlation between m. negative Robotics 2018, correlation 7, x FOR PEER between REVIEW m Robotics 2018, 7, x FOR PEER REVIEW The rotation extended uncertainties vs. vs. longitudinal maximum distance LongZ. 28. The rotation extended uncertainties vs. longitudinal maximum distance LongZ. 29. The position extended uncertainties vs. longitudinal maximum distance LongZ. 29. The position extended uncertainties vs. longitudinal maximum distance LongZ. 29. The position extended uncertainties vs. longitudinal maximum distance LongZ Parameter e: Maximum Transverse Distance TransvX 7.5. Parameter e: Maximum Transverse Distance TransvX 7.5. Parameter e: Maximum Transverse Distance TransvX The maximum transverse distance among 3D points exhibits second least significant The maximum transverse distance among 3D points exhibits second least significant correlation The maximum coefficient transverse in all distance four rows among Table 3D1. points exhibits 30 shows second angle least error significant versus correlation coefficient in all four rows Table shows angle error versus correlation parameter TransvX coefficient inconfirms all four that re rows is no Table linear 1. or hyperbolic 30 shows correlation angle between error m. versus The parameter TransvX confirms that re is no linear or hyperbolic correlation between m. The effect transversal distribution 3D points appears substantially different from longitudinal effect transversal distribution 3D points appears substantially different from longitudinal one. From 30, it is clear that both low high values TransvX are associated with low one. From 30, it is clear that both low high values TransvX are associated with low angular errors, while higher errors take place when TransvX takes intermediate values. The angular errors, while higher errors take place when TransvX takes intermediate values. The different behavior two parameters LongZ TransvX (as highlighted by values different behavior two parameters LongZ TransvX (as highlighted by values correlation coefficients by 28 vs. 30) can be explained considering observed

25 Robotics 2018, 7, parameter TransvX confirms that re is no linear or hyperbolic correlation between m. The effect transversal distribution 3D points appears substantially different from longitudinal one. From 30, it is clear that both low high values TransvX are associated with low angular errors, while higher errors take place when TransvX takes intermediate values. The different behavior two parameters LongZ TransvX (as highlighted by values correlation coefficients by 28 vs. 30) can be explained considering observed movements 2D projected points. If 3D points (markers) are dispersed spaced out along longitudinal direction, a rotation (around an axis roughly orthogonal to longitudinal optical axis as in experimental set-up near 3D point barycenter) yields transverse displacements 3D points larger observed movements projected 2D points than case with 3D points dispersed along a transverse direction. Larger 2D observed movements allow us to obtain lower errors Robotics uncertainties 2018, 7, x FOR inpeer REVIEW rotation measurement Robotics 2018, 7, x FOR PEER REVIEW The rotation errors vs. maximum transverse distance TransvX The The rotation errors errors vs. vs. maximum transverse distance distance TransvX Parameter f: Mean Distance DistBar 3D Points from Their Barycenter 7.6. Parameter f: f: Mean Distance DistBar 3D Points from Their Barycenter The parameter DistBar, which is mean distance between 3D points ir barycenter The parameter DistBar, which is is mean distance between 3D points ir barycenter ( 15f), does not excel for its correlation coefficients in Table 1. This low correlation is graphically ( 15f), does not excel for its correlation coefficients in in Table This low correlation is is graphically confirmed by 31 which illustrates case position extended uncertainties shows confirmed by which illustrates case position extended uncertainties shows that all possible uncertainty values are possible with same value DistBar (see vertical line). that all all possible uncertainty values are possible with same value DistBar (see vertical line). However, if rotations are considered, 32 shows that angular errors exhibit a slight However, if if rotations are considered, shows that angular errors exhibit a slight negative correlation with DistBar: longitudinal motion does not seem affected by DistBar in a negative correlation with DistBar: longitudinal motion does not seem affected by DistBar in in a general evident way, while rotation errors increase if DistBar gets lower. general evident way, while rotation errors increase if if DistBar gets gets lower. 31. The position extended uncertainties vs. mean distance DistBar. 31. The position extended uncertainties vs. mean distance DistBar. 31. The position extended uncertainties vs. mean distance DistBar.

26 Robotics 2018, 7, The position extended uncertainties vs. mean distance DistBar The The rotation rotation errors errors vs. vs. mean mean distance distance DistBar. DistBar. Robotics Robotics 2018, 2018, 7, 7, x FOR FOR PEER PEER REVIEW REVIEW Parameter g: g: Number Points NNpoints 7.7. Parameter g: Number Points Npoints Both in in Table in in following s , 34, number points points NNpoints an Both in Table 1 in following s 33 34, number points exhibits an Npoints exhibits an to effect effect very very similar similar to to parameter parameter DistBar: DistBar: no no evident evident correlation correlation with with position position a slight slight negative correlation with with rotation. rotation. s s 22, 22, 24 24, 33 illustrate 33 illustrate a slight slight correlation correlation with with final negative correlation with rotation. s 22, 24, 33 illustrate a slight correlation with angular final angular errors errors ir uncertainties, ir uncertainties, as Table as in 1, Table confirming 1, confirming generally generally accepted accepted belief that belief an final angular errors ir uncertainties, as in Table 1, confirming generally accepted belief increase that an increase N points reduces Npoints reduces final error. Afinal possible error. effect possible spatial effect distribution spatial (with/without that an increase distribution Npoints reduces final error. A possible effect spatial distribution coplanarity), (with/without when coplanarity), N points is low, when has Npoints beenis already low, has discussed been already in Section discussed 7.2. (with/without coplanarity), when in Section 7.2. Npoints is low, has been already discussed in Section The rotation errors vs. number points Npoints. 33. The rotation errors vs. number points NNpoints The position extended uncertainties vs. number points Npoints The position extended uncertainties vs. vs. number points NNpoints Parameter Parameter h: h: Distance Distance Dist_BC Dist_BC Barycenter Camera In In Table Table 1, 1, distance distance between between point point barycenter barycenter camera camera is is least least significant significant parameter parameter in in three three out out four four cases cases (rows). (rows). However, However, both both Section Section 6 following following s s , 36, highlight highlight a slight slight positive positive correlation correlation which which cannot cannot be be detected detected only only from from correlation correlation coefficient coefficient (except (except in in position position error error case, case, summarized summarized in in third third row row Table Table 1): 1): if if observed observed target target gets gets farr, farr, errors errors uncertainties uncertainties increase. increase. Looking Looking at at 36, 36, position uncertainties are aligned along lines almost straight each line corresponds to a different

27 Robotics 2018, 7, Parameter h: Distance Dist_BC Barycenter Camera In Table 1, distance between point barycenter camera is least significant parameter in three out four cases (rows). However, both 14 Section 6 following s 35 36, highlight a slight positive correlation which cannot be detected only from correlation coefficient (except in position error case, summarized in third row Table 1): if observed target gets farr, errors uncertainties increase. Looking at 36, position uncertainties are aligned along lines almost straight each line corresponds to a different inclination spacecraft. The best line, i.e., with lowest uncertainty increment for same distance variation, exhibits an inclination roughly equal to 4 mm/m. This means that an increment 1 m in camera-satellite distance yields an uncertainty increase 4 mm. The worst case has an inclination 21 mm/m, while average inclination, which can be evaluated from 14, is roughly 8 mm/m. We warn that se results are obtained with a limited total distance variation about 1.05 m that any Robotics extrapolation 2018, 7, x FOR could PEER REVIEW be dangerous Robotics 2018, 7, x FOR PEER REVIEW The The position position errors errors vs. vs. distance distance barycenter camera Dist_BC. 35. The position errors vs. distance barycenter camera Dist_BC. 36. position extended uncertainties vs. distance barycenter camera Dist_BC. 36. position extended uncertainties vs. distance barycenter camera Dist_BC. 36. position extended uncertainties vs. distance barycenter camera Dist_BC. As already said in Section 6, this behavior seems reasonable coherent with reduction As already said in Section 6, this behavior seems reasonable coherent with reduction observed size markers when distance increases. observed As already size said in Section markers 6, when this behavior distance seems increases. reasonable coherent with reduction observed size markers when distance increases. 8. Conclusions 8. Conclusions 8. Conclusions An improved approach based on a single camera fiducial markers for measurement An improved approach based on a single camera fiducial markers for measurement An relative improved pose approach between based a target on a single a camera chaser spacecraft fiducial markers is presented for measurement described. The relative pose between a target a chaser spacecraft is presented described. The relative measurement pose between procedure a target integrates a chaser a closed-form spacecraft isolution presented described. PnP problem, The measurement a RANSAC measurement procedure integrates a closed-form solution PnP problem, a RANSAC procedure, integrates a non-linear a closed-form local optimization, solution a PnP global problem, Bundle aadjustment RANSAC procedure, refinement a non-linear both procedure, a non-linear local optimization, a global Bundle Adjustment refinement both local measured optimization, poses a 3D global point Bundle map. Adjustment The employed refinement experimental both set-up measured allows us poses a metrological measured poses 3D point map. The employed experimental set-up allows us a metrological characterization evaluation measurement errors, while a Monte Carlo analysis allows characterization evaluation measurement errors, while a Monte Carlo analysis allows us to evaluate position rotation uncertainties. The application method to an us to evaluate position rotation uncertainties. The application method to an experimental data set acquired in laboratory highlights significant improvements provided by experimental data set acquired in laboratory highlights significant improvements provided by Bundle Adjustment refinement. A set possible influencing physical parameters are defined Bundle Adjustment refinement. A set possible influencing physical parameters are defined ir correlations with rotation position errors uncertainties are analyzed. Using both ir correlations with rotation position errors uncertainties are analyzed. Using both

28 Robotics 2018, 7, D point map. The employed experimental set-up allows us a metrological characterization evaluation measurement errors, while a Monte Carlo analysis allows us to evaluate position rotation uncertainties. The application method to an experimental data set acquired in laboratory highlights significant improvements provided by Bundle Adjustment refinement. A set possible influencing physical parameters are defined ir correlations with rotation position errors uncertainties are analyzed. Using both numerical quantitative correlation coefficients qualitative graphical representations, most significant parameters for final measurement errors uncertainties are found out. In detail, mean angle α between each projection line 3D points normal to marker planes resulted in most influencing parameter, whose maximum values around 50 deg exhibit best uncertainty lowest errors. The obtained results give clear indications advice for design future measurement systems for selection marker positioning on a satellite surface. Author Contributions: Conceptualization, M.P., S.C., R.G., S.D., E.C.L.; Methodology, M.P., S.C., R.G., M.M., A.V., A.F.; Stware, M.P., S.C., R.G., M.M., A.F.; Validation, M.P., S.C., R.G., A.F., S.D., E.C.L.; Formal Analysis, M.P.; Investigation, M.P., S.C., R.G., M.M., A.V., A.F., S.D., E.C.L.; Resources, M.P.; Data Curation, M.P., M.M.; Writing-Original Draft Preparation, M.P.; Writing-Review & Editing, M.P., S.C., R.G., M.M., A.V., A.F., S.D., E.C.L.; Supervision, M.P., S.D., E.C.L.; Project Administration, M.P., S.D., E.C.L.; Funding Acquisition, S.D., E.C.L. Funding: This research received no external funding. Conflicts Interest: The authors declare no conflict interest. The funders had no role in design study; in collection, analyses, or interpretation data; in writing manuscript, in decision to publish results. References 1. Mokuno, M.; Kawano, I. In-orbit demonstration an optical navigation system for autonomous rendezvous docking. AIAA J. Spacecr. Rockets 2011, 48, [CrossRef] 2. Howard, R.; Heaton, A.; Pinson, R.; Carrington, C. Orbital Express Advanced Video Guidance Sensor. In Proceedings 2008 IEEE Aerospace Conference, Big Sky, MT, USA, 1 8 March Howard, R.T.; Heaton, A.F.; Pinson, R.M.; Carrington, C.L.; Lee, J.E.; Bryan, T.C.; Robertson, B.A.; Spencer, S.H.; Johnson, J.E. The Advanced Video Guidance Sensor: Orbital Express next generation. In AIP Conference Proceedings; AIP: College Park, MD, USA, 2008; Volume 969, pp Bodin, P.; Noteborn, R.; Larsson, R.; Karlsson, T.; D Amico, S.; Ardaens, J.S.; Delpech, M.; Berges, J.C. PRISMA formation flying demonstrator: overview conclusions from nominal mission. Adv. Astronaut. Sci. 2012, 144, Tweddle, B.E. Relative Computer Vision Based Navigation for Small Inspection Spacecraft. In Proceedings AIAA Guidance, Navigation Control Conference, Portl, OR, USA, 8 11 August Tweddle, B.E.; Saenz-Otero, A. Relative Computer Vision-Based Navigation for Small Inspection Spacecraft. J. Guid. Control Dyn. 2014, 38, [CrossRef] 7. Sansone, F.; Branz, F.; Francesconi, A. A Relative Navigation Sensor for Cubesats Based on Retro-reflective Markers. In Proceedings International Workshop on Metrology for Aerospace, Padua, Italy, June Pertile, M.; Mazzucato, M.; Chiodini, S.; Debei, S.; Lorenzini, E. Uncertainty evaluation a vision system for pose measurement a spacecraft with fiducial markers. In Proceedings International Workshop on Metrology for Aerospace, Benevento, Italy, 4 5 June Lepetit, V.; Moreno-Noguer, F.; Fua, P. EPnP: An Accurate O(n) solution to PnP problem. Int. J. Comput. Vis. 2009, 81, [CrossRef] 10. Ma, Y.; Soatto, S.; Kosecka, J.; Sastry, S.S. An Invitation to 3-D Vision; Springer Science & Business Media: Berlin, Germany, Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press: New York, NY, USA, Haralick, R.; Lee, C.; Ottenberg, K.; Nolle, M. Review Analysis Solutions Three Point Perspective Pose Estimation Problem. Int. J. Comput. Vis. 1994, 13, [CrossRef]

29 Robotics 2018, 7, Quan, L.; Lan, Z. Linear N-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell. 1999, 21, [CrossRef] 14. Gao, X.; Hou, X.; Tang, J.; Cheng, H. Complete solution classification for perspective-three-point problem. IEEE Trans. Pattern Anal. Mach. Intell. 2003, 25, Nister, D.; Stewenius, H. A minimal solution to generalized 3-point pose problem. J. Math. Imaging Vis. 2006, 27, [CrossRef] 16. Kneip, L.; Scaramuzza, D.; Siegwart, R. A Novel Parametrization Perspective-Three-Point Problem for a Direct Computation Absolute Camera Position Orientation. In Proceedings IEEE Conference on Computer Vision Pattern Recognition, CVPR 2011, Colorado Springs, CO, USA, June Christian, J.A.; Robinson, S.B.; D Souza, C.N.; Ruiz, J.P. Cooperative Relative Navigation Spacecraft Using Flash Light Detection Ranging Sensors. J. Guid. Control Dyn. 2014, 37, [CrossRef] 18. Opromolla, R.; Fasano, G.; Rufino, G.; Grassi, M. A review cooperative uncooperative spacecraft pose determination techniques for close-proximity operations. Prog. Aerosp. Sci. 2017, 93, [CrossRef] 19. Jeong, J.; Kim, S.; Suk, J. Parametric study sensor placement for vision-based relative navigation system multiple spacecraft. Acta Astronaut. 2017, 141, [CrossRef] 20. Yousefian, P.; Durali, M.; Jalali, M.A. Optimal design simulation sensor arrays for solar motion estimation. IEEE Sens. J. 2017, 17, [CrossRef] 21. Jackson, B.; Carpenter, B. Optimal placement spacecraft sun sensors using stochastic optimization. In Proceedings IEEE Aerospace Conference, Big Sky, MT, USA, 6 13 March 2004; Volume 6, pp Kato, H.; Billinghurst, M. Marker tracking HMD calibration for a video-based augmented reality conferencing system. In Proceedings 2nd IEEE ACM International Workshop on Augmented Reality, IWAR 099, IEEE Computer Society, Washington, DC, USA, October 1999; pp Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation detection highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, [CrossRef] 24. Ahn, S.J.; Warnecke, H.J.; Kotowski, R. Systematic Geometric Image Measurement Errors Circular Object Targets: Mamatical Formulation Correction. Photogramm. Rec. 1999, 16, [CrossRef] 25. Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. PAMI 2000, 22, [CrossRef] 26. Gonzalez, R.C.; Woods, R.E.; Eddins, S.L. Digital Image Processing Using MATLAB; Pearson Prentice Hall: Upper Saddle River, NJ, USA, Heckbert, P.S.; Garl, M. Survey Polygonal Surface Simplification Algorithms; Multiresolution Surface Modeling Course SIGGRAPH 97, Carnegie Mellon University Technical Report; Carnegie Mellon University School Computer Science: Pittsburgh, PA, USA, Fischler, M.A.; Bolles, R.C. Rom sample consensus: a paradigm for model fitting with applications to image analysis automated cartography. Commun. ACM 1981, 24, [CrossRef] 29. Moré, J.J. The Levenberg-Marquardt Algorithm: Implementation Theory. In Numerical Analysis; Lecture Notes in Mamatics 630; Watson, G.A., Ed.; Springer: Berlin, Germany, 1977; pp Lourakis, M.I.A.; Argyros, A.A. SBA: A Stware Package for Generic Sparse Bundle Adjustment. ACM Trans. Math. Stw [CrossRef] 31. Triggs, B.; McLauchlan, P.; Hartley, R.; Fitzgibbon, A. Bundle Adjustment: A Modern Synsis. In International Workshop on Vision Algorithms; Springer: Berlin/Heidelberg, Germany, 1999; pp BIPM; IEC; IFCC; ILAC; ISO; IUPAC; IUPAP; OIML. Evaluation Measurement Data Guide to Expression Uncertainty in Measurement; International Organization for Stardization: Geneva, Switzerl, BIPM; IEC; IFCC; ILAC; ISO; IUPAC; IUPAP; OIML. Evaluation Measurement Data Supplement 1 to Guide to Expression Uncertainty in Measurement Propagation Distributions Using a Monte Carlo Method; International Organization for Stardization: Geneva, Switzerl, by authors. Licensee MDPI, Basel, Switzerl. This article is an open access article distributed under terms conditions Creative Commons Attribution (CC BY) license (

Presentation Outline

Presentation Outline Presentation Outline Phd Activities during the three years mainly concentrated on the development and testing of the SPARTANS cooperating spacecraft hardware testbed Development of the Translation Module

More information

Prof. Fanny Ficuciello Robotics for Bioengineering Visual Servoing

Prof. Fanny Ficuciello Robotics for Bioengineering Visual Servoing Visual servoing vision allows a robotic system to obtain geometrical and qualitative information on the surrounding environment high level control motion planning (look-and-move visual grasping) low level

More information

Smartphone Video Guidance Sensor for Small Satellites

Smartphone Video Guidance Sensor for Small Satellites SSC13-I-7 Smartphone Video Guidance Sensor for Small Satellites Christopher Becker, Richard Howard, John Rakoczy NASA Marshall Space Flight Center Mail Stop EV42, Huntsville, AL 35812; 256-544-0114 christophermbecker@nasagov

More information

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education DIPARTIMENTO DI INGEGNERIA INDUSTRIALE Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education Mattia Mazzucato, Sergio Tronco, Andrea Valmorbida, Fabio Scibona and Enrico

More information

CSE 252B: Computer Vision II

CSE 252B: Computer Vision II CSE 252B: Computer Vision II Lecturer: Serge Belongie Scribes: Jeremy Pollock and Neil Alldrin LECTURE 14 Robust Feature Matching 14.1. Introduction Last lecture we learned how to find interest points

More information

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc.

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc. Minimizing Noise and Bias in 3D DIC Correlated Solutions, Inc. Overview Overview of Noise and Bias Digital Image Correlation Background/Tracking Function Minimizing Noise Focus Contrast/Lighting Glare

More information

Overview. Augmented reality and applications Marker-based augmented reality. Camera model. Binary markers Textured planar markers

Overview. Augmented reality and applications Marker-based augmented reality. Camera model. Binary markers Textured planar markers Augmented reality Overview Augmented reality and applications Marker-based augmented reality Binary markers Textured planar markers Camera model Homography Direct Linear Transformation What is augmented

More information

Fully Automatic Endoscope Calibration for Intraoperative Use

Fully Automatic Endoscope Calibration for Intraoperative Use Fully Automatic Endoscope Calibration for Intraoperative Use Christian Wengert, Mireille Reeff, Philippe C. Cattin, Gábor Székely Computer Vision Laboratory, ETH Zurich, 8092 Zurich, Switzerland {wengert,

More information

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical

More information

Three-dimensional nondestructive evaluation of cylindrical objects (pipe) using an infrared camera coupled to a 3D scanner

Three-dimensional nondestructive evaluation of cylindrical objects (pipe) using an infrared camera coupled to a 3D scanner Three-dimensional nondestructive evaluation of cylindrical objects (pipe) using an infrared camera coupled to a 3D scanner F. B. Djupkep Dizeu, S. Hesabi, D. Laurendeau, A. Bendada Computer Vision and

More information

Detecting motion by means of 2D and 3D information

Detecting motion by means of 2D and 3D information Detecting motion by means of 2D and 3D information Federico Tombari Stefano Mattoccia Luigi Di Stefano Fabio Tonelli Department of Electronics Computer Science and Systems (DEIS) Viale Risorgimento 2,

More information

Outline. ETN-FPI Training School on Plenoptic Sensing

Outline. ETN-FPI Training School on Plenoptic Sensing Outline Introduction Part I: Basics of Mathematical Optimization Linear Least Squares Nonlinear Optimization Part II: Basics of Computer Vision Camera Model Multi-Camera Model Multi-Camera Calibration

More information

Perspective Projection Describes Image Formation Berthold K.P. Horn

Perspective Projection Describes Image Formation Berthold K.P. Horn Perspective Projection Describes Image Formation Berthold K.P. Horn Wheel Alignment: Camber, Caster, Toe-In, SAI, Camber: angle between axle and horizontal plane. Toe: angle between projection of axle

More information

Segmentation and Tracking of Partial Planar Templates

Segmentation and Tracking of Partial Planar Templates Segmentation and Tracking of Partial Planar Templates Abdelsalam Masoud William Hoff Colorado School of Mines Colorado School of Mines Golden, CO 800 Golden, CO 800 amasoud@mines.edu whoff@mines.edu Abstract

More information

Visual Odometry. Features, Tracking, Essential Matrix, and RANSAC. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Visual Odometry. Features, Tracking, Essential Matrix, and RANSAC. Stephan Weiss Computer Vision Group NASA-JPL / CalTech Visual Odometry Features, Tracking, Essential Matrix, and RANSAC Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline The

More information

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 263 Index 3D reconstruction, 125 5+1-point algorithm, 284 5-point algorithm, 270 7-point algorithm, 265 8-point algorithm, 263 affine point, 45 affine transformation, 57 affine transformation group, 57 affine

More information

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains PhD student: Jeff DELAUNE ONERA Director: Guy LE BESNERAIS ONERA Advisors: Jean-Loup FARGES Clément BOURDARIAS

More information

Using surface markings to enhance accuracy and stability of object perception in graphic displays

Using surface markings to enhance accuracy and stability of object perception in graphic displays Using surface markings to enhance accuracy and stability of object perception in graphic displays Roger A. Browse a,b, James C. Rodger a, and Robert A. Adderley a a Department of Computing and Information

More information

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253

Index. 3D reconstruction, point algorithm, point algorithm, point algorithm, point algorithm, 253 Index 3D reconstruction, 123 5+1-point algorithm, 274 5-point algorithm, 260 7-point algorithm, 255 8-point algorithm, 253 affine point, 43 affine transformation, 55 affine transformation group, 55 affine

More information

Chapters 1 7: Overview

Chapters 1 7: Overview Chapters 1 7: Overview Chapter 1: Introduction Chapters 2 4: Data acquisition Chapters 5 7: Data manipulation Chapter 5: Vertical imagery Chapter 6: Image coordinate measurements and refinements Chapter

More information

Exterior Orientation Parameters

Exterior Orientation Parameters Exterior Orientation Parameters PERS 12/2001 pp 1321-1332 Karsten Jacobsen, Institute for Photogrammetry and GeoInformation, University of Hannover, Germany The georeference of any photogrammetric product

More information

Sensor Modalities. Sensor modality: Different modalities:

Sensor Modalities. Sensor modality: Different modalities: Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature

More information

Computer aided error analysis for a threedimensional precision surface mapping system

Computer aided error analysis for a threedimensional precision surface mapping system Computer aided error analysis for a threedimensional precision surface mapping system M. Hill, J.W. McBride, D. Zhang & J. Loh Mechanical Engineering, U~riversity of Southampton, UK Abstract This paper

More information

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2

More information

Outline. Introduction System Overview Camera Calibration Marker Tracking Pose Estimation of Markers Conclusion. Media IC & System Lab Po-Chen Wu 2

Outline. Introduction System Overview Camera Calibration Marker Tracking Pose Estimation of Markers Conclusion. Media IC & System Lab Po-Chen Wu 2 Outline Introduction System Overview Camera Calibration Marker Tracking Pose Estimation of Markers Conclusion Media IC & System Lab Po-Chen Wu 2 Outline Introduction System Overview Camera Calibration

More information

Flexible Calibration of a Portable Structured Light System through Surface Plane

Flexible Calibration of a Portable Structured Light System through Surface Plane Vol. 34, No. 11 ACTA AUTOMATICA SINICA November, 2008 Flexible Calibration of a Portable Structured Light System through Surface Plane GAO Wei 1 WANG Liang 1 HU Zhan-Yi 1 Abstract For a portable structured

More information

Navigation and Alignment Aids Concept of Operations and Supplemental Design Information

Navigation and Alignment Aids Concept of Operations and Supplemental Design Information Navigation and Alignment Aids Concept of Operations and Supplemental Design Information Revision A REVISION AND HISTORY REV. DESCRIPTION PUB. DATE - Initial Release 03-24-16 A Revision A includes the following

More information

Interactive Math Glossary Terms and Definitions

Interactive Math Glossary Terms and Definitions Terms and Definitions Absolute Value the magnitude of a number, or the distance from 0 on a real number line Addend any number or quantity being added addend + addend = sum Additive Property of Area the

More information

DEVELOPMENT OF A ROBUST IMAGE MOSAICKING METHOD FOR SMALL UNMANNED AERIAL VEHICLE

DEVELOPMENT OF A ROBUST IMAGE MOSAICKING METHOD FOR SMALL UNMANNED AERIAL VEHICLE DEVELOPMENT OF A ROBUST IMAGE MOSAICKING METHOD FOR SMALL UNMANNED AERIAL VEHICLE J. Kim and T. Kim* Dept. of Geoinformatic Engineering, Inha University, Incheon, Korea- jikim3124@inha.edu, tezid@inha.ac.kr

More information

A Robust Two Feature Points Based Depth Estimation Method 1)

A Robust Two Feature Points Based Depth Estimation Method 1) Vol.31, No.5 ACTA AUTOMATICA SINICA September, 2005 A Robust Two Feature Points Based Depth Estimation Method 1) ZHONG Zhi-Guang YI Jian-Qiang ZHAO Dong-Bin (Laboratory of Complex Systems and Intelligence

More information

Chapter 9 Object Tracking an Overview

Chapter 9 Object Tracking an Overview Chapter 9 Object Tracking an Overview The output of the background subtraction algorithm, described in the previous chapter, is a classification (segmentation) of pixels into foreground pixels (those belonging

More information

Geometric Calibration in Active Thermography Applications

Geometric Calibration in Active Thermography Applications 19 th World Conference on Non-Destructive Testing 2016 Geometric Calibration in Active Thermography Applications Thomas SCHMIDT 1, Christoph FROMMEL 2 1, 2 Deutsches Zentrum für Luft- und Raumfahrt (DLR)

More information

Model Based Perspective Inversion

Model Based Perspective Inversion Model Based Perspective Inversion A. D. Worrall, K. D. Baker & G. D. Sullivan Intelligent Systems Group, Department of Computer Science, University of Reading, RG6 2AX, UK. Anthony.Worrall@reading.ac.uk

More information

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne Hartley - Zisserman reading club Part I: Hartley and Zisserman Appendix 6: Iterative estimation methods Part II: Zhengyou Zhang: A Flexible New Technique for Camera Calibration Presented by Daniel Fontijne

More information

Visual Pose Estimation System for Autonomous Rendezvous of Spacecraft

Visual Pose Estimation System for Autonomous Rendezvous of Spacecraft Visual Pose Estimation System for Autonomous Rendezvous of Spacecraft Mark A. Post1, Junquan Li2, and Craig Clark2 Space Mechatronic Systems Technology Laboratory Dept. of Design, Manufacture & Engineering

More information

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Stereo II CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz Stereo II CSE 576 Ali Farhadi Several slides from Larry Zitnick and Steve Seitz Camera parameters A camera is described by several parameters Translation T of the optical center from the origin of world

More information

COMPARATIVE STUDY OF DIFFERENT APPROACHES FOR EFFICIENT RECTIFICATION UNDER GENERAL MOTION

COMPARATIVE STUDY OF DIFFERENT APPROACHES FOR EFFICIENT RECTIFICATION UNDER GENERAL MOTION COMPARATIVE STUDY OF DIFFERENT APPROACHES FOR EFFICIENT RECTIFICATION UNDER GENERAL MOTION Mr.V.SRINIVASA RAO 1 Prof.A.SATYA KALYAN 2 DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING PRASAD V POTLURI SIDDHARTHA

More information

Project 2: Structure from Motion

Project 2: Structure from Motion Project 2: Structure from Motion CIS 580, Machine Perception, Spring 2015 Preliminary report due: 2015.04.27. 11:59AM Final Due: 2015.05.06. 11:59AM This project aims to reconstruct a 3D point cloud and

More information

Supporting Information. High-Throughput, Algorithmic Determination of Nanoparticle Structure From Electron Microscopy Images

Supporting Information. High-Throughput, Algorithmic Determination of Nanoparticle Structure From Electron Microscopy Images Supporting Information High-Throughput, Algorithmic Determination of Nanoparticle Structure From Electron Microscopy Images Christine R. Laramy, 1, Keith A. Brown, 2, Matthew N. O Brien, 2 and Chad. A.

More information

First scan matching algorithms. Alberto Quattrini Li University of South Carolina

First scan matching algorithms. Alberto Quattrini Li University of South Carolina First scan matching algorithms Alberto Quattrini Li 2015-10-22 University of South Carolina Robot mapping through scan-matching Framework for consistent registration of multiple frames of measurements

More information

Outline. ETN-FPI Training School on Plenoptic Sensing

Outline. ETN-FPI Training School on Plenoptic Sensing Outline Introduction Part I: Basics of Mathematical Optimization Linear Least Squares Nonlinear Optimization Part II: Basics of Computer Vision Camera Model Multi-Camera Model Multi-Camera Calibration

More information

AUTOMATED 4 AXIS ADAYfIVE SCANNING WITH THE DIGIBOTICS LASER DIGITIZER

AUTOMATED 4 AXIS ADAYfIVE SCANNING WITH THE DIGIBOTICS LASER DIGITIZER AUTOMATED 4 AXIS ADAYfIVE SCANNING WITH THE DIGIBOTICS LASER DIGITIZER INTRODUCTION The DIGIBOT 3D Laser Digitizer is a high performance 3D input device which combines laser ranging technology, personal

More information

3D Model Acquisition by Tracking 2D Wireframes

3D Model Acquisition by Tracking 2D Wireframes 3D Model Acquisition by Tracking 2D Wireframes M. Brown, T. Drummond and R. Cipolla {96mab twd20 cipolla}@eng.cam.ac.uk Department of Engineering University of Cambridge Cambridge CB2 1PZ, UK Abstract

More information

Ch 22 Inspection Technologies

Ch 22 Inspection Technologies Ch 22 Inspection Technologies Sections: 1. Inspection Metrology 2. Contact vs. Noncontact Inspection Techniques 3. Conventional Measuring and Gaging Techniques 4. Coordinate Measuring Machines 5. Surface

More information

arxiv: v1 [cs.cv] 28 Sep 2018

arxiv: v1 [cs.cv] 28 Sep 2018 Camera Pose Estimation from Sequence of Calibrated Images arxiv:1809.11066v1 [cs.cv] 28 Sep 2018 Jacek Komorowski 1 and Przemyslaw Rokita 2 1 Maria Curie-Sklodowska University, Institute of Computer Science,

More information

Geometric Rectification of Remote Sensing Images

Geometric Rectification of Remote Sensing Images Geometric Rectification of Remote Sensing Images Airborne TerrestriaL Applications Sensor (ATLAS) Nine flight paths were recorded over the city of Providence. 1 True color ATLAS image (bands 4, 2, 1 in

More information

Robot Vision without Calibration

Robot Vision without Calibration XIV Imeko World Congress. Tampere, 6/97 Robot Vision without Calibration Volker Graefe Institute of Measurement Science Universität der Bw München 85577 Neubiberg, Germany Phone: +49 89 6004-3590, -3587;

More information

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras UAV Position and Attitude Sensoring in Indoor Environment Using Cameras 1 Peng Xu Abstract There are great advantages of indoor experiment for UAVs. Test flights of UAV in laboratory is more convenient,

More information

Stereo Vision. MAN-522 Computer Vision

Stereo Vision. MAN-522 Computer Vision Stereo Vision MAN-522 Computer Vision What is the goal of stereo vision? The recovery of the 3D structure of a scene using two or more images of the 3D scene, each acquired from a different viewpoint in

More information

Satellite Attitude Determination

Satellite Attitude Determination Satellite Attitude Determination AERO4701 Space Engineering 3 Week 5 Last Week Looked at GPS signals and pseudorange error terms Looked at GPS positioning from pseudorange data Looked at GPS error sources,

More information

Vision-based Localization of an Underwater Robot in a Structured Environment

Vision-based Localization of an Underwater Robot in a Structured Environment Vision-based Localization of an Underwater Robot in a Structured Environment M. Carreras, P. Ridao, R. Garcia and T. Nicosevici Institute of Informatics and Applications University of Girona Campus Montilivi,

More information

NUMERICAL METHOD TO ESTIMATE TOLERANCES COMBINED EFFECTS ON A MECHANICAL SYSTEM

NUMERICAL METHOD TO ESTIMATE TOLERANCES COMBINED EFFECTS ON A MECHANICAL SYSTEM INTERNATIONAL DESIGN CONFERENCE - DESIGN 2004 Dubrovnik, May 18-21, 2004. NUMERICAL METHOD TO ESTIMATE TOLERANCES COMBINED EFFECTS ON A MECHANICAL SYSTEM D. Cambiaghi, A. Magalinia and D. Vetturi Keywords:

More information

PERFORMANCE CAPTURE FROM SPARSE MULTI-VIEW VIDEO

PERFORMANCE CAPTURE FROM SPARSE MULTI-VIEW VIDEO Stefan Krauß, Juliane Hüttl SE, SoSe 2011, HU-Berlin PERFORMANCE CAPTURE FROM SPARSE MULTI-VIEW VIDEO 1 Uses of Motion/Performance Capture movies games, virtual environments biomechanics, sports science,

More information

A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models

A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models Emanuele Ruffaldi Lorenzo Peppoloni Alessandro Filippeschi Carlo Alberto Avizzano 2014 IEEE International

More information

3D-2D Laser Range Finder calibration using a conic based geometry shape

3D-2D Laser Range Finder calibration using a conic based geometry shape 3D-2D Laser Range Finder calibration using a conic based geometry shape Miguel Almeida 1, Paulo Dias 1, Miguel Oliveira 2, Vítor Santos 2 1 Dept. of Electronics, Telecom. and Informatics, IEETA, University

More information

CS231A Midterm Review. Friday 5/6/2016

CS231A Midterm Review. Friday 5/6/2016 CS231A Midterm Review Friday 5/6/2016 Outline General Logistics Camera Models Non-perspective cameras Calibration Single View Metrology Epipolar Geometry Structure from Motion Active Stereo and Volumetric

More information

Miniature faking. In close-up photo, the depth of field is limited.

Miniature faking. In close-up photo, the depth of field is limited. Miniature faking In close-up photo, the depth of field is limited. http://en.wikipedia.org/wiki/file:jodhpur_tilt_shift.jpg Miniature faking Miniature faking http://en.wikipedia.org/wiki/file:oregon_state_beavers_tilt-shift_miniature_greg_keene.jpg

More information

Available online at ScienceDirect. Energy Procedia 69 (2015 )

Available online at   ScienceDirect. Energy Procedia 69 (2015 ) Available online at www.sciencedirect.com ScienceDirect Energy Procedia 69 (2015 ) 1885 1894 International Conference on Concentrating Solar Power and Chemical Energy Systems, SolarPACES 2014 Heliostat

More information

FIDUCIAL BASED POSE ESTIMATION ADEWOLE AYOADE ALEX YEARSLEY

FIDUCIAL BASED POSE ESTIMATION ADEWOLE AYOADE ALEX YEARSLEY FIDUCIAL BASED POSE ESTIMATION ADEWOLE AYOADE ALEX YEARSLEY OVERVIEW Objective Motivation Previous Work Methods Target Recognition Target Identification Pose Estimation Testing Results Demonstration Conclusion

More information

3D-OBJECT DETECTION METHOD BASED ON THE STEREO IMAGE TRANSFORMATION TO THE COMMON OBSERVATION POINT

3D-OBJECT DETECTION METHOD BASED ON THE STEREO IMAGE TRANSFORMATION TO THE COMMON OBSERVATION POINT 3D-OBJECT DETECTION METHOD BASED ON THE STEREO IMAGE TRANSFORMATION TO THE COMMON OBSERVATION POINT V. M. Lisitsyn *, S. V. Tikhonova ** State Research Institute of Aviation Systems, Moscow, Russia * lvm@gosniias.msk.ru

More information

Study on Gear Chamfering Method based on Vision Measurement

Study on Gear Chamfering Method based on Vision Measurement International Conference on Informatization in Education, Management and Business (IEMB 2015) Study on Gear Chamfering Method based on Vision Measurement Jun Sun College of Civil Engineering and Architecture,

More information

Object Recognition with Invariant Features

Object Recognition with Invariant Features Object Recognition with Invariant Features Definition: Identify objects or scenes and determine their pose and model parameters Applications Industrial automation and inspection Mobile robots, toys, user

More information

Computer and Machine Vision

Computer and Machine Vision Computer and Machine Vision Lecture Week 12 Part-2 Additional 3D Scene Considerations March 29, 2014 Sam Siewert Outline of Week 12 Computer Vision APIs and Languages Alternatives to C++ and OpenCV API

More information

Camera calibration. Robotic vision. Ville Kyrki

Camera calibration. Robotic vision. Ville Kyrki Camera calibration Robotic vision 19.1.2017 Where are we? Images, imaging Image enhancement Feature extraction and matching Image-based tracking Camera models and calibration Pose estimation Motion analysis

More information

Structure from motion

Structure from motion Structure from motion Structure from motion Given a set of corresponding points in two or more images, compute the camera parameters and the 3D point coordinates?? R 1,t 1 R 2,t R 2 3,t 3 Camera 1 Camera

More information

arxiv: v1 [cs.cv] 18 Sep 2017

arxiv: v1 [cs.cv] 18 Sep 2017 Direct Pose Estimation with a Monocular Camera Darius Burschka and Elmar Mair arxiv:1709.05815v1 [cs.cv] 18 Sep 2017 Department of Informatics Technische Universität München, Germany {burschka elmar.mair}@mytum.de

More information

Scene Reconstruction from Uncontrolled Motion using a Low Cost 3D Sensor

Scene Reconstruction from Uncontrolled Motion using a Low Cost 3D Sensor Scene Reconstruction from Uncontrolled Motion using a Low Cost 3D Sensor Pierre Joubert and Willie Brink Applied Mathematics Department of Mathematical Sciences University of Stellenbosch, South Africa

More information

Model-based segmentation and recognition from range data

Model-based segmentation and recognition from range data Model-based segmentation and recognition from range data Jan Boehm Institute for Photogrammetry Universität Stuttgart Germany Keywords: range image, segmentation, object recognition, CAD ABSTRACT This

More information

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation Advanced Vision Guided Robotics David Bruce Engineering Manager FANUC America Corporation Traditional Vision vs. Vision based Robot Guidance Traditional Machine Vision Determine if a product passes or

More information

Introduction to Digitization Techniques for Surgical Guidance

Introduction to Digitization Techniques for Surgical Guidance Introduction to Digitization Techniques for Surgical Guidance Rebekah H. Conley, MS and Logan W. Clements, PhD Department of Biomedical Engineering Biomedical Modeling Laboratory Outline Overview of Tracking

More information

Agenda. Rotations. Camera models. Camera calibration. Homographies

Agenda. Rotations. Camera models. Camera calibration. Homographies Agenda Rotations Camera models Camera calibration Homographies D Rotations R Y = Z r r r r r r r r r Y Z Think of as change of basis where ri = r(i,:) are orthonormal basis vectors r rotated coordinate

More information

DRC A Multi-Camera System on PC-Cluster for Real-time 3-D Tracking. Viboon Sangveraphunsiri*, Kritsana Uttamang, and Pongsakon Pedpunsri

DRC A Multi-Camera System on PC-Cluster for Real-time 3-D Tracking. Viboon Sangveraphunsiri*, Kritsana Uttamang, and Pongsakon Pedpunsri The 23 rd Conference of the Mechanical Engineering Network of Thailand November 4 7, 2009, Chiang Mai A Multi-Camera System on PC-Cluster for Real-time 3-D Tracking Viboon Sangveraphunsiri*, Kritsana Uttamang,

More information

Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation

Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation Patrick Dunau 1 Fraunhofer-Institute, of Optronics, Image Exploitation and System Technologies (IOSB), Gutleuthausstr.

More information

Practical Robotics (PRAC)

Practical Robotics (PRAC) Practical Robotics (PRAC) A Mobile Robot Navigation System (1) - Sensor and Kinematic Modelling Nick Pears University of York, Department of Computer Science December 17, 2014 nep (UoY CS) PRAC Practical

More information

Calibration of IRS-1C PAN-camera

Calibration of IRS-1C PAN-camera Calibration of IRS-1C PAN-camera Karsten Jacobsen Institute for Photogrammetry and Engineering Surveys University of Hannover Germany Tel 0049 511 762 2485 Fax -2483 Email karsten@ipi.uni-hannover.de 1.

More information

ELEC Dr Reji Mathew Electrical Engineering UNSW

ELEC Dr Reji Mathew Electrical Engineering UNSW ELEC 4622 Dr Reji Mathew Electrical Engineering UNSW Review of Motion Modelling and Estimation Introduction to Motion Modelling & Estimation Forward Motion Backward Motion Block Motion Estimation Motion

More information

CALIBRATION BETWEEN DEPTH AND COLOR SENSORS FOR COMMODITY DEPTH CAMERAS. Cha Zhang and Zhengyou Zhang

CALIBRATION BETWEEN DEPTH AND COLOR SENSORS FOR COMMODITY DEPTH CAMERAS. Cha Zhang and Zhengyou Zhang CALIBRATION BETWEEN DEPTH AND COLOR SENSORS FOR COMMODITY DEPTH CAMERAS Cha Zhang and Zhengyou Zhang Communication and Collaboration Systems Group, Microsoft Research {chazhang, zhang}@microsoft.com ABSTRACT

More information

Pattern Feature Detection for Camera Calibration Using Circular Sample

Pattern Feature Detection for Camera Calibration Using Circular Sample Pattern Feature Detection for Camera Calibration Using Circular Sample Dong-Won Shin and Yo-Sung Ho (&) Gwangju Institute of Science and Technology (GIST), 13 Cheomdan-gwagiro, Buk-gu, Gwangju 500-71,

More information

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW Thorsten Thormählen, Hellward Broszio, Ingolf Wassermann thormae@tnt.uni-hannover.de University of Hannover, Information Technology Laboratory,

More information

Geometric camera models and calibration

Geometric camera models and calibration Geometric camera models and calibration http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 13 Course announcements Homework 3 is out. - Due October

More information

MACHINE VISION AS A METHOD FOR CHARACTERIZING SOLAR TRACKER PERFORMANCE

MACHINE VISION AS A METHOD FOR CHARACTERIZING SOLAR TRACKER PERFORMANCE MACHINE VISION AS A METHOD FOR CHARACTERIZING SOLAR TRACKER PERFORMANCE M. Davis, J. Lawler, J. Coyle, A. Reich, T. Williams GreenMountain Engineering, LLC ABSTRACT This paper describes an approach to

More information

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22)

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22) Digital Image Processing Prof. P. K. Biswas Department of Electronics and Electrical Communications Engineering Indian Institute of Technology, Kharagpur Module Number 01 Lecture Number 02 Application

More information

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene Buyuksalih, G.*, Oruc, M.*, Topan, H.*,.*, Jacobsen, K.** * Karaelmas University Zonguldak, Turkey **University

More information

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern

Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern Calibration of a Different Field-of-view Stereo Camera System using an Embedded Checkerboard Pattern Pathum Rathnayaka, Seung-Hae Baek and Soon-Yong Park School of Computer Science and Engineering, Kyungpook

More information

Reconstructing Images of Bar Codes for Construction Site Object Recognition 1

Reconstructing Images of Bar Codes for Construction Site Object Recognition 1 Reconstructing Images of Bar Codes for Construction Site Object Recognition 1 by David E. Gilsinn 2, Geraldine S. Cheok 3, Dianne P. O Leary 4 ABSTRACT: This paper discusses a general approach to reconstructing

More information

Product information. Hi-Tech Electronics Pte Ltd

Product information. Hi-Tech Electronics Pte Ltd Product information Introduction TEMA Motion is the world leading software for advanced motion analysis. Starting with digital image sequences the operator uses TEMA Motion to track objects in images,

More information

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science. Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Bundle Adjustment 2 Example Application A vehicle needs to map its environment that it is moving

More information

STRAIGHT LINE REFERENCE SYSTEM STATUS REPORT ON POISSON SYSTEM CALIBRATION

STRAIGHT LINE REFERENCE SYSTEM STATUS REPORT ON POISSON SYSTEM CALIBRATION STRAIGHT LINE REFERENCE SYSTEM STATUS REPORT ON POISSON SYSTEM CALIBRATION C. Schwalm, DESY, Hamburg, Germany Abstract For the Alignment of the European XFEL, a Straight Line Reference System will be used

More information

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy

BSB663 Image Processing Pinar Duygulu. Slides are adapted from Selim Aksoy BSB663 Image Processing Pinar Duygulu Slides are adapted from Selim Aksoy Image matching Image matching is a fundamental aspect of many problems in computer vision. Object or scene recognition Solving

More information

Structured Light II. Thanks to Ronen Gvili, Szymon Rusinkiewicz and Maks Ovsjanikov

Structured Light II. Thanks to Ronen Gvili, Szymon Rusinkiewicz and Maks Ovsjanikov Structured Light II Johannes Köhler Johannes.koehler@dfki.de Thanks to Ronen Gvili, Szymon Rusinkiewicz and Maks Ovsjanikov Introduction Previous lecture: Structured Light I Active Scanning Camera/emitter

More information

Everything you did not want to know about least squares and positional tolerance! (in one hour or less) Raymond J. Hintz, PLS, PhD University of Maine

Everything you did not want to know about least squares and positional tolerance! (in one hour or less) Raymond J. Hintz, PLS, PhD University of Maine Everything you did not want to know about least squares and positional tolerance! (in one hour or less) Raymond J. Hintz, PLS, PhD University of Maine Least squares is used in varying degrees in -Conventional

More information

Project: Camera Rectification and Structure from Motion

Project: Camera Rectification and Structure from Motion Project: Camera Rectification and Structure from Motion CIS 580, Machine Perception, Spring 2018 April 18, 2018 In this project, you will learn how to estimate the relative poses of two cameras and compute

More information

TOLERANCE ALLOCATION IN FLEXIBLE ASSEMBLIES: A PRACTICAL CASE

TOLERANCE ALLOCATION IN FLEXIBLE ASSEMBLIES: A PRACTICAL CASE TOLERANCE ALLOCATION IN FLEXIBLE ASSEMBLIES: A PRACTICAL CASE Pezzuti E., Piscopo G., Ubertini A., Valentini P.P. 1, Department of Mechanical Engineering University of Rome Tor Vergata via del Politecnico

More information

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features Stephen Se, David Lowe, Jim Little Department of Computer Science University of British Columbia Presented by Adam Bickett

More information

Robot-to-Camera Calibration: A Generic Approach Using 6D Detections

Robot-to-Camera Calibration: A Generic Approach Using 6D Detections Robot-to-Camera Calibration: A Generic Approach Using 6D Detections Christian Nissler and Zoltan-Csaba Marton Institute of Robotics and Mechatronics, German Aerospace Center (DLR) Muenchner Str. 2, D-82234

More information

A Desktop 3D Scanner Exploiting Rotation and Visual Rectification of Laser Profiles

A Desktop 3D Scanner Exploiting Rotation and Visual Rectification of Laser Profiles A Desktop 3D Scanner Exploiting Rotation and Visual Rectification of Laser Profiles Carlo Colombo, Dario Comanducci, and Alberto Del Bimbo Dipartimento di Sistemi ed Informatica Via S. Marta 3, I-5139

More information

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation Alexander Andreopoulos, Hirak J. Kashyap, Tapan K. Nayak, Arnon Amir, Myron D. Flickner IBM Research March 25,

More information

Data Association for SLAM

Data Association for SLAM CALIFORNIA INSTITUTE OF TECHNOLOGY ME/CS 132a, Winter 2011 Lab #2 Due: Mar 10th, 2011 Part I Data Association for SLAM 1 Introduction For this part, you will experiment with a simulation of an EKF SLAM

More information

Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras

Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inria.fr http://perception.inrialpes.fr/ Outline The geometry of active stereo.

More information

coding of various parts showing different features, the possibility of rotation or of hiding covering parts of the object's surface to gain an insight

coding of various parts showing different features, the possibility of rotation or of hiding covering parts of the object's surface to gain an insight Three-Dimensional Object Reconstruction from Layered Spatial Data Michael Dangl and Robert Sablatnig Vienna University of Technology, Institute of Computer Aided Automation, Pattern Recognition and Image

More information