Vision Based GNC Systems for Planetary Exploration

Size: px
Start display at page:

Download "Vision Based GNC Systems for Planetary Exploration"

Transcription

1 Abstract Vision Based GNC Systems for Planetary Exploration S. Mancuso ESA ESTEC, Noordwijk, The Netherlands In the frame of planetary exploration, among the Guidance Navigation and Control activities in ESA, Vision Based Navigation is one which has received much attention, because of the flexibility and cost saving aspects associated with camera type sensors. Two applications currently under investigation in this field are: Autonomous Interplanetary Navigation [1] and Navigation for Planetary Approach and Landing [2]. The present paper illustrates the problematic involved, the progresses achieved and the possible future development for these activities in the frame of future missions. Introduction Regarding interplanetary navigation the basic concept consists in estimating the state of an interplanetary spacecraft by using information provided by a camera imaging asteroids and/or planets, and information on the inertial attitude provided by the camera itself or by an additional star sensor. The study has concentrated also on the autonomous and real-time aspects of a possible future flight implementation. Validation of the results will be performed via simulation in three parallel ways: by using images generated synthetically by an image simulator; by using images acquired by a ground telescope, by using images acquired by the Smart-1 AMIE camera during its mission to the Moon. Regarding planetary landing, a very challenging concept has been elaborated which allows estimating the position of the spacecraft with respect to the landing site during its final descent on a planet. The application has been developed in the frame of the Bepi-Colombo mission to Mercury, which represents the baseline mission considered. The estimation of the spacecraft state is based on information provided by the camera, in particular the coordinates of feature points, which are extracted and tracked automatically. In order to be compliant with the strict real-time constraints of the fast dynamic environment typical of a final descent phase, hardware implementation of the image processing algorithms has been performed. In addition, an ad-hoc APS (Active Pixel Sensor) camera will be developed up to the level of an elegant breadboard. Validation will be performed via simulation with hardware in the loop. Autonomous vision-based navigation for interplanetary missions The Autonomous On-Board Navigation for Interplanetary Missions (AutoNav) study has been performed by a consortium of European industries under an ESA contract: EADS-Astrium, France (prime contractor); GMV, Spain; INETI, Portugal; IMCCE, France.

2 Objective The main objective of the study is to develop, for the cruise and encounter phases of an interplanetary mission, an autonomous navigation system based on camera measurement of celestial bodies, in order to restitute the spacecraft position and velocity autonomously (without intervention by ground station). Classical navigation for interplanetary probes is based on radar tracking from very large ground stations, able to provide range and range rate measurements very accurately. Then, by employing complex filtering techniques it is possible to estimate the full state of the spacecraft. These techniques have proven very accurate and robust, however, they require almost constant operator supervision, implying frequent ground-spacecraft contacts, especially during critical phases. Also, classical navigation is not a real-time process; due to the enormous distance involved during interplanetary missions, the signal delay can reach up to 6 hours in our solar system, without considering the time required for the processing of the data itself. For particular applications where relative navigation is required, such as fly-bys, swingbys, planetary entry, rendezvous with celestial objects, the accuracy obtainable is very dependent on the knowledge of the ephemeris of the target celestial body, since we need to provide position and velocity of the spacecraft relative to such bodies; typical accuracy reachable for minor bodies, for which ephemeris are not very well know, is of the order of km. On the contrary, an autonomous system such as the vision-based system under consideration, can overcome some of these drawbacks. In particular, being an onboard and autonomous system it can fulfil the real-time requirements and does not require frequent man in the loop presence. Also, since during relative navigation phases direct measurements relative to the target celestial body are processed, errors on the ephemeris of the target celestial body do not affect the relative navigation accuracy. The absolute navigation accuracy with respect to the inertial space depends largely on the complexity of the filter implemented on-board. Due to limitations in hardware capabilities for space application, it is not conceivable to implement on-board complex filters using very detailed environment models (such as those used on ground), therefore a lower accuracy shall be expected for absolute navigation in comparison with classical ground navigation. However, accuracy requirements for the absolute navigation phase are less stringent, and the accuracy provided by the vision-based on-board system is more than sufficient to navigate through this phase efficiently. Navigation concept The basic idea of the vision-based navigation concept is to use celestial bodies (asteroids and planets) as beacons for the on-board state estimation process. Celestial bodies with well known ephemeris are needed in order to provide sufficient accuracy. Despite the fact that ephemeris are better known for planets (with an error of about 1km) than for asteroids (error can be up to 1000km for the faintest ones), asteroid will be mostly used simply because they are much more numerous than planets and more evenly spread out.

3 The position of the spacecraft can be easily estimated via trigonometric relations as the planar example of Figure 1 shows. LOS Measurement error of beacon 1 Vehicle position uncertainty Spacecraft LOS Measurement error of beacon 2 Figure 1: Navigation concept This concept is not new and was shown to work during the US mission Deep Space 1 which flew in 1998 [3]. The mission objective for showing the feasibility of vision-based navigation was fully accomplished. Two main phases are identified for the interplanetary mission: Cruise and Encounter phase. The navigation processing is stopped at about 3 hours before encounter, since afterwards it would become too expensive to perform correction manoeuvres. During the cruise phase, navigation is absolute; the inertial position of the spacecraft is determined by knowing the Line of Sight (LOS) to the beacons, the beacons position and the inertial spacecraft attitude. During this phase no particular accuracy requirement is needed, the only constraint is to allow switching to the following encounter phase; typical navigation accuracy is of the order km. An image plan shall be established a priori, typically in the average two images a day will be needed for an arc of about one month. However, taking images very frequently would impose constraints at system level, a more efficient strategy is to concentrate all the measurements for one arc at the beginning or at the end, and then process the batches of time-tagged data in the navigation filter. During Encounter Phase, navigation is relative to the target body. The relative state of the Spacecraft can be determined by knowing the LOS to the target body and the previous state history (see Figure 2). Knowledge of the shape of the target can improve the state estimation process, especially with respect to the along track direction. A summary of the accuracy requirements for the various phases is reported in Table 1.

4 LOS t1 r1 LOS t2 r2 LOS Figure 2: Navigation principle for encounter phase Phase Requirement Justification Asteroid Encounter 1 km cross-track 3% of range along-track (up to encounter) Planet Encounter (Gravity Assist) Cruise (Asteroid Encounter) Cruise (Planet Encounter) Low thrust deep space maneuver 10 km cross-track 3% of range along-track (up t0 3 hours) Camera specifications and image processing To allow close encounter. To enable instrument operation scheduling. To minimize error correction consumption 500 km To enable detection of faint objects. To meet along track error requirements km To meet along track error requirements km To allow maneuver realization error at minimum cost. Table 1: Accuracy requirements As the Deep Space 1 mission showed, performances achievable are very dependent on the quality and capability of the camera; the LOS error accuracy is a good quality index with this respect: LOS error accuracy = IFOV * subpixel accuracy (1) where IFOV is the instantaneous field of view, namely the field of view of a single pixel. It has been found that for minor bodies encounter a LOS error accuracy of 5-15 µrad is needed in order to meet the performance requirements specified in Table 1. For approach to major bodies (planets) this requirement can be relaxed due to the brightness and size of the object which prevents achieving high accuracy with image processing algorithms; LOS accuracy of 50 µrad is sufficient in this case. The FOV of the camera is another important parameter; for given detector, the smaller the FOV the higher the accuracy, however with a small FOV we will have

5 less probability to have enough bright stars in the camera background to reconstruct attitude information. It has been found that 0.8 deg is an optimal value. The Image Processing (IP) tasks are responsible for delivering the LOS measurement information to the navigation process. Starting from an image, the IP tasks will have to estimate the correct position of the beacon mass center in detector frame, and at the same time, if possible, provide information about the position of the stars in the background. When the distance spacecraft-beacon is large, the beacon will appear as an unresolved object in the image. In order to obtain subpixel accuracy in this case Multiple Cross-Correlation (MCC) techniques will be employed. These techniques consist of correlating each object in an image with a mask template extracted from the same image. Several non-coincident correlation maxima are generated, which will be interpolated to produce the final reference point with subpixel accuracy. As the spacecraft-beacon distance decreases, the beacon becomes a resolved object in the camera field of view (FOV), different image processing techniques will have to be applied in this case such as center finding techniques, when the object is completely within the camera FOV, or by processing two images of part of the limb to estimate the center of mass of the body from a known a-priori shape. Notice that these processes work quite well for regular shapes (such as sphere, ellipsoid, ), but in the case of an irregular asteroid the estimation error is higher, in this case the Center of Brightness will be estimated instead. Validation The validation of the navigation concept illustrated in the previous sections, will be carried out in three different ways: via simulation using synthesized images, via simulation using real images acquired by ground telescope, via simulation using flight images acquired by the Smart-1 spacecraft with the AMIE camera. The objective of the validation process is twofold: first to validate the developed navigation concept and algorithms, then to validate the real-time aspects. For the validation process two simulators will be employed: the so called Functional Engineering Simulator (FES), and the Prototype Validation Simulator (PVS) which basically represent the AutoNav breadboard and will be used to validate real-time aspects. Matlab/Simulink has been used for the simulators, along with encapsulated C and Fortran code. The first set of simulations will be performed in closed loop using the FES and employing a synthetic image generator; these tests will validate the complete GNC chain. Then tests using images acquired by ground telescopes will be performed using the PVS simulator, the objective is to validate the IP and navigation algorithms on real images, by estimating the position of the observatory in the inertial space, and also to validate the image simulator by comparing synthetic images with real images. Finally, tests using real in-flight images acquired by the Smart-1 satellite with the AMIE camera and the PVS simulator will be performed, the objective is to validate the IP and navigation algorithm on real flight images conducting a realistic orbit determination process and also to validate real time aspects. Notice that while the FES tests are closed loop, the PVS tests (ground and

6 flight images) are necessarily open loop tests, since the guidance and control processes cannot be employed to command the spacecraft. Results The AutoNav study is to be completed by the end of the summer. At the time of writing some preliminary simulations have been performed by using both covariance and Montecarlo methods. Preliminary results show that an accuracy of about 300km can be obtained for the case of the ground tests representing the cruise phase, and an accuracy of 40 km shall be achievable for the Smart-1 case with electrical propulsion, while for encounter the accuracy expected is of the order of few km for encounter with asteroids and of about 10 km for encounter with planet. These are results of preliminary simulation without Image Processing and show that the requirement outlined under Navigation concept, above, can all be fulfilled. The image processing code has just been delivered and is being implemented in the FAS simulator. The above performances will be confirmed with detailed simulations including image processing of the acquired images. Vision-based navigation for planetary approach and landing The Navigation for Planetary Approach and Landing (NPAL) study has been performed by a consortium of European industries under an ESA contract: EADS- Astrium, France (prime contractor); INETI, Portugal; University of Dundee, UK; Science Systems, UK; Galileo Avionica, Italy. Objectives The main objective of the study is to develop a vision-based navigation system for the final descent phase of the landing, to allow soft and precision landing on a planet, based on a single camera. The classical approach for navigation for landing is based on a very accurate orbit determination from ground, before initiation of the descent, and initialization of inertial measurement unit (IMU). After the de-orbit burn, the navigation during the descent is completely inertial. Navigation can be augmented in the last part of the descent with radar or laser altimeters, in this way one ensures that vertical parameters (position and velocity) are within the soft landing domain, however there is no improvement in the horizontal velocity and position, and therefore it is not possible to achieve precision landing with classical approach. Typically, the horizontal overall GNC dispersion at landing will be of the order of km. Also with classical type of navigation it is not possible to perform landing site recognition, hazard avoidance and retargeting. On the contrary, in the vision-based navigation concept relative navigation is performed, better accuracy is achieved and both soft and precision landing are possible. With the vision system it would also be possible to perform retargeting, hazard avoidance and landing site recognition, if an accurate map of the terrain is known a-priori. There are also of course some disadvantages related to a visionbased system like the daylight constraint for landing, which can impose stringent thermal requirements at system level, or adverse environment conditions (heavy dust storms or fog) which could affect the vision system.

7 In the frame of the study the reference baseline considered is the Bepi Colombo mission, an ESA scientific mission to Mercury to be launched in In particular the last part of the descent is analyzed here, starting from an altitude of 8 km at a range of about 20 km from the landing site; when the landing site becomes visible in the camera FOV (high gate) approximately 60 sec before touchdown. Navigation concept The basic idea of the vision-based navigation concept, is to utilize information from IMU and from images of the terrain and fuse them together to estimate the state of the spacecraft relative to the landing site. The fusion process is performed by the navigation filter, an extended Kalman filter with UD formulation has been employed in the frame of this study. The main difficulty here is that terrain features are unknown a-priori and therefore landmark recognition cannot be carried out. Instead the navigation processing will have to be based on unknown features. The solution employed in the present study is the following. Instead of using velocity information from the IMU, obtained via integration of the acceleration measurements (therefore affected by errors), the NPAL concept provides a navigation solution via dynamic filtering based on the acceleration estimation. To illustrate this concept refer to the simplified example shown in Figure 3. Spacecraft X Camera θ x Z D Figure 3: NPAL concept The measurement equation can be written as q = 1 / tan(θ) = x / D (2) where q is the raw measurement from the camera (after image processing), x the distance to the ground (to be estimated) and D is the distance between landmarks (unknown but constant). By differentiating twice this equation we obtain & q& = & x& / D (3)

8 where & x& is the total acceleration acting on the spacecraft. The IMU provides information on the non-gravitational acceleration, for the gravitational acceleration we must rely on on-board models, which, although not very accurate, produce an error of one order of magnitude smaller than other estimation errors. Once & x& is known, from Eq. 3 one can find D, and then from Eq. 2, x. With this technique at least 3 points are needed to estimate the state of the spacecraft, however many more (200 in our baseline) will be tracked for robustness. Image processing An essential part of the navigation concept is the Image Processing (IP). The camera will take images at a relatively high frame rate (20 Hz) in order to cope with the fast dynamics of the spacecraft, then two tasks will be performed at image processing level: extraction and tracking. The choice of the algorithms has been made considering several factors such as: robustness with respect to noise, repeatability and simplicity for hardware implementation. For the extraction process, the trade-off has selected the Harris algorithm [4], which works on gradient masks and provides a criterion map. For the tracking process, tracking via correlation is performed by employing a special technique. Each point extracted in frame N is projected in frame N+1 with a prediction search window. A correlation process is performed within this window, using a predefined correlation template, to generate a local maximum: the tracked point in the frame N+1. In this way, the extraction process is only performed in the initial frame and every time a track is lost. This technique has proven very robust also in presence of noise. The output of the IP task is the list of tracked points, to be sent to the Navigation filter. Camera and image processing hardware As mentioned before, due to the fast dynamics of the final descent phase, the requirements in terms of computation speed are quite stringent. Nominally, for robustness, 200 points will be tracked at a frame rate of 20 Hz. This would require a too high data rate flow between camera and On-Board Computer (above 300 Mbit/sec). To solve this problem image processing algorithms will be implemented in an integrated circuit named FEIC (Feature Extraction and Image Correlation). Regarding the camera detector, an Active Pixel Sensor (APS) has been chosen to cope with the compactness and speed requirements, coupled with a large 70 deg FOV to be able to keep the landing site in the FOV while descending. Within the present study only a camera breadboard will be manufactured, which will be as similar as possible to the flight version. The APS Fillfactory Star1000 has been selected as best candidate detector for the breadboard. The final camera will have a mass of about 0.5 kg and power consumption of about 5W. Validation approach The validation of the navigation concept illustrated in the previous sections will be carried out via simulation in successive steps. The simulator employed for the validation has been developed under Matlab/Simulink and it is named VBNAT (Vision Based Navigation Tool). Four main incremental steps are envisioned for the simulator implementation and for the validation process.

9 VBNAT v1 implements the generic blocks; validation at this stage is with respect to the simulator itself, dynamic equations and environment models. Vision-based navigation is not implemented and only inertial or ideal navigation is used. VBNAT v2 is the successive step; it will implement and validate the vision-based navigation block, however no image processing is embedded at this time and the navigation block input is a list of feature points elaborated a-priori. VBNAT v3 will encapsulate the image processing algorithms, which will be fed with images. The output of the image processing block will then be sent to the navigation processing. Images will be provided by the synthetic terrain generator PANGU, developed under a previous ESA study [5], capable of generating high resolution realistic images at a frame rate of a couple of frames per second. VBNAT v4 will be the final step, and will implement and validate the FEIC by integrating the hardware in the loop. The FEIC will be fed with images generated by PANGU residing on a dedicated workstation. Further to the original plan, an extension has been planned to provide a real time demonstration of the Navigation system. The navigation system will be tested by implementing it on a computer with a Real Time operating system (LinuxRT) with the camera breadboard and FEIC in the loop. In order to support the RT tests, images (collected a-priori) will be injected directly into the camera memory. Results The NPAL study is to be completed by the end of At the time of writing the validation campaign with VBNAT v3, the detailed design of the camera and the FEIC prototyping have been performed. What remains to be done is the manufacturing of the camera, the integration of the VBNAT v4 and the validation campaign. The real-time demonstration and an investigation on the applicability of the concept to the case of Mars landing will also be conducted. In Table 2, the main results for navigation accuracy from the VBNAT v3 campaign are summarized. A comparison is made with an analogous scenario based on pure inertial navigation via an IMU sensor (type III, 0.1 deg/h bias stability). Position wrt Landing site Feature point position Velocity Visual based 32 m (cross track) 11 m (along track) 10 m 1 m/sec (cross track) 1.8 m/sec (along track) Table 2: Navigation Estimation Accuracy IMU 760 m (cross track) 520 m (along track) 8 m/sec (cross track) 8 m/sec (along track The results show that the concept developed in the frame of the present study is feasible, and that landing accuracy in the metric range is achieved. The most

10 complex task, i.e. the estimation of the velocity without any further aiding from dissimilar sensors (e.g. altimeters), has also been performed successfully. Conclusions Camera based navigation systems present many advantages with respect to other classical based system, in particular regarding flexibility, compactness (mass and dimension) and cost. The present paper provides an overview of two of the current GNC activity in ESA in the area of vision-based navigation: Autonomous Interplanetary Navigation and Navigation for Planetary Approach and Landing. The associated preliminary results show that both concepts are feasible and provide very good performances. The conclusions provided in the single sections trace the directions for possible future improvements and continuation. Regarding the AutoNav study, next step would be to solve all the open real-time issues with the help of a real-time avionics test-bed. The concept should also be extended for the late encounter phase (after 3hrs before encounter), for the benefits of aerocapture and aerobraking missions. Finally, a flight experiment should be performed by implementing the concept as part of the GNC system of a spacecraft. A possibility could be provided by the ESA Aurora program. Regarding the NPAL study for landing, next steps in the development of the complete visual based landing system are the detailed development of hazard detection/avoidance and piloting techniques (already started in the frame of another ESA study). The natural continuation of the study would be the development of an avionic test bed for real-time simulation, and the development of a real terrestrial demonstrator. This would represent the ultimate and decisive test to prove the robustness of the concept. After that, possible implementation as a flight experiment on lander probes to the Moon or Mars should be investigated. References [1] B. Polle, B. Frapard, O. Saint-Pé, V. Crombez, S. Mancuso, F. Ankersen, J. Fertig, Autonomous On-Board Navigation for Interplanetary Missions, 26 th Annual AAS Guidance and Control Conference, 5-9 Feb. 2003, Breckenridge, Co, USA. [2] B. Frapard, B. Polle, G. Flandin, P. Bernard, C. Vétel, X. Sembely, S. Mancuso, Navigation for Planetary Approach and Landing, 5th International ESA Conference on Guidance Navigation and Control Systems, Oct 2002, Frascati, Italy [3] J. E. Riedel, S. Bhaskara, et al, Autonomous Optical Navigation DS1 Technology Validation Report, Jet Propulsion Laboratory, California, USA. [4] C. G. Harris, M. Stephens, A Combined Corner and Edge Detector, 4th Alvey Vision Conference, 31 Aug - 2 Sept 1998, Manchester, UK. [5] S. M. Parkes, I. Martin, M. Dunstan, S. Mills, Mercury Surface Simulation for Bepi Colombo Lander, DASIA Conference, May 2002, Dublin, Ireland.

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains PhD student: Jeff DELAUNE ONERA Director: Guy LE BESNERAIS ONERA Advisors: Jean-Loup FARGES Clément BOURDARIAS

More information

ASTRIUM Space Transportation

ASTRIUM Space Transportation SIMU-LANDER Hazard avoidance & advanced GNC for interplanetary descent and soft-landing S. Reynaud, E. Ferreira, S. Trinh, T. Jean-marius 3rd International Workshop on Astrodynamics Tools and Techniques

More information

Navigation for Future Space Exploration Missions Based on Imaging LiDAR Technologies. Alexandre Pollini Amsterdam,

Navigation for Future Space Exploration Missions Based on Imaging LiDAR Technologies. Alexandre Pollini Amsterdam, Navigation for Future Space Exploration Missions Based on Imaging LiDAR Technologies Alexandre Pollini Amsterdam, 12.11.2013 Presentation outline The needs: missions scenario Current benchmark in space

More information

Innovative Visual Navigation Solutions for ESA s Lunar Lander Mission Dr. E. Zaunick, D. Fischer, Dr. I. Ahrns, G. Orlando, B. Polle, E.

Innovative Visual Navigation Solutions for ESA s Lunar Lander Mission Dr. E. Zaunick, D. Fischer, Dr. I. Ahrns, G. Orlando, B. Polle, E. Lunar Lander Phase B1 Innovative Visual Navigation Solutions for ESA s Lunar Lander Mission Dr. E. Zaunick, D. Fischer, Dr. I. Ahrns, G. Orlando, B. Polle, E. Kervendal p. 0 9 th International Planetary

More information

H2020 Space Robotic SRC- OG4

H2020 Space Robotic SRC- OG4 H2020 Space Robotic SRC- OG4 2 nd PERASPERA workshop Presentation by Sabrina Andiappane Thales Alenia Space France This project has received funding from the European Union s Horizon 2020 research and

More information

Concept and Performance Simulation with ASTOS

Concept and Performance Simulation with ASTOS Concept and Performance Simulation with ASTOS Andreas Wiegand (1), Sven Weikert (1) Astos Solutions GmbH (1) Meitnerstraße 8, 70563 Stuttgart, Germany andreas.wiegand@astos.de ABSTRACT Advanced space missions,

More information

12 June Flight Testing a Vision-Based Navigation and Hazard Detection and Avoidance (VN&HDA) Experiment over a Mars-Representative Terrain

12 June Flight Testing a Vision-Based Navigation and Hazard Detection and Avoidance (VN&HDA) Experiment over a Mars-Representative Terrain Flight Testing a Vision-Based Navigation and Hazard Detection and Avoidance (VN&HDA) Experiment over a Mars-Representative Terrain IPPW 2018: 15th International Planetary Probe Workshop 12 June 2018 Spin.Works,

More information

Rendezvous sensors and navigation

Rendezvous sensors and navigation Clean Space Industrial Days Rendezvous sensors and navigation 23-27 th May 2016 Nicolas Deslaef (TAS-F) Julien Christy (TAS-F) 83230918-DOC-TAS-FR-001 OPEN Agenda 2 Collection of R&T studies enabling vision

More information

Guidance, Navigation and Control issues for Hayabusa follow-on missions F. Terui, N. Ogawa, O. Mori JAXA (Japan Aerospace Exploration Agency )

Guidance, Navigation and Control issues for Hayabusa follow-on missions F. Terui, N. Ogawa, O. Mori JAXA (Japan Aerospace Exploration Agency ) 18-20th May 2009 Guidance, Navigation and Control issues for Hayabusa follow-on missions F. Terui, N. Ogawa, O. Mori JAXA (Japan Aerospace Exploration Agency ) Lessons and Learned & heritage from Proximity

More information

FRACTIONATED SATELLITES

FRACTIONATED SATELLITES EXECUTIVE SUMMARY February 2010 Page : ii/12 FRACTIONATED Executive Summary Toulouse, February 2010-02-08 Prepared by: C. Cougnet, B. Gerber ESA Project Manager: EADS Astrium Project Manager: J. F. Dufour

More information

The Avionics System Test Bench, Functional Engineering Simulator: New Developments in Support of Mission and System Verification

The Avionics System Test Bench, Functional Engineering Simulator: New Developments in Support of Mission and System Verification The Avionics System Test Bench, Functional Engineering Simulator: New Developments in Support of Mission and System Verification INTRODUCTION 11th Int. WS on Simulation & EGSE facilities for Space Programmes

More information

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016 Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused

More information

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education DIPARTIMENTO DI INGEGNERIA INDUSTRIALE Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education Mattia Mazzucato, Sergio Tronco, Andrea Valmorbida, Fabio Scibona and Enrico

More information

UAV Autonomous Navigation in a GPS-limited Urban Environment

UAV Autonomous Navigation in a GPS-limited Urban Environment UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight

More information

CHALLENGES OF PINPOINT LANDING FOR PLANETARY EXPLORATION : THE LION ABSOLUTE VISION-BASED NAVIGATION SYSTEM STEP-WISE VALIDATION APPROACH

CHALLENGES OF PINPOINT LANDING FOR PLANETARY EXPLORATION : THE LION ABSOLUTE VISION-BASED NAVIGATION SYSTEM STEP-WISE VALIDATION APPROACH CHALLENGES OF PINPOINT LANDING FOR PLANETARY EXPLORATION : THE LION ABSOLUTE VISION-BASED NAVIGATION SYSTEM STEP-WISE VALIDATION APPROACH Thomas Voirin (1), Jeff Delaune (2), G. Le Besnerais (3), J.L Farges

More information

LISA - Status at ESA. Oliver Jennrich LISA Project Scientist, ESA

LISA - Status at ESA. Oliver Jennrich LISA Project Scientist, ESA - Status at ESA Oliver Jennrich Project Scientist, ESA 1st -DECIGO Workshop, Sagamihara, / November 28 May 19: Proposed as a mission to the M3 project of Horizon 20. 1st -DECIGO Workshop, Sagamihara, /

More information

Landmarks Constellation Based Position Estimation for Spacecraft Pinpoint Landing

Landmarks Constellation Based Position Estimation for Spacecraft Pinpoint Landing Landmarks Constellation Based Position Estimation for Spacecraft Pinpoint Landing Bach Van Pham, Simon Lacroix, Michel Devy LAAS-CNRS 7 Avenue du Colonel Roche, 31077 Toulouse Cedex 4, France bvpham@laasfr,

More information

Precise Image-Based Motion Estimation for Autonomous Small Body Exploration

Precise Image-Based Motion Estimation for Autonomous Small Body Exploration Appearing in the 5th Int l Symp. on Artificial Intelligence, Robotics and Automation in Space (isairas 99), pp. 627-634. Precise Image-Based Motion Estimation for Autonomous Small Body Exploration Andrew

More information

Jena-Optronik Relative Navigation Sensor Activities

Jena-Optronik Relative Navigation Sensor Activities Jena-Optronik Relative Navigation Sensor Activities ESA AIM Industry Day Jena-Optronik Relative Navigation Sensor Activities ESA AIM Industry Day. Page 1 Space developments from Jena. 1976 1980 1991 1994

More information

VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS

VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS ASTRA 2015 VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS Property of GMV All rights reserved PERIGEO PROJECT The work presented here is part of the PERIGEO project

More information

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU Alison K. Brown, Ph.D.* NAVSYS Corporation, 1496 Woodcarver Road, Colorado Springs, CO 891 USA, e-mail: abrown@navsys.com Abstract

More information

Visual Navigation System for Pin-Point Landing

Visual Navigation System for Pin-Point Landing Visual Navigation System for Pin-Point Landing ESTEC/Contract No 18692/4/NL/MU Approved: Prof. Dr. K. Janschek Dresden, 28 November 25 Issue 28.11.25; Page 2 Contents 1. Introduction...3 2. List of acronyms...3

More information

Development of Formation Flight and Docking Algorithms Using the SPHERES Testbed

Development of Formation Flight and Docking Algorithms Using the SPHERES Testbed Development of Formation Flight and Docking Algorithms Using the Testbed Prof. David W. Miller MIT Allen Chen, Alvar Saenz-Otero, Mark Hilstad, David W. Miller Introduction : Synchronized Position Hold

More information

CONTROL MOMENT GYRO CMG S : A compact CMG product for agile satellites in the one ton class

CONTROL MOMENT GYRO CMG S : A compact CMG product for agile satellites in the one ton class CONTROL MOMENT GYRO CMG 15-45 S : A compact CMG product for agile satellites in the one ton class Ange DEFENDINI, Philippe FAUCHEUX & Pascal GUAY EADS Astrium SAS ZI du Palays, 31 Av. des Cosmonautes,

More information

Applying Model Predictive Control Architecture for Efficient Autonomous Data Collection and Operations on Planetary Missions

Applying Model Predictive Control Architecture for Efficient Autonomous Data Collection and Operations on Planetary Missions Applying Model Predictive Control Architecture for Efficient Autonomous Data Collection and Operations on Planetary Missions M. Lieber, E. Schindhelm, R. Rohrschneider, C. Weimer, S. Roark Cahill Center

More information

Blue Canyon Technologies XB1 Enabling a New Realm of CubeSat Science. George Stafford BCT Range St, Suite 200 Boulder, CO 80301

Blue Canyon Technologies XB1 Enabling a New Realm of CubeSat Science. George Stafford BCT Range St, Suite 200 Boulder, CO 80301 Blue Canyon Technologies XB1 Enabling a New Realm of CubeSat Science George Stafford BCT 720.458.0703 1600 Range St, Suite 200 Boulder, CO 80301 About BCT Blue Canyon Technologies is a small business founded

More information

AN ANALYTICAL APPROACH TO AUTONOMOUS OPTICAL NAVIGATION FOR A CUBESAT MISSION TO A BINARY ASTEROID SYSTEM

AN ANALYTICAL APPROACH TO AUTONOMOUS OPTICAL NAVIGATION FOR A CUBESAT MISSION TO A BINARY ASTEROID SYSTEM 4 th IAA Conference on University Satellite Missions and CubeSat Workshop 4-7 December 2017 Rome, Italy AN ANALYTICAL APPROACH TO AUTONOMOUS OPTICAL NAVIGATION FOR A CUBESAT MISSION TO A BINARY ASTEROID

More information

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement

More information

GNCDE: AN INTEGRATED GNC DEVELOPMENT ENVIRONMENT FOR ATTITUDE AND ORBIT CONTROL SYSTEMS

GNCDE: AN INTEGRATED GNC DEVELOPMENT ENVIRONMENT FOR ATTITUDE AND ORBIT CONTROL SYSTEMS GNCDE: AN INTEGRATED GNC DEVELOPMENT ENVIRONMENT FOR ATTITUDE AND ORBIT CONTROL SYSTEMS Fernando Gandía (1), Luigi Strippoli (1), Valentín Barrena (1) (1) GMV, Isaac Newton 11, P.T.M. Tres Cantos, E-28760

More information

Motion estimation of unmanned marine vehicles Massimo Caccia

Motion estimation of unmanned marine vehicles Massimo Caccia Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it

More information

DYNAMICS OF SPACE ROBOTIC ARM DURING INTERACTIONS WITH NON COOPERATIVE OBJECTS

DYNAMICS OF SPACE ROBOTIC ARM DURING INTERACTIONS WITH NON COOPERATIVE OBJECTS DYNAMICS OF SPACE ROBOTIC ARM DURING INTERACTIONS WITH NON COOPERATIVE OBJECTS Karol Seweryn 1, Marek Banaszkiewicz 1, Bernd Maediger 2, Tomasz Rybus 1, Josef Sommer 2 1 Space Research Centre of the Polish

More information

FPGA-Based Feature Detection

FPGA-Based Feature Detection FPGA-Based Feature Detection Wennie Tabib School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 wtabib@andrew.cmu.edu Abstract Fast, accurate, autonomous robot navigation is essential

More information

Satellite Attitude Determination

Satellite Attitude Determination Satellite Attitude Determination AERO4701 Space Engineering 3 Week 5 Last Week Looked at GPS signals and pseudorange error terms Looked at GPS positioning from pseudorange data Looked at GPS error sources,

More information

Coupled Vision and Inertial Navigation for Pin-Point Landing *

Coupled Vision and Inertial Navigation for Pin-Point Landing * Coupled Vision and Inertial Navigation for Pin-Point Landing * Nikolas Trawny, Anastasios I. Mourikis and Stergios I. Roumeliotis (PI) University of Minnesota, Minneapolis, MN 55455 Andrew E. Johnson (Co-I)

More information

Fusion of Radar and EO-sensors for Surveillance

Fusion of Radar and EO-sensors for Surveillance of Radar and EO-sensors for Surveillance L.J.H.M. Kester, A. Theil TNO Physics and Electronics Laboratory P.O. Box 96864, 2509 JG The Hague, The Netherlands kester@fel.tno.nl, theil@fel.tno.nl Abstract

More information

Autonomous on-orbit Calibration Of Star Trackers

Autonomous on-orbit Calibration Of Star Trackers Autonomous on-orbit Calibration Of Star Trackers Malak Samaan 1, Todd Griffith 2,Puneet Singla 3, and John L. Junkins 4 Department of Aerospace Engineering, Texas A&M University, College Station, TX 77843-3141

More information

VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS

VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS Francisco Perez (1), Jesus Gil (2), Giovanni Binet (2), Antidio Viguria (1) (1) FADA-CATEC, Wilbur y Orville Wright,

More information

An Angle Estimation to Landmarks for Autonomous Satellite Navigation

An Angle Estimation to Landmarks for Autonomous Satellite Navigation 5th International Conference on Environment, Materials, Chemistry and Power Electronics (EMCPE 2016) An Angle Estimation to Landmarks for Autonomous Satellite Navigation Qing XUE a, Hongwen YANG, Jian

More information

Mars Entry and Descent. Dr. Scott Striepe NASA Langley Research Center

Mars Entry and Descent. Dr. Scott Striepe NASA Langley Research Center Mars Entry and Descent Dr. Scott Striepe NASA Langley Research Center Robotic Mars Exploration Operating Missions Mars Exploration Program Search: Aqueous Minerals Found Search: Subsurface Ice Found Determine:

More information

PROTOTYPE IMPLEMENTATION OF A ROUTING POLICY USING FLEXRAY FRAMES

PROTOTYPE IMPLEMENTATION OF A ROUTING POLICY USING FLEXRAY FRAMES PROTOTYPE IMPLEMENTATION OF A ROUTING POLICY USING FLEXRAY FRAMES CONCEPT OVER A SPACEWIRE NETWORK Sev Gunes-Lasnet (1), Olivier Notebaert (1) (1) Astrium, 31 avenue des Cosmonautes, 31402 Toulouse Cedex

More information

On-Orbit Testing of Target-less TriDAR 3D Rendezvous and Docking Sensor

On-Orbit Testing of Target-less TriDAR 3D Rendezvous and Docking Sensor On-Orbit Testing of Target-less TriDAR 3D Rendezvous and Docking Sensor Stephane Ruel, Tim Luu, Andrew Berube* *Neptec Design Group, Canada e-mail: sruel@neptec.com Abstract TriDAR is a vision system developed

More information

Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications

Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications By Sameh Nassar and Naser El-Sheimy University of Calgary, Canada Contents Background INS/GPS Integration & Direct Georeferencing

More information

Use of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments

Use of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments Use of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments Precision Strike Technology Symposium Alison K. Brown, Ph.D. NAVSYS Corporation, Colorado Phone:

More information

Real-time Algorithmic Detection of a Landing Site using Sensors aboard a Lunar Lander

Real-time Algorithmic Detection of a Landing Site using Sensors aboard a Lunar Lander Real-time Algorithmic Detection of a Landing Site using Sensors aboard a Lunar Lander Ina Fiterau Venkat Raman Senapati School of Computer Science Electrical & Computer Engineering Carnegie Mellon University

More information

USE OF COTS ROBOTICS FOR ON-GROUND VALIDATION OF SPACE GNC SYSTEMS: PLATFORM DYNAMIC TEST BENCH I-SAIRAS 2012 TURIN, ITALY 4-6 SEPTEMBER 2012

USE OF COTS ROBOTICS FOR ON-GROUND VALIDATION OF SPACE GNC SYSTEMS: PLATFORM DYNAMIC TEST BENCH I-SAIRAS 2012 TURIN, ITALY 4-6 SEPTEMBER 2012 USE OF COTS ROBOTICS FOR ON-GROUND VALIDATION OF SPACE GNC SYSTEMS: PLATFORM DYNAMIC TEST BENCH I-SAIRAS 212 TURIN, ITALY 4-6 SEPTEMBER 212 Matteo Suatoni (1), Luis Mollinedo (1), Valentín Barrena (1),

More information

Terrain Integrity Monitoring Studies using Kalman Filter for SVS

Terrain Integrity Monitoring Studies using Kalman Filter for SVS Terrain Integrity Monitoring Studies using Kalman Filter for SVS Srikanth K P, Kamali C, Abhay Pashilkar FMCD, CSIR-National Aerospace Laboratories, Bangalore, India ABSTRACT: An important component of

More information

ROSESAT -- A GRAPHICAL SPACECRAFT SIMULATOR FOR RAPID PROTOTYPING

ROSESAT -- A GRAPHICAL SPACECRAFT SIMULATOR FOR RAPID PROTOTYPING ROSESAT -- A GRAPHICAL SPACECRAFT SIMULATOR FOR RAPID PROTOTYPING Xavier Cyril Space Systems Engineering, CAE Electronics Ltd. 8585 Cote de Liesse, Saint Laurent, Quebec, Canada H4T 1G6 FAX: (514) 734

More information

Micro Sun Sensor with CMOS Imager for Small Satellite Attitude Control

Micro Sun Sensor with CMOS Imager for Small Satellite Attitude Control SSC05-VIII-5 Micro Sun Sensor with CMOS Imager for Small Satellite Attitude Control Keisuke Yoshihara, Hidekazu Hashimoto, Toru Yamamoto, Hirobumi Saito, Eiji Hirokawa, Makoto Mita Japan Aerospace Exploration

More information

Bet & MathWorks By Bet Herrera Sucarrat Application Engineer MathWorks

Bet & MathWorks By Bet Herrera Sucarrat Application Engineer MathWorks Bet & MathWorks By Bet Herrera Sucarrat Application Engineer MathWorks 2015 The MathWorks, Inc. 1 Researchers Test Control Algorithms for NASA SPHERES Satellites with a MATLAB Based Simulator Challenge

More information

Inertial measurement and realistic post-flight visualization

Inertial measurement and realistic post-flight visualization Inertial measurement and realistic post-flight visualization David Fifield Metropolitan State College of Denver Keith Norwood, faculty advisor June 28, 2007 Abstract Determining the position and orientation

More information

Angles-Only Autonomous Rendezvous Navigation to a Space Resident Object

Angles-Only Autonomous Rendezvous Navigation to a Space Resident Object aa.stanford.edu damicos@stanford.edu stanford.edu Angles-Only Autonomous Rendezvous Navigation to a Space Resident Object Josh Sullivan PhD. Candidate, Space Rendezvous Laboratory PI: Dr. Simone D Amico

More information

DESIGN AND VALIDATION OF AN ABSOLUTE LOCALISATION SYSTEM FOR THE LUNAR ANALOGUE ROVER ARTEMIS I-SAIRAS 2012 TURIN, ITALY 4-6 SEPTEMBER 2012

DESIGN AND VALIDATION OF AN ABSOLUTE LOCALISATION SYSTEM FOR THE LUNAR ANALOGUE ROVER ARTEMIS I-SAIRAS 2012 TURIN, ITALY 4-6 SEPTEMBER 2012 DESIGN AND VALIDATION OF AN ABSOLUTE LOCALISATION SYSTEM FOR THE LUNAR ANALOGUE ROVER ARTEMIS I-SAIRAS 2012 TURIN, ITALY 4-6 SEPTEMBER 2012 Jean-François Hamel (1), Marie-Kiki Langelier (1), Mike Alger

More information

Model-based Programming: From Embedded Systems To Robotic Space Explorers

Model-based Programming: From Embedded Systems To Robotic Space Explorers Model-based Programming: From Embedded Systems To Robotic Space Explorers Brian C. Williams CSAIL Massachusetts Institute of Technology Failures Highlight The Challenge of Robustness Clementine Mars Climate

More information

Overview of Proximity Operations Missions Relevant to NanoSats

Overview of Proximity Operations Missions Relevant to NanoSats Overview of Proximity Operations Missions Relevant to NanoSats Scott MacGillivray, President Tyvak Nano-Satellite Systems LLC (714) 392-9095 scott@tyvak.com 18 April 2012 This document does not contain

More information

Comprehensive Matlab GUI for Determining Barycentric Orbital Trajectories

Comprehensive Matlab GUI for Determining Barycentric Orbital Trajectories Comprehensive Matlab GUI for Determining Barycentric Orbital Trajectories Steve Katzman 1 California Polytechnic State University, San Luis Obispo, CA 93405 When a 3-body gravitational system is modeled

More information

Optical Navigation System for Pin-Point Lunar Landing

Optical Navigation System for Pin-Point Lunar Landing Preprints of the 9th World Congress The International Federation of Automatic Control Cape Town, South Africa. August -9, Optical Navigation System for Pin-Point Lunar Landing V. Simard Bilodeau*, S. Clerc**,

More information

University of Texas Center for Space Research. ICESAT/GLAS CSR SCF Release Notes for Orbit and Attitude Determination

University of Texas Center for Space Research. ICESAT/GLAS CSR SCF Release Notes for Orbit and Attitude Determination University of Texas Center for Space Research ICESAT/GLAS CSR SCF Notes for Orbit and Attitude Determination Charles Webb Tim Urban Bob Schutz Version 1.0 August 2006 CSR SCF Notes for Orbit and Attitude

More information

ceo array was used with the wire bonds between the ceo chip and the

ceo array was used with the wire bonds between the ceo chip and the 1\PPLIC/\TJON OF CCIJ lm/\gj:h.s ln IIICll.SIIOCK I:NVIHONMENT.S Kenneth Ferrisf Richard Ely,* Larry Zimmerman* ABSTRACT Charge-Coupled device (CCD) camera development has been conducted to demonstrate

More information

Inertial Navigation Systems

Inertial Navigation Systems Inertial Navigation Systems Kiril Alexiev University of Pavia March 2017 1 /89 Navigation Estimate the position and orientation. Inertial navigation one of possible instruments. Newton law is used: F =

More information

LANDINS Georeferencing and Orientation System

LANDINS Georeferencing and Orientation System LANDINS Georeferencing and Orientation System LANDINS Georeferencing and Orientation System Landins is a simple, turnkey position and orientation system for land-based mobile applications. It offers dependable

More information

MS3 Lunar General Information. MS3 Lunar Program Functionality

MS3 Lunar General Information. MS3 Lunar Program Functionality MS3 Lunar General Information MS3-Lunar provides support for a next generation mission to the Moon and back, based on the original concepts of the Constellation program. MS3-Lunar can be adapted to other

More information

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry

Distributed Vision-Aided Cooperative Navigation Based on Three-View Geometry Distributed Vision-Aided Cooperative Navigation Based on hree-view Geometry Vadim Indelman, Pini Gurfil Distributed Space Systems Lab, Aerospace Engineering, echnion Ehud Rivlin Computer Science, echnion

More information

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013 Live Metric 3D Reconstruction on Mobile Phones ICCV 2013 Main Contents 1. Target & Related Work 2. Main Features of This System 3. System Overview & Workflow 4. Detail of This System 5. Experiments 6.

More information

Perspective Sensing for Inertial Stabilization

Perspective Sensing for Inertial Stabilization Perspective Sensing for Inertial Stabilization Dr. Bernard A. Schnaufer Jeremy Nadke Advanced Technology Center Rockwell Collins, Inc. Cedar Rapids, IA Agenda Rockwell Collins & the Advanced Technology

More information

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Przemys law G asior, Stanis law Gardecki, Jaros law Gośliński and Wojciech Giernacki Poznan University of

More information

Chapters 1 9: Overview

Chapters 1 9: Overview Chapters 1 9: Overview Chapter 1: Introduction Chapters 2 4: Data acquisition Chapters 5 9: Data manipulation Chapter 5: Vertical imagery Chapter 6: Image coordinate measurements and refinements Chapters

More information

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Stable Vision-Aided Navigation for Large-Area Augmented Reality Stable Vision-Aided Navigation for Large-Area Augmented Reality Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu Supun Samarasekera, Rakesh Teddy Kumar Vision and Robotics Laboratory SRI-International Sarnoff,

More information

Simulation and Analysis of an Earth Observation Mission Based on Agile Platform

Simulation and Analysis of an Earth Observation Mission Based on Agile Platform Simulation and Analysis of an Earth Observation Mission Based on Agile Platform Augusto Caramagno Fabrizio Pirondini Dr. Luis F. Peñín Advanced Projects Division DEIMOS Space S.L. -1 - Engineering activities

More information

Results from the Phoenix Atmospheric Structure Experiment

Results from the Phoenix Atmospheric Structure Experiment Results from the Phoenix Atmospheric Structure Experiment Paul Withers 1 and David Catling 2 (1) Center for Space Physics, Boston University, USA (withers@bu.edu) (2) University of Washington, USA International

More information

Improvements to Ozone Mapping Profiler Suite (OMPS) Sensor Data Record (SDR)

Improvements to Ozone Mapping Profiler Suite (OMPS) Sensor Data Record (SDR) Improvements to Ozone Mapping Profiler Suite (OMPS) Sensor Data Record (SDR) *C. Pan 1, F. Weng 2, T. Beck 2 and S. Ding 3 * 1 ESSIC, University of Maryland, College Park, MD 20740; 2 NOAA NESDIS/STAR,

More information

Collision Risk Computation accounting for Complex geometries of involved objects

Collision Risk Computation accounting for Complex geometries of involved objects Collision Risk Computation accounting for Complex geometries of involved objects Noelia Sánchez-Ortiz (1), Ignacio Grande-Olalla (1), Klaus Merz (2) (1) Deimos Space, Ronda de Poniente 19, 28760, Tres

More information

Autonomous navigation in industrial cluttered environments using embedded stereo-vision

Autonomous navigation in industrial cluttered environments using embedded stereo-vision Autonomous navigation in industrial cluttered environments using embedded stereo-vision Julien Marzat ONERA Palaiseau Aerial Robotics workshop, Paris, 8-9 March 2017 1 Copernic Lab (ONERA Palaiseau) Research

More information

Current status of ESA s Space Situational Awareness Near-Earth Object programme

Current status of ESA s Space Situational Awareness Near-Earth Object programme Current status of ESA s Space Situational Awareness Near-Earth Object programme Detlef Koschny, European Space Agency, Solar System Missions Division Keplerlaan 1 NL-2201 AZ Noordwijk ZH Detlef.Koschny@esa.int

More information

SPACE SITUATIONAL AWARENESS

SPACE SITUATIONAL AWARENESS SPACE SITUATIONAL AWARENESS BACKGROUND AND SPACE SURVEILLANCE STATUS Page: 1 BACKGROUND BACKGROUND PURPOSE OF THE SSA PROGRAMME The objective of the Space Situational Awareness (SSA) programme is to support

More information

Planetary & Cometary Exploration

Planetary & Cometary Exploration Planetary & Cometary Exploration Cameras on Orbiters and Landers Nick Hoekzema Purpose of the camera (I) Navigation, orientation Solar sensors orientation with ~ 0.5º accuracy Star trackers to recognize

More information

PRODAS Newsletter. Announcing the Release of PRODAS Version 3.6. MATLAB/Simulink Trajectory Module

PRODAS Newsletter. Announcing the Release of PRODAS Version 3.6. MATLAB/Simulink Trajectory Module PRODAS Newsletter If You Can t Get a Bigger Target Fall 2011 Announcing the Release of PRODAS Version 3.6 As times change, so do the tools we use to do our work. As Arrow Tech gets deeper and deeper into

More information

CelestLabX: CelestLab s extension module

CelestLabX: CelestLab s extension module ScilabTEC CelestLabX: CelestLab s extension module Alain Lamy, CNES Alain.Lamy@cnes.fr 16 May 2014 Space Flight Dynamics mission design at CNES Agenda What tools for mission design? -> What makes Scilab

More information

HARDWARE IN THE LOOP VALIDATION OF GNC FOR RVD/RVC SCENARIOS

HARDWARE IN THE LOOP VALIDATION OF GNC FOR RVD/RVC SCENARIOS AAS 13-13 HARDWARE IN THE LOOP VALIDATION OF GNC FOR RVD/RVC SCENARIOS Pablo Colmenarejo, * Valentín Barrena and Thomas Voirin The big challenge of new technologies, particularly related to GNC systems,

More information

SGEO: Overview and Product Offering. Marco R. Fuchs. Marco R. R. Fuchs. DLR-ESA Workshop on ARTES 11. Marco R. Fuchs OHB Technology AG

SGEO: Overview and Product Offering. Marco R. Fuchs. Marco R. R. Fuchs. DLR-ESA Workshop on ARTES 11. Marco R. Fuchs OHB Technology AG DLR-ESA Workshop on ARTES 11 SGEO: Overview and Product Offering Marco R. R. Fuchs June June29, 29, 2006 2006 Tegernsee, Tegernsee, Germany Germany Marco R. Fuchs Marco R. Fuchs OHB Technology AG OHB Technology

More information

SENTINEL-3 PROPERTIES FOR GPS POD

SENTINEL-3 PROPERTIES FOR GPS POD SENTINEL-3 PROPERTIES FOR GPS POD COPERNICUS SENTINEL-1, -2 AND -3 PRECISE ORBIT DETERMINATION SERVICE (SENTINELSPOD) Prepared by: X E.J. Calero Project Engineer Signed by: Emilio José Calero Rodriguez

More information

Mars Pinpoint Landing Trajectory Optimization Using Sequential Multiresolution Technique

Mars Pinpoint Landing Trajectory Optimization Using Sequential Multiresolution Technique Mars Pinpoint Landing Trajectory Optimization Using Sequential Multiresolution Technique * Jisong Zhao 1), Shuang Li 2) and Zhigang Wu 3) 1), 2) College of Astronautics, NUAA, Nanjing 210016, PRC 3) School

More information

Dynamically Reconfigurable Processing Module (DRPM), Application on Instruments and Qualification of FPGA Package

Dynamically Reconfigurable Processing Module (DRPM), Application on Instruments and Qualification of FPGA Package INSTITUTE OF COMPUTER AND NETWORK ENGINEERING Dynamically Reconfigurable Processing Module (DRPM), Application on Instruments and Qualification of FPGA Package Björn Fiethe, Frank Bubenhagen, Tobias Lange,

More information

GI-Eye II GPS/Inertial System For Target Geo-Location and Image Geo-Referencing

GI-Eye II GPS/Inertial System For Target Geo-Location and Image Geo-Referencing GI-Eye II GPS/Inertial System For Target Geo-Location and Image Geo-Referencing David Boid, Alison Brown, Ph. D., Mark Nylund, Dan Sullivan NAVSYS Corporation 14960 Woodcarver Road, Colorado Springs, CO

More information

Robust Control Design. for the VEGA Launch Vehicle. during atmospheric flight

Robust Control Design. for the VEGA Launch Vehicle. during atmospheric flight Robust Control Design for the VEGA Launch Vehicle during atmospheric flight Diego Navarro-Tapia Andrés Marcos www.tasc-group.com Technology for AeroSpace Control (TASC) Aerospace Engineering Department

More information

CNES robotics activities : Towards long distance on-board decision-making navigation

CNES robotics activities : Towards long distance on-board decision-making navigation CNES robotics activities : Towards long distance on-board decision-making navigation S.MORENO sabine.moreno@cnes.fr Contents I. Introduction 1.Context 2.Definition II. CNES activities 1.Perception 2.Localisation

More information

H2020 Space Robotic SRC- OG4

H2020 Space Robotic SRC- OG4 H2020 Space Robotic SRC- OG4 CCT/COMET ORB Workshop on Space Rendezvous 05/12/2017 «Smart Sensors for Smart Missions» Contacts: Sabrina Andiappane, sabrina.andiappane@thalesaleniaspace.com Vincent Dubanchet,

More information

The Applanix Approach to GPS/INS Integration

The Applanix Approach to GPS/INS Integration Lithopoulos 53 The Applanix Approach to GPS/INS Integration ERIK LITHOPOULOS, Markham ABSTRACT The Position and Orientation System for Direct Georeferencing (POS/DG) is an off-the-shelf integrated GPS/inertial

More information

INERTIAL NAVIGATION SENSOR INTEGRATED MOTION ANALYSIS FOR OBSTACLE DETECTION

INERTIAL NAVIGATION SENSOR INTEGRATED MOTION ANALYSIS FOR OBSTACLE DETECTION INERTIAL NAVIGATION SENSOR INTEGRATED MOTION ANALYSIS FOR OBSTACLE DETECTION Barry Roberts, Banavar Sridhar*, and Bir Bhanuf Honeywell Systems and Research Center Minneapolis, Minnesota NASA Ames Research

More information

PREPARATIONS FOR THE ON-ORBIT GEOMETRIC CALIBRATION OF THE ORBVIEW 3 AND 4 SATELLITES

PREPARATIONS FOR THE ON-ORBIT GEOMETRIC CALIBRATION OF THE ORBVIEW 3 AND 4 SATELLITES PREPARATIONS FOR THE ON-ORBIT GEOMETRIC CALIBRATION OF THE ORBVIEW 3 AND 4 SATELLITES David Mulawa, Ph.D. ORBIMAGE mulawa.david@orbimage.com KEY WORDS: Geometric, Camera, Calibration, and Satellite ABSTRACT

More information

Attitude-Independent Magnetometer Calibration with Time-Varying Bias

Attitude-Independent Magnetometer Calibration with Time-Varying Bias Attitude-Independent Magnetometer Calibration with Time-Varying Bias John C. Springmann University of Michigan, Ann Arbor, MI Adviser: James W. Cutler August 10, 2011 AIAA/USU Small Satellite Conference

More information

EXOMARS ROVER VEHICLE PERCEPTION SYSTEM ARCHITECTURE AND TEST RESULTS

EXOMARS ROVER VEHICLE PERCEPTION SYSTEM ARCHITECTURE AND TEST RESULTS EXOMARS ROVER VEHILE PEREPTION SYSTEM ARHITETURE AND TEST RESULTS Kevin McManamon (1), Richard Lancaster (2), Nuno Silva (3) (1) Astrium Ltd, Gunnels Wood Road, Stevenage, SG1 2AS, UK, Email: kevin.mcmanamon@astrium.eads.net

More information

An Experimental Study of the Autonomous Helicopter Landing Problem

An Experimental Study of the Autonomous Helicopter Landing Problem An Experimental Study of the Autonomous Helicopter Landing Problem Srikanth Saripalli 1, Gaurav S. Sukhatme 1, and James F. Montgomery 2 1 Department of Computer Science, University of Southern California,

More information

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018 Asaf Moses Systematics Ltd., Technical Product Manager aviasafm@systematics.co.il 1 Autonomous

More information

Sentinel-2 Calibration and Validation : from the Instrument to Level 2 Products

Sentinel-2 Calibration and Validation : from the Instrument to Level 2 Products Sentinel-2 Calibration and Validation : from the Instrument to Level 2 Products Vincent Lonjou a, Thierry Tremas a, Sophie Lachérade a, Cécile Dechoz a, Florie Languille a, Aimé Meygret a, Olivier Hagolle

More information

CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS

CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS ökçen Aslan 1,2, Afşar Saranlı 2 1 Defence Research and Development Institute (SAE), TÜBİTAK 2 Dept. of Electrical and Electronics Eng.,

More information

'Smartphone satellite' developed by Surrey space researchers

'Smartphone satellite' developed by Surrey space researchers Press Release 24th January 2011 'Smartphone satellite' developed by Surrey space researchers Space researchers at the University of Surrey and Surrey Satellite Technology Limited (SSTL) have developed

More information

SSASIM: AN EARTH-ORBITING OBJECTS CATALOGUE MAINTENANCE SIMULATOR

SSASIM: AN EARTH-ORBITING OBJECTS CATALOGUE MAINTENANCE SIMULATOR SSASIM: AN EARTH-ORBITING OBJECTS CATALOGUE MAINTENANCE SIMULATOR Alberto Águeda Maté (1), Isaac Juárez Villares (2), Pablo Muñoz Muñoz (3), Francisco M. Martínez Fadrique (4) (1) GMV, Isaac Newton 11.

More information

InFuse: A Comprehensive Framework for Data Fusion in Space Robotics

InFuse: A Comprehensive Framework for Data Fusion in Space Robotics InFuse InFuse: A Comprehensive Framework for Data Fusion in Space Robotics June 20 th, 2017 Shashank Govindaraj (Space Applications Services, Belgium) Overview 1. Motivations & Objectives 2. InFuse within

More information

Noise Estimation for Star Tracker Calibration and Enhanced Precision Attitude Determination

Noise Estimation for Star Tracker Calibration and Enhanced Precision Attitude Determination Noise Estimation for Star Tracker Calibration and Enhanced Precision Attitude Determination Quang Lam, Craig Woodruff, and Sanford Ashton David Martin Swales Aerospace NASA GSFC 5050 Powder Mill Road Greenbelt

More information

SPAN. novatel.com. Tightly coupled GNSS+INS technology performance for exceptional 3D, continuous position, velocity & attitude

SPAN. novatel.com. Tightly coupled GNSS+INS technology performance for exceptional 3D, continuous position, velocity & attitude SPAN novatel.com Tightly coupled GNSSINS technology performance for exceptional 3D, continuous position, velocity & attitude For comprehensive SPAN information, visit: www.novatel.com/span SPAN Technology

More information