ASTRIUM Space Transportation

Similar documents
Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains

Innovative Visual Navigation Solutions for ESA s Lunar Lander Mission Dr. E. Zaunick, D. Fischer, Dr. I. Ahrns, G. Orlando, B. Polle, E.

Navigation for Future Space Exploration Missions Based on Imaging LiDAR Technologies. Alexandre Pollini Amsterdam,

12 June Flight Testing a Vision-Based Navigation and Hazard Detection and Avoidance (VN&HDA) Experiment over a Mars-Representative Terrain

UAV Autonomous Navigation in a GPS-limited Urban Environment

Real-time Algorithmic Detection of a Landing Site using Sensors aboard a Lunar Lander

Vision Based GNC Systems for Planetary Exploration

Error Simulation and Multi-Sensor Data Fusion

Guidance, Navigation and Control issues for Hayabusa follow-on missions F. Terui, N. Ogawa, O. Mori JAXA (Japan Aerospace Exploration Agency )

Visual Navigation System for Pin-Point Landing

Dealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Mars Entry and Descent. Dr. Scott Striepe NASA Langley Research Center

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Mission Design for Safe Traverse of Planetary Hoppers

Robust Control Design. for the VEGA Launch Vehicle. during atmospheric flight

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016

Autonomous navigation in industrial cluttered environments using embedded stereo-vision

Hardware Software Co-Design and Testing Using Simulink Real-Time Paul Berry and Brian Steenson

Terrain Integrity Monitoring Studies using Kalman Filter for SVS

Precision Hopping/Rolling Robotic Surface Probe Based on Tensegrity Structures. BEST Lab Seminar October 7 th, 2016 Brian Cera Edward Zhu

Results from the Phoenix Atmospheric Structure Experiment

Visual Odometry. Features, Tracking, Essential Matrix, and RANSAC. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Landmarks Constellation Based Position Estimation for Spacecraft Pinpoint Landing

W4. Perception & Situation Awareness & Decision making

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Vehicle Localization. Hannah Rae Kerner 21 April 2015

Selection and Integration of Sensors Alex Spitzer 11/23/14

1st International Round Table on Intelligent Control for Space Missions

Robert Collins CSE598G. Intro to Template Matching and the Lucas-Kanade Method

H2020 Space Robotic SRC- OG4

Low Cost solution for Pose Estimation of Quadrotor

Lecture 13 Visual Inertial Fusion

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

DYNAMIC POSITIONING CONFERENCE September 16-17, Sensors

Photo-realistic Renderings for Machines Seong-heum Kim

Overview of the Trimble TX5 Laser Scanner

Accuracy Assessment of Ames Stereo Pipeline Derived DEMs Using a Weighted Spatial Dependence Model

Turning an Automated System into an Autonomous system using Model-Based Design Autonomous Tech Conference 2018

Camera Drones Lecture 2 Control and Sensors

navigation Isaac Skog

An Angle Estimation to Landmarks for Autonomous Satellite Navigation

3D Convolutional Neural Networks for Landing Zone Detection from LiDAR

Sensor Integration and Image Georeferencing for Airborne 3D Mapping Applications

Distributed Ray Tracing

Range Sensors (time of flight) (1)

Fast Star Tracker Centroid Algorithm for High Performance CubeSat with Air Bearing Validation. Matthew Knutson, David Miller. June 2012 SSL # 5-12

A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models

Camera and Inertial Sensor Fusion

VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS

Precise Image-Based Motion Estimation for Autonomous Small Body Exploration

NEW ASTOS SOFTWARE CAPABILITIES APPLIED TO RENDEZVOUS SCENARIOS

Optical Navigation System for Pin-Point Lunar Landing

Dynamical Modeling and Controlof Quadrotor

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

Final Exam Assigned: 11/21/02 Due: 12/05/02 at 2:30pm

Terrain correction. Backward geocoding. Terrain correction and ortho-rectification. Why geometric terrain correction? Rüdiger Gens

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Correcting INS Drift in Terrain Surface Measurements. Heather Chemistruck Ph.D. Student Mechanical Engineering Vehicle Terrain Performance Lab

Satellite Attitude Determination

Field Testing of the Mars Exploration Rovers Descent Image Motion Estimation System*

Multi-Sensor Terrain Classification for Safe Spacecraft Landing

PRODAS Newsletter. Announcing the Release of PRODAS Version 3.6. MATLAB/Simulink Trajectory Module

Lecture: Autonomous micro aerial vehicles

Performance Analysis for Visual Planetary Landing Navigation Using Optical Flow and DEM Matching

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

REGISTRATION OF AIRBORNE LASER DATA TO SURFACES GENERATED BY PHOTOGRAMMETRIC MEANS. Y. Postolov, A. Krupnik, K. McIntosh

Laser Beacon Tracking for High-Accuracy Attitude Determination

Video integration in a GNSS/INS hybridization architecture for approach and landing

Design of Chang'e No.three Soft Landing Orbit Based on GA and Texture Analysis. Jiani Xing 1, a

Chapters 1 9: Overview

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

Rendezvous sensors and navigation

Coupled Vision and Inertial Navigation for Pin-Point Landing *

Chapter 1: Overview. Photogrammetry: Introduction & Applications Photogrammetric tools:

A multilevel simulation framework for highly automated harvest processes enabled by environmental sensor systems

CHALLENGES OF PINPOINT LANDING FOR PLANETARY EXPLORATION : THE LION ABSOLUTE VISION-BASED NAVIGATION SYSTEM STEP-WISE VALIDATION APPROACH

SAE Aerospace Control & Guidance Systems Committee #97 March 1-3, 2006 AFOSR, AFRL. Georgia Tech, MIT, UCLA, Virginia Tech

Lecture 24: More on Reflectance CAP 5415

Video Georegistration: Key Challenges. Steve Blask Harris Corporation GCSD Melbourne, FL 32934

CAMERA POSE ESTIMATION OF RGB-D SENSORS USING PARTICLE FILTERING

Planetary & Cometary Exploration

DIGITAL HEIGHT MODELS BY CARTOSAT-1

VALIDATION OF 3D ENVIRONMENT PERCEPTION FOR LANDING ON SMALL BODIES USING UAV PLATFORMS

Transactions on Information and Communications Technologies vol 16, 1996 WIT Press, ISSN

Unmanned Aerial Vehicles

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements

GPS denied Navigation Solutions

Technical Paper of HITCSC Team for The International Aerial Robotics Competition

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

Autonomous Mobile Robot Design

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Sensor Modalities. Sensor modality: Different modalities:

DUAL MODE SCANNER for BROKEN RAIL DETECTION

Space Robotics. Ioannis Rekleitis

Autonomous Navigation for Flying Robots

Angles-Only Autonomous Rendezvous Navigation to a Space Resident Object

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks

Simulation and Optimization

Other Reconstruction Techniques

Transcription:

SIMU-LANDER Hazard avoidance & advanced GNC for interplanetary descent and soft-landing S. Reynaud, E. Ferreira, S. Trinh, T. Jean-marius 3rd International Workshop on Astrodynamics Tools and Techniques ESTEC - 04/10/06 Page 1

Outline Simulator presentation Models Hazard avoidance: Hazard mapping function Piloting function Navigation function Guidance Control function Improvement Demonstration Astrium Page 2

Simulator Functionnalities Camera or LIDAR or radar (if no HM needs) Risks maps Scientific interest Local slope Shadow zones Piloting Maps & time fusion Navigation IMU Hazard Detection (Camera or LIDAR) Simplified guidance Re-targeting Guidance constraints (propellant, visibility ) Astrium Page 3 Guidance & control Control relative velocity new target Old site New site

Simulator Objectives Hazard avoidance & GNC platform for interplanetary powered descent soft-landing developed for: Lander mission and vehicle concept studies G, N, C and sensors trade-off and specifications GNC and hazard avoidance algorithms assessments Performance analysis Platform dedicated to Mars or Lunar Lander mission studies to design the mission & GNC of a descent & landing system Astrium Page 4

Simulator Architecture Simulink environment Modular and generic architecture allowing easy models and flight software improvements, adaptation to any planet data flow transfers by the mean of Simulink buses Environment/Dynamics: currently models for lune or mars Environment Environment HM HM &navigation &navigation sensor sensor IMU IMU FLIGHT SOFTWARE PROPULSION PROPULSION Astrium Page 5

Simulator Flight software architecture Several navigation modes: Inertial Only Navigation, Vision, Radar and LIDAR (under development) Based Navigation Retargeting (supervisor in the loop) or not Algorithms step by step G,N,C validation available FROM IMU NAVIGATION NAVIGATION GUIDANCE GUIDANCE From complementary sensors CONTROL & ASSIGN SUPERVISION SUPERVISION Astrium Page 6

Environment models Planet: spherical or ellipsoid; Kepler, J2 or J3 gravity model Environment (Mars): Atmosphere(EMCDB), Aerodynamics (6 dof) & Wind models Environment Propulsion/Dynamics: Parametric propelled lander (up to 20 thrusters) MCI: variable mass, C & I constants, CoG offset - Sloshing modes Monte-carlo mode Astrium Page 7

Environment models Scene generator PANGU (V2.6) : Synthetic Scene generator Craters Boulders Dunes Ground model Shadows Illumination Ambiant light Light Reflection models Astrium Page 8

Environment models Scene generator Main assets of Pangu: Possibility to create a client/server interface with Matlab: Matlab => camera position/attitude Pangu => associated picture Possibility to use MOLA DEM from the NASA to generate the surfaces Possibility to create surfaces of diffferent resolutions Mars photography Pangu picture using the MOLA Astrium Page 9

Sensors models IMU: bias and scale factors LIDAR: currently model from PANGU Camera model -Blur effect -Pixellization -Optical transfert function IR Convolution Image Add CCD Noise Astrium Page 10 Noisy image Original image Processed image

Sensors models Doppler radar: Generic raw data model (r,rdot) Detailed error models: misalignments, bias Operationnal limitations Antennae lobes Fauchées Astrium Page 11

Hazard avoidance Give to the guidance function the location of a landing site considered as safe This process is divided into 2 functions Site classification: produce risk maps for different criteria Slopes Shadows Obstacles (e.g. craters, boulders ) Piloting function: map fusion & decision process to determine, given the guidance and mission management constraints, the best safe target for landing Objective: to maximise mission success even in difficult terrain Astrium Page 12

Hazard mapping Camera Shadows and saturated zones detection Passive camera cannot detect obstacles in dark and saturated areas The "shadow and saturated zones risk map" can so be simple using thresholds techniques Risk 255 Grey level 0 20 50 205 235 255 Image generated by PANGU Shadow risk map Astrium Page 13

Hazard mapping Camera Obstacles detection Texture analysis: An obstacle can be feared under space changing response Methods: Variance Correlation Derivative of the correlation CPU requirements and obstacle size coverage can be improved with Gaussian or Laplace pyramids PANGU image Astrium Page 14 Variance Derivative of the correlation

Hazard mapping Camera Obstacles detection Precisely locate craters and boulders (example with variance method with three window sizes multi-resolution - on the image): With camera: Camera field of view: 70 Image size: 1024x1024 pixels Smallest obstacle detected 50cm at a 100m altitude Astrium Page 15

Hazard mapping Camera Slope estimation Shape from shading / Carlotto Typical assumption needs Lambertian reflectance model constant albedo on the surface of the moon sun is the unique light source and is distant sun elevation known 3D reconstruction by move without previous simplification assumptions (currently under development): From image motion between several images Reconstruction from 3D points delivered by the camera (NPAL) Astrium Page 16

Hazard mapping Camera Slope estimation Reference slope map Altitude map From DEM Slope map mean plane computation with the least median squares method Astrium Page 17

Hazard mapping LIDAR LIDAR based (under development) Regridding 3D elevation model Mean slope computation Roughness computation Astrium Page 18

Hazard avoidance Piloting function Pre define site Original maps GNC information Lander to candidate sites state Guidance function for sites accessibilty Evolution of the site risk Available fuel HAZARD MAPPING Shadows risk map List of candidate sites Sites/attributes time history Decision maker Evidence theory (Dempster-shafer), fuzzy logic retargeting Slopes risk map New landing site texture risk map Astrium Page 19 Scientific interest & sun visibility Fusion & decision Processes

Navigation function Provides to guidance the vehicle relative state with respect to the chosen landing site Navigation state: Relative position Relative velocity Attitude quaternion of the vehicle with respect to the local geographic frame IMU defects (biases and drifts) Target Astrium Page 20

Navigation function Generic filter of navigation: State-of-the art Extended Kalman filter Inertial propagation Update with altitude, velocity or attitude measurements Generic implementation measurements can come from any external sensor: Radar altimeter Doppler radar Camera Imaging radar Lidar Astrium Page 21

Navigation function Camera Sensor information measure of relative motion Vision Based Navigation: Astrium Page 22 Tracking of landmarks through successive images Features points extraction and matching Computation of the homography (camera rotation and translation) Horn algorithm - Range sensor needed Processing of camera rotation and translation to update the filter

Navigation function DOPPLER RADAR Sensor measurements processings to update the filter Radar doppler open loop navigation performance illustration Astrium Page 23

Navigation function LIDAR Lidar raw data processing (under development) Scanning Lidar provides 3D maps of the form two angles and a range Raw measurements are processed to obtain a 3D map onto a regular grid in the local geographic frame Measurement extraction hazard mapping Measurement extraction: Altitude Velocity from two consecutive maps Astrium Page 24

Navigation function LIDAR Velocity extraction: Estimate common area W between two consecutive maps Z 1 (x,y) and Z 2 (x,y) Translation (T x, T y, T z ) between maps obtained by minimizing: ε = W ( Z ( x + T,y + T ) + T Z ( x, )) 1 X y Z 2 y 2 Astrium Page 25 Knowledge of translation and delay between two maps Velocity measurement

Guidance and control functions Guidance: acceleration cmd rate of max. available thrust Selection among several techniques: gravity turn / neural networks / feedback trajectory control / bilinear tangent law / predictor-corrector Robust to retargetingd Attitude control: classical PID controller for the 3 axes Astrium Page 26 Thrust monitoring function: Braking: number of engines to be ignited (0-1-2-3 for each cluster) Y-Z (X) control: deadband zone for ignition of braking (roll) thrusters

Guidance Statistical analysis Monte-Carlo analysis based on 100 runs Deviations on propulsion, aerodynamics, initial conditions Vertical velocity at landing < 2,2 m/s at 3σ (mean = 1,8 m/s) Astrium Page 27

Demonstration Video Astrium Page 28