Cooperative Fusion Architecture in a Network Centric Environment
|
|
- William Ellis
- 5 years ago
- Views:
Transcription
1 Cooperative Fusion Architecture in a Network Centric Environment Dr. Omar Aboutalib System Analysis and Sensor Fusion Corporation, 1142 South Diamond Bar Boulevard, M/S 247 Diamond Bar, California 91765, USA omaraboutalib@hotmail.com Abstract Modern network centric technology within the global information grid (GIG) permits the rapid and effective sharing of information to such a degree that nodes should be able to "pull" information, rather than having centralized locations predict their information needs and "push" it to them. In this paper, we introduce innovative cooperative fusion architecture of nodes with similar on-board sensors and algorithms followed by fusion of relevant sources that may be available in the vertical layered sensing environment. This leads to flattening of the traditional network hierarchy and enhanced target location and reliable combat target identification. Keywords-component; network centric fusion, cooperative fusion, decision level fusion, relevant and common operating picture, GPS-denied navigation. I. PROBLEM STATEMENT The role of information fusion and the need to share fused information quickly has become critical in light of modern asymmetric warfare. There is a need of networking sensors, commanders, and shooters to flatten the network hierarchy, and enhance information fusion precision and reliability. Modern network centric technology within the GIG permits the rapid and effective sharing of information to such a degree that nodes conducting military missions themselves, should be able to "pull" information, rather than having centralized locations predict their information needs and "push" it to them. This would imply a major flattening of traditional network centric architecture. However, the extensive amount of data and information being collected in a network centric enterprise can potentially overwhelm the war fighters to the point that critical event or threat information may not be addressed on time. Therefore, the war fighters need effective processing of the vast amount of data and associated contextual information to generate timely information to mediate multiple hypotheses and make the proper decisions. The advantages of network centric fusion are the ability to adapt to unanticipated and unforeseen situations, eliminate single points of failure, and remain continuously operational while being dynamically updated. The advent of network-centric systems has served to accelerate the move toward systems of systems. Some of the challenges in building effective network centric fusion architecture are the ability of the architecture to network all relevant nodes into one self-forming, self-healing network; network coordination where every piece of mobile node which may be a piece of equipment or a human participant becomes a potential source or relay of information; maintaining reliable common and/or relevant operating picture throughout; the accurate location awareness where global positioning system (GPS) coverage is weak or denied; and the considerable interest in providing data fusion as a service in a serviceoriented architecture (SOA) environment. The SOA concept was first initiated by the computer science community. In a SOA environment, functional capabilities are made available to users as services that interact through the use of the core enterprise services that include security control, messaging, and discovery services that allow users to become aware of available services. This need will impact the design of the fusion architecture to operate in the SOA environment. Multitier fusion in vertically integrated ISR enterprise assumes that each node within the network knows its position fairly accurate. This assumption is not valid if attacks on key nodes of the GPS occurred. Therefore, a fallback method of navigation is required in time of crisis. The all source fusion of vision and navigation sensors is required to provide continuous and precision navigation in GPS as well as non-gps environment. II. NETWORK CENTRIC FUSION ARCHITECTURE In this section, we propose innovative two-tier fusion architecture. Tier 1 is cooperative fusion of similar nodes with similar on-board sensors and similar algorithms followed by tier 2, which fuses each node with relevant sources that may have access to in the vertical layered sensing environment. Tier 1 fusion yields enhanced fusion performance by enabling the distribution of all participating nodes in preferred configurations that enhance the fusion performance and generate reliable common or relevant operating picture for nodes participating in cooperative fusion. In tier 1 fusion, all participating nodes have the same sensors and processing algorithms and communicate using fast local network. In Tier 2, each node fuses its operating picture with any available information from sensors or sources in the vertical layer sensing environment. A new update to the common or relevant operating picture is then performed among the participating nodes in tier 1 as shown in Figure 1. The Tier 1 fusion represents the fusion of sensors on-board a single platform and across platforms in a given formation of same platforms using similar on-board sensors and algorithms, 1379
2 and tier 2 fusion represents the additional vertical fusion with other sources in the layered sensing environment. It should be noted as discussed in sections A and B that care needs to be exercised in the multi-tiered fusion to avoid fusing dependent or duplicate information which will lead to building false belief. Figure1: Two-tier fusion in network centric environment maintains common or relevant operating picture Tier 1 fusion encompasses the fusion of the detections or tracks of the on-board sensors of each aircraft to produce Single Platform Multi-Sensor (SPMS) database as discussed in more details in section A. Subsequently, cooperative fusion among platforms within the formation yielding a common or relevant operating picture by producing a Multi-Platform Multi-Sensor (MPMS) database as shown in Figure 3. Tier 2 fusion is the fusion of each platform operating picture with the relevant off-board assets in the vertically integrated sensing environment, and a subsequent cooperative fusion among the platforms within the formation yielding an updated common/relevant operating picture. A relevant operating picture includes the fused results in the common footprint area and any tracks outside the common footprint that deemed to be of interest to a given platform (e.g. tracks that are expected to show up in the platform field of regard). A common operating picture includes the fused results in the common footprint area as well as the combination of all other tracks. A. Tier 1 Fusion for Target Tracking Figure 2 depicts tier 1 single platform multi-sensor fusion. As shown in Figure 1, raw detections from each single sensor on-board each platform in the formation are correlated and fused into single sensor tracks. The single sensor tracks are also used to build a sensor detections database of detections correlated to sensor tracks (marked as database 1 in Figure 2), which may be used for centralized fusion with other on-board sensor detections as shown in Figure 2. There are two methods of fusion to enhance tracking performance, centralized fusion (marked as block A in Figure 2) and track-level-fusion (marked as block B in Figure 2). Track fusion suffers from the presence of cross correlation of the tracking error from two or more sensors tracking the same target. This cross-correlation of these errors should be taken into account in fusing the track data [1]-[3]. In general, centralized fusion tends to be more accurate than track fusion. A hybrid of the two approaches may be more appropriate, where track fusion is used for situation awareness for targets at long ranges while centralized fusion is used for fire control and targeting at closer ranges. A SPMS track database is first initiated using single tracks. The correlation of single sensor tracks and SPMS tracks are then used to develop centralized or track fusion databases as show in Figure 2. Cooperative fusion are performed as part of tier 1 fusion, where a portion of the on-board SPMS database is communicated to the other platforms within the formation via the in-flight data link as shown in Figure 1, and Figure 3. The transmit filter shown in Figure 3 selects the transmitted database based on geographical and temporal constraints. For example, the transmitted database may represent targets in the common footprint of two platforms for cooperative fusion with the other platform s tracks, or may represent relevant tracks to the other platform. The platform receive filter will receive data from the other platforms which may include sensor detections and/or tracks for multi-platform multi-sensor (MPMS) centralized or track fusion. It should be noted that the received data needs to be inspected using a maintained pedigree table to determine if the received data has been contaminated by ownship database as shown in Figure 3. Different track fusion methods apply to remove estimation error cross-correlation with ownship tracks [1]-[3]. A common and/or relevant operating picture is then formed and shared as shown in Figure 1. Figure 2: Tier 1 Single platform multi-sensor fusion for target continuous and precision tracking. B. Tier 1 Fusion for Combat Identification The Existing combat identification (CID) implementations lack consistency and the required high confidence due to: 1) stovepipe architecture leading to disparate CID contributors with limited fusion; 2) lack of CID algorithms characterization for proper fusion; 3) available ISR data is not fully utilized due to latency and ambiguity; 4) lack of target models that are representative of all potential target articulations and variations; 5) large variability in the extended operating conditions including deployment of partially obscured targets and targets employing denial and deception (D&D) techniques; and 6) 1380
3 black box implementations of CID systems which lead to expensive software validation and verification (V&V) as the CID technology matures and new and advanced CID algorithms become available. There are various levels of CID fusion including, pixel-level fusion, feature or attribute level fusion, and decision-level fusion [4]-[7]. Let us assume without the loss of any generality that we have a formation of two platforms within the formation. Each aircraft has a wide area search Electronic Warfare (EW) or Electronic Support Measure (ESM) sensor for detection and identification of emitting air defense radars, a high resolution Synthetic Aperture Radar (SAR) for long range target detection and identification, and an Electro-Optical/Imaging Infrared (EO/IIR) as a terminal sensor. Figure 4 illustrates some of the potential CID fusion levels for the sited example. requirements for CID algorithms characterization. Level 1 demands the least amount of requirement from each CID algorithm, where a single confusion matrix represents the minimum amount of information for the characterization of a single CID algorithm. Level 2, requires multiple confusion matrices representative of various levels of probability of correct classifications (e.g. low, medium, and high) as well as extended operating conditions (e.g. different look down angles, and various weather states). Level 3 occur if the multi-sensor CID algorithms are characterized jointly. It should be noted that in level 3, the CID algorithms are tightly coupled, but yields the most reliable fusion results. In level 1, the CID algorithms are loosely coupled, but may yield unreliable fusion performance. The various competing CID algorithms will start as loosely coupled but overtime they become tightly coupled through an adaptive learning process. For track fusion methods, we need to remove estimation error cross- correlation among tracks as described in [1]-[3]. Figure 3: Cooperative fusion among the nodes of a given formation C. Tier 2 CID Decision-Level Fusion and Track Fusion In a vertically integrated sensing environment, CID decision-level fusion and track level fusion are most applied. CID decision-level fusion methods used in industry are not quite effective. In some situations, there is no performance gains experienced due to fusion, and in other situations the fused decision performance is less than an individual sensor algorithm decision performance. This is due to making assumptions to simplify the mathematics of decision level fusion. Specifically, making the assumption that the joint decision probability of multiple identification algorithms is the product of individual decision probabilities. An optimal CID decision level fusion for either tier 1 or tier 2 fusions should avoid making such assumption, and should utilize all historical available information to perform global optimization of the joint decision probability. All CID algorithms are characterized offline (this process known as automatic target recognition algorithms training) and their respective decision performance metrics are collected. These historical CID performance metrics should be used for the decision optimization. Figure 5 illustrates various levels of Figure 4: Potential CID fusion levels in cooperative fusion Figure 5: CID algorithms characterization and decision level fusion performance D. CHALLENGES OF GPS DENIAL IN LAYERED SENSING While the GPS/INS navigation performance is fairly accurate, navigation performance will degrade significantly if 1381
4 the GPS is denied. Multi-tier fusion in vertically integrated ISR enterprise assumes that each node within the network knows its position fairly accurate. This assumption is not valid if attacks on key nodes of the GPS occurred. Terrain-relative navigation can be used as an alternative navigation to rectify the GPSdenied problem. Optical flow processing can be used to infer the velocity vector of the flying platform. When the platform with the imaging sensor or camera moves with respect to a stationary three-dimensional (3-D) point, the 3-D velocity of the platform as seen by that stationary point can be mapped onto the sensor s image plane as a two-dimensional (2-D) image motion. As the platform moves, projected points on the image plane seem to be traveling away from the optical flow center, known as the focus of expansion (FOE). The FOE defines the point on the image plane at which the velocity vector of the platform pierces the earth surface. The field of computer vision has witnessed many approaches for computing ego-motion which estimates an observer s movement (sensor or biological organism) from optical flow measurements. Optical flow is defined as the apparent motion of the brightness patterns. The optical flow is characterized by a field of 2-D velocity vectors which are the projections of the 3- D velocity vectors of the surface points onto the image plane [8]-[11]. The 2-D velocity vectors derived from sequences of images are shown in Figure 6. Adaptive fusion in real-time of all available navigation data such as IMU/GPS, altimeters, star tracker, horizon sensor, passive imaging sensor, and digital elevation database is a promising approach for navigation in non-gps environment. For example, the integration of passive imaging sensors has some important advantages. Foremost, the sensors are completely passive, and can operate in an environment where the GPS signal may be difficult to receive. Secondly, the sensors are immune to disruptions in the radio spectrum. The adaptive fusion of all navigation and sensory will provide optimal estimate of the vehicle position, velocity, and attitude, as well as estimates of the accelerometer, rate gyro, and clock biases. The architecture of the fusion filter should be independent of specific sensor types and data rates, and hence is easily adaptable to the addition of new sensors, changing data rates, and loss or interruption of data from any given sensor. The fusion filter can be hosted on multiple platforms (airborne, ground), small robots, or dismount man-wearable equipment to provide adaptive fusion in real-time of all sources or signals of opportunities that may be available to any node in the network leading to distributed and collaborative navigation solutions. At the tier 1 level, the on-board single platform sensors such as EO/IR, SAR, altimeter, horizon sensor, inertial measurement unit (IMU), and star tracker should be used for navigation as shown in Figure 7. Collaboration among multiple platforms within a formation of platforms and other platforms in the vertically integrated sensing environment will be used to enhance navigation performance and transfer navigation accuracy to platforms flying over featureless terrain such as desert or water. Fusion is the main focus for alternate navigation in GPS denied environment as described in section E. Figure 6: Optical flow field The measurement of the 2-D image velocity at a given point can be used to infer the 3-D velocity of the imaging platform. Since the optical flow is a projection of a 3-D velocity vector onto a 2-D image plane, there is an inherent ambiguity in inferring the platform 3-D velocity vector. The observed optical flow therefore needs to be fused with other measurements such as the scene depth or height to remove the ambiguity and to provide an estimate of the 3-D velocity vector of the imaging platform. Knowing the observer velocity vector and the last GPS position estimate will enable, in theory, navigation in non-gps environment. Using only optical flow measurements for navigation in non-gps environment encounters difficulties due to the variation of depth across the image field and the presence of noise in the sequence of images. Legacy approaches combine precision radar sensors with digital terrain elevation databases to serve as a back-up to the GPS system. Such approaches require high-level of accuracies of digital terrain data and radar sensor and will fail if the platform travels in featureless terrain. Figure 7: Network centric autonomous and collaborative navigation in GPS denied environment E. Fusion for Navigation in non-gps Environment Fusion of all available vision and navigation sensor data will provide an alternative navigation in a GPS-denied environment. Various sensors have various contributions to continuous and precision navigation in GPS-denied environment as described below: 1382
5 Inertial Measurement Unit (IMU) - IMUs usually contain three orthogonal rate gyros which measure the angular rotation rates of the vehicle s roll, pitch, and yaw, and three orthogonal accelerometers measuring the vehicle linear accelerations in the vehicle coordinate system. By processing signals from these six sensors at high rate (usually 100Hz or more), it is possible to use initial estimates of vehicle heading and position developed at lower rates to perform dead reckoning and generate the vehicle s heading and position at 100 Hz as shown in Figure 8. Figure 9: Surface features are detected and tracked Figure 8: Dead reckoning of vehicle heading and position The fusion of IMU measurements for navigation in non- GPS environment needs to account for the various types of IMU error such as misalignment, moving bias, random walk noise, and scale factors. IMU sensing can be unreliable at very high altitudes. Star Trackers The star tracker provides measurements of vehicle attitude by detecting stars of high magnitudes, and performing star pattern recognition and vehicle s attitude determination based on line of sight of vectors to the known stars. Star tracker measurements can be used when the IMU sensing is unreliable at very high altitudes and can also be fused with IMU measurements to reduce the dead reckoning error. With the rapid advancement of imaging sensor hardware and high-speed computer processors, current star trackers are small and can achieve 1-arc-second attitude accuracy at a sampling rates range from 1 to 10 Hz. Horizon Sensor - horizon sensor is a scanning infrared sensor (using a small focal plan array, µmeter wave length). It senses the comparative warmth (radiated intensity) of atmosphere compared with much colder cosmic background to determine the horizon. Attitude measurements using star tracker can be combined with the horizon plane position developed using a horizon sensor to provide the vehicle s geolocation. Electro-Optical / Imaging Infrared (EO/IIR) Sensor EO/IIR are useful vision sensors. Optical flow measurements and surface features as shown in Figure 9 can be detected from imagery stream and can then be correlated in real time with the database of landmarks [12]-[19]. Optical flow measurements can contribute to the estimation of vehicle velocity vector. Landmark measurements are important navigation aids because they provide positions as well as orientations fixes as shown in Figure 10. EO/IIR sensors have limitations for alternate navigation in high altitude missions and low cloud ceiling where visibility is limited. Figure 10: Landmarks can be used for position and orientation fixes Altimeter altitude measurements can be combined with EO/IIR optical flow measurements to determine vehicle velocity and position. Altitude can be measured using a barometer based on atmospheric pressure. Radar altimeter measures altitude more directly, using the time taken for a radio signal to reflect from the surface back to the aircraft. The radar altimeter is used to measure height above ground level during landing in commercial and military aircraft. Synthetic Aperture Radar (SAR) SAR sensor can provide all-weather, autonomous navigation. By forming SAR imagery of the terrain and then correlating persistent SAR features with a stored reference obtained from optical photography or a presorted SAR imagery database, a navigation position fix can be obtained. Position accuracies of less than a SAR resolution cell can be obtained. Magnetometer A magnetometer is an instrument used to measure the strength or direction of magnetic fields. As we mentioned IMUs are naturally subject to drift since they are essentially integrators, and any error in the sensor reading is compounded by the integration. They are often augmented by some other sensor to correct for drift. One method is to use a magnetometer to get an orientation measurement based on the earth magnetic field. Magnetometers will help correct orientation error but have no direct contribution in correcting vehicle s position or velocity. Digital Terrain Elevation Data (DTED) There is a great utility of DTED databases in navigation in non-gps environment. However, the limitations and accuracies of these databases need to be considered. The utility of DETED database and the use of a radar altimeter has been demonstrated to provide position fix for low-altitude missions in many legacy systems. 1383
6 III. CONCLUSIONS There is a need of networking sensors, commanders, and shooters to flatten the network hierarchy, and enhance information fusion precision and reliability. Modern network centric technology within the GIG permits nodes within the network to "pull" information, rather than having centralized locations predict their information needs and "push" it to them. We introduced innovative cooperative fusion architecture of nodes with similar on-board sensors and algorithms followed by fusion of relevant sources that may be available in the vertical layered sensing environment. Tier 1 fusion yields enhanced fusion performance by enabling the distribution of all participating nodes in preferred configurations that enhance the fusion performance and generate reliable common or relevant operating picture for nodes participating in cooperative fusion. Shortcomings of existing combat identification (CID) implementations and optimized decision-level fusion are discussed. Continuous and high precision navigation is also required for network centric fusion. Multi-tier fusion in vertically integrated ISR enterprise assumes that each node within the network knows its position fairly accurate. This assumption is not valid if attacks on key nodes of the GPS occurred. Therefore, a fallback method of navigation is required in time of crisis. At the tier 1 level, the on-board single platform sensors such as EO/IR, SAR, altimeter, horizon sensor, inertial measurement unit (IMU), and star tracker can be used for terrain relative navigation. The concept of the all source fusion of vision and navigation is introduced for navigation in non-gps environment. The contribution of various types of vision and navigation sensors is also presented. For higher tier fusion collaboration among multiple platforms will be used to enhance navigation performance and transfer navigation accuracy to platforms flying over featureless terrain. REFERENCES [1] Y. Bar-Shalom and L. Campo, The effect of common process noise on the two-sensor fused-track covariance, IEEE Transactions on Aerospace and Electronic Systems, vol. 22 no 6, pp , 1986 [2] J. Oliver E. Drummond and David Dana-Bashian, "Comparison of tracklet filter methods for non-maneuvering targets", Proc. SPIE 5913, 59131B (2005). [3] X. R. Li and Z. L. Zhao, Measuring estimator s credibility: Noncredibility index, in Proceedings of the 9th International Information Fusion Conference, pp. 1 8, July [4] G.S. Robinson, and A. Omar Aboutalib," Trade-off Analysis of Multi-sensor Fusion Levels", 2nd National Symposium on sensor fusion, Orlando, Florida, Vol. II, March 1989, pp [5] A. Omar Aboutalib, L. Tran, and C. Hu, Fusion of Passive Imaging Sensor Data for Target Acquisition and Identification", 5th National Symposium on Sensor Fusion, Orlando, Florida, Vol. I, April 1992, pp [6] A. Omar Aboutalib, Luong Tran, Jeff Cameron, " Enhancement of 4-D Laser Radar Imagery for Target Identification", 1993 Infrared Information Symposia on Active Systems, Monterey, California, Proc. IRIS, vol. II, November 1993, pp [7] O. Aboutalib and Timothy J. Klausutis, All source adaptive fusion for aided navigation in non-gps environment, SPIE, March [8] Berthold K.P. Horn and Brian G. Schunck, Determining Optical Flow, Artificial Intelligence 17, , [9] B.D. Lucas and T. Kanade, An Iterative Image Registration Technique with an Application to Stereo Vision, Proc. DARPA Image Understanding Workshop, pp , [10] Joachim Wickert and Christoph Schnorr, Variational Optic Flow Computation with a Spatio-Temporal Smoothness Constraint, J. of Mathematical Imaging and Vision 14: , [11] S.S. Beauchemin and J.L. Barron, The Computation of Optical Flow, ACM Computing Surveys, Vol 27 No. 3, , [12] Jianbo Shi and Carlo Tomasi, Good Features to Track, IEEE Conference on Computer Vision and Pattern Recognition (CVPR94), [13] Berthold K.P. Horn and Brian G. Schunck, Determining Optical Flow, Artificial Intelligence 17, , [14] B.D. Lucas and T. Kanade, An Iterative Image Registration Technique with an Application to Stereo Vision, Proc. DARPA Image Understanding Workshop, pp , [15] Joachim Wickert and Christoph Schnorr, Variational Optic Flow Computation with a Spatio-Temporal Smoothness Constraint, J. of Mathematical Imaging and Vision 14: , [16] S.S. Beauchemin and J.L. Barron, The Computation of Optical Flow, ACM Computing Surveys, Vol 27 No. 3, , [17] Jianbo Shi and Carlo Tomasi, Good Features to Track, IEEE Conference on Computer Vision and Pattern Recognition (CVPR94), [18] Artur M. Arsenio, "An Embodied Approach to Perceptual Grouping," cvprw, p. 51, Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04), [19] David G. Lowe, Distinctive Image Features from Scale- Invariant Keypoints, International Journal of Computer Vision,
Sensor Fusion: Potential, Challenges and Applications. Presented by KVH Industries and Geodetics, Inc. December 2016
Sensor Fusion: Potential, Challenges and Applications Presented by KVH Industries and Geodetics, Inc. December 2016 1 KVH Industries Overview Innovative technology company 600 employees worldwide Focused
More informationNavigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM
Glossary of Navigation Terms accelerometer. A device that senses inertial reaction to measure linear or angular acceleration. In its simplest form, it consists of a case-mounted spring and mass arrangement
More informationTEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU
TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU Alison K. Brown, Ph.D.* NAVSYS Corporation, 1496 Woodcarver Road, Colorado Springs, CO 891 USA, e-mail: abrown@navsys.com Abstract
More informationError Simulation and Multi-Sensor Data Fusion
Error Simulation and Multi-Sensor Data Fusion AERO4701 Space Engineering 3 Week 6 Last Week Looked at the problem of attitude determination for satellites Examined several common methods such as inertial
More informationPerspective Sensing for Inertial Stabilization
Perspective Sensing for Inertial Stabilization Dr. Bernard A. Schnaufer Jeremy Nadke Advanced Technology Center Rockwell Collins, Inc. Cedar Rapids, IA Agenda Rockwell Collins & the Advanced Technology
More informationDealing with Scale. Stephan Weiss Computer Vision Group NASA-JPL / CalTech
Dealing with Scale Stephan Weiss Computer Vision Group NASA-JPL / CalTech Stephan.Weiss@ieee.org (c) 2013. Government sponsorship acknowledged. Outline Why care about size? The IMU as scale provider: The
More informationAircraft Tracking Based on KLT Feature Tracker and Image Modeling
Aircraft Tracking Based on KLT Feature Tracker and Image Modeling Khawar Ali, Shoab A. Khan, and Usman Akram Computer Engineering Department, College of Electrical & Mechanical Engineering, National University
More informationThe Applanix Approach to GPS/INS Integration
Lithopoulos 53 The Applanix Approach to GPS/INS Integration ERIK LITHOPOULOS, Markham ABSTRACT The Position and Orientation System for Direct Georeferencing (POS/DG) is an off-the-shelf integrated GPS/inertial
More informationTightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains
Tightly-Integrated Visual and Inertial Navigation for Pinpoint Landing on Rugged Terrains PhD student: Jeff DELAUNE ONERA Director: Guy LE BESNERAIS ONERA Advisors: Jean-Loup FARGES Clément BOURDARIAS
More informationLIDAR MAPPING FACT SHEET
1. LIDAR THEORY What is lidar? Lidar is an acronym for light detection and ranging. In the mapping industry, this term is used to describe an airborne laser profiling system that produces location and
More informationAn Angle Estimation to Landmarks for Autonomous Satellite Navigation
5th International Conference on Environment, Materials, Chemistry and Power Electronics (EMCPE 2016) An Angle Estimation to Landmarks for Autonomous Satellite Navigation Qing XUE a, Hongwen YANG, Jian
More informationNotes 9: Optical Flow
Course 049064: Variational Methods in Image Processing Notes 9: Optical Flow Guy Gilboa 1 Basic Model 1.1 Background Optical flow is a fundamental problem in computer vision. The general goal is to find
More information3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots
3D Fusion of Infrared Images with Dense RGB Reconstruction from Multiple Views - with Application to Fire-fighting Robots Yuncong Chen 1 and Will Warren 2 1 Department of Computer Science and Engineering,
More informationFeature Tracking and Optical Flow
Feature Tracking and Optical Flow Prof. D. Stricker Doz. G. Bleser Many slides adapted from James Hays, Derek Hoeim, Lana Lazebnik, Silvio Saverse, who 1 in turn adapted slides from Steve Seitz, Rick Szeliski,
More informationHuman Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg
Human Detection A state-of-the-art survey Mohammad Dorgham University of Hamburg Presentation outline Motivation Applications Overview of approaches (categorized) Approaches details References Motivation
More informationASTRIUM Space Transportation
SIMU-LANDER Hazard avoidance & advanced GNC for interplanetary descent and soft-landing S. Reynaud, E. Ferreira, S. Trinh, T. Jean-marius 3rd International Workshop on Astrodynamics Tools and Techniques
More informationEstimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter
Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter Przemys law G asior, Stanis law Gardecki, Jaros law Gośliński and Wojciech Giernacki Poznan University of
More informationExterior Orientation Parameters
Exterior Orientation Parameters PERS 12/2001 pp 1321-1332 Karsten Jacobsen, Institute for Photogrammetry and GeoInformation, University of Hannover, Germany The georeference of any photogrammetric product
More informationRange Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation
Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical
More informationSatellite Attitude Determination
Satellite Attitude Determination AERO4701 Space Engineering 3 Week 5 Last Week Looked at GPS signals and pseudorange error terms Looked at GPS positioning from pseudorange data Looked at GPS error sources,
More informationMobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS
Mobile Robotics Mathematics, Models, and Methods Alonzo Kelly Carnegie Mellon University HI Cambridge UNIVERSITY PRESS Contents Preface page xiii 1 Introduction 1 1.1 Applications of Mobile Robots 2 1.2
More informationLecture 13 Visual Inertial Fusion
Lecture 13 Visual Inertial Fusion Davide Scaramuzza Course Evaluation Please fill the evaluation form you received by email! Provide feedback on Exercises: good and bad Course: good and bad How to improve
More informationSensor Modalities. Sensor modality: Different modalities:
Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature
More informationCamera and Inertial Sensor Fusion
January 6, 2018 For First Robotics 2018 Camera and Inertial Sensor Fusion David Zhang david.chao.zhang@gmail.com Version 4.1 1 My Background Ph.D. of Physics - Penn State Univ. Research scientist at SRI
More informationPersonal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery
Personal Navigation and Indoor Mapping: Performance Characterization of Kinect Sensor-based Trajectory Recovery 1 Charles TOTH, 1 Dorota BRZEZINSKA, USA 2 Allison KEALY, Australia, 3 Guenther RETSCHER,
More informationInnovative Visual Navigation Solutions for ESA s Lunar Lander Mission Dr. E. Zaunick, D. Fischer, Dr. I. Ahrns, G. Orlando, B. Polle, E.
Lunar Lander Phase B1 Innovative Visual Navigation Solutions for ESA s Lunar Lander Mission Dr. E. Zaunick, D. Fischer, Dr. I. Ahrns, G. Orlando, B. Polle, E. Kervendal p. 0 9 th International Planetary
More informationComparison Between The Optical Flow Computational Techniques
Comparison Between The Optical Flow Computational Techniques Sri Devi Thota #1, Kanaka Sunanda Vemulapalli* 2, Kartheek Chintalapati* 3, Phanindra Sai Srinivas Gudipudi* 4 # Associate Professor, Dept.
More informationStrapdown Inertial Navigation Technology
Strapdown Inertial Navigation Technology 2nd Edition David Titterton and John Weston The Institution of Engineering and Technology Preface xv 1 Introduction 1 1.1 Navigation 1 1.2 Inertial navigation 2
More informationOutline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017
Electric Electronic Engineering Bogazici University December 13, 2017 Absolute position measurement Outline Motion Odometry Inertial systems Environmental Tactile Proximity Sensing Ground-Based RF Beacons
More informationUse of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments
Use of Image aided Navigation for UAV Navigation and Target Geolocation in Urban and GPS denied Environments Precision Strike Technology Symposium Alison K. Brown, Ph.D. NAVSYS Corporation, Colorado Phone:
More informationDistributed Vision-Aided Cooperative Navigation Based on Three-View Geometry
Distributed Vision-Aided Cooperative Navigation Based on hree-view Geometry Vadim Indelman, Pini Gurfil Distributed Space Systems Lab, Aerospace Engineering, echnion Ehud Rivlin Computer Science, echnion
More informationOptical flow and tracking
EECS 442 Computer vision Optical flow and tracking Intro Optical flow and feature tracking Lucas-Kanade algorithm Motion segmentation Segments of this lectures are courtesy of Profs S. Lazebnik S. Seitz,
More informationAn Overview of Applanix.
An Overview of Applanix The Company The Industry Leader in Developing Aided Inertial Technology Founded on Canadian Aerospace and Defense Industry Expertise Providing Precise Position and Orientation Systems
More informationDominant plane detection using optical flow and Independent Component Analysis
Dominant plane detection using optical flow and Independent Component Analysis Naoya OHNISHI 1 and Atsushi IMIYA 2 1 School of Science and Technology, Chiba University, Japan Yayoicho 1-33, Inage-ku, 263-8522,
More informationAn Introduction to Lidar & Forestry May 2013
An Introduction to Lidar & Forestry May 2013 Introduction to Lidar & Forestry Lidar technology Derivatives from point clouds Applied to forestry Publish & Share Futures Lidar Light Detection And Ranging
More informationSURVEY OF LOCAL AND GLOBAL OPTICAL FLOW WITH COARSE TO FINE METHOD
SURVEY OF LOCAL AND GLOBAL OPTICAL FLOW WITH COARSE TO FINE METHOD M.E-II, Department of Computer Engineering, PICT, Pune ABSTRACT: Optical flow as an image processing technique finds its applications
More informationW4. Perception & Situation Awareness & Decision making
W4. Perception & Situation Awareness & Decision making Robot Perception for Dynamic environments: Outline & DP-Grids concept Dynamic Probabilistic Grids Bayesian Occupancy Filter concept Dynamic Probabilistic
More informationNon-symmetric membership function for Fuzzy-based visual servoing onboard a UAV
1 Non-symmetric membership function for Fuzzy-based visual servoing onboard a UAV M. A. Olivares-Méndez and P. Campoy and C. Martínez and I. F. Mondragón B. Computer Vision Group, DISAM, Universidad Politécnica
More informationCamera Calibration for a Robust Omni-directional Photogrammetry System
Camera Calibration for a Robust Omni-directional Photogrammetry System Fuad Khan 1, Michael Chapman 2, Jonathan Li 3 1 Immersive Media Corporation Calgary, Alberta, Canada 2 Ryerson University Toronto,
More informationFeature Tracking and Optical Flow
Feature Tracking and Optical Flow Prof. D. Stricker Doz. G. Bleser Many slides adapted from James Hays, Derek Hoeim, Lana Lazebnik, Silvio Saverse, who in turn adapted slides from Steve Seitz, Rick Szeliski,
More informationV-Sentinel: A Novel Framework for Situational Awareness and Surveillance
V-Sentinel: A Novel Framework for Situational Awareness and Surveillance Suya You Integrated Media Systems Center Computer Science Department University of Southern California March 2005 1 Objective Developing
More informationEvaluating the Performance of a Vehicle Pose Measurement System
Evaluating the Performance of a Vehicle Pose Measurement System Harry Scott Sandor Szabo National Institute of Standards and Technology Abstract A method is presented for evaluating the performance of
More informationCLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS
CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL CAMERA THERMAL (e.g. TIMS) VIDEO CAMERA MULTI- SPECTRAL SCANNERS VISIBLE & NIR MICROWAVE HYPERSPECTRAL (e.g. AVIRIS) SLAR Real Aperture
More informationThis was written by a designer of inertial guidance machines, & is correct. **********************************************************************
EXPLANATORY NOTES ON THE SIMPLE INERTIAL NAVIGATION MACHINE How does the missile know where it is at all times? It knows this because it knows where it isn't. By subtracting where it is from where it isn't
More informationLeica Systems Overview
RC30 AERIAL CAMERA SYSTEM Leica Systems Overview The Leica RC30 aerial film camera is the culmination of decades of development, started with Wild's first aerial camera in the 1920s. Beautifully engineered
More informationINERTIAL NAVIGATION SENSOR INTEGRATED MOTION ANALYSIS FOR OBSTACLE DETECTION
INERTIAL NAVIGATION SENSOR INTEGRATED MOTION ANALYSIS FOR OBSTACLE DETECTION Barry Roberts, Banavar Sridhar*, and Bir Bhanuf Honeywell Systems and Research Center Minneapolis, Minnesota NASA Ames Research
More informationGIS Data Collection. This chapter reviews the main methods of GIS data capture and transfer and introduces key practical management issues.
9 GIS Data Collection OVERVIEW This chapter reviews the main methods of GIS data capture and transfer and introduces key practical management issues. It distinguishes between primary (direct measurement)
More informationENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning
1 ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning Petri Rönnholm Aalto University 2 Learning objectives To recognize applications of laser scanning To understand principles
More informationA Multiple-Hypothesis Tracking of Multiple Ground Targets from Aerial Video with Dynamic Sensor Control *
A Multiple-Hypothesis Tracking of Multiple Ground Targets from Aerial Video with Dynamic Sensor Control * Pablo Arambel, Matthew Antone, Constantino Rago, Herbert Landau ALPHATECH, Inc, 6 New England Executive
More informationGI-Eye II GPS/Inertial System For Target Geo-Location and Image Geo-Referencing
GI-Eye II GPS/Inertial System For Target Geo-Location and Image Geo-Referencing David Boid, Alison Brown, Ph. D., Mark Nylund, Dan Sullivan NAVSYS Corporation 14960 Woodcarver Road, Colorado Springs, CO
More informationRDT&E BUDGET ITEM JUSTIFICATION SHEET (R-2 Exhibit) February 2000
PE NUMBER: 0602702F PE TITLE: Command Control and Communications BUDGET ACTIVITY RDT&E BUDGET ITEM JUSTIFICATION SHEET (R-2 Exhibit) February 2000 PE NUMBER AND TITLE 02 - Applied Research 0602702F Command
More informationQuaternion Kalman Filter Design Based on MEMS Sensors
, pp.93-97 http://dx.doi.org/10.14257/astl.2014.76.20 Quaternion Kalman Filter Design Based on MEMS Sensors Su zhongbin,yanglei, Kong Qingming School of Electrical and Information. Northeast Agricultural
More informationSatellite and Inertial Navigation and Positioning System
Satellite and Inertial Navigation and Positioning System Project Proposal By: Luke Pfister Dan Monroe Project Advisors: Dr. In Soo Ahn Dr. Yufeng Lu EE 451 Senior Capstone Project December 10, 2009 PROJECT
More informationGeometric Rectification of Remote Sensing Images
Geometric Rectification of Remote Sensing Images Airborne TerrestriaL Applications Sensor (ATLAS) Nine flight paths were recorded over the city of Providence. 1 True color ATLAS image (bands 4, 2, 1 in
More informationLight Detection and Ranging (LiDAR)
Light Detection and Ranging (LiDAR) http://code.google.com/creative/radiohead/ Types of aerial sensors passive active 1 Active sensors for mapping terrain Radar transmits microwaves in pulses determines
More informationMissile Simulation in Support of Research, Development, Test Evaluation and Acquisition
NDIA 2012 Missile Simulation in Support of Research, Development, Test Evaluation and Acquisition 15 May 2012 Briefed by: Stephanie Brown Reitmeier United States Army Aviation and Missile Research, Development,
More informationStrapdown Inertial Navigation Technology, Second Edition D. H. Titterton J. L. Weston
Strapdown Inertial Navigation Technology, Second Edition D. H. Titterton J. L. Weston NavtechGPS Part #1147 Progress in Astronautics and Aeronautics Series, 207 Published by AIAA, 2004, Revised, 2nd Edition,
More informationRevising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History
Revising Stereo Vision Maps in Particle Filter Based SLAM using Localisation Confidence and Sample History Simon Thompson and Satoshi Kagami Digital Human Research Center National Institute of Advanced
More informationAPPLICATION OF AERIAL VIDEO FOR TRAFFIC FLOW MONITORING AND MANAGEMENT
Pitu Mirchandani, Professor, Department of Systems and Industrial Engineering Mark Hickman, Assistant Professor, Department of Civil Engineering Alejandro Angel, Graduate Researcher Dinesh Chandnani, Graduate
More informationMotion and Optical Flow. Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi
Motion and Optical Flow Slides from Ce Liu, Steve Seitz, Larry Zitnick, Ali Farhadi We live in a moving world Perceiving, understanding and predicting motion is an important part of our daily lives Motion
More informationMULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang
MULTI-MODAL MAPPING Robotics Day, 31 Mar 2017 Frank Mascarich, Shehryar Khattak, Tung Dang Application-Specific Sensors Cameras TOF Cameras PERCEPTION LiDAR IMU Localization Mapping Autonomy Robotic Perception
More informationFAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES
FAST REGISTRATION OF TERRESTRIAL LIDAR POINT CLOUD AND SEQUENCE IMAGES Jie Shao a, Wuming Zhang a, Yaqiao Zhu b, Aojie Shen a a State Key Laboratory of Remote Sensing Science, Institute of Remote Sensing
More informationSpace and Naval Warfare Systems Center Atlantic Information Warfare Research Project (IWRP)
Space and Naval Warfare Systems Center Atlantic Information Warfare Research Project (IWRP) SSC Atlantic is part of the Naval Research & Development Establishment (NR&DE) Information Warfare Research Project
More informationCamera Drones Lecture 2 Control and Sensors
Camera Drones Lecture 2 Control and Sensors Ass.Prof. Friedrich Fraundorfer WS 2017 1 Outline Quadrotor control principles Sensors 2 Quadrotor control - Hovering Hovering means quadrotor needs to hold
More informationTerrestrial GPS setup Fundamentals of Airborne LiDAR Systems, Collection and Calibration. JAMIE YOUNG Senior Manager LiDAR Solutions
Terrestrial GPS setup Fundamentals of Airborne LiDAR Systems, Collection and Calibration JAMIE YOUNG Senior Manager LiDAR Solutions Topics Terrestrial GPS reference Planning and Collection Considerations
More informationApplying Synthetic Images to Learning Grasping Orientation from Single Monocular Images
Applying Synthetic Images to Learning Grasping Orientation from Single Monocular Images 1 Introduction - Steve Chuang and Eric Shan - Determining object orientation in images is a well-established topic
More informationSpatio-Temporal Stereo Disparity Integration
Spatio-Temporal Stereo Disparity Integration Sandino Morales and Reinhard Klette The.enpeda.. Project, The University of Auckland Tamaki Innovation Campus, Auckland, New Zealand pmor085@aucklanduni.ac.nz
More informationTransactions on Information and Communications Technologies vol 16, 1996 WIT Press, ISSN
ransactions on Information and Communications echnologies vol 6, 996 WI Press, www.witpress.com, ISSN 743-357 Obstacle detection using stereo without correspondence L. X. Zhou & W. K. Gu Institute of Information
More informationVideo Georegistration: Key Challenges. Steve Blask Harris Corporation GCSD Melbourne, FL 32934
Video Georegistration: Key Challenges Steve Blask sblask@harris.com Harris Corporation GCSD Melbourne, FL 32934 Definitions Registration: image to image alignment Find pixel-to-pixel correspondences between
More informationMotion and Target Tracking (Overview) Suya You. Integrated Media Systems Center Computer Science Department University of Southern California
Motion and Target Tracking (Overview) Suya You Integrated Media Systems Center Computer Science Department University of Southern California 1 Applications - Video Surveillance Commercial - Personals/Publics
More informationFusion of Radar and EO-sensors for Surveillance
of Radar and EO-sensors for Surveillance L.J.H.M. Kester, A. Theil TNO Physics and Electronics Laboratory P.O. Box 96864, 2509 JG The Hague, The Netherlands kester@fel.tno.nl, theil@fel.tno.nl Abstract
More informationExploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation
Exploitation of GPS-Control Points in low-contrast IR-imagery for homography estimation Patrick Dunau 1 Fraunhofer-Institute, of Optronics, Image Exploitation and System Technologies (IOSB), Gutleuthausstr.
More informationUsing temporal seeding to constrain the disparity search range in stereo matching
Using temporal seeding to constrain the disparity search range in stereo matching Thulani Ndhlovu Mobile Intelligent Autonomous Systems CSIR South Africa Email: tndhlovu@csir.co.za Fred Nicolls Department
More informationMotion estimation of unmanned marine vehicles Massimo Caccia
Motion estimation of unmanned marine vehicles Massimo Caccia Consiglio Nazionale delle Ricerche Istituto di Studi sui Sistemi Intelligenti per l Automazione Via Amendola 122 D/O, 70126, Bari, Italy massimo.caccia@ge.issia.cnr.it
More informationInternational Research Journal of Engineering and Technology (IRJET) e-issn: Volume: 04 Issue: 09 Sep p-issn:
Automatic Target Detection Using Maximum Average Correlation Height Filter and Distance Classifier Correlation Filter in Synthetic Aperture Radar Data and Imagery Puttaswamy M R 1, Dr. P. Balamurugan 2
More informationAided-inertial for GPS-denied Navigation and Mapping
Aided-inertial for GPS-denied Navigation and Mapping Erik Lithopoulos Applanix Corporation 85 Leek Crescent, Richmond Ontario, Canada L4B 3B3 elithopoulos@applanix.com ABSTRACT This paper describes the
More informationMini Survey Paper (Robotic Mapping) Ryan Hamor CPRE 583 September 2011
Mini Survey Paper (Robotic Mapping) Ryan Hamor CPRE 583 September 2011 Introduction The goal of this survey paper is to examine the field of robotic mapping and the use of FPGAs in various implementations.
More informationSelection and Integration of Sensors Alex Spitzer 11/23/14
Selection and Integration of Sensors Alex Spitzer aes368@cornell.edu 11/23/14 Sensors Perception of the outside world Cameras, DVL, Sonar, Pressure Accelerometers, Gyroscopes, Magnetometers Position vs
More informationDigital Image Processing Lectures 1 & 2
Lectures 1 & 2, Professor Department of Electrical and Computer Engineering Colorado State University Spring 2013 Introduction to DIP The primary interest in transmitting and handling images in digital
More informationChapter 9 Object Tracking an Overview
Chapter 9 Object Tracking an Overview The output of the background subtraction algorithm, described in the previous chapter, is a classification (segmentation) of pixels into foreground pixels (those belonging
More informationRange Sensors (time of flight) (1)
Range Sensors (time of flight) (1) Large range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic sensors, infra-red sensors
More informationINFRARED AUTONOMOUS ACQUISITION AND TRACKING
INFRARED AUTONOMOUS ACQUISITION AND TRACKING Teresa L.P. Olson and Harry C. Lee Teresa.Lolson@lmco.com (407) 356-7109 Harrv.c.lee@lmco.com (407) 356-6997 Lockheed Martin Missiles and Fire Control - Orlando
More informationAmateur Rocketry Flight Data Logging Device Version II
Amateur Rocketry Flight Data Logging Device Version II John Litzenberger & Ben Merryman Design Requirements Document University of Colorado at Colorado Springs Table of Contents: Overview...3 Statement
More informationCourse Outline (1) #6 Data Acquisition for Built Environment. Fumio YAMAZAKI
AT09.98 Applied GIS and Remote Sensing for Disaster Mitigation #6 Data Acquisition for Built Environment 9 October, 2002 Fumio YAMAZAKI yamazaki@ait.ac.th http://www.star.ait.ac.th/~yamazaki/ Course Outline
More informationChapter 13. Vision Based Guidance. Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 2012,
Chapter 3 Vision Based Guidance Beard & McLain, Small Unmanned Aircraft, Princeton University Press, 22, Chapter 3: Slide Architecture w/ Camera targets to track/avoid vision-based guidance waypoints status
More informationDynamic Modelling for MEMS-IMU/Magnetometer Integrated Attitude and Heading Reference System
International Global Navigation Satellite Systems Society IGNSS Symposium 211 University of New South Wales, Sydney, NSW, Australia 15 17 November, 211 Dynamic Modelling for MEMS-IMU/Magnetometer Integrated
More informationTesting the Possibilities of Using IMUs with Different Types of Movements
137 Testing the Possibilities of Using IMUs with Different Types of Movements Kajánek, P. and Kopáčik A. Slovak University of Technology, Faculty of Civil Engineering, Radlinského 11, 81368 Bratislava,
More informationVisible and Long-Wave Infrared Image Fusion Schemes for Situational. Awareness
Visible and Long-Wave Infrared Image Fusion Schemes for Situational Awareness Multi-Dimensional Digital Signal Processing Literature Survey Nathaniel Walker The University of Texas at Austin nathaniel.walker@baesystems.com
More informationAdaptive Multi-Stage 2D Image Motion Field Estimation
Adaptive Multi-Stage 2D Image Motion Field Estimation Ulrich Neumann and Suya You Computer Science Department Integrated Media Systems Center University of Southern California, CA 90089-0781 ABSRAC his
More informationVisual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys
Visual motion Man slides adapted from S. Seitz, R. Szeliski, M. Pollefes Motion and perceptual organization Sometimes, motion is the onl cue Motion and perceptual organization Sometimes, motion is the
More informationElectronic and Mission Systems
Electronic and Mission Systems Boeing Networks, Space, and Security Small Business Workshop Bob Tamaru November 8, 2011 Where We Fit In Boeing Boeing Military Aircraft AEW&C P-8A C-17 Italy Japan 7A7 F-18
More informationUAV Autonomous Navigation in a GPS-limited Urban Environment
UAV Autonomous Navigation in a GPS-limited Urban Environment Yoko Watanabe DCSD/CDIN JSO-Aerial Robotics 2014/10/02-03 Introduction 2 Global objective Development of a UAV onboard system to maintain flight
More informationRelating Local Vision Measurements to Global Navigation Satellite Systems Using Waypoint Based Maps
Relating Local Vision Measurements to Global Navigation Satellite Systems Using Waypoint Based Maps John W. Allen Samuel Gin College of Engineering GPS and Vehicle Dynamics Lab Auburn University Auburn,
More informationPervasive Computing. OpenLab Jan 14 04pm L Institute of Networked and Embedded Systems
Pervasive Computing Institute of Networked and Embedded Systems OpenLab 2010 Jan 14 04pm L4.1.01 MISSION STATEMENT Founded in 2007, the Pervasive Computing Group at Klagenfurt University is part of the
More informationManeuver Strategy in Beyond-Visual-Range Air Combat
2011 International Conference on Information Communication and Management IPCSIT vol.16 (2011) (2011) IACSIT Press, Singapore Maneuver Strategy in Beyond-Visual-Range Air Combat Liang Xiao, Jun Huang Beijing
More informationCS 378: Autonomous Intelligent Robotics. Instructor: Jivko Sinapov
CS 378: Autonomous Intelligent Robotics Instructor: Jivko Sinapov http://www.cs.utexas.edu/~jsinapov/teaching/cs378/ Visual Registration and Recognition Announcements Homework 6 is out, due 4/5 4/7 Installing
More informationCHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS
CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS ökçen Aslan 1,2, Afşar Saranlı 2 1 Defence Research and Development Institute (SAE), TÜBİTAK 2 Dept. of Electrical and Electronics Eng.,
More informationComputer Animation and Visualisation. Lecture 3. Motion capture and physically-based animation of characters
Computer Animation and Visualisation Lecture 3. Motion capture and physically-based animation of characters Character Animation There are three methods Create them manually Use real human / animal motions
More informationAerial and Mobile LiDAR Data Fusion
Creating Value Delivering Solutions Aerial and Mobile LiDAR Data Fusion Dr. Srini Dharmapuri, CP, PMP What You Will Learn About LiDAR Fusion Mobile and Aerial LiDAR Technology Components & Parameters Project
More informationNAVIGATION AND ELECTRO-OPTIC SENSOR INTEGRATION TECHNOLOGY FOR FUSION OF IMAGERY AND DIGITAL MAPPING PRODUCTS. Alison Brown, NAVSYS Corporation
NAVIGATION AND ELECTRO-OPTIC SENSOR INTEGRATION TECHNOLOGY FOR FUSION OF IMAGERY AND DIGITAL MAPPING PRODUCTS Alison Brown, NAVSYS Corporation Paul Olson, CECOM Abstract Several military and commercial
More information