E80. Experimental Engineering. Lecture 9 Inertial Measurement

Similar documents
Autonomous Navigation for Flying Robots

Camera Drones Lecture 2 Control and Sensors

EE565:Mobile Robotics Lecture 2

Collaboration is encouraged among small groups (e.g., 2-3 students).

navigation Isaac Skog

Inertial Navigation Systems

Satellite Attitude Determination

Testing the Possibilities of Using IMUs with Different Types of Movements

Inertial Measurement Units II!

Inertial Measurement Units I!

Selection and Integration of Sensors Alex Spitzer 11/23/14

EE565:Mobile Robotics Lecture 3

Introduction to Inertial Navigation (INS tutorial short)

Inertial Measurement for planetary exploration: Accelerometers and Gyros

Satellite and Inertial Navigation and Positioning System

IMU and Encoders. Team project Robocon 2016

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

Introduction to Inertial Navigation and Kalman filtering

An Intro to Gyros. FTC Team #6832. Science and Engineering Magnet - Dallas ISD

Calibration of Inertial Measurement Units Using Pendulum Motion

Optimization-Based Calibration of a Triaxial Accelerometer-Magnetometer

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

Design of a Three-Axis Rotary Platform

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK

MULTI-SENSOR DATA FUSION FOR LAND VEHICLE ATTITUDE ESTIMATION USING A FUZZY EXPERT SYSTEM

CAMERA GIMBAL PERFORMANCE IMPROVEMENT WITH SPINNING-MASS MECHANICAL GYROSCOPES

Lecture 13 Visual Inertial Fusion

3-D D Euclidean Space - Vectors

Error Simulation and Multi-Sensor Data Fusion

ADVANTAGES OF INS CONTROL SYSTEMS

Modeling, Parameter Estimation, and Navigation of Indoor Quadrotor Robots

MTRX4700 Experimental Robotics

CHARACTERIZATION AND CALIBRATION OF MEMS INERTIAL MEASUREMENT UNITS

ROBOTICS AND AUTONOMOUS SYSTEMS

Mobile Robots Locomotion & Sensors

Robotics and Autonomous Systems

Robotics and Autonomous Systems

Autonomous Navigation in Complex Indoor and Outdoor Environments with Micro Aerial Vehicles

Centre for Autonomous Systems

Quaternion Kalman Filter Design Based on MEMS Sensors

Low Cost solution for Pose Estimation of Quadrotor

Exercise 1: Kinematics of the ABB IRB 120

CHAPTER 2 SENSOR DATA SIMULATION: A KINEMATIC APPROACH

This was written by a designer of inertial guidance machines, & is correct. **********************************************************************

GPS + Inertial Sensor Fusion

Calibrating a Triaxial Accelerometer- Magnetometer. By Erin L. Renk, Walter Collins, Matthew Rizzo, Fuju Lee, and Dennis S.

Robotics (Kinematics) Winter 1393 Bonab University

Review Paper: Inertial Measurement

Me 3-Axis Accelerometer and Gyro Sensor

GPS denied Navigation Solutions

ROTATING IMU FOR PEDESTRIAN NAVIGATION

4INERTIAL NAVIGATION CHAPTER 20. INTRODUCTION TO INERTIAL NAVIGATION...333

Camera and Inertial Sensor Fusion

Evaluating the Performance of a Vehicle Pose Measurement System

(1) and s k ωk. p k vk q

Research Article An Intuitive Approach to Inertial Sensor Bias Estimation

COS Lecture 13 Autonomous Robot Navigation

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

State-space models for 3D visual tracking

Chapter 4 Dynamics. Part Constrained Kinematics and Dynamics. Mobile Robotics - Prof Alonzo Kelly, CMU RI

Analysis of Euler Angles in a Simple Two-Axis Gimbals Set

Electronics Design Contest 2016 Wearable Controller VLSI Category Participant guidance

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

INTEGRATED TECH FOR INDUSTRIAL POSITIONING

CS283: Robotics Fall 2016: Sensors

Sphero Lightning Lab Cheat Sheet

Strapdown Inertial Navigation Technology

OFERTA O120410PA CURRENT DATE 10/04//2012 VALID UNTIL 10/05/2012 SUMMIT XL

ECGR4161/5196 Lecture 6 June 9, 2011

Attitude Control for Small Satellites using Control Moment Gyros

Publ. Astron. Obs. Belgrade No. 91 (2012), COSMIC NAVIGATION AND INERTIAL NAVIGATION SYSTEM 1. INTRODUCTION

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

TEST RESULTS OF A GPS/INERTIAL NAVIGATION SYSTEM USING A LOW COST MEMS IMU

Gravity compensation in accelerometer measurements for robot navigation on inclined surfaces

CS612 - Algorithms in Bioinformatics

Strapdown system technology

AE483: Lab #1 Sensor Data Collection and Analysis

State Estimation for Continuous-Time Systems with Perspective Outputs from Discrete Noisy Time-Delayed Measurements

MEAM 620: HW 1. Sachin Chitta Assigned: January 10, 2007 Due: January 22, January 10, 2007

Mission Overview Cal Poly s Design Current and future work

MEMS technology quality requirements as applied to multibeam echosounder. Jerzy DEMKOWICZ, Krzysztof BIKONIS

Introduction to Mobile Robotics

STRAPDOWN ANALYTICS - SECOND EDITION. Notice - Strapdown Associates. Inc. Copyrighted Material

Development of a MEMs-Based IMU Unit

7 3-Sep Localization and Navigation (GPS, INS, & SLAM) 8 10-Sep State-space modelling & Controller Design 9 17-Sep Vision-based control

AUTONOMOUS PLANETARY ROVER CONTROL USING INVERSE SIMULATION

Use of the Magnetic Field for Improving Gyroscopes Biases Estimation

The Applanix Approach to GPS/INS Integration

Strapdown Inertial Navigation Technology, Second Edition D. H. Titterton J. L. Weston

Synchronization aspects of sensor and data fusion in a research multi-sensor-system

Chapter 2 - Basic Mathematics for 3D Computer Graphics

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z

A Kalman Filter Based Attitude Heading Reference System Using a Low Cost Inertial Measurement Unit

Satellite/Inertial Navigation and Positioning System (SINAPS)

Dynamic Modelling for MEMS-IMU/Magnetometer Integrated Attitude and Heading Reference System

COARSE LEVELING OF INS ATTITUDE UNDER DYNAMIC TRAJECTORY CONDITIONS. Paul G. Savage Strapdown Associates, Inc.

Modeling and Estimation for Maneuvering Target Tracking with Inertial Systems using Interacting Multiple Models. (M.Sc. Thesis) Furrukh Sana

Transcription:

Lecture 9 Inertial Measurement http://www.volker-doormann.org/physics.htm Feb. 19, 2013 Christopher M. Clark

Where is the rocket?

Outline Sensors People Accelerometers Gyroscopes Representations State Prediction Example Systems Bounding with KF

Sensors People http://en.wikipedia.org/wiki/file:bigotolith.jpg

Sensors Accelerometers a = k x / m

Sensors Accelerometers The accelerometers are typically MEMS based They are small cantilever beams (~100 µm) Khir MH, Qu P, Qu H - Sensors (Basel) (2011)

Sensors Gyroscopes (original) Mounted on two nested gimbals, the spinning wheel of the gyroscope is free to take any orientation. High spin rate leads to high angular momentum

E80 Sensors http://www.youtube.com/watch?v=cquva_ipesa

Sensors Gyroscopes (original) The gyroscope resists any change of orientation caused by external torques, due to the principle of conservation of angular momentum. The orientation of the gyroscope remains nearly fixed, regardless of any motion of the platform.

Sensors Gyroscopes (original) Gyroscope orientation can be obtained by measuring the angles between adjacent gimbals.

E80 Sensors MEMS (Microelectromechanical Systems) Gyroscopes q q Two masses oscillate back and forth from the center of rotation with velocity v. A rotation will cause a Coriolis force in this coordinate frame. 11

Sensors MEMS Gyroscopes Their deflection y is measured, to establish a force F Coriolis = ky The acceleration is obtained since the mass is known -2m Ω x v = F Coriolis 12

Sensors Inertial Measurement Units 3 Accelerometers 3 Gyroscopes 3 Magnetometers(?) www.barnardmicrosystems.com 13

Sensors Inertial Measurement Units Y 14

Outline Sensors Representations Cartesian Coordinate Frames Transformations State Prediction Example Systems Bounding with KF

Representations Cartesian Coordinate Frames We can represent the 3D position of a vehicle with the vector r =[ x y z ]

Representations Euler Angles We can represent the 3D orientation of a vehicle with the vector ϕ = [ α β γ ] Roll - α Pitch - β Yaw - γ

Representations Where do we place the origin? We can fix the origin at a specific location on earth, e.g. a rocket s launch pad. This is called the global or inertial coordinate frame Z r =[x y z] Y X

Representations Where do we place the origin? Z Z Y X We can ALSO fix the origin on a vehicle. This is called the local coordinate frame r =[x y z] Y X

Representations X L Where do we place the origin? We must differentiate between these two frames. What is the real difference between these two frames? A Transformation consisting of a rotation and translation Z L Z G Y L r =[x y z] Y G X G

Representations Transformations The rotation can be about 3 axes (i.e. the roll, pitch, yaw) Z G Z L γ Y L β Y G X L X G α

Representations Transformations The rotation can be about 3 axes (i.e. the roll, pitch, yaw) Z L Z G Y L X L Y G X G

Representations Transformations The translation can be in three directions Z L Z G Y L X L r =[x y z] Y G X G

Representations Rotations In 2D, it is easy to determine the effects of rotation on a vector q 2 Y q 1 α X q 2 = cos(α) -sin(α) q 1 sin(α) cos(α) = R(α) q 1

Representations Rotations In 3D, we can use similar rotation matrices Z G q 2 = 1 0 0 q 1 0 cos(α) -sin(α) 0 sin(α) cos(α) q 2 α q 1 Y G = R x (α) q 1 X G

Representations R x (α) = 1 0 0 0 cos(α) -sin(α) 0 sin(α) cos(α) R y (β) = cos(β) 0 sin(β) 0 1 0 -sin(β) 0 cos(β) R z (γ) = cos(γ) -sin(γ) 0 sin(γ) cos(γ) 0 0 0 1

Representations Rotations For 3 rotations, we can write a general Rotation Matrix R(α, β, γ) = R x (α) R x (β) R x (γ) Hence, we can rotate any vector with the general Rotation Matrix q 2 = R(α, β, γ) q 1

Outline Sensors Representations State Prediction Updating R(t) Updating r(t) Example Systems Bounding Errors with KF

State Prediction X L Strapdown Inertial Navigation Our IMU is fixed to the local frame Z L Z G Y L IMU We care about the state of the vehicle in the global frame r =[x y z] Y G X G

Updating R(t) Given: ω L (t) = [ ω x,l (t) ω y,l (t) ω z,l (t) ] Find: R(t)

Updating R(t) Lets define the rotational velocity matrix based on our gyroscope measurements ω L = [ω x,l (t) ω y,l (t) ω z,l (t)] Ω(t) = 0 -ω z (t) ω y (t) ω z (t) 0 -ω x (t) -ω y (t) ω x (t) 0

Updating R(t) It can be shown that the vehicle rotating with velocity Ω(t) for δt seconds will (approximately) yield the resulting rotation Matrix R(t+δt): R(t+δt) = R(t) [ I + Ω(t)δt ]

Updating r(t) Given: a L = [ a x,l a y,l a z,l ] R(t) Find: r G = [ a x,l a y,l a z,l ]

Updating r(t)

Updating r(t) First, convert to global reference frame a G (t) = R(t) a L (t) Second, remove gravity term a G (t) = [ a x,g (t) a y,g (t) a z,g (t)-g ]

Updating r(t) Third, integrate to obtain velocity v G (t) = v G (0) + t a G (τ) dτ Fourth, integrate to obtain position 0 r G (t) = r G (0) + t v G (τ) dτ 0

Updating r(t) Third, integrate to obtain (approximate) velocity v G (t+δt) = v G (t) + a G (t+δt) δt Fourth, integrate to obtain (approximate) position r G (t+δt) = r G (t) + v G (t+δt) δt

Outline Sensors Representations State Prediction Example Systems LN-3 The Jaguar Bounding Errors with KF

E80 Example Systems LN-3 Inertial Navigation System q q q q Developed in 1960 s Used gyros to help steady the platform Accelerometers on the platform were used to obtain accelerations in global coordinate frame Accelerations (double) integrated to obtain position 39

Example Systems http://www.youtube.com/watch?feature=player_embedded&v=epbr_4ymehs

Example Systems The Jaguar Lite Equipped with an IMU, camera, laser scanner, encoders, GPS

Example Systems The Jaguar Lite GUI provides acceleration measurements

Example Systems Question: Can we use the accelerometers alone to measure orientation? α

Outline Sensors Representations State Prediction Example Systems Bounding Errors with KF Exteroceptive Sensing Fusing measurements

Bounding our Errors Exteroceptive sensors Drift in inertial navigation is a problem We often use exteroceptive sensors which measure outside the robot to bound our errors Examples include vision systems, GPS, range finders

Bounding our Errors Exteroceptive sensors We can fuse measurements, e.g. integrated accelerometer measurements and range measurements, by averaging. For example, consider the 1D position estimate of the jaguar. x = 0.5 ( x IMU + (x wall - x laser ) ) x IMU is the double integrated IMU measurement x wall is the distance from the origin to the wall x laser is the central range measurement

Bounding our Errors Exteroceptive sensors Lets weight the average, where weights reflect measurement confidence x = w IMU x IMU + w laser (x wall - x laser ) w IMU + w laser

Bounding our Errors Exteroceptive sensors This leads us to a 1D Kalman Filter x t = x IMU,t + K t [(x wall x laser,t ) - x IMU,t ] K t = σ 2 IMU σ 2 IMU + σ2 laser σ 2 x = (1 K t ) σ2 IMU

Bounding our Errors IMU GPS Higher sampling rate Small errors between time steps (maybe centimeters) Uncertainty increases Large build up over time (unbounded) Lower sampling rate Larger errors(maybe meters) Uncertainty decreases No build up (bounded)

E80 Bounding our Errors http://www.youtube.com/watch?v=i_ccegm4x4c