Collaboration is encouraged among small groups (e.g., 2-3 students).

Similar documents
Camera Drones Lecture 2 Control and Sensors

Autonomous Navigation for Flying Robots

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

E80. Experimental Engineering. Lecture 9 Inertial Measurement

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

ECGR4161/5196 Lecture 6 June 9, 2011

EE565:Mobile Robotics Lecture 3

navigation Isaac Skog

CAMERA GIMBAL PERFORMANCE IMPROVEMENT WITH SPINNING-MASS MECHANICAL GYROSCOPES

Selection and Integration of Sensors Alex Spitzer 11/23/14

Sphero Lightning Lab Cheat Sheet

Testing the Possibilities of Using IMUs with Different Types of Movements

Satellite Attitude Determination

This was written by a designer of inertial guidance machines, & is correct. **********************************************************************

IMU and Encoders. Team project Robocon 2016

Lecture 2. Reactive and Deliberative Intelligence and Behavior. Lecture Outline

ROBOTICS AND AUTONOMOUS SYSTEMS

Inertial Navigation Systems

Robotics and Autonomous Systems

Robotics and Autonomous Systems

EE565:Mobile Robotics Lecture 2

9 Degrees of Freedom Inertial Measurement Unit with AHRS [RKI-1430]

Probabilistic Robotics

CSE-571 Robotics. Sensors for Mobile Robots. Beam-based Sensor Model. Proximity Sensors. Probabilistic Sensor Models. Beam-based Scan-based Landmarks

AMG Series. Motorized Position and Rate Gimbals. Continuous 360 rotation of azimuth and elevation including built-in slip ring

Marker Based Localization of a Quadrotor. Akshat Agarwal & Siddharth Tanwar

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

Attitude Control for Small Satellites using Control Moment Gyros

Sensor technology for mobile robots

Inertial Measurement Units II!

MULTI-MODAL MAPPING. Robotics Day, 31 Mar Frank Mascarich, Shehryar Khattak, Tung Dang

Evaluating the Performance of a Vehicle Pose Measurement System

Camera and Inertial Sensor Fusion

CS283: Robotics Fall 2016: Sensors

1 in = 25.4 mm 1 m = ft g = 9.81 m/s 2

AE483: Lab #1 Sensor Data Collection and Analysis

ROBOT SENSORS. 1. Proprioceptors

CS 460/560 Introduction to Computational Robotics Fall 2017, Rutgers University. Course Logistics. Instructor: Jingjin Yu

Indoor navigation using smartphones. Chris Hide IESSG, University of Nottingham, UK

Inertial Measurement for planetary exploration: Accelerometers and Gyros

An Intro to Gyros. FTC Team #6832. Science and Engineering Magnet - Dallas ISD

ROBOT TEAMS CH 12. Experiments with Cooperative Aerial-Ground Robots

Ceilbot vision and mapping system

Inertial Measurement Units I!

Development of a Ground Based Cooperating Spacecraft Testbed for Research and Education

INTEGRATED TECH FOR INDUSTRIAL POSITIONING

Exam in DD2426 Robotics and Autonomous Systems

Me 3-Axis Accelerometer and Gyro Sensor

Exterior Orientation Parameters

4INERTIAL NAVIGATION CHAPTER 20. INTRODUCTION TO INERTIAL NAVIGATION...333

DYNAMIC POSITIONING CONFERENCE September 16-17, Sensors

CS283: Robotics Fall 2017: Sensors

Design of a Three-Axis Rotary Platform

Quaternion Kalman Filter Design Based on MEMS Sensors

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

USB Virtual Reality HID. by Weston Taylor and Chris Budzynski Advisor: Dr. Malinowski

DS-IMU NEXT GENERATION OF NAVIGATION INSTRUMENTS

2011 FIRST Robotics Competition Sensor Manual

7 3-Sep Localization and Navigation (GPS, INS, & SLAM) 8 10-Sep State-space modelling & Controller Design 9 17-Sep Vision-based control

Mobile Robots Locomotion & Sensors

Satellite and Inertial Navigation and Positioning System

ADVANTAGES OF INS CONTROL SYSTEMS

Calibration of Inertial Measurement Units Using Pendulum Motion

Terrestrial GPS setup Fundamentals of Airborne LiDAR Systems, Collection and Calibration. JAMIE YOUNG Senior Manager LiDAR Solutions

DriftLess Technology to improve inertial sensors

The Applanix Approach to GPS/INS Integration

XDK HARDWARE OVERVIEW

Strapdown Inertial Navigation Technology

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Perception: Sensors. Autonomous Mobile Robots. Davide Scaramuzza Margarita Chli, Paul Furgale, Marco Hutter, Roland Siegwart

ALGORITHMS FOR DETECTING DISORDERS OF THE BLDC MOTOR WITH DIRECT CONTROL

Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios

Modelling and Simulation of the Autonomous Underwater Vehicle (AUV) Robot

ECV ecompass Series. Technical Brief. Rev A. Page 1 of 8. Making Sense out of Motion

Low Cost solution for Pose Estimation of Quadrotor

Face Detection and Tracking Control with Omni Car

EE 570: Location and Navigation: Theory & Practice

Old View of Perception vs. New View

Camera gimbal control system for unmanned platforms

Strapdown Inertial Navigation Technology, Second Edition D. H. Titterton J. L. Weston

Mission Overview Cal Poly s Design Current and future work

CS 775: Advanced Computer Graphics. Lecture 17 : Motion Capture

Line of Sight Stabilization Primer Table of Contents

Lecture 13 Visual Inertial Fusion

Electronics Design Contest 2016 Wearable Controller VLSI Category Participant guidance

PSE Game Physics. Session (3) Springs, Ropes, Linear Momentum and Rotations. Oliver Meister, Roland Wittmann

Computer Animation and Visualisation. Lecture 3. Motion capture and physically-based animation of characters

Parallax LSM9DS1 9-axis IMU Module (#28065)

Estimation of Altitude and Vertical Velocity for Multirotor Aerial Vehicle using Kalman Filter

QUANSER Flight Control Systems Design. 2DOF Helicopter 3DOF Helicopter 3DOF Hover 3DOF Gyroscope. Quanser Education Solutions Powered by

DD2426 Robotics and Autonomous Systems Lecture 4: Robot Sensors and Perception

Introduction to Inertial Navigation and Kalman filtering

MEAM 620: HW 1. Sachin Chitta Assigned: January 10, 2007 Due: January 22, January 10, 2007

Lecture 22 of 41. Animation 2 of 3: Rotations, Quaternions Dynamics & Kinematics

Lecture 22 of 41. Animation 2 of 3: Rotations, Quaternions Dynamics & Kinematics

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

Major project components: Sensors Robot hardware/software integration Kinematic model generation High-level control

J. Roberts and P. Corke CRC for Mining technology and Equipment PO Box 883, Kenmore, Q

Mirror positioning on your fingertip. Embedded controller means tiny size plus fast, easy integration. Low power for hand-held systems

Range Sensors (time of flight) (1)

Transcription:

Assignments Policies You must typeset, choices: Word (very easy to type math expressions) Latex (very easy to type math expressions) Google doc Plain text + math formula Your favorite text/doc editor Submit a single PDF for the written part The could be some coding tasks in assignments Unless otherwise stated, you don t need to submit your code for HW assignments unless explicitly told to do so Follow instructions given in each assignment Collaboration is encouraged among small groups (e.g., 2-3 students). You should disclose with whom you have discussed you problems This helps especially if multiple people make identical mistakes

CS 460/560 Introduction to Computational Robotics Fall 2017, Rutgers University Lecture 04 Sensor Mechanisms Instructor: Jingjin Yu

Outline Common sensors Encoders Heading/orientation Acceleration Inertial measurement unit (IMU) Range sensors Motion/speed sensors Vision Modeling sensors

Encoders, Absolute Wheel encoders measure position and rotation speed Absolute (binary, gray code) 010 001 011 000 100 111 101 110 Measure exact (quantized) wheel position Do not need to track time for position With time, can also measure speed t 13 track Gray code absolute rotary encoder Images from wikipedia

Encoders, Relative Relative Index t Relative: mainly direction/speed A single set of gaps may have difficulty determining direction Can add an offset set of gaps to address this Offset by ¼, forward and backward become asymmetric Images from IMTR, www.phidgets.com/docs21/encoder_primer

Encoders, Applications Encoders are very useful in robotics For figuring out wheel states For figuring out positions of robotic arms E.g., Universal Robots UR5 arm Can be highly accurate! For UR5, the repeatability is ±0.1 mm or ±0.0039 in

Orientation Sensors Generally, there are two types of orientation sensors 2D: SE 2 = R 2 S 1, single orientation on S 1 3D: SE 3 = R 3 SO(3), three orientations, e.g., (yaw, pitch, roll) Two dimensional orientation sensor Compasses! They don t always work reliably Globally, magnetic field changes over time Locally Natural magnetic objects Man made magnetic/electro-magnetic fields Magnetic field direction image from geomag.org/info/declination.html

Orientation Sensors, Continued Gyroscopes (three dimensional orientation sensors) Operational principle: angular momentum (inertia) A spinning wheel likes to stay the way it is A gimbal is a device allowing free rotation along one axis A gyro is a wheel mounted on 2+ gimbals Used in many equipment, e.g., aircraft Can be non-mechanical https://www.youtube.com/watch?v=cquva_ipesa A toy gyro Gimbal camera mount 3-gimbal gyro Gyroscope image from wikipedia

Acceleration Sensor Accelerometers Also based on inertia (linear momentum) Newton s first law A mass tends to keep its current momentum Changing it requires force F = ma = kx (Newton s second law) Measures a single direction, e.g. x For three dimensions, need three such mechanisms Can be done with a single mass though A useful trick for figuring out where the ground is Earth has gravity, which is an acceleration force Actually very large and fixed magnitude, often dominates other forces Provides us with a ground direction Can be made very small! May replace spring with piezoelectric mechanism

Inertial Measurement Unit (IMU) IMU uses both angular and linear inertial effects How? Gyro + accelerometer Produces position, velocity, acceleration Pipeline IMUs may also have magnetic and other sensors

Range Sensors Time-of-flight ranging Measure distance by measuring time traveled (by sound or light) distance = speed travel time Light travels much faster Accurate, but more difficult to measure Sound is slower, but works in water well Many things to consider Time measurement Speed variance (esp. sound) Signal uniformity (e.g., sound cone) Triangulation (will cover this later) Phase shift Measure distance by measuring phase of light Light is a wave Sonar Lidar (often uses triangulation)

Range Sensors, Continued Sonars Laser scanners Two dimensional Three dimensional Similar principles Other (more complex) methods Structured light Used in Microsoft Kinect iphone X has something similar See IAMR 4.1

TED version: https://www.youtube.com/watch?v=y_9vd4hwlva Speed Sensors and Vision Sensors Speed sensor: Doppler effect Uses wave aspect of sound/light Faster then more frequent arrival Very accurate Vison sensors You have seen many of these Yes just cameras But can do 1T FPS now

Sensor Modeling A sensor can be modeled using a function h: X: state space Y: observation space h: sensor (mapping) function h: X Y Often, X and Y may be the same; but not always Examples Position sensor: h: R 3 R 3 Orientation sensor: SO(3) SO(3) Contact sensor: h: SE(2) {0,1} We will not do this very formally in this class