Exam in DD2426 Robotics and Autonomous Systems

Similar documents
Final Exam Practice Fall Semester, 2012

Localization, Where am I?

Sensor Modalities. Sensor modality: Different modalities:

EE565:Mobile Robotics Lecture 2

Introduction to Robotics

Goals: Course Unit: Describing Moving Objects Different Ways of Representing Functions Vector-valued Functions, or Parametric Curves

Autonomous Navigation for Flying Robots

ME 597/747 Autonomous Mobile Robots. Mid Term Exam. Duration: 2 hour Total Marks: 100

UNIVERSITY OF OSLO. Faculty of Mathematics and Natural Sciences

Introduction to Mobile Robotics

Robotics and Autonomous Systems

Robotics and Autonomous Systems

Navigational Aids 1 st Semester/2007/TF 7:30 PM -9:00 PM

NAVIGATION SYSTEM OF AN OUTDOOR SERVICE ROBOT WITH HYBRID LOCOMOTION SYSTEM

ROBOTICS AND AUTONOMOUS SYSTEMS

Use Parametric notation. Interpret the effect that T has on the graph as motion.

MATH 31A HOMEWORK 9 (DUE 12/6) PARTS (A) AND (B) SECTION 5.4. f(x) = x + 1 x 2 + 9, F (7) = 0

Theory of Robotics and Mechatronics

ME5286 Robotics Spring 2013 Quiz 1

ME5286 Robotics Spring 2014 Quiz 1 Solution. Total Points: 30

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

COMP 175 COMPUTER GRAPHICS. Lecture 10: Animation. COMP 175: Computer Graphics March 12, Erik Anderson 08 Animation

Kinematics, Kinematics Chains CS 685

EE565:Mobile Robotics Lecture 3

Unit 2: Locomotion Kinematics of Wheeled Robots: Part 3

Localization, Mapping and Exploration with Multiple Robots. Dr. Daisy Tang

CMPUT 412 Motion Control Wheeled robots. Csaba Szepesvári University of Alberta

Cecilia Laschi The BioRobotics Institute Scuola Superiore Sant Anna, Pisa

MTRX4700 Experimental Robotics

Localization and Map Building

Final Exam : Introduction to Robotics. Last Updated: 9 May You will have 1 hour and 15 minutes to complete this exam

CS283: Robotics Fall 2016: Sensors

navigation Isaac Skog

Robotics (Kinematics) Winter 1393 Bonab University

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

2.3 Circular Functions of Real Numbers

Industrial Robots : Manipulators, Kinematics, Dynamics

Robot mechanics and kinematics

Robot Localization based on Geo-referenced Images and G raphic Methods

3. Manipulator Kinematics. Division of Electronic Engineering Prof. Jaebyung Park

Jacobian: Velocities and Static Forces 1/4

Encoder applications. I Most common use case: Combination with motors

Objectives. Part 1: forward kinematics. Physical Dimension

10/11/07 1. Motion Control (wheeled robots) Representing Robot Position ( ) ( ) [ ] T

Motion Control (wheeled robots)

Motion estimation of unmanned marine vehicles Massimo Caccia

Manipulator Path Control : Path Planning, Dynamic Trajectory and Control Analysis

Robot mechanics and kinematics

Kinematics of Machines Prof. A. K. Mallik Department of Mechanical Engineering Indian Institute of Technology, Kanpur. Module - 3 Lecture - 1

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z

Rigid Body Transformations

Mobile Robot Kinematics

Robotics I. March 27, 2018

Homework Set 3 Due Thursday, 07/14

Robots are built to accomplish complex and difficult tasks that require highly non-linear motions.

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Jim Lambers MAT 169 Fall Semester Lecture 33 Notes

Forward kinematics and Denavit Hartenburg convention

Introduction to Motion

sin30 = sin60 = cos30 = cos60 = tan30 = tan60 =

Homework #2 Posted: February 8 Due: February 15

Manipulator trajectory planning

Rational Numbers: Graphing: The Coordinate Plane

Optics Vac Work MT 2008

θ x Week Date Lecture (M: 2:05p-3:50, 50-N202) 1 23-Jul Introduction + Representing Position & Orientation & State 2 30-Jul

Recycling Robotics Garbage collecting Robot Application - Project Work Definitions and Design

Chapter 23. Linear Motion Motion of a Bug

ME5286 Robotics Spring 2015 Quiz 1

Homework Assignment /645 Fall Instructions and Score Sheet (hand in with answers)

Chapter 7: Geometrical Optics. The branch of physics which studies the properties of light using the ray model of light.

Robotics. Lecture 5: Monte Carlo Localisation. See course website for up to date information.

THE UNIVERSITY OF AUCKLAND

Camera Drones Lecture 2 Control and Sensors

Inverse Kinematics. Given a desired position (p) & orientation (R) of the end-effector

TEAMS National Competition Middle School Version Photometry Solution Manual 25 Questions

Jo-Car2 Autonomous Mode. Path Planning (Cost Matrix Algorithm)

Practical Robotics (PRAC)

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Kinematics of Machines Prof. A. K. Mallik Department of Mechanical Engineering Indian Institute of Technology, Kanpur. Module 10 Lecture 1

Selection and Integration of Sensors Alex Spitzer 11/23/14

Sphero Lightning Lab Cheat Sheet

Geometry: Angle Relationships

EV3 Programming Workshop for FLL Coaches

Polar Coordinates. Chapter 10: Parametric Equations and Polar coordinates, Section 10.3: Polar coordinates 27 / 45

Introduction to Mobile Robotics

Mobile Robotics. Marcello Restelli. Dipartimento di Elettronica e Informazione Politecnico di Milano tel:

Polar coordinate interpolation function G12.1

Introduction to Robotics

LET S FOCUS ON FOCUSING

MAT 271 Recitation. MAT 271 Recitation. Sections 10.1,10.2. Lindsey K. Gamard, ASU SoMSS. 22 November 2013

Robotics. Chapter 25. Chapter 25 1

Introduction to Information Science and Technology (IST) Part IV: Intelligent Machines and Robotics Planning

Testing the Possibilities of Using IMUs with Different Types of Movements

Lecture VI: Constraints and Controllers. Parts Based on Erin Catto s Box2D Tutorial

Mobile Robots Locomotion

Jacobian: Velocities and Static Forces 1/4

The University of Missouri - Columbia Electrical & Computer Engineering Department EE4330 Robotic Control and Intelligence

TEAMS National Competition High School Version Photometry Solution Manual 25 Questions

1) Find. a) b) c) d) e) 2) The function g is defined by the formula. Find the slope of the tangent line at x = 1. a) b) c) e) 3) Find.

Reinforcement Learning for Appearance Based Visual Servoing in Robotic Manipulation

Transcription:

Exam in DD2426 Robotics and Autonomous Systems Lecturer: Patric Jensfelt KTH, March 16, 2010, 9-12 No aids are allowed on the exam, i.e. no notes, no books, no calculators, etc. You need a minimum of 20 points to pass the exam. Part I You do not need to motivate your answers in this section. A single word is often enough. Each question in this part is worth 2p. 1. Can you calculate the distance to a known object using a single image? yes, if you know the size of the object and have a calibrated camera 2. Mention one good thing with a feature based representation of the world compact, fits well with a an efficient Kalman filter for localization and mapping purposes, etc 3. You start with a perfectly known position but with completely unknown orientation. If you move along a straight line, how long distance does it take to accumulate a position uncertainty of 1m if the uncertainty in distance traveled is 1% and there is no added uncertainty in the orientation with distance traveled? Roughly 0.5m. Since the uncertainty in orientation is complete we do not know in what direction we travel and for every distance d we end up with a circle with diameter 2d on which the robot can be. 4. Mention a disadvantage with legged locomotion. Mechanically complex and power hungry 5. What do we call the equations that give the joint angles given the Cartesian position of the end effector? inverse kinematics 6. If you think of the robot as a positively charged particle in the potential field method for path planning and obstacle avoidance works. What charge (positive/negative) would the goal have? Negative since the robot should be attracted by the goal 7. Mention something good with the swedish wheel It allows you to make omnidirectional robots, ie a robot that can move in any direction and rotate at the same time. 8. Which type of locomotion is gait associated with, legged or wheeled? legged, it is the sequency of lift and release events for the individual legs 1

9. Mention an advantage with an active sensor It typically gives better signal to noise ratio 10. Mention a disadvantage with the sonar sensor Sound travels slowly in air which means a measurement take a long time, it typically have a large opening angle and suffers from reflections in the environment 2

Part II In this section the answers are expected to be a bit longer. It is important to motivate your answers! 11. For the sensors below. Explain a) what information it can give and b) how this can be used to reduce the uncertainty in the pose for an agent moving around in an indoor environment and c) what the main limitations of the sensors are in this situation. rategyro (6p) accelerometer (6p) stereocamera (6p) GPS (6p) rategyro: The rategyro gives us angular speed (2p). We can use rategyro to get the orientation change (integrate angular speed to get angle change). This can be very valuable to reduce the uncertainty in orientation (2p). The main problem with a gyro is that the angle estimate drifts with time, sensitive to noise and bias. (2p) accelerometer: Gives limiear acceleration along an axis. (2p) We can integrate the signal twice to get the position change. We can also use accelerometers to estimate the vertical direction which can greatly help us reduce the orientation uncertainty for an agent moving in 3D as it provides an absolute measure of this direction. We can also use the accelerometers to detect collisions and other events which are likely to disturb the position. (2p) When used to estimate the position we need to integrate twice which is a process that is extremely noise sensitive and it also assumes that we know the orientation. camera: Gives us image pairs which allows us to calculate depth information (2p). The camera information is enough to solve for the full pose motion (ie we get information about both position and orientation change) (assuming that we can find enough texture to find correspondences between the two images. By building a model of the environment as we move we can reduce the pose drift and limit it if we are moving in a limited region. (2p). Some limitations are the processing time to get the information, that we need texture, etc (2p) GPS: will not give us anything indoor or if it does it will be very coarse position information. (2p) if we do get a signal we can use it to at least tell us roughly where we are (2p) the biggest limitation here is that the GPS system does not work that well indoor. (2p) 12. Assume that you are using a 3-axis accelerometer to estimate the position change of something that can move around freely in 3D. Assume that your estimate of the angle w.r.t. the vertical (rotation around the x and/or y axes in a coordinate system with z pointing up) is wrong by 5. Also assume that the body is standing still. How will this influence the estimation process? Make rough calculations to quantity the effects.(6p) If the orientation of the body is wrong we cannot do proper gravity comensation which will result in thinking that the body is moving when it is not. An error in orientation w.r.t. to the vertical will mean that we will think that the body is experiencing an acceleration of 9.81m/s 2 sin(5 ) 10 sin(5 π/180) 10 sin(5/60) 10 5/60 = 0.8m/s 2. This means that after a few seconds we will think that the body is moving at several m/s (0.8tm/s). The position will grow as 0.4t 2 m as a result of this. We will 3

also make a mistake in the vertical direction but this will be less than in the horizontal plane as cos(x) 1 for small x. 13. Provide the Denavit-Hartenberg parameters for the arm below up to the end effector (p 4 ) (8p) Note that the links are assumed to have no width and that the motors at the joint are so small that they can be neglected. The following figure might come in handy Note that there are more than one way to express this and the below is just one of them. α i 1 a i 1 θ i d i 1 0 0 θ 1 L 1 2 π 2 0 θ 2 π 2 0 3 0 L 2 θ 3 0 4 0 L 3 0 0 One of the harder parts here is to realise that the θ 2 angle needs to be modified before it can be plugged in. The x-axis of coordinate system before it has been rotated to be parallel to the second link and points straight out along the horizontal whereas θ 2 is defined from the vertical. 14. Describe the principle behind measuring distance using phase-shift (3p) Modulate the tramitted signal and correlate the returned signal with a reference signal. The shift in phase can be translated to a distance if one knows the wave length of the modulation 4

15. How will your dead reckoning estimate differ from the true motion of a differential drive robot if the wheel base is assumed to be larger than it really is. Motivate by making a rough calculation for a case when we falsely assume that it is 10% larger than the true value. (3p) If the wheel base is too large the angular speed will be underestimated. This means that all angular changes will be underestimated as well. The angular speed is inversely proportional to the wheel base. If the wheel base is overestimated by 10%, a turn of 90 degrees will be estimated to be 90 /1.1 82. 16. The figure below shows a path planning problem. The robot (circular) has to move from the current position to the goal. GOAL (a) Sketch how the configuration space would look (4p) (b) Indicate in the sketch what the closest path is (2p) a) Expand all obstacle by the radius of the robot and make the robot a point. b) just draw polygon close to the expanded obstacles around on the right hand side to the goal as the small opening has been shut close. 17. Your team has developed a new fancy light weight arm. It has a payload-weight ratio of about 1 and there is just one minor thing that stops you from closing a deal with a company that makes advanced wheel chairs. You need to be able to assure your customers that the arm will not hit things in its surroundings, like the person in and around the wheel chair. The arm will be place on the right hand side of the wheel chair. How would you deal with this problem? (10p) Too open to give one good answer. Typically need to have some idea of what is around the arm to avoid colisions which calls for non-contact sensors such as camera, infrared camera, sonars, IR, laser, etc but also some kind of touch sensors as a last line of defense like an artifical skin. We should also make sure that we reduce the speed of the arm to make sure that if it hits something nothing gets damaged/injured. It is probably also a good idea to limit the workspace somewhat to not all shoulders to hit the person in the wheel chair, only the end effector that you might need to be able to touch if it is supposed to be used for interacting with the person. 5