Project 1 : Dead Reckoning and Tracking

Similar documents
Introduction to Mobile Robotics Probabilistic Motion Models

MATH 1112 Trigonometry Final Exam Review

Name Period. (b) Now measure the distances from each student to the starting point. Write those 3 distances here. (diagonal part) R measured =

UAV Position and Attitude Sensoring in Indoor Environment Using Cameras

Goals: Course Unit: Describing Moving Objects Different Ways of Representing Functions Vector-valued Functions, or Parametric Curves

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Unit #13 : Integration to Find Areas and Volumes, Volumes of Revolution

Detailed instructions for video analysis using Logger Pro.

Perspective Projection [2 pts]

Math Section 4.2 Radians, Arc Length, and Area of a Sector

Encoder applications. I Most common use case: Combination with motors

Linear algebra deals with matrixes: two-dimensional arrays of values. Here s a matrix: [ x + 5y + 7z 9x + 3y + 11z

Camera Model and Calibration

Chapter 3 Image Registration. Chapter 3 Image Registration

Part A: Monitoring the Rotational Sensors of the Motor

Exam in DD2426 Robotics and Autonomous Systems

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

A Simple Introduction to Omni Roller Robots (3rd April 2015)

EV3 Programming Workshop for FLL Coaches

Major project components: Sensors Robot hardware/software integration Kinematic model generation High-level control

Autonomous Vehicle Navigation Using Stereoscopic Imaging

10/11/07 1. Motion Control (wheeled robots) Representing Robot Position ( ) ( ) [ ] T

Projectile Motion. Photogate 2 Photogate 1 Ramp and Marble. C-clamp. Figure 1

TEAM 12: TERMANATOR PROJECT PROPOSAL. TEAM MEMBERS: Donald Eng Rodrigo Ipince Kevin Luu

Sphero Lightning Lab Cheat Sheet

Physics 101, Lab 1: LINEAR KINEMATICS PREDICTION SHEET

Optic Flow and Basics Towards Horn-Schunck 1

Autonomous Navigation for Flying Robots

Lab 3: Acceleration of Gravity

1 Differential Drive Kinematics

6. Bicycle Experiment

CS-465 Computer Vision

DEAD RECKONING FOR MOBILE ROBOTS USING TWO OPTICAL MICE

Robot Navigation Worksheet 1: Obstacle Navigation

3D Geometry and Camera Calibration

Camera Deployment Guide

CS231A Section 6: Problem Set 3

Date Course Name Instructor Name Student(s) Name WHERE WILL IT LAND?

Data Association for SLAM

MATH 31A HOMEWORK 9 (DUE 12/6) PARTS (A) AND (B) SECTION 5.4. f(x) = x + 1 x 2 + 9, F (7) = 0

Grasping Known Objects with Aldebaran Nao

ROBOTICS AND AUTONOMOUS SYSTEMS

Camera Drones Lecture 2 Control and Sensors

Robotics and Autonomous Systems

Robotics and Autonomous Systems

Kinematics, Kinematics Chains CS 685

Construction Progress Management and Interior Work Analysis Using Kinect 3D Image Sensors

Trigonometry I. Exam 0

Drawing in 3D (viewing, projection, and the rest of the pipeline)

SketchUp. SketchUp. Google SketchUp. Using SketchUp. The Tool Set

Autonomous Vehicle Navigation Using Stereoscopic Imaging

CS 130 Final. Fall 2015

Evaluating the Performance of a Vehicle Pose Measurement System

sin30 = sin60 = cos30 = cos60 = tan30 = tan60 =

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Waypoint Navigation with Position and Heading Control using Complex Vector Fields for an Ackermann Steering Autonomous Vehicle

Math Learning Center Boise State 2010, Quadratic Modeling STEM 10

Localization, Where am I?

Motion Control (wheeled robots)

ON THE VELOCITY OF A WEIGHTED CYLINDER DOWN AN INCLINED PLANE

Integrating Machine Vision and Motion Control. Huntron

CS4758: Rovio Augmented Vision Mapping Project

Camera Model and Calibration. Lecture-12

Build and Test Plan: IGV Team

Since a projectile moves in 2-dimensions, it therefore has 2 components just like a resultant vector: Horizontal Vertical

CS4495 Fall 2014 Computer Vision Problem Set 6: Particle Tracking

Visual Physics Camera Parallax Lab 1

(40-455) Student Launcher

Case Study 1: Piezoelectric Rectangular Plate

Homework Assignment /645 Fall Instructions and Score Sheet (hand in with answers)

Section 4.1: Introduction to Trigonometry

Reading on the Accumulation Buffer: Motion Blur, Anti-Aliasing, and Depth of Field

EE565:Mobile Robotics Lecture 3

Guided Problem Solving

Appendix E: Software

Math 113 Exam 1 Practice

CS 4495 Computer Vision Motion and Optic Flow

Motion Analysis. Motion analysis. Now we will talk about. Differential Motion Analysis. Motion analysis. Difference Pictures

XSLIDE. XSlide Positioning System. Manual and Motorized. Compact Positioning Stage. Long life, precise movement, greater value

ME5286 Robotics Spring 2014 Quiz 1 Solution. Total Points: 30

Positioning & Orientation of Elements

Final Exam Study Guide

You are not expected to transform y = tan(x) or solve problems that involve the tangent function.

Visual Odometry. Features, Tracking, Essential Matrix, and RANSAC. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Geometry Second Semester Review

Checkpoint 1 Define Trig Functions Solve each right triangle by finding all missing sides and angles, round to four decimal places

Vision-based Mobile Robot Localization and Mapping using Scale-Invariant Features

Precision cylindrical face grinding

Section 5.4: Modeling with Circular Functions

Introduction to Google SketchUp

INTRODUCTION TO COMPUTER GRAPHICS. It looks like a matrix Sort of. Viewing III. Projection in Practice. Bin Sheng 10/11/ / 52

A Guide to Processing Photos into 3D Models Using Agisoft PhotoScan

Adept Lynx Triangle Drive: Configuration and Applications

Solution Notes. COMP 151: Terms Test

Lecture 13 Visual Inertial Fusion

CAD Tutorial 23: Exploded View

Chapter 7 Building robot with MicroCamp kit

Scene Modeling for a Single View

Introduction to Computer Vision

Transcription:

CS3630 Spring 2012 Project 1 : Dead Reckoning and Tracking Group : Wayward Sons Sameer Ansari, David Bernal, Tommy Kazenstein 2/8/2012

Wayward Sons CS3630 Spring 12 Project 1 Page 2 of 12 CS 3630 (Spring 12) Project 1: Dead Reckoning and Tracking Team Members : Sameer Ansari, David Bernal, Tommy Kazenstein Overview This project involved two major parts: Commanding the Rovio to travel along a path/trajectory, and using the overhead camera to track it. For Part 1, 3 trials were run with the Rovio attempting to travel a square trajectory using dead reckoning. For Part 2, an overhead camera was used to track the Rovio, and the equations for homogenous transform from world-space to pixel coordinates are derived. For Part 3, the Rovio repeats 3 trials with an arbitrary trajectory (chosen as an hourglass shape), as well as a discussion more advanced techniques than dead reckoning with pseudo-code examples. Coding Techniques used All code was run from a 64 bit windows laptop using opencv2 and python bindings. For sending and receiving commands to the Rovio, python and the pyrovio library were used as the base, some minor syntax and logic errors were modified inside the pyrovio library, and a main test python file was forked off for project use. One important change was replacing the python Sleep timer with a system timer to determine update cycles, resulting in code that did not vary with host computer Sleep timer intervals (which vary by current processing time etc.), this resulted in more consistent motions of the Rovio. For overhead tracking, python with opencv2 was again used from a windows 64 bit platform. Instead of the traditional blob tracking with color, optical flow via two hierarchical Lucas-Kanade pyramids was used, the code being forked off of a python optical flow demo. Features were defined using the GoodFeaturesToTrack subroutine on an area specified by mouse-click or expected initial location, these points then had an averaged optical flow (the Rovio, a face, anything of import) that was used to update the area s location to follow the optical flow. In images the features tracked are shown as green lines with green square dots on the features, and the area tracked is shown as a yellow circle of constant radius, an interesting future addition would be to specify features using a more specific method than good features to track, and choosing the area as a function of the points chosen, allowing for a more form-fitting area. Part 1: Odometry Part 1 s main task is to make a Rovio travel the pre-specified trajectory of a square using dead reckoning (no outside knowledge). The sides of the square are 1 meter each, with 90 degree turns at each point, resulting in a final position and orientation that is ideally the same as the initial.

Wayward Sons CS3630 Spring 12 Project 1 Page 3 of 12 1 Meter Rovio Since dead reckoning generally requires close to ideal conditions, in normal conditions errors happen, and will accumulate over time. The Rovio was programmed to accomplish somewhere within 30% of the required translational movements, not including rotation error, and succeeded in that task. Q 1.1 In our tests, we used the overhead camera along with opencv and matlab to determine that 1 meter in the real world was equivalent 210 pixels in our camera view from a distance of 3 meters. This comes out to 2.1 pixels per centimeter. Three tests were conducted for the square trajectory, shown in Figure 1. Figure 1 - Tests 1-3 Segment Displacements Figure 1 shows the measurements of each segment length in pixels, which are then converted to real world distances in meters using the conversion of 210 px to 1 m, as shown in Table 1, with the percent error from expected shown also. Straight Line Offset Target = 1 meter Table 1- Test 1-3 Segment Displacements and Percent Error Segment 1 Segment 2 Segment 3 Segment 4 Test 1 1.07 m 7%.97 m 3% 1.02 m 2% 1.05 m 5% Test 2 1.01 m 1%.98 m 2%.95 m 5% 1 m 0% Test 3.99 m 1%.97 m 3%.96 m 4% 1.03 m 3% Average 1.02 m 2%.97 m 3%.98 m 2% 1.03 m 3%

Wayward Sons CS3630 Spring 12 Project 1 Page 4 of 12 Further inspection of the trajectories was shown for each expected corner of the square, as shown in Figure 2. Figure 2 - Corner Displacements Also, the angles were determined for each turn, as shown in Figure 3. We used simple trigonometry to calculate our angles from pictures taking during the experiment. Figure 3 - Turn angle calculation (Tests 1 2 3 as Left Right Bottom)

Wayward Sons CS3630 Spring 12 Project 1 Page 5 of 12 To get the angle we drew a line in the direction Rovio was moving, and connecting it with the line in the direction Rovio was facing after the rotation, forming a right triangle. A basic sin formula returned the angle. Using the information from Figure 2 and Figure 3, Table 2 displays the Rovio's target displacements, more specifically the Distance column represents the distance the Rovio was from target, each corner of the square. The Angle column represents the number of degrees Rovio was off in its rotation. For example in the test 3 and segment 1, Rovio was 4.76 centimeters from the target location in the top right corner, and Rovio had an angle offset of -8 degrees meaning Rovio under-rotated 82 degrees instead of the target 90 degrees. Displacement from Expected Table 2 - Segment displacements and angle offsets Segment 1 Segment 2 Segment 3 Segment 4 Distance Angle Distance Angle Distance Angle Distance Angle Test 1 9.38 cm 0 11.76 cm 24 43.7 cm 0 65.82 cm 26 Test 2 0 cm 8 15.51 cm 0 23.22 cm 12 30.01 cm 24 Test 3 4.76 cm -8 18.1 cm -2 27.35 cm 6 15.51 cm 35 Average 4.71 cm 0 15.12 cm 7 31.42 cm 6 37.11 cm 28 Q 1.2 Table 3 shows measurements of the offset distance and angle of the robot's final position compared to the expected position and angle. The average offset for our three cases was 37 cm meaning for our reported tests, the robot was displaced by 37% from the initial position at the bottom left of the square. This is largely due to the error discussed below in addition to the four rotations in the square trajectory. Because one under or over rotation could cause the Rovio to go one meter in the wrong direction. Table 3 - Trial displacements and angle offsets Distance Offset Angle Offset Test 1 65 cm 20 Test 2 30 cm 60 Test 3 18 cm 30 Average 37 cm 36 Though for each test the distance and angle offsets vary in placement, the displacement error per segment was always within 30%, with angle offsets accounting for the majority of the displacement error accumulating over time.

Wayward Sons CS3630 Spring 12 Project 1 Page 6 of 12 Q 1.3 In running the tests of dead reckoning, an assumption is made that the condition of the environment and Rovio are constant, and the commands passed to the robot, when executed, will cause the same series of actions consistently in every trial. Unfortunately that is not the case. Surface friction of ground Though the environment is generally static, the surface friction s effects on the wheels are not taken into account, so that a Rovio that turns 90 degrees on carpet may turn more or less on hardwood floor, and move further in a straight line. For tuning, the surface used for testing was considered constant and tuned as such. Rovio wheel motors Another assumption was for a specified speed, a wheel would rotate with a certain velocity or at least force. This was found to be false, as the wheel velocity is directly dependent on battery state of the Rovio, as well as something else that was generally not possible to account for in dead reckoning. A hypothesis is that the Rovio has limited power to provide to the 3 wheels, therefore the resistance encountered by a wheel results in more power being sent to it, reducing the available power and therefore velocity to the other wheels, in a non-deterministic fashion. Rovio wheel rolling friction Another assumption was constant rolling friction on the wheels of the Rovio, which is found to be false. The rolling friction of the Rovio s wheels is dependent on the orientation of each wheel, and whether a roller or the plastic in between is in contact with the surface. This changes the friction for both translation and rotation into a combination of static friction and rolling friction, which is generally not possible to account for without any encoders on the wheels and consistent wheel rotation. Tests were conducted by placing the wheels in an initial configuration, and it was found that tests performed consistently better, where if the Rovio performed the same number of revolutions per step in trajectory, it tended towards a more consistent set of motions. However, changes did occur as a result of not taking into account battery level and time-varying wheel motor strengths. Network latency and Update interval The final assumption is that commands are passed consistently and in the exact same way. This assumption is incorrect when taking into account wireless network latency and the Rovio s update cycle. In testing, though the commands sent to the Rovio were broadcast consistently and in an accurate timeframe, the Rovio would arbitrarily execute commands for varying times until the next update cycle triggered and it received the next broadcast message. These discrepancies cause the Rovio to execute the motor-wheel commands for varying time intervals between trials. This requires more testing, but different broadcast frequencies between 10Hz to 1Hz were tested, with 2Hz tending towards more consistent actions. Part 2: Tracking A homogenous transform from world coordinates to pixel coordinates is a useful first step that allows one to describe events in world-space coordinates, and know how that would be seen from a camera s point of view, for example, a real-world trajectory can be converted to a 2d path traversed from the camera s point of view. This key ability allows on the use camera information to make decisions in the real world. In the setup, the camera is set up at 3 meters above the field facing straight down. The world coordinate frame is defined as follows: the origin in the point on the ground which is in the center of the cameras

Wayward Sons CS3630 Spring 12 Project 1 Page 7 of 12 point of view, with x being to the right along the surface, y is up along the surface, and z is up off the ground towards the camera. Q 2.1 The transform required is a 90 rotation about the x axis to account for camera facing down (90 from world space). X, Y, Z stands for world space coordinates. X, Y, Z stands for camera space coordinates. The transform also required a 3m translation along the world coordinate s Z axis. Since the rotation happens first, the z axis becomes the y axis and the y becomes the z, so the translation is done in the y axis, as shown in. The total transformation: = * This transform is useful to adapt to the camera s downwards facing position by rotating. The camera is also at a different Z location so a translation is required in order to account for this slight change. This transformation adequately transforms the world frame to where the camera is positioned thus creating the camera frame. Q 2.2 Once the camera frame coordinates are found, another set of transformations is needed in order to adapt to the computer s representation of the frame. A rotation is adequate in order to transform the camera s frame so that it matches to an image s representation of the X and Y axis. In an image, the X axis runs left to right and the Y axis runs from the top to the bottom. The following transformation deals with this rotation. Next, a scaling is needed to successfully map the camera frame s coordinates to match with the computer s pixel representation. Through testing the scale factor was found to be 210 pixels per meter, for a height of 286cm. Thus a 1 meter distance in world space coordinates needs to be 210 pixels in pixel coordinates. Using the scale factor the following transform deals with the required scaling from meters to pixels.

Wayward Sons CS3630 Spring 12 Project 1 Page 8 of 12 Following the scaling, another transformation was needed in order to correctly match the Cartesian system s coordinates to the computer s pixel coordinates. For example, the origin in the camera frame exists at coordinates X = 0, Y = 0. On the image however, the origin s coordinates (which are at the center of the image) are located at X = 320, Y = 240. The following transformation deals with the translation. The total transformation: = * * Note that has the z translation zeroed out as the z axis is not used in pixel coordinates. This transform is useful in order to adapt to the computer s definition of an image compared to the world s definition of the image. The image rotates in order to fit the pixel s axis, scales to fit the image pixel size, and translates shifting the frame s origin to match the pixel representation of the center of the image. This transform effectively transforms the camera frame onto the image(pixel) frame. Q 2.3 The homogeneous transform that maps world to image (pixel) coordinates is the multiplication of the world to camera frame transform and the camera frame to pixel transform. Thus, the following transformation deals with the transform from world to pixel. = * Here, shows the complete transform from world space coordinates to pixel coordinates. Interesting to note is that the z and y axis are just inverted as a result of the two 90 rotations due to the camera facing down and the inverted y axis of pixel coordinates. A scale and translation finalize the conversion to pixel coordinates. It is important to note that no perspective conversions are done, this essentially assumes the camera s view is orthogonal, an approximation that has minor effect for the specific scenario detailed in this project.

Wayward Sons CS3630 Spring 12 Project 1 Page 9 of 12 Q 2.4 Using the homogenous transform from Q2.3 the world-space coordinates of the square trajectory are transformed into pixel coordinates, as shown in Table 4. Table 4 - Transformed pixel coordinates of square trajectory Point Position World Space (x,y,z) in meters Pixel Space (x,y) in pixels 1 Bottom-left -0.5 m -0.5 m 0 m 215 px 345 px 2 Top-left -0.5 m 0.5 m 0 m 215 px 135 px 3 Top-right 0.5 m 0.5 m 0 m 425 px 135 px 4 Bottom-right 0.5 m -0.5 m 0 m 425 px 345 px Table 4 shows the homogenous transformation applied to the world-space coordinates of the square trajectory, producing pixel coordinates. These are shown overlaid in Figure 4 alone and with the robot traveling along the trajectory path. Figure 4 - Overlaid Square based off Homogenous transform (left), Rovio traversing path (right) As can be seen in Figure 4, the transform results in pixel coordinates that are similar to the actual worldspace positions (black tape marks), with a slight offset in the y axis, possibly due to placement of camera not being exact center of the image. Q 2.5 The robot began the trajectory at pixel position [215,345]. The robot travelled the first segment and stopped its movement at pixel position [214,130]. The expected location of the robot after the first segment is at pixel position [215,135]. Using the Euclidian distance formula the total error was found to be around 5 pixels, a pretty minimal and relatively negligible amount. The robot s actual final position was. This compares to the expected final position which is the same starting position of the robot. The starting position was. Assuming the robot s initial position is its expected final position, the total error distance for the whole trajectory was found to be 91.78. The total error distance in the world frame, which was measured by hand, was. Again, taking into account that the final position is the initial position the robot started in.

Wayward Sons CS3630 Spring 12 Project 1 Page 10 of 12 Q 2.6 The pixel error for the whole trajectory was 98.6. This compares to the physical measurement of the robot s error in the world frame which was 0.48m. A meter is represented by 210 pixels, so converting the errors into fractions leads to following results. The two error fractions are nearly identical thus clarifying the fact that an error in the pixel frame matches up to the error on the world frame. Part 3: Analysis Q 3.1 For Part 3, a new trajectory was chosen and 3 trials were repeated. The new trajectory was arbitrarily chosen as an hourglass with a path as shown in Figure 5, with the Rovio shown in the initial and ideal final position and orientation. 1 Meter 1 Meter Figure 5 - Arbitrary Trajectory Figure 6 - Arbitrary trajectory trials 1 to 3 (left to right) full path images via overhead camera and optical flow tracking.

Wayward Sons CS3630 Spring 12 Project 1 Page 11 of 12 In Figure 6, the three trials for the arbitrary trajectory are shown, green lines showing the features of the Rovio tracked, the Rovio tracked with a yellow circle, and the expected trajectory shown as a magenta overlay. The displacements and orientation offsets are shown in Table 5. Table 5 - Arbitrary trajectory trials final position errors Displacement Orientation offset Trial 1 5 cm 30 Trial 2 28 cm 45 Trial 3 50 cm 55 In Table 5, the final displacement reached a maximum of 50 cm, which was surprisingly good, however, more testing than 3 would be needed to build a more accurate representation of the errors in testing. For these tests, the Rovio was kept charged fully, and the wheel initial positions constant and the code the same. The errors accumulate on each turn, with distance traveled being more consistent than the angles turned. Using the transform formula from section 2, the pixel coordinates were determined as shown in Table 6. Table 6 - Arbitrary Trajectory Homogenous Transform from World Space to Pixel Coords Point Position World Space (x,y,z) in meters Pixel Space (x,y) in pixels 1 Bottom-left -0.5 m -0.5 m 0 m 215 px 345 px 2 Bottom-right 0.5 m -0.5 m 0 m 425 px 345 px 3 Top-left -0.5 m 0.5 m 0 m 215 px 135 px 4 Top-right 0.5 m 0.5 m 0 m 425 px 135 px Table 6 shows the transformed arbitrary path points to pixel coordinates. These points are shown as an overlay in Figure 7, which contains images of the Rovio traversing the path and after it has finished traveling. Figure 7 - Rovio traveling along arbitrary path overlay built using Homogenous Transform

Wayward Sons CS3630 Spring 12 Project 1 Page 12 of 12 As can be seen in Figure 7, the transform from world coordinates to pixel coordinates provides an accurate representation in this scenario. The final displacement and orientation offset of the Rovio in this test was and. Q 3.2a Here is a possible pseudo-code implementation for using the overhead camera to help the robot navigate a path/trajectory. For each waypoint in path: While distance to waypoint > TOLERANCE: Get robot offset and orientation to waypoint from camera Turn robot towards waypoint and drive forward X distance Where X distance is minimum of distance to the waypoint and distance traveled in one update cycle at maximum speed. An update cycle is each step in the while loop, from the Rovio s point of view. This could be made slower but more accurate by adding an orientation check with camera before driving, but it is all the same principle. Q 3.2b Using the onboard camera on the Rovio, it is possible to use various computer vision techniques to determine the change in position and orientation of the Rovio to a higher degree of accuracy than dead reckoning, allowing for more accurate trajectory traversal. The principle is the same, the Rovio moves, checks with CV techniques what actually happened, and corrects. One way is to use a camera calibration with a known target (or set of targets), and use that to determine its orientation/position. Another is to use optical flow, which doesn t require a preset target, and determine how the image has changed frame by frame based to determine the motion and rotation of the camera/rovio. Contributions Work was split up as follows: Q1.1,1.2 - Tommy Q2.1-2.3,2.5-2.6 - David Overview/Coding Tech, Part1, Q1.3, Q2.4, Q3.1-3.2 - Sam Team worked together to come up with homogenous transforms, etc. Sam did general coding, David worked on Rovio alternate trajectory. Tommy was primary for running tests for data.