EECS 4330/7330 Introduction to Mechatronics and Robotic Vision, Fall Lab 1. Camera Calibration

Similar documents
3D Geometry and Camera Calibration

Camera Model and Calibration

The real voyage of discovery consists not in seeking new landscapes, but in having new eyes.

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Camera Model and Calibration. Lecture-12

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

CS 6320 Computer Vision Homework 2 (Due Date February 15 th )

Rectification. Dr. Gerhard Roth

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

Outline. ETN-FPI Training School on Plenoptic Sensing

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

An idea which can be used once is a trick. If it can be used more than once it becomes a method

Lecture 5.3 Camera calibration. Thomas Opsahl

Geometric camera models and calibration

Lecture 14: Basic Multi-View Geometry

Camera Parameters and Calibration

Pin Hole Cameras & Warp Functions

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

1 Projective Geometry

Camera calibration. Robotic vision. Ville Kyrki

Robot Vision: Camera calibration

Vision Review: Image Formation. Course web page:

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

calibrated coordinates Linear transformation pixel coordinates

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

Pin Hole Cameras & Warp Functions

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

CIS 580, Machine Perception, Spring 2016 Homework 2 Due: :59AM

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne

NAME VCamera camera model representation

CS201 Computer Vision Camera Geometry

Pattern Feature Detection for Camera Calibration Using Circular Sample

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

CSCI 5980: Assignment #3 Homography

LUMS Mine Detector Project

Multiple View Geometry

3D Vision Real Objects, Real Cameras. Chapter 11 (parts of), 12 (parts of) Computerized Image Analysis MN2 Anders Brun,

An Embedded Calibration Stereovision System

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

CSE 252B: Computer Vision II

EECS 442: Final Project

Agenda. Rotations. Camera calibration. Homography. Ransac

Multiple View Reconstruction of Calibrated Images using Singular Value Decomposition

Camera Parameters, Calibration and Radiometry. Readings Forsyth & Ponce- Chap 1 & 2 Chap & 3.2 Chap 4

Single View Geometry. Camera model & Orientation + Position estimation. Jianbo Shi. What am I? University of Pennsylvania GRASP

CS 231A Computer Vision (Winter 2015) Problem Set 2

Yihao Qian Team A: Aware Teammates: Amit Agarwal Harry Golash Menghan Zhang Zihao (Theo) Zhang ILR07 February.16, 2016

COMP 558 lecture 19 Nov. 17, 2010

Computer Vision: Lecture 3

Epipolar Geometry and Stereo Vision

Projector Calibration for Pattern Projection Systems

Robotics - Projective Geometry and Camera model. Marcello Restelli

DD2429 Computational Photography :00-19:00

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Computer Vision. Geometric Camera Calibration. Samer M Abdallah, PhD

CSE 167: Introduction to Computer Graphics Lecture #3: Coordinate Systems

Stereo Image Rectification for Simple Panoramic Image Generation

Introduction to Homogeneous coordinates

16720: Computer Vision Homework 1

Flexible Calibration of a Portable Structured Light System through Surface Plane

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

Planar pattern for automatic camera calibration

Camera model and multiple view geometry

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Cameras and Stereo CSE 455. Linda Shapiro

Module 6: Pinhole camera model Lecture 32: Coordinate system conversion, Changing the image/world coordinate system

Tracking Under Low-light Conditions Using Background Subtraction

Epipolar Geometry in Stereo, Motion and Object Recognition

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

CSE 167: Introduction to Computer Graphics Lecture #4: Coordinate Systems

Computer Vision Lecture 17

ROBUST LINE-BASED CALIBRATION OF LENS DISTORTION FROM A SINGLE VIEW

Agenda. Rotations. Camera models. Camera calibration. Homographies

Computer Vision Lecture 17

HW 1: Project Report (Camera Calibration)

SRI Small Vision System

3D Geometry and Camera Calibration

Camera Calibration. COS 429 Princeton University

Announcements. Stereo

Calibrating an Overhead Video Camera

CSE 167: Introduction to Computer Graphics Lecture #3: Projection. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

Srikumar Ramalingam. Review. 3D Reconstruction. Pose Estimation Revisited. School of Computing University of Utah

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Scene Reconstruction from Uncontrolled Motion using a Low Cost 3D Sensor

Lecture'9'&'10:'' Stereo'Vision'

Cameras and Radiometry. Last lecture in a nutshell. Conversion Euclidean -> Homogenous -> Euclidean. Affine Camera Model. Simplified Camera Models

Shape and deformation measurements by high-resolution fringe projection methods February 2018

Epipolar Geometry and Stereo Vision

L16. Scan Matching and Image Formation

3D Reconstruction of a Hopkins Landmark

ENGN D Photography / Spring 2018 / SYLLABUS

Lecture 5 Epipolar Geometry

Transcription:

1 Lab 1 Camera Calibration Objective In this experiment, students will use stereo cameras, an image acquisition program and camera calibration algorithms to achieve the following goals: 1. Develop a procedure for camera calibration. 2. Measure the error in calibration using two approaches: i. A simple Least Square method ii. A non-linear method using a well-designed calibration toolbox. Reference Materials Lecture Notes, i.e. camera calibration using least square approach Zhang s paper: A Flexible New Technique for Camera Calibration Caltech Camera Calibration Toolbox. Pre-lab Please read the following materials: Lecture notes about Reference (Coordinate) Frames, Homogeneous Transformation Matrix (HTM), Camera Model, Linear Method Camera Calibration, Non-Linear Method Camera Calibration (specially Zhang s Method) Zhang s paper: A Flexible New Technique for Camera Calibration Getting Started section for Camera Calibration Toolbox for Matlab : http://www.vision.caltech.edu/bouguetj/calib_doc/#start Briefly describe functions required from calibration toolbox. Background Camera calibration is a procedure to determine a mathematical model (i.e. pinhole) for the camera in which relation between points outside world and image is formulized. The intrinsic and extrinsic parameters relate the camera s coordinate system to idealized and world coordinate systems, respectively. In this lab, we use

2 two cameras (left and right) next to each puma robot. The pinhole camera model can be expressed as following: where R is the rotation matrix and T is the translation vector. u and v are pixel coordinates and z c is the scaling factor. x w, y w, and z w are coordinate of the point with respect to world coordinate frame. Lab Procedure This lab consists of the following three parts: Part 1: Least Square method for camera calibration 1. Take five snapshots of the calibration pattern using both left and right cameras. Each snapshot should be taken with the pattern at a known XY position and at different heights, parallel to the table. You can use the green foams in the lab to raise the pattern. The pattern must be moved only vertically for each of the three heights. That is, there is no translation in the XY direction when moving up for each pattern. To capture left and right images, you can use puma#_save_left and puma#_save_right where # is the robot number. To see stream video of cameras you can use mvid_stream i port d IP -2 where port and IP are: robot port IP puma1 1500 10.14.1.100 puma2 1500 10.14.1.101 puma3 1600 10.14.1.100

3 2. Use corner extraction tool in the Camera Calibration Toolbox to determine the pixel coordinate of each corner in the pattern and create a file with XYZ- UV information for the calibration of each camera. You can find the toolbox under /usr/share/toolbox_calib. Add this directory to Matlab search path. 3. Write a simple Matlab program to find the calibration matrix (intrinsic and extrinsic parameters) using the least square approach. The suggestion is to use the method in your lecture notes. That is, re-rewriting the equations to construct one single matrix, A, and minimizing Aq = 0 under the constraint q =1, where q contains the calibration parameters in a column vector. NOTE: In this case the calibration toolbox won't be able to determine the Z coordinate of the pattern. Therefore, make sure you record the different heights at which you placed the pattern and correct the Z coordinate in your code accoringly. Part 2: Using a non-linear method for camera calibration In this case you will use the Calibration toolobox. 1. Position the pattern this time at random poses with respect to the camera and take five snapshots with each camera. You may pick any random position, but make sure each one is reasonably different from the others (rotation and translation) and that at all poses, the pattern can be clearly seen by both cameras. Also make sure that one such pose of the pattern is well known with respect to the robot world coordinate frame this will be the reference frame to establish the camera to robot/world transformation. (Hint: use the same initial position used for part 1) 2. Run the Matlab programs provided in the Calibration Toolbox to obtain the calibration matrix (intrinsic and extrinsic parameters) and distortion parameters the program may use Zhang s method or any other approach with distortion parameters modeling. You should also save the distortion parameters and image corner points. Part 3: Error Measurement At this point you have collected training images in order to learn the parameters for camera calibration using two different methods. Next step is to test and compare results of the calibration. 1. Take another snapshot of the pattern with known XYZ position for testing purpose. (Hint: record the XYZ position with respect to the world coordinate frame). Make sure that the image of this pattern is different from all the images used for the calibration.

4 2. Extract the corners of the test image using Camera Calibration Toolbox and save the XYZ-UV coordinate of the pattern. You will use these coordinates as ground truth to compare with results obtained from the calibration parameters learnt in parts 1 and 2. 3. Calculate the error. That is a) observed UV coordinates in the image vs the estimated UV coordinates using the calibration matrix derived in Part 1. b) observed UV coordinates in the image vs the estimated UV coordinates using the calibration matrix derived in Part 2. 4. In step 3 above, you are comparing normalized UV coordinates to distorted coordinates. Therefore, for this step you should take distortions into acount i.e. Repeat step (3) for undistorted and distorted UV coordinates. Hints: o Use Weng s or Zhang's error measurement technique for converting from normalized UV coordinates to undistorted normalized UV coordinates and vice versa. o Use the distortion parameters saved in Part 2. o Flowcharts bellow give a visual explanation for what you need to do.

5 Post-Lab Questions: Comment your results by answering to below questions: what you expected the error to be and why? what error did you get and why? Include all your results, and describe steps you did in your implementation.