A Stereo Machine Vision System for. displacements when it is subjected to elasticplastic

Similar documents
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

Camera Model and Calibration

CS201 Computer Vision Camera Geometry

Machine vision. Summary # 11: Stereo vision and epipolar geometry. u l = λx. v l = λy

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

Epipolar Geometry and the Essential Matrix

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Homogeneous Coordinates. Lecture18: Camera Models. Representation of Line and Point in 2D. Cross Product. Overall scaling is NOT important.

Camera Model and Calibration. Lecture-12

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Rectification. Dr. Gerhard Roth

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

3D Geometry and Camera Calibration

Rectification and Distortion Correction

1 Projective Geometry

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

Computer Vision cmput 428/615

And. Modal Analysis. Using. VIC-3D-HS, High Speed 3D Digital Image Correlation System. Indian Institute of Technology New Delhi

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

CSE 252B: Computer Vision II

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Flexible Calibration of a Portable Structured Light System through Surface Plane

MERGING POINT CLOUDS FROM MULTIPLE KINECTS. Nishant Rai 13th July, 2016 CARIS Lab University of British Columbia

INFO - H Pattern recognition and image analysis. Vision

Camera Projection Models We will introduce different camera projection models that relate the location of an image point to the coordinates of the

Agenda. Rotations. Camera models. Camera calibration. Homographies

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

CV: 3D to 2D mathematics. Perspective transformation; camera calibration; stereo computation; and more

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

Image Formation. Antonino Furnari. Image Processing Lab Dipartimento di Matematica e Informatica Università degli Studi di Catania

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

Single View Geometry. Camera model & Orientation + Position estimation. What am I?

Epipolar geometry contd.

CHAPTER 3 DISPARITY AND DEPTH MAP COMPUTATION

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

calibrated coordinates Linear transformation pixel coordinates

Project 4 Results. Representation. Data. Learning. Zachary, Hung-I, Paul, Emanuel. SIFT and HoG are popular and successful.

Camera Models and Image Formation. Srikumar Ramalingam School of Computing University of Utah

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Computer Vision Projective Geometry and Calibration. Pinhole cameras

Full Field Displacement and Strain Measurement. On a Charpy Specimen. Using Digital Image Correlation.

Rectification and Disparity

Minimizing Noise and Bias in 3D DIC. Correlated Solutions, Inc.

Agenda. Rotations. Camera calibration. Homography. Ransac

Epipolar Geometry Prof. D. Stricker. With slides from A. Zisserman, S. Lazebnik, Seitz

CSE 252B: Computer Vision II

3D Modeling using multiple images Exam January 2008

Assignment 2 : Projection and Homography

Cameras and Stereo CSE 455. Linda Shapiro

Image Transformations & Camera Calibration. Mašinska vizija, 2018.

Final Exam Study Guide

CS4670: Computer Vision

Depth. Common Classification Tasks. Example: AlexNet. Another Example: Inception. Another Example: Inception. Depth

Computer Vision Lecture 17

Project Title: Welding Machine Monitoring System Phase II. Name of PI: Prof. Kenneth K.M. LAM (EIE) Progress / Achievement: (with photos, if any)

Robotics - Projective Geometry and Camera model. Marcello Restelli

CIS 580, Machine Perception, Spring 2016 Homework 2 Due: :59AM

Computer Vision Lecture 17

Structure from Motion. Prof. Marco Marcon

Visual Recognition: Image Formation

Camera model and multiple view geometry

Computer Vision I - Appearance-based Matching and Projective Geometry

Camera Geometry II. COS 429 Princeton University

Stereo CSE 576. Ali Farhadi. Several slides from Larry Zitnick and Steve Seitz

Depth Measurement and 3-D Reconstruction of Multilayered Surfaces by Binocular Stereo Vision with Parallel Axis Symmetry Using Fuzzy

Geometry of image formation

Introduction to 3D Imaging: Perceiving 3D from 2D Images

Reminder: Lecture 20: The Eight-Point Algorithm. Essential/Fundamental Matrix. E/F Matrix Summary. Computing F. Computing F from Point Matches

Stereo Image Rectification for Simple Panoramic Image Generation

There are many cues in monocular vision which suggests that vision in stereo starts very early from two similar 2D images. Lets see a few...

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

Computer Vision Project-1

ECE 470: Homework 5. Due Tuesday, October 27 in Seth Hutchinson. Luke A. Wendt

Outline. ETN-FPI Training School on Plenoptic Sensing

CS6670: Computer Vision

Multiple View Geometry

MAPI Computer Vision. Multiple View Geometry

Vision Review: Image Formation. Course web page:

Image Rectification (Stereo) (New book: 7.2.1, old book: 11.1)

Recap: Features and filters. Recap: Grouping & fitting. Now: Multiple views 10/29/2008. Epipolar geometry & stereo vision. Why multiple views?

An Embedded Calibration Stereovision System

Feature descriptors and matching

Introduction to Computer Vision

Lecture 9: Epipolar Geometry

Projective Geometry and Camera Models

Computer Vision I - Appearance-based Matching and Projective Geometry

Pin Hole Cameras & Warp Functions

Geometric camera models and calibration

Early Fundamentals of Coordinate Changes and Rotation Matrices for 3D Computer Vision

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

Specifying Complex Scenes

Archeoviz: Improving the Camera Calibration Process. Jonathan Goulet Advisor: Dr. Kostas Daniilidis

CS5670: Computer Vision

LUMS Mine Detector Project

Stereo and Epipolar geometry

Unit 3 Multiple View Geometry

Transcription:

A Stereo Machine Vision System for measuring three-dimensional crack-tip displacements when it is subjected to elasticplastic deformation Arash Karpour Supervisor: Associate Professor K.Zarrabi Co-Supervisor: Professor A.Crosky

1. Aims The aims of this project is to: design and manufacture a Stereo Machine Vision System (SMVS) to measure a crack tip displacement in x, y and z direction simultaneously. Keep a low cost in manufacturing SMVS while maintain the accuracy within a acceptable margins.

2. Background To study complex behaviour of mixed-modes I,II and III a pair of grips have specially been designed. Using a universal Instron machine and these grips a cracked specimen would go under simultaneous in and out-of- plane (mixed-mode I,II and III) displacement. Mechanical displacement measurement sensors such as clip gauge is used to measure the y-displacement and two Linear Variable Displacement Transducers (LVDT) are used to measure the displacement in x and z direction.

2. Background Because of the compound geometry of the grips not all of the mechanical sensors are able to physically fit around the crack tip to measure the displacement. This is where the importance of a non-contactable measurement device such as SMVS become evident.

Mode I (Tension, Opening) Mode II (In-plane shear, Sliding) Mode III (Out-of-plane shear, Tearing)

3. System Set Up - Hardware INSTRON Load Cell Data Acquisition Upper Grip Specimen Lower Grip PC SMVS

3. System Set Up - Hardware Universal Instron Machine Upper Clamp Upper Load Shaft Upper Grip Load Nut Data Acquisition Unit Specimen Lower Grip Lower Load Shaft

3. System Set Up - Hardware y θ=45 x z =15

3. System Set Up - Hardware

3. System Set Up - Hardware

L 1 3. System Set Up - Hardware Z L1 X L2

3. System Set Up - Hardware Y Micro Positioners Z Macro Positioners X Locking Cap Screws

3. System Set Up - Hardware The machine vision consists of: 1-Firewire Cameras: Two DMK 21AF04 firewire cameras

3. System Set Up - Hardware 2-Image Processing Card: Fireboard-Blue 1394a PCI adapter

3. System Set Up - Hardware 3-Micropositioner: Mellet Geriot 5 axis manual. 4-Macroptioner: Lateral and Transversal linear bearings 5- Navitar Zoom Lenses : The Zoom 7000 offers a 6:1 zoom ratio

3. System Set Up - Hardware White lights to illuminate the background White background certain White lights illuminating the test specimen Test Specimen Cluster of high intensity red LEDs Studio lights

4. Calibration Camera calibration is the process of identifying the camera s optical parameters and its position and orientation with respect to a pattern frame. Intrinsic parameters: focal length, pixel spacing and distortion of lens. Extrinsic parameters: incorporate rotation of cameras, distance between cameras and angle of the cameras.

4. Calibration 1. Calculating the homograms for all points in a series of images. A homogram is a matrix of perspective transformation between the checkerboard (i.e., our calibration pattern plane in SMVS) and the camera view plane. 2. Initializing the intrinsic (including distortion) parameters to zero. 3. Finding extrinsic parameters for each image of the pattern. 4. Optimizing by minimizing error of projection points with all parameters.

4. Calibration Checkerboard pattern Rotation and Goniometers (tilt) unis Second rotation and tilt unit Translation (x,y,z) unit

4. Calibration Assuming a pinhole camera is used, a 3-D point on specimen surface which is defined as {M} = [ X,Y, Z] T, corresponds to a point on the projected image defined as {m} = [ x, y ] T. During the estimation process, 16 frames from randomly selected angles are captured. The homogram relation between {m} and {M} {m} = [A][R]{t}{M}

4. Calibration Object Frame Y M X Z Image Frame x y [Rt] Camera Frame y z A x

4. Calibration r11 r12 r13 t1 [R] = r r r,{t} = t 21 22 23 2 r31 r32 r 33 t 3 fl γ c [A] 0 f c = R y x 0 0 1

4. Calibration Looking at the homography between the model plane {M} and its image we can assume that the model plane is on Z = 0 of the world coordinate system. Assuming Z = 0 sets the world coordinate system for that particular frame on the model plane. Orientation of the coordinate system changes with the angle which the model plane is set to for that shot.

4. Calibration i th If column of a rotation matrix [R] is denoted as then, X x X Y y = [A][ r1 r2 r3 t ] = [A][ r1 r2 t ] Y 0 1 1 1 Therefore a model plane {M} and its image {m} is related by homography [H]: Which [H] is: {m} % = H{M} % [ h h h ] = λ A[ r r t] 1 2 3 1 2 ri

4. Calibration λ r r 1 2 Where is an arbitrary scalar. Knowing and are orthonormal (orthogonal vectors with magnitude of 1, then we have, T T 1 h A A h = 0 1 2 h A A h = h A A h T T 1 T T 1 1 1 2 2 1 T [A] [A] = [B] is called the absolute conic. B B B = = B31 B32 B 33 11 12 13 T 1 [B] [A] [A] B21 B22 B32

4. Calibration 1 γ cxγ cyf R 2 2 2 f L f L f R f L f R 2 γ γ 1 (cxγ cyf R ) c y = + 2 2 2 2 2 2 2 flfr flfr fr flfr fr 2 cxγ cyf R (cxγ cyf R ) c (c x xγ cyf R ) cx 1 + + 2 2 2 2 2 2 2 flfr flfr fr flfr f R

4. Calibration c 12 13 11 23 x 2 ( B11B22 B12 ) λ = f f L ( B B B B ) = B [ B + c ( B B B B )] 2 33 13 x 12 13 11 23 11 B 11 R 2 B11B22 B12 γ = c y = = λ B x = L 2 13 L 11 λb ( ) 2 B12f LfR λ γc B f f λ

4. Calibration Epipolar geometry: the point P, the optical centres O and O of the two cameras, and the two images p and p of P all lie in the same plane.

X Intrinsic & x Y extrinsic = y Z parameters 4. Calibration Epipolar geometry depends uniquely on the parameters and geometry of the two cameras and not on the geometry of the scene. A matrix called the fundamental matrix, [F], exists, which is the algebraic representation of the epipolar geometry and which therefore relates an epipolar line to the point it has generated T {m }[F]{M} = 0 X Intrinsic & x Y extrinsic = y Z parameters

4. Calibration A pair of corresponding points

4. Calibration Once we had a corresponding object pair, we could compute the depth (z value) of the detected object. To do that, we could use the following simple formulas (B f ) z object =,d = xl xr (d PS) x PS z image object x object = (B / 2) y object = y image f PS z f object where d is the disparity (difference between x-values in image plane), B the baseline (distance between cameras) and f is the focal length. PS is the size of one pixel

Start Frame Capture Frame Capture Parameter Retrieval Intrinsic Parameter Calibration Extrinsic Parameter Calibration Frame Capture Calibrated? Calibrated? Object Location Parameter Storage Parameter Storage Completed? Parameters Exit

4. System Set Up - Software

4. System Set Up - Software

5. Results C B A

5. Results C B A

5. Results C B A

Thank you Questions?