Tracking system Object Recognition & Model Based Tracking
Motivation Manipulating objects in domestic environments Localization / Navigation Object Recognition Servoing Tracking Grasping Pose estimation
Recognition (2D) Tracking (2D) Pose estimation (3D) : Initial pose estimation Steps Where in the image? Where in the world?
Initial Pose Estimation Recognition/Tracking Pose estimation (x,y) (X,Y,Z, φ, ψ, γ)
Example Objects Object Recognition & Model Based Tracking
Characteristics Simple geometry (polyhedra, cones, cylinders) Specular surfaces Background Illumination Slippery objects
Characteristics Simple geometry wireframe models Specular surfaces - ll Illumination - ll Background - ll Highly texture appearance Slippery objects power grasps
Model Based Techniques Appearance based methods Geometry based methods 3D wireframe models Complete pose estimation FUSION! Techniques from computer graphics used for rendering
Removes background, preserves object. Necessary to raise the signal to noise ratio, for the pose estimatior. Solved using color cooccurrence histograms. Object Recognition
An apperance based method is used to recognize the object, and estimate an initial pose. A geometric model based method is used to obtain an accurate pose. Algorithm combines the robustness of appearance based methods with the accuracy of feature based methods. Pose Estimation
Object Recognition & Model Based Tracking
Object Recognition & Model Based Tracking
Color Cooccurrence Histograms Apperance based method. Based on color cues only. Superior to standard color histograms. Invariant to translation and rotation. Robust towards scale changes.
Building Color Cooccurrence Histograms All pairs of pixels within a certain radius contribute to the histogram. Example: 4x4 image with 3 colors, and a maximum radius of 3 pixels. Histogram:
Building Color Cooccurrence Histograms When all pairs have been counted, the histogram is normalized. Each bin is divided with the total number of pixel pairs. 50 % Histogram:
Color Cooccurrence Histograms - Matching A common histogram matching method is used. Match = N i= 1 min( h1 [ i], h2[ i]) Reduces the effect of background noise, as unexpected colors will not penalize the match value.
Before the histogram can be built, the colors in the image need to be quantized. This is done using k- means clustering. Color Quantization Red Green
Images are normalized prior to quantization, in order to decrease the effect of varying lighting conditions. Only the red and green components are preserved. Performance equal to RGB and HSV. r r = norm r + g + b g g = norm r + g + b Color Quantization Red Green
Color Constancy Problem If lighting conditions change, colors may fall out of their original cluster, or even worse, into another one. Red green light Green
Object Segmentation - The system was trained using both front and back sides of the objects. The background of the training images was manually removed before training. Training
Object Segmentation A search window scans through the image, comparing the cooccurrence histogram with the stored histogram from the training images. The result is a vote matrix.
Object Segmentation From the vote matrix, segmentation windows are contructed. Starting from the global maximum, adjacent rows and columns are added as long as the vote values give sufficient support.
Object Segmentation - Out of 50 test images, 49 objects were successfully segmented. Average segmentation time was 1.7 s on a 500 MHz Sun station. Results
Pose Estimation The geometric model based pose estimator requires an initial pose to converge. The initial pose is estimated using color cooccurrence histograms.
Pose Estimation - Training 70 training images were used. The pose of the object varied over the training images. The correct pose of the object in the training image was stored, together with the cooccurrence histogram.
The object with the unknown pose is compared to each of the training examples. The result is a match value graph. Pose Estimation
The match value graph is filtered using a Gaussian kernel. Superior method compared to a nearest-neighbor approach. Pose Estimation
Initial Pose Estimation Appearance based Object Recognition & Model Based Tracking
Principle Component Analisys Learning stage compressing image set using eigenspace representation PCA PCA Pose recognition stage closest point search on appearance manifold PCA Fitting stage closest line search for pose refinement
PCA Pose Appearance Eigenstructure decomposition problem PCA i(q)
Implicit covariance matrix (conjugate gradient method) PCA PCA
Pose determination PCA PCA Object Recognition & Model Based Tracking
Initialization by PCA Object Recognition & Model Based Tracking
Geometric Model Based Pose Estimation Finally, the algorithm was integrated with the model based pose estimator.
Geometric Model Based Pose Estimation
Local refinement by tracking H = (14 0 60 15 6 5) [mm, deg]
Modeling Object Recognition & Model Based Tracking
Modeling Object Recognition & Model Based Tracking
Pose estimation DeMenthon and Davis 1995 Orthographic projection Iterative method No initial guess needed This step is followed by an extension of Lowe s nonlinear approach (Canceroni, Araujo and Brown et al.)
Lie algebra approach Tracking Rigid body motion SE(3) (6D Lie group) G G ω t x x = = 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 G t G y ω = y = 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 G t z = G ω 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 z = 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0
with: Image motion = AE and x = u w vu Z y / / 1 Y X v w w + = 2 = AEG x = w w i Z and Li y v vw + 1 2 L - observed motion in an image point i w vu Y X u w uw w
Normal flow Object Recognition & Model Based Tracking
Rendering example Object Recognition & Model Based Tracking
3D pose update The change in pose is estimated using least square approach: 1 α = Cij O i i where α i represents the quantities of Euclidian motion O C i ij = = p p d p ( L ( L p i n p i p n p ) )( L p j n p )
3D pose update M I + α i G i t + 1 H ( R, t ) = H ( R, t ) ( I + α t i G i ) Object Recognition & Model Based Tracking
Examples Object Recognition & Model Based Tracking
Examples Object Recognition & Model Based Tracking
Example Object Recognition & Model Based Tracking
Task 1 Align and Track Object Recognition & Model Based Tracking
Task 1 Align and Track Object Recognition & Model Based Tracking
Task 2 Object Positioning Object Recognition & Model Based Tracking
Object Recognition & Model Based Tracking
Object Recognition & Model Based Tracking
Task 3 - Insertion Object Recognition & Model Based Tracking
How much a-priori info can we used? Insertion task Object Recognition & Model Based Tracking
Pick and Place Object Recognition & Model Based Tracking