Geometric-Edge Random Graph Model for Image Representation Bo JIANG, Jin TANG, Bin LUO CVPR Group, Anhui University /1/11, Beijing Acknowledgements This research is supported by the National Natural Science Foundation of China (No. 6173116)
Content Random geometric graph Generalized edge random graph G-E random graph (GERG) model Image representation based on GERG Random Dot Product Graph (DPRG) for GERG Image matching Image recognition
Spatial domain image modeling Polygon, skeleton, chain code, run length code, pyramid, and graphs Advantage of graph model : Image matching and recognition graph matching and recognition Capable of describing structural relationships between image units Exist solid problem solving schemes from mathematics.
Problems of image graph model Imperfect image segmentation or key point extraction Extra and missing nodes, and edge structure may not be stable Random graph model
Tradition graph based image representation Delaunay Graph/ K-NN graph/ Geometric graph MST Region adjacent graph Skeleton graph/ Shock graph ARG
Tradition graph based image representation Delaunay Graph KNN Graph
Tradition graph based image representation Geometric Graph Skeleton Graph
Motivation Geometric graph - can be used to extract structure information of an image Edge stability problem Two geometric graphs are generated based on two point-sets (the right image is obtained by adding Gaussian noise to the point positions of the points in the left image). The red edges denote the structure variation between two graphs
Geometric graph Let. be some norm (such as Euclidean norm) on R d, and let r be some positive parameters. Given finite sets X, Y R d, A geometric graph G(X; r) is an undirected graph with vertex set X and with undirected edges connecting all pairs {X, Y} if they satisfy Y-X r.
Random geometric graph Let X=(X 1, X X n ) be iid d-dim variables with common density f G(X; r) = {G(Y; r) Y belongs to X} Property: 1. Given X, the parameter r (contains the relation between graph nodes) determines the graph structure. The randomness lies in the structures attached to the vertices (vertex random graph); E-R random graph: p is a constant matrix
Generalized edge random graph (GERG) Given vertex set V={1, n} and a symmetric probability function p: [n] [n] [,1] The edge probability: p ij = p (i, j) The probability measure P p (G) on G n Pp G p(, i j) 1 p(, i j) i j, ij E i j, ij E
Generalized edge random graph (GERG) Properties 1. The randomness lies in the structures attached to the edges. The edges are independent 3. GERG can be represented by a symmetric matrix (we can consider the algebraic method, Spectral method)
G-E Random graph Main idea Random Geometric graph + GERG Structure Algebraic information Representation
G-E Random graph Given vertex set V={1, n} and X = (X 1, X X n ) for each node, We determine the symmetric function p G-E as p G-E (;, ri j) f( u) d where u ij = X i -X j and f ij is the pdf of random variable u ij. r ij ij u ij
G-E Random graph Properties 1. Edge random graph. Convey the vertex randomness to the edges 3. Way to represent structure information of node random variables
Measurement Expected average degree and maximum degree n n n 1 E( k ) p( i, j) E( k ) max p( i, j) a n i 1 j 1 Expected number of edges Dynamic evolution n E( n ) ( i, j) e n p i 1 j 1 G (p G-E, r n ) a i j 1
GERG image representation Noncentral chi-square distribution Y N i k i ~ ( i, i ),( 1,..., ) The pdf of chi-square distribution: Y k /4 1/ 1 ( x )/ x Y k / 1 k i ( ) ~ ( k, ) i 1 i f ( xk ;, ) e I ( x) I ( y) ( y/) a a j j ( y /4) j! ( a j 1)
GERG image representation The cdf of chi-square distribution: ( /) Pxk e Qxk j j! j / ( ;, ) ( ; ) j ( k /, x/) Qxk ( ; ) ( k /) ( k, z) is the low incomplete Gamma function
GERG image representation Graph nodes: 1. key points of the image. For node i, a random attribute vector y i = (y i1, y i ) is allocated 3. Assume y ~, ij N xij
GERG image representation Graph edge probability: p d ~ k G-E (; rkl,) Pr ( ;, ) 1 xk1 xl1 xk xl
Application: Image matching Initial matching embedding followed by alignment Idea: given edge probability from the GERG, find the best n fitting feature vectors xi attached to each graph node, i 1 which is achieved by Random Dot Product Graph, traditional graph matching method like Hungarian or SVD could be used to find the matches. Ref. Jin Tang, Bo Jiang, Bin Luo, Graph matching based on dot product representation of graphs, Proceedings of the 8th IAPR-TC-15 International Workshop on Graph- based representations in Pattern Recognition(GBRPR11), 11, 175-184
Application: Image matching Consistent check To remove initial mis-matching, and leave most positive correspondences. Idea: Enforce coherent spatial relationships of corresponding nodes between graphs.
Application: Image matching Association graph
Application: Image matching Co-embedding We use the positive correspondences obtained from the above consistency check process and integrate the embedding processes for G1 and G at the same time. Missing value or unmatched nodes can be recovered by, T min f (X) X X (X X) X T P, M PAG M AG where X [ X, X, X ], M is the missing data label matrix. I I I 1
Application: Image matching CMU data Ref. Bin Luo, E.R. Hancock, Structural graph matching using the EM algorithm and singular value decomposition. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.3, No.1, 1,11-1136
Application: Image matching CMU (Compare with EM SVD) Images Points methods Correct False No correspondences correspondences correspondences EM - - - house1 3 DPRG - - - Our method - - - EM 9 1 house 3 DPRG 8 1 1 (5 degree) Our method 8 1 1 house3 (1 degree) 3 house4 (15 degree) 31 house5 ( degree) 3 house6 (5 degree) 3 EM 8 1 1 DPRG 8 1 1 Our method 8 EM 3 5 DPRG 5 3 Our method 7 1 EM 11 1 9 DPRG 4 4 Our method 8 EM 5 16 9 DPRG 19 7 4 Our method 5 4 1
Application: Image matching York data
Application: Image recognition Flow chart Image 1 Image Image Image n Harris Corner detector Corner Points G-E Random graph G-E random graph feature extraction G-ECRI 1 G-ECRI G-ECRI n Similarity of images Euclidean Distance
Application: Image recognition Image data Ref. Bin Luo, Richard C.Wilson, Edwin R. Hancock. Spectral embedding of graphs[j]. Pattern Recognition,3,36:13 ~ 3
Application: Image recognition Clustering Graph Index 3 5 15 1 5 5 1 15 5 3 Graph Index 1 Second eigenvector MDS 6 4 5 9 6 8 7 4 3 3 1 - -4-6 18 47 1 3 5 86 9 1 19 17-8 1 114 15 16 13-15 -1-5 5 1 Second eigenvector 1 1 8 6 4 7 9-8 3-58 -4 6 3 4-6 PCA 11 14 131 15 16 19 1-4 -3 - -1 1 3 17 18 19 8 6 7 5 3 41 (a) Distance matrix (b) MDS embedding (c) PCA embedding Fig. * Expected average degree clustering result
Application: Image recognition Clustering Graph Index 3 5 15 1 5 5 1 15 5 3 Graph Index 1 Second eigenvector 4 3 56 8 3 9 7 1-1 3 1 PCA 19 5 3 4 1 7 18-17 14 15 16 13 11 1 46 48 5 5 54 56 58 6 6 64 66 19 8 6 Second eigenvector 1.5 3 3 9 4 5 6 8 7 -.5-1 1 MDS 14 16 15 13 11 1 1 9 -.5 - -1.5-1 -.5.5 1 1.5 17 18 19 86 5 3 1 47 (a) Distance matrix (b) MDS embedding (c) PCA embedding Fig.* Expected edge number clustering result
Application: Image recognition Synthetic house model sequence
Application: Image recognition Object view structure Third eigenvector.4. -. -.4 1 5431 6 7 8 9 1 11 1 13 14 15 16 5 Second eigenvector 17 18 19 64 63 6 61 6 59 58 67 66 65 57 56 53 54 55 69 68 71 7 5 51 73 7 5 74 49 75 76 48 77 47 78 46 79 45 8 44 43 4 41 4 39 1 38 37 3 35 36 45 3334 678 93313-5 -1-1 1 Third eigenvector - 4 36353433331 38 37 3 9 39 8 4 7 41 6 5 4 4 43 3 44 1 45 46 19 18 47 17 16 48 15 8 79 14 49 75 76 77 78 13 74 1 5 73 11 71 7 51 7 134567891 5 68 69 53 66 67 54 64 65 55 56 57 6 63 58 59 6 61 Second eigenvector - -4-1 -5 5.15.1.5 -.5 -.1 -.15.1 98765413 1 11 1 13 14 15 16 17 18 19 1 3 4 Second eigenvector 5 6 7 8 -.1 57 56 55 59 5854 53 61 6 5 51 63 6 64 5 65 66 49 67 48 69 68 71 7 47 73 7 8 79 78 77 76 75 74 46 45 44 43 4 41 4 39 9 38 3 31 37 3333435 36.5 -.5 -.1 (a) PCA embedding (b) MDS embedding (c) Laplacian embedding Fig. * Expected average edge number based embedding for the model-house.1
Application: Image recognition Object view structure Third eigenvector.1 5431 6 7 8 9 1 11 61 6 63 64 65 66 67 68 1 59 669 57 58 7 71 7 13 56 74 73 54 55 -.1 76 75 14 5 53 79 78 77 15 51 5 8 16 49 17 48 47 18 46 19 45.4 44 43 1 4 41 4. 3 4 39 5 37 38 67 3536 89 334 3313 Second eigenvector -. -6-4 - 4 6 Third eigenvector.6.4. -. -.4 -.6 8 78 79 77 76 75 74 73 7 71 7 69 68 67 66 65 64 63 6 47 48 49 5 61 51 6 5 59 54 53 58575655 38 39 4 41 4 43 44 45 46-1.5-1 -.5.5 1 1.5 35 34 33 3 37 36 31 3 9 8 7 6 5 4 3 1 19 18 17 16 15 14 13 11 1 341567891 -.5.5 Third eigenvector.15.1.5 -.5 -.1 -.15 Second eigenvector 13456 11 7891 13 1 14 15 16 17 18 19 1 44 3 4 43 5 4 6 41 7 4 8 9 39 3 38 31 37 3 334 35 36 48 47 46 45 -.1 -.5.5.1 5 5556 51 5354 57 5 58 59 49 6 61 6 63 64 65 66 67 68 69 7 71 7 73 74 79 8 76 78 75 77 -.1.1 Second eigenvector (a) PCA embedding (b) MDS embedding (c) Laplacian embedding Fig. * Expect edge number based embedding for model-house
Application: Image recognition Object view structure 5 1 8 7 8 7 Third eigenvector -5-1 5 Second eigenvector -5 1 4 3 5 6 7 8 9 1 11 3 4 5 6 Third eigenvector 3 1-1 - -3 6 9 1 4 Second eigenvector 11-1 -4 13-1 56 4-5 3 1 5 Third eigenvector. -. -.4 9 1.4. Second eigenvector 11 -. 1 13 56 4 1..1 -.1 -. -.3 -.4 3 (1) PCA embedding () MDS embedding (3) Laplacian embedding Fig. * Expert average degree based embedding for York-House
Application: Image recognition Object view structure 13.1 1 11 8 7 Third eigenvector -.1 -. 1-1 Second eigenvector 4 3 11 1 5 6 1 7 8 9 5 54 56 58 6 Third eigenvector.5 1 4 -.5.5 3 51 6 Second eigenvector 7 -.5 13 1 8-1 9 1 Third eigenvector.4. -. -.4 9 1.4. Second eigenvector 11 -. 13 1 -.4 -. 6 1 5 3 4. (1) PCA embedding () MDS embedding (3) Laplacian embedding Fig. * Expect edge number based embedding for York-House
Application: Image recognition Object view structure Third eigenvector 15 1 5-5 -1 5 Second eigenvector 8 7 6 5 4-5 9 11 1 3-1 1 1 13-4 14 15-16 17 18 19 4 Third eigenvector 5 1-5 5 Second eigenvector 3 4 5 6-5 -1 7 8 9-1 1 1 11-5 1314 15 16 17 18 19 1 5 Third eigenvector.3..1 -.1 -. -.3 -. -.1 17 16 13 14 15 18 19 1.1 111. 1.3 9 8 7 6 1 3 5 4..1 -.1 -. -.3 Second eigenv (1) PCA embedding () MDS embedding (3) Laplacian embedding Fig. * Expert average degree embedding for MOVI-House
Application: Image recognition Object view structure 7 8 9 Third eigenvector 11 1 1. 13 9 8 14 7 19 -. 15 6 5 16 18 1 17 4 3-1 1 5 Second eigenvector - -5 Third eigenvector.6.4. 5 -. 4 -.4 1 3.5 -.5 Second eigenvector 6-1 -1.5 1 11 1-1 13 1415 16 -.5 1 19 18 17.5 Third eigenvector.4. -.. 1 11 1.1 9 13 14 15 8 16 -.1 Second eigenvector 7 -. 6 -.3 17 5 4 1 19 18 -. 3 1 -.1.1..3 (1) PCA embedding () MDS embedding (3) Laplacian embedding Fig. 15 Expect edge number embedding for MOVI-House
Conclusion Contribution 1: We propose a Geometric-Edge (G-E) Random Graph Model for image representation Contribution : We cast image matching into G-E Random Graph matching by using the random dot product graph based matching algorithm The proposed method is more robust to graph node position disturbance. When used for graph matching, it has been shown the advantages of correct matching rate.
Potential applications of the proposed model Video analysis (structured data) Motion analysis (graph evolution)?
Thank you!