Structural Graph Matching With Polynomial Bounds On Memory and on Worst-Case Effort

Similar documents
An Algorithm Using Length-R Paths to Approximate Subgraph Isomorphism

The complement of PATH is in NL

Local Algorithms for Sparse Spanning Graphs

A Study of Graph Spectra for Comparing Graphs

Dealing with Categorical Data Types in a Designed Experiment

Categorical Data in a Designed Experiment Part 2: Sizing with a Binary Response

Fast Landmark-Based Registration via Deterministic and Efficient Processing, Some Preliminary Results Ω

Acquisition Description Exploration Examination Understanding what data is collected. Characterizing properties of data.

Graphs: Introduction. Ali Shokoufandeh, Department of Computer Science, Drexel University

Approximation slides 1. An optimal polynomial algorithm for the Vertex Cover and matching in Bipartite graphs

On Covering a Graph Optimally with Induced Subgraphs

CS 664 Image Matching and Robust Fitting. Daniel Huttenlocher

Graph and Digraph Glossary

Decision Support Risk handout. Simulating Spreadsheet models

COSC160: Detection and Classification. Jeremy Bolton, PhD Assistant Teaching Professor

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

The Monte Carlo analysis can vary basic components and models - subcircuit data is not varied during the analysis.

Binary Image Processing. Introduction to Computer Vision CSE 152 Lecture 5

Simultaneous surface texture classification and illumination tilt angle prediction

ARTISTIC PATTERNS: FROM RANDOMNESS TO SYMMETRY

Data Mining Chapter 3: Visualizing and Exploring Data Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Random projection for non-gaussian mixture models

Object Classification Using Tripod Operators

From Structure-from-Motion Point Clouds to Fast Location Recognition

CS Data Structures and Algorithm Analysis

Learning Generative Graph Prototypes using Simplified Von Neumann Entropy

Lecture 28 Intro to Tracking

MONTE CARLO SIMULATION AND DESIGN OF EXPERIMENTS FOR IMPROVED PIN-PAD ALIGNMENT

Recall: Blob Merge/Split Lecture 28

1. Why Study Trees? Trees and Graphs. 2. Binary Trees. CITS2200 Data Structures and Algorithms. Wood... Topic 10. Trees are ubiquitous. Examples...

Reals 1. Floating-point numbers and their properties. Pitfalls of numeric computation. Horner's method. Bisection. Newton's method.

Graph Matching: Fast Candidate Elimination Using Machine Learning Techniques

Assignment 4 Solutions of graph problems

Efficient Algorithms for Solving Shortest Paths on Nearly Acyclic Directed Graphs

Some graph theory applications. communications networks

Image Segmentation. Srikumar Ramalingam School of Computing University of Utah. Slides borrowed from Ross Whitaker

Image Segmentation. Ross Whitaker SCI Institute, School of Computing University of Utah

Bayesian Methods in Vision: MAP Estimation, MRFs, Optimization

Integer Programming ISE 418. Lecture 1. Dr. Ted Ralphs

Unsupervised Learning : Clustering

Efficient Synthesis of Production Schedules by Optimization of Timed Automata

Framework for Design of Dynamic Programming Algorithms

UCT Algorithm Circle: Probabilistic Algorithms

Using the DATAMINE Program

Conflict Graphs for Combinatorial Optimization Problems

Introduction to Graph Theory

Shape Descriptor using Polar Plot for Shape Recognition.

Robust Shape Retrieval Using Maximum Likelihood Theory

RANSAC and some HOUGH transform

CITS 4402 Computer Vision

Single Source Shortest Path

Balancing Awareness and Interruption: Investigation of Notification Deferral Policies

E-Companion: On Styles in Product Design: An Analysis of US. Design Patents

Example: Map coloring

Small World Graph Clustering

SELECTION OF A MULTIVARIATE CALIBRATION METHOD

Pattern Recognition Using Graph Theory

Hierarchical Minimum Spanning Trees for Lossy Image Set Compression

Lecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing

Relational Indexing. Mauro S. Costa 1 and Linda G. Shapiro 1,2 "

A Vertex Chain Code Approach for Image Recognition

Theorem 2.9: nearest addition algorithm

Histograms. h(r k ) = n k. p(r k )= n k /NM. Histogram: number of times intensity level rk appears in the image

Chapter 9 Graph Algorithms

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

ChristoHouston Energy Inc. (CHE INC.) Pipeline Anomaly Analysis By Liquid Green Technologies Corporation

Domination, Independence and Other Numbers Associated With the Intersection Graph of a Set of Half-planes

Continuous global optimization for protein structure analysis

The Maximum Clique Problem

Graph Embedding in Vector Spaces

Texture Analysis. Selim Aksoy Department of Computer Engineering Bilkent University

Discrete Optimization of Ray Potentials for Semantic 3D Reconstruction

WELCOME! Lecture 3 Thommy Perlinger

Introduction to Image Processing and Analysis. Applications Scientist Nanotechnology Measurements Division Materials Science Solutions Unit

PARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM

EE 701 ROBOT VISION. Segmentation

Sequential Monte Carlo Method for counting vertex covers

On-the-fly Route Planning for Mono-UAV Surveillance Missions

Post Silicon Electrical Validation

On Modularity Clustering. Group III (Ying Xuan, Swati Gambhir & Ravi Tiwari)

Connected components - 1

An Average-Case Analysis of the k-nearest Neighbor Classifier for Noisy Domains

Generalized Weapon Target Assignment Problem

INFO0948 Fitting and Shape Matching

TIED FACTOR ANALYSIS FOR FACE RECOGNITION ACROSS LARGE POSE DIFFERENCES

Chapter 9 Graph Algorithms

2. True or false: even though BFS and DFS have the same space complexity, they do not always have the same worst case asymptotic time complexity.

Statistical image models

Bipartite Roots of Graphs

Read-Once Functions (Revisited) and the Readability Number of a Boolean Function. Martin Charles Golumbic

Operations Research and Optimization: A Primer

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Data Communication. Guaranteed Delivery Based on Memorization

Operators-Based on Second Derivative double derivative Laplacian operator Laplacian Operator Laplacian Of Gaussian (LOG) Operator LOG

A System of Image Matching and 3D Reconstruction

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Reference Sheet for CO142.2 Discrete Mathematics II

The Curse of Dimensionality

An interesting related problem is Buffon s Needle which was first proposed in the mid-1700 s.

CS 161 Lecture 11 BFS, Dijkstra s algorithm Jessica Su (some parts copied from CLRS) 1 Review

Transcription:

Structural Graph Matching With Polynomial Bounds On Memory and on Worst-Case Effort Fred DePiero, Ph.D. Electrical Engineering Department CalPoly State University

Goal: Subgraph Matching for Use in Real-Time Measurement Systems Graph representations for measurements: Image registration, speech recognition, object recognition Approximate maximum common subgraph. Deterministic bounds on processing effort and on memory requirements. Tolerant to low SNR, low dynamic range, required for node and edge coloring. Results Presented: In Paper: Matching based on graph structure emphasized. In Talk: Graphs with and without color attributes. In Software: Multidimensional coloring supported.

Dynamic Horizon Used To Compare Graph Structure Locally Horizon is the number of nodes included in a local comparison of graph structure. Dynamic nature permits the horizon to vary in extent depending on similarity of structure. Examples of horizon distance: Scene labeling via discrete relaxation, unary & binary constraints, h = 1,2 Superclique neighborhood comparison [Wilson & Hancock 6/97], h = constant Eigenspace projection clustering [Caelli & Kosinov 4/04], h = constant We desire having a larger horizon to help find better node-tonode correspondences. Expand until affected by structural differences.

Local Neighborhood Created Using Basis Graphs Basis Graphs ba-bd: Rooted at node 0. Ordered nodes. Non symmetric. ba 3 bb 2 0 1 2 4 5 0 1 3 bc 3 5 bd 5 3 0 1 2 4 0 1 2 4 Basis Graphs used to create local neighborhood (subgraph of input) having invariant order.

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement of Basis Graph (ba), rooted at node ni of Input Graph (G1). Root node ni ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement of Basis Graph (ba), rooted at node ni of Input Graph (G1). Root node ni ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions for ba nodes. Occurrences of ba.0 ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions for ba nodes. Occurrences of ba.1 ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions for ba nodes. Occurrences of ba.2 ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions for ba nodes. Occurrences of ba.3 ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions for ba nodes. Occurrences of ba.4 ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions for ba nodes. Occurrences of ba.5 ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions 3. Overlay instances of ba onto G1, depth-first. Select nodes with highest histogram count. Halt at ties. First instance of ba ba Input Graph, G1

Neighborhood Constructed With Invariant Node Order 1. Depth-first placement 2. Histogram reveals common positions 3. Overlay instances of ba 4. Neighborhood is the induced subgraph of G1 for nodes touched by ba instances. Neighborhood order given by ordering of nodes in ba instances. ba Ordered Neighborhood Input Graph, G1

Common Subgraph Extracted Via Local & Global Processing 1. Establish local (invariant) neighborhoods for each input graph. 2. Compare structure of local neighborhoods in each graph. Define initial mapping probability using complement of edit distance. Functional approximation to PDF, discriminative. 3. Use continuous relaxation to update mapping probabilities. Fixed number of iterations. Introduces global constraints to mapping. 4. Find final mapping in best-first manner. Using updated mapping probabilities.

Additional Features Improve Size Of Common Subgraph 1. Similarity of degree also used to adjust mapping probabilities. Slight improvement. 2. Use multiple basis graphs, select result having larger common subgraph. Good improvement.

Test Trials Contained High Noise Model A, N=12, with 12 added noise nodes Input graphs randomly generated. Types: Model A, Strongly regular, Near-planar. Varying sizes, edge densities. Varying amounts of noise added to each input. Outer ring of nodes, red edges. Node order of input graphs randomized

Adjacency Matrix Used to Verify Result G1: G2: Original Noise added Order of G2 randomized Processing Common subgraph Adjacency matrix of common subgraph checked for validity.

Results Show Improvement Compared to Previous Method Graph Inputs: Model A (0.15, 0.2, 0.3) Strongly Regular (d=2,3,4) Noise added to each input (0-100%) 12 nodes, nominal. Size of common subgraph shown. Min & max value of mean plotted, for each above type. 1200 trials total. Comparisons made versus previous approach: LeRP, Compared Length-R Paths. Basis graphs encounter high density of noise nodes & edges in these tests as shown on previous slides! Nodes in Common Subgraph - Basis Graph Result 120.0% 100.0% 80.0% 60.0% 40.0% 20.0% 0.0% 0% 20% 40% 60% 80% 100% Percentage of Added Noise Nodes Nodes in Common Subgraph - LeRP Result 120.0% 100.0% 80.0% 60.0% 40.0% 20.0% 0.0% 0% 20% 40% 60% 80% 100% Percentage of Added Noise Nodes

Results Show Improvement Compared to Recent Publication Graph Inputs: Model A (0.2) Noise added to one input (0-30%) 50 nodes, nominal. Mean size of common subgraph, with one standard deviation error bars. Upper Curve: Basis Graphs. Lower Curve: LeRP. 120% 100% 80% 60% 40% 20% 0% 0% 5% 10% 15% 20% 25% 30% 35% Caelli-Kosinov, IEEE-PAMI 4/ 04. ~30% match for 30% added noise.

BG Can Yield Good Approximation To Maximal Subgraph Edit Distance Basis Graphs 0.6 +/- 1.5 LeRP 4.9 +/- 2.1 % Size Difference 2 +/- 7 % 25 +/- 13 % Maximum common subgraph found exhaustively. Input graphs had 10 nodes nominal, 100% noise nodes additional. Six different graph types. 120 trials reported.

Deterministic Effort and Memory Required Effort for histogram processing bounded by O(N^V) for N-node data graph and V-node basis graph, for worst-case. Effort for the mean case O(N D^(V-1)), for graphs with mean degree D. Basis graph suite, and mean V, are constant in reported trials. Maximum value of V=6 in reported tests. O(V N^2) memory required. Lower curve had integer-valued node & edge coloring. Dynamic range: 4. Duration of LeRP: under 0.1 seconds. Duration of Processing With and Without Color 1000 100 10 1 10 100 0.1 0.01 Number of Nodes: 10, 20, 30, 40, 50

Factors Limit Increased Size of Basis Graphs Do larger graphs necessarily require larger basis graphs? Increase in V depends on structure of noise & input graph. Should keep V small to reduce corruption of local description by noise. Neighborhoods can be potentially large for modest V. Tests demonstrate best performance for low noise, implying choice of basis is less critical in these cases. Hence cases with both high and low noise may not always require large basis graphs for acceptable performance. Purpose of basis graph is to provide local comparison! Alternative: Increase number of relaxation iterations.

Revised Version of Paper Available Improvements to Algorithm Testing Implementation Available via: www.ee.calpoly.edu/~fdepiero, See references. Encouraging use of approach

Freeware Available! No charge for non-profit, non-commercial, non-military, non-defense applications. Executables only, currently. Find common subgraph for specific input datagraphs, defined in ASCII files. Mapping reported. Perform Monte Carlo simulations, specifying number of nodes, dynamic range of integer colors. Reports mean size of common graph.

Summary & Future Studies Basis Graphs provide local description of graph structure. Common subgraphs with same size as nominal input graphs can be recovered, with 100% additional noise nodes added to each input. Freeware available. Investigating: A-priori PDFs formally relating edit distance of neighborhoods to mapping probability. Optimal set of Basis Graphs. May vary depending on structure of noise & graph style? Process to find A-priori mapping PDFs and optimal basis graphs, given a style of input graphs. Faster processing, forming approximate histogram. Applications: Graphs as measurements, real-time comparison. Image registration & speech recognition. Clustering analyses of graph-based measurements, to find probabilities of nodes & edges for typical measurements. Incorporate probabilities into matching process.