GraphGAN: Graph Representation Learning with Generative Adversarial Nets

Similar documents
Network embedding. Cheng Zheng

arxiv: v1 [cs.si] 12 Jan 2019

node2vec: Scalable Feature Learning for Networks

arxiv: v1 [cs.si] 14 Mar 2017

DeepWalk: Online Learning of Social Representations

This Talk. Map nodes to low-dimensional embeddings. 2) Graph neural networks. Deep learning architectures for graphstructured

Community Preserving Network Embedding

HARP: Hierarchical Representation Learning for Networks

Deep Attributed Network Embedding

Unsupervised Multi-view Nonlinear Graph Embedding

Lecture 19: Generative Adversarial Networks

POI2Vec: Geographical Latent Representation for Predicting Future Visitors

Learning Representations of Large-Scale Networks Jian Tang

A Brief Review of Representation Learning in Recommender 赵鑫 RUC

Relation Structure-Aware Heterogeneous Information Network Embedding

Structural Deep Embedding for Hyper-Networks

Label Informed Attributed Network Embedding

CS224W: Analysis of Networks Jure Leskovec, Stanford University

Proceedings of the International MultiConference of Engineers and Computer Scientists 2018 Vol I IMECS 2018, March 14-16, 2018, Hong Kong

Alternatives to Direct Supervision

Adversarial Network Embedding

AspEm: Embedding Learning by Aspects in Heterogeneous Information Networks

arxiv: v2 [cs.si] 14 Sep 2017

Unsupervised Learning

A United Approach to Learning Sparse Attributed Network Embedding

arxiv: v1 [cs.lg] 19 May 2018

Network Representation Learning Enabling Network Analytics and Inference in Vector Space. Peng Cui, Wenwu Zhu, Daixin Wang Tsinghua University

Hierarchical Mixed Neural Network for Joint Representation Learning of Social-Attribute Network

MetaGraph2Vec: Complex Semantic Path Augmented Heterogeneous Network Embedding

Unsupervised Feature Selection on Networks: A Generative View

An efficient face recognition algorithm based on multi-kernel regularization learning

Heterogeneous Graph-Based Intent Learning with Queries, Web Pages and Wikipedia Concepts

ECCV Presented by: Boris Ivanovic and Yolanda Wang CS 331B - November 16, 2016

GENERATIVE ADVERSARIAL NETWORKS (GAN) Presented by Omer Stein and Moran Rubin

Adversarial Network Embedding

Lab meeting (Paper review session) Stacked Generative Adversarial Networks

Adaptive Dropout Training for SVMs

Learning to Match. Jun Xu, Zhengdong Lu, Tianqi Chen, Hang Li

Deep Learning on Graphs

Controllable Generative Adversarial Network

Xiaowei Hu* Lei Zhu* Chi-Wing Fu Jing Qin Pheng-Ann Heng

arxiv: v1 [cs.si] 7 Sep 2018

Semi-supervised Methods for Graph Representation

Unsupervised User Identity Linkage via Factoid Embedding

Deep Learning for Visual Manipulation and Synthesis

Ph.D. in Computer Science & Technology, Tsinghua University, Beijing, China, 2007

Deep Generative Models and a Probabilistic Programming Library

Learning the Structures of Online Asynchronous Conversations

Embedding Temporal Network via Neighborhood Formation

Dynamic Network Embedding by Modeling Triadic Closure Process

Multi-Label Classification with Conditional Tree-structured Bayesian Networks

TriRank: Review-aware Explainable Recommendation by Modeling Aspects

arxiv: v1 [cs.cv] 5 Jul 2017

Progress on Generative Adversarial Networks

arxiv: v1 [cs.cv] 7 Mar 2018

Effective Latent Space Graph-based Re-ranking Model with Global Consistency

Adversarially Regularized Graph Autoencoder for Graph Embedding

Neural Link Prediction over Aligned Networks

JOINT INTENT DETECTION AND SLOT FILLING USING CONVOLUTIONAL NEURAL NETWORKS. Puyang Xu, Ruhi Sarikaya. Microsoft Corporation

GraphNet: Recommendation system based on language and network structure

Learning to generate with adversarial networks

arxiv: v4 [cs.si] 26 Apr 2018

Ranking with Query-Dependent Loss for Web Search

CONE: Community Oriented Network Embedding

Binarized Attributed Network Embedding

Generative Adversarial Network

A New Evaluation Method of Node Importance in Directed Weighted Complex Networks

10-701/15-781, Fall 2006, Final

Deep Learning on Graphs

Sub2Vec: Feature Learning for Subgraphs

Linked Document Classification by Network Representation Learning

Discrete Network Embedding

Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting

CS249: ADVANCED DATA MINING

GAN Related Works. CVPR 2018 & Selective Works in ICML and NIPS. Zhifei Zhang

Introduction to Generative Adversarial Networks

Combining Review Text Content and Reviewer-Item Rating Matrix to Predict Review Rating

arxiv: v1 [cs.cv] 6 Sep 2018

A Recommender System Based on Improvised K- Means Clustering Algorithm

User Guided Entity Similarity Search Using Meta-Path Selection in Heterogeneous Information Networks

Jing Gao 1, Feng Liang 1, Wei Fan 2, Chi Wang 1, Yizhou Sun 1, Jiawei i Han 1 University of Illinois, IBM TJ Watson.

Fast Network Embedding Enhancement via High Order Proximity Approximation

Learning Community Embedding with Community Detection and Node Embedding on Graphs

An Empirical Study of Generative Adversarial Networks for Computer Vision Tasks

Channel Locality Block: A Variant of Squeeze-and-Excitation

S+U Learning through ANs - Pranjit Kalita

with Deep Learning A Review of Person Re-identification Xi Li College of Computer Science, Zhejiang University

Query Independent Scholarly Article Ranking

Don t Walk, Skip! Online Learning of Multi-scale Network Embeddings

Scalable Influence Maximization in Social Networks under the Linear Threshold Model

Mining Trusted Information in Medical Science: An Information Network Approach

INformation networks are becoming ubiquitous across a

Can Adversarial Network Attack be Defended?

Random projection for non-gaussian mixture models

Midterm Examination CS 540-2: Introduction to Artificial Intelligence

arxiv: v1 [cs.ai] 12 Sep 2018

Unsupervised Feature Selection by Preserving Stochastic Neighbors

Intuitionistic Fuzzy Petri Nets for Knowledge Representation and Reasoning

Multi-view Unsupervised Feature Selection by Cross-diffused Matrix Alignment

arxiv: v2 [cs.cv] 27 Nov 2017

Transcription:

The 32 nd AAAI Conference on Artificial Intelligence (AAAI 2018) New Orleans, Louisiana, USA GraphGAN: Graph Representation Learning with Generative Adversarial Nets Hongwei Wang 1,2, Jia Wang 3, Jialin Wang 4, Miao Zhao 3, Weinan Zhang 1, Fuzheng Zhang 2, Xing Xie 2, Minyi Guo 1 1 Shanghai Jiao Tong University, 2 Microsoft Research Asia, 3 The Hong Kong Polytechnic University, 4 Huazhong University of Science and Technology November 27, 2017

Background (1/3) Graph representation learning tries to embed each node of a graph into a low-dimensional vector space, which preserves the structural similarities or distances among the nodes in the original graph Also known as network embedding / graph embedding / graph feature learning 5 2 Graph representation learning 1 11/29/2017 2

Background (2/3) Graph representation learning can benefit a wide range of real-world applications: Link prediction (Gao, Denoyer, and Gallinari, CIKM 2011) Node classification (Tang, Aggarwal, and Liu, SDM 2016) Recommendation (Yu et al., WSDM 2014) Visualization (Maaten and Hinton, JMLR 2008) Knowledge graph representation (Lin et al., AAAI 2015) Clustering (Tian et al., AAAI 2014) Text embedding (Tang, Qu, and Mei, KDD 2015) Social network analysis (Liu et al., IJCAI 2016) 11/29/2017 3

Background (3/3) Researchers have examined applying representation learning methods to various types of graphs: Weighted graphs (Grover and Leskovec, KDD 2016) Directed graphs (Zhou et al., AAAI 2017) Signed graphs (Wang et al., SDM 2017) Heterogeneous graphs (Wang et al., WSDM 2018) Attributed graphs (Huang, Li, and Hu, WSDM 2017) Several prior works also try to preserve specific properties during the learning process: Global structures (Wang, Cui, and Zhu, KDD 2017) Community structures (Wang et al., AAAI 2017) Group information (Chen, Zhang, and Huang, CIKM 2016) Asymmetric transitivity (Ou et al., KDD 2016) 11/29/2017 4

Motivation (1/3) Generative Model Generative graph representation learning model assumes an underlying true connectivity distribution p true (v v c ) for each vertex v c Similar to GMM and LDA The edges can be viewed as observed samples generated by p true (v v c ) Vertex embeddings are learned by maximizing the likelihood of edges E.g., DeepWalk (KDD 2014) and node2vec (KDD 2016) v c 0.4 0.3 v c 0.0 0.3 Original graph p true (v v c ) 11/29/2017 5

Motivation (2/3) Discriminative Model Discriminative graph representation learning model aim to learn a classifier for predicting the existence of edges directly Consider two vertices v i and v j jointly as features Predict the probability of an edge existing between them, i.e., p(edge v i, v j ) E.g., SDNE (KDD 2016) and PPNE (DASFAA, 2017) v j p edge v i, v j = 0.8 v i p edge v i, v k = 0.3 v k 11/29/2017 6

Motivation (3/3) G + D? Generative and discriminative models are two sides of the same coin LINE (WWW 2015) has tried to combine these two objectives Generative adversarial nets (GAN) have received a great deal of attention GAN designs a game-theoretical minimax game to combine G and D GAN achieves success in various applications: image generation (Denton et al., NIPS 2015) sequence generation (Yu et al., AAAI 2017) dialogue generation (Li et al., arxiv 2017) information retrieval (Wang et al., SIGIR 2017) domain adaption (Zhang, Barzilay, and Jaakkola, arxiv 2017) We propose GraphGAN, a framework that unifies generative and discriminative thinking for graph representation learning 11/29/2017 7

The Minimax Game V G = (V, E), V = {v 1,, v V }, E = {e ij } i,j=1 N(v c ): set of neighbors of v c p true (v c ): underlying true connectivity distribution for v c The objective of GraphGAN is to learn the following two models: G v v c ; θ G which tries to approximate p true (v c ) D(v, v c ; θ D ) which aims to discriminate the connectivity for the vertex pair v, v c The two-player minimax game: 11/29/2017 8

Implementation & Optimization of D Implementation of D: where d v, d vc R k are the k-dimensional vectors of v and v c for D Gradient of V(G, D) w.r.t θ D : 11/29/2017 9

Optimization of G Gradient of V(G, D) w.r.t θ G (policy gradient): 11/29/2017 10

GraphGAN Framework 11/29/2017 11

Implementation of G Softmax? Computationally inefficient Graph-structure-unaware where g v, g vc R k are the k-dimensional vectors of v and v c for G Hierarchical softmax? Graph-structure-unaware Negative sampling? Not a valid probability distribution graph-structure-unaware 11/29/2017 12

Graph Softmax (1/5) Objectives The design of graph softmax should satisfy the following three properties: Normalized: The generator should produce a valid probability distribution, i.e., σ v vc G v v c ; θ G = 1 Graph-structure-aware: The generator should take advantage of the structural information of a graph Computationally efficient: The computation of G v v c ; θ G involve a small number of vertices in the graph should only 11/29/2017 13

Graph Softmax (2/5) Design Breadth First Search (BFS) on G from every vertex v c BFS-tree T c rooted at v c For a given vertex v and one of its neighbors v i N c (v), the relevance probability of v i given v as where N c (v) is the set of neighbors of v in T c Graph softmax given the unique path from v c to v in tree T c : P vc v = (v r0, v 1,, v rm ), where v r0 = v c and v r0 = v 11/29/2017 14

Graph Softmax (3/5) Design Graph softmax given the unique path from v c to v in tree T c : P vc v = (v r0, v 1,, v rm ), where v r0 = v c and v r0 = v v 11/29/2017 15

Graph Softmax (4/5) Properties σ v vc G v v c ; θ G = 1 in graph softmax In graph softmax, G v v c ; θ G decreases exponentially with the increase of the shortest distance between v and v c in original graph G In graph softmax, calculation of G v v c ; θ G depends on O(d log V) vertices, where d is average degree of vertices and V is the number of vertices in graph G 11/29/2017 16

Graph Softmax (5/5) Generating Strategy 11/29/2017 17

GraphGAN Algorithm 11/29/2017 18

Experiments (1/3) Datasets arxiv-astroph: 18,772 vertices and 198,110 edges arxiv-grqc: 5,242 vertices and 14,496 edges BlogCatalog: 10,312 vertices, 333,982 edges and 39 labels Wikipedia: 4,777 vertices, 184,812 edges and 40 labels MovieLens-1M: 6,040 users and 3,706 movies Baselines DeepWalk (KDD 2014) LINE (WWW 2015) Node2vec (KDD 2016) Struc2vec (KDD 2017) 11/29/2017 19

Experiments (2/3) Link Prediction Learning curves Results 11/29/2017 20

Experiments (3/3) Node Classification Recommendation 11/29/2017 21

Summary We propose GraphGAN, a novel framework that unifies generative and discriminative thinking for graph representation learning Generator G(v v c ) tries to fit p true (v v c ) as much as possible Discriminator D(v, v c ) tries to tell whether an edge exists between v and v c G and D act as two players in a minimax game: G tries to produce the most indistinguishable fake vertices under guidance provided by D D tries to draw a clear line between the ground truth and counterfeits to avoid being fooled by G We propose graph softmax as the implementation of G Graph softmax overcomes the limitations of softmax and hierarchical softmax Graph softmax satisfies the properties of normalization, graph structure awareness and computational efficiency 11/29/2017 22

Q & A Thanks! Visit https://hwwang55.github.io/files/2018-aaai-graphgan.pdf for full paper 23