Graph Neural Network. learning algorithm and applications. Shujia Zhang

Similar documents
Deep Learning. Architecture Design for. Sargur N. Srihari

Machine Learning 13. week

DEEP LEARNING REVIEW. Yann LeCun, Yoshua Bengio & Geoffrey Hinton Nature Presented by Divya Chitimalla

The use of Neural Networks in Text Categorization. Abhishek Sethi

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa

Lecture 17: Neural Networks and Deep Learning. Instructor: Saravanan Thirumuruganathan

Recurrent Convolutional Neural Networks for Scene Labeling

Code Mania Artificial Intelligence: a. Module - 1: Introduction to Artificial intelligence and Python:

Deep Learning with Tensorflow AlexNet

Yuki Osada Andrew Cannon

Two-Stream Convolutional Networks for Action Recognition in Videos

Deep Convolutional Neural Networks. Nov. 20th, 2015 Bruce Draper

COMP9444 Neural Networks and Deep Learning 7. Image Processing. COMP9444 c Alan Blair, 2017

ECG782: Multidimensional Digital Signal Processing

Multilayer Feed-forward networks

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

A Neural Algorithm of Artistic Style. Leon A. Gatys, Alexander S. Ecker, Matthias Bethge

Arbitrary Style Transfer in Real-Time with Adaptive Instance Normalization. Presented by: Karen Lucknavalai and Alexandr Kuznetsov

An Efficient Learning Scheme for Extreme Learning Machine and Its Application

Comparing Dropout Nets to Sum-Product Networks for Predicting Molecular Activity

SEMANTIC COMPUTING. Lecture 8: Introduction to Deep Learning. TU Dresden, 7 December Dagmar Gromann International Center For Computational Logic

Machine Learning. Deep Learning. Eric Xing (and Pengtao Xie) , Fall Lecture 8, October 6, Eric CMU,

Back propagation Algorithm:

Opening the Black Box Data Driven Visualizaion of Neural N

CS231N Section. Video Understanding 6/1/2018

Inception and Residual Networks. Hantao Zhang. Deep Learning with Python.

Convolutional Neural Networks

CMU Lecture 18: Deep learning and Vision: Convolutional neural networks. Teacher: Gianni A. Di Caro

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Artificial Neural Networks. Introduction to Computational Neuroscience Ardi Tampuu

Deep Learning Basic Lecture - Complex Systems & Artificial Intelligence 2017/18 (VO) Asan Agibetov, PhD.

Deep Learning in Visual Recognition. Thanks Da Zhang for the slides

Motivation. Problem: With our linear methods, we can train the weights but not the basis functions: Activator Trainable weight. Fixed basis function

COMPUTATIONAL INTELLIGENCE

Convolutional Neural Networks. Computer Vision Jia-Bin Huang, Virginia Tech

DEEP LEARNING IN PYTHON. The need for optimization

CS 4510/9010 Applied Machine Learning. Neural Nets. Paula Matuszek Fall copyright Paula Matuszek 2016

Neural Network Weight Selection Using Genetic Algorithms

Artificial Intelligence Introduction Handwriting Recognition Kadir Eren Unal ( ), Jakob Heyder ( )

ABC-CNN: Attention Based CNN for Visual Question Answering

In this assignment, we investigated the use of neural networks for supervised classification

People Detection and Video Understanding

Deep Learning. Deep Learning. Practical Application Automatically Adding Sounds To Silent Movies

Supervised Learning in Neural Networks (Part 2)

Disguised Face Identification (DFI) with Facial KeyPoints using Spatial Fusion Convolutional Network. Nathan Sun CIS601

Conditional Random Fields as Recurrent Neural Networks

Neural Networks for unsupervised learning From Principal Components Analysis to Autoencoders to semantic hashing

Deep Learning With Noise

CS6220: DATA MINING TECHNIQUES

WHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES?

Know your data - many types of networks

Advanced Introduction to Machine Learning, CMU-10715

A neural network that classifies glass either as window or non-window depending on the glass chemistry.

Learning. Learning agents Inductive learning. Neural Networks. Different Learning Scenarios Evaluation

Machine Learning With Python. Bin Chen Nov. 7, 2017 Research Computing Center

MACHINE LEARNING CLASSIFIERS ADVANTAGES AND CHALLENGES OF SELECTED METHODS

Subgraph Matching Using Graph Neural Network

Computer Vision Lecture 16

Deep Learning for Computer Vision II

Introduction to Deep Learning

Machine Learning : Clustering, Self-Organizing Maps

Dropout. Sargur N. Srihari This is part of lecture slides on Deep Learning:

arxiv: v1 [cond-mat.dis-nn] 30 Dec 2018

A Keypoint Descriptor Inspired by Retinal Computation

Predictive Analytics: Demystifying Current and Emerging Methodologies. Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA

COMP 551 Applied Machine Learning Lecture 16: Deep Learning

Neural Network and Deep Learning. Donglin Zeng, Department of Biostatistics, University of North Carolina

Neural Network Approach for Automatic Landuse Classification of Satellite Images: One-Against-Rest and Multi-Class Classifiers

TRAFFIC SIGN CLASSIFICATION USING CONVOLUTIONAL NEURAL NETWORK

INTRODUCTION TO DEEP LEARNING

Why Normalizing v? Why did we normalize v on the right side? Because we want the length of the left side to be the eigenvalue

Tutorial on Keras CAP ADVANCED COMPUTER VISION SPRING 2018 KISHAN S ATHREY

Neural Network Neurons

Inception Network Overview. David White CS793

Deep Learning with R. Francesca Lazzeri Data Scientist II - Microsoft, AI Research

Action recognition in videos

Impact sensitive ranking of structured documents

COMP9444 Neural Networks and Deep Learning 5. Geometry of Hidden Units

Residual Networks And Attention Models. cs273b Recitation 11/11/2016. Anna Shcherbina

Face Image Quality Assessment for Face Selection in Surveillance Video using Convolutional Neural Networks

CENG 783. Special topics in. Deep Learning. AlchemyAPI. Week 11. Sinan Kalkan

Character Recognition Using Convolutional Neural Networks

End-To-End Spam Classification With Neural Networks

Deep Learning. Vladimir Golkov Technical University of Munich Computer Vision Group

Object Detection. Part1. Presenter: Dae-Yong

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

Event: PASS SQL Saturday - DC 2018 Presenter: Jon Tupitza, CTO Architect

For Monday. Read chapter 18, sections Homework:

C-Brain: A Deep Learning Accelerator

Hybrid PSO-SA algorithm for training a Neural Network for Classification

Feature Extraction and Learning for RSSI based Indoor Device Localization

SGD: Stochastic Gradient Descent

Dynamic Routing Between Capsules

Center for Automation and Autonomous Complex Systems. Computer Science Department, Tulane University. New Orleans, LA June 5, 1991.

ECE 6504: Deep Learning for Perception

Inversion of Fracture Parameters by Using the Artificial Neural Network

Image Compression: An Artificial Neural Network Approach

Deep Residual Learning

CS 2750: Machine Learning. Neural Networks. Prof. Adriana Kovashka University of Pittsburgh April 13, 2016

! References: ! Computer eyesight gets a lot more accurate, NY Times. ! Stanford CS 231n. ! Christopher Olah s blog. ! Take ECS 174!

Transcription:

Graph Neural Network learning algorithm and applications Shujia Zhang

What is Deep Learning?

Enable computers to mimic human behaviour Artificial Intelligence Machine Learning Subset of ML algorithms using complex artificial neural networks F Deep Learning Subset of AI to make machine learn from data

Deep Learning Convolutional Graph Neural Network Neural Network (GNN) (CNN) Graph of Graphs (GoGs) Learning algorithm and applications

Data Representations Numeric: made of numbers Plain vector Sequences Trees Categorical: made of words Female 1 0 Male 0 1 Graph

Data Representations Plain vectors Sequences Trees Graph User events sequence

Data Representations Plain vectors Sequences Trees Graph

Data Representations Plain vectors Sequences Trees Graph

Graph of Graphs The nature of a node in a graph is of a more complicated structure than plain numeric vectors The node in a graph is described by another graph Theoretically, such structure can be extended to be with infinite depth Both the nodes and edges in the graph can be attached with labels

Web Spam Detection One-level GoGs Level 1: web documents connected via hyperlinks Documents are described by content-based and link-based feature vectors

Web Document Categorisation Three-level GoGs Level 1: web documents connected via hyperlinks Level 2: section of text within XML formatting elements Level 3: word tokens in section of text

Human Action Recognition

Step 2 Delaunay Triangulations Sequence of triangle graphs Step 1 Interest points Dense trajectories HOG, HOF, MBH

Human Action Recognition Two-level GoGs Level 1 Sequence of frames Level 2 Delaunay Triangulation of each frame u1 n1 u2 n2 u3 n3

How to encode GoGs structure using neural networks?

Graph Neural Network A graph is processed node by node sum of states in a random order Label For a node in graph, the sum of the state vectors of neighboring nodes hidden are computed and concatenated to its own label vector The algorithm guarantees a convergence of the state nodes to a stable and unique solution states outputs

Deep Graph Neural Network Encoding Network Encode Graph-of-Graphs The architecture of the sum of states Label hidden network depends on the depth and size of the graphs states Feedforward and backpropagation through graph structure and network output Output Network sum of states Label hidden structure states output

Iterative Feed Forward Initiate states for all nodes Compute sum of states of all neighboring nodes Feed forward through hidden to state layer Iterate the process until the states for all the nodes are stable

Iterative Feed Forward Initiate states for all nodes Compute sum of states of all neighboring nodes Feed forward through hidden to state layer Iterate the process until the states for all the nodes are stable

Iterative Back Propagation BP the error from the topmost level through the levels until we reach the deepest level of graph and update the parameters of the networks in the process Iterative BP iterates same number of times as the number of feed forwarding required to reach stable states

Deep Graph Neural Networks Compute the states: from the deepest level k of GoGs, to level k-1, k-2,..., 0 Compute outputs at the topmost level 0, and form an error function Back propagate the error and compute the gradients until we reach the deepest level k Update the network parameters and repeat the process

What are the applications of GNN and GoGs?

Web Spam Detection One-level GoGs for both properties of individual pages and relationship among pages Content-based and linked-based numeric feature vectors The GNN model produced AUC 0.951 which outperformed others in the competition

Human Action Recognition Two-level GoGs Level 1 Sequence of frames Level 2 Delaunay Triangulation of each frame u1 n1 u2 n2 u3 n3

Human Action Recognition GoGs and GNN provide a deep learning based framework for human action recognition problem Different visual descriptors can be arranged in a spatial and temporal representation

Conclusion & Discussion The significance of the GoGs representation is that various features and structures from different depth in the problem domain are described without abstracting the intrinsic connections among them, and GNN is capable of encoding all the elements in a GoGs as a whole The problem still remains in such approach is that it is not easy to determine a good architecture, e.g. how many layers required? how many neurons in each layer? etc The computational cost can be high depends on depth of GoGs and structure of GNN

? Q&A