NS-CUK Seminar: J.H.Lee, Review on "Graph Propagation Transformer for Graph R...ssuser4b1f48
Weitere ähnliche Inhalte
Ähnlich wie NS-CUK Joint Journal Club: V.T.Hoang, Review on "NAGphormer: A Tokenized Graph Transformer for Node Classification in Large Graphs", ICLR 2023
Ähnlich wie NS-CUK Joint Journal Club: V.T.Hoang, Review on "NAGphormer: A Tokenized Graph Transformer for Node Classification in Large Graphs", ICLR 2023 (20)
3. 3
• The goal:
• Mapping individual nodes to vector points in latent space.
Graph embedding learning
4. 4
• GNN-based approaches provide faster and practical training and
state-of-the-art results on benchmark datasets for downstream
tasks such as node classification
GNN-based models
6. 6
Problems
Most of the existing GNN architectures have two fundamental weaknesses
which restrict their learning ability on general graph-structured data:
Over-smoothing problems
GNNs fail to be deep enough: Trade-off between features and
structure
Noise from neighbours
GNNs seem to be tailor-made to work on homophilic (associative)
graphs
7. 7
Problems
The existing graph transformers:
Treat the nodes as independent tokens
Construct a single sequence of all the node tokens to train the model
8. 8
Contributions
Training such models on large graphs will cost a huge GPU resources
NAGphormer:
Hop2Token
Attention-based readout function
12. 12
IMPLEMENTATION DETAILS
Structural encoding
Besides the attribute information of nodes, the structural information of
nodes is also a crucial feature for graph mining tasks
the eigenvectors of Laplacian matrix of the graph for capturing the
structural information of nodes
18. 18
Pros & Cons
Pros
What Hop2Token can do:
Capture global information (up to K-hop)
Solve isomophic subgraphs
Attention layer:
learning more informative node representations from the multi-hop
neighborhoods
Cons:
Expressive: <= 1-d Weisfeiler lehman test
Fails to capture graph structure
Noise from neighbourhood