Anzeige

NS-CUK Joint Journal Club: V.T.Hoang, Review on "Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns", KDD 2021

30. Mar 2023
Anzeige

Más contenido relacionado

Similar a NS-CUK Joint Journal Club: V.T.Hoang, Review on "Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns", KDD 2021(20)

Más de ssuser4b1f48(20)

Anzeige

NS-CUK Joint Journal Club: V.T.Hoang, Review on "Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns", KDD 2021

  1. Thuy Hoang Van, PhD student Network Science Lab E-mail: hoangvanthuy90@gmail.com Susheel Suresh, Vinith Budde, Jennifer Neville, Pan Li, Jianzhu Ma: Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns. KDD 2021: 1541-1551
  2. 1 A lot of real-world data does not “live” on grids Social media
  3. 2 A lot of real-world data does not “live” on grids Biological network
  4. 3 A lot of real-world data does not “live” on grids Transportation system
  5. 4 A lot of real-world data does not “live” on grids Computer network
  6. 5 A lot of real-world data does not “live” on grids Computer program
  7. 6 Graph representation learning The goal: to generate graph representation vectors that capture the structure and features of large graphs
  8. 7 Graph representation learning: Downstream tasks
  9. 8 Graph representation learning: Constraints Proximity
  10. 9 Graph representation learning: Constraints Structural identity
  11. 10 Graph neural networks (GNNs)
  12. 11 Neural Networks on Graph Data Main Idea: Pass massages between pairs of nodes and agglomerate Alternative Interpretation: Pass massages between nodes to refine node (and possi bly edge) representations
  13. 12 Message Passing Neural Networks (MPNN) • Aggregate messages from neighbouring nodes: • Update node information: • Where: • evu are the features associated to edge (v, u) • M (k-1) is a message function (e.g. an MLP) computing message fro m neighbour • U (k) is a node update function (e.g. an MLP) combining messages and local information
  14. 13 Message Passing Neural Networks (MPNN)
  15. 14 Neural Networks on Graph Data: Problems  How far is that message passing coming from?  The quality of message  Trade-offs between graph structure and node features
  16. 15 The problems of GNNs  Disassortative graphs:
  17. 16 The problems of GNNs Deep eunough: Over-smoothing problem Noise from neighbours
  18. 17 Breaking the Limit of Graph Neural Networks
  19. 18 Structural distance Two nodes: g,h The structural distance is recursively: Distance between ordered degree sequences
  20. 19 Structural distance
  21. 20 Message Passing on the Multi-relational Computation Graph The AGGREGATE function: The importance of node 𝑣 to node u : Update function is defined as:
  22. 21 Experiments and results Node Classification
  23. 22 Node Classification Experiments and results
  24. 23
Anzeige