Anzeige
Anzeige

Más contenido relacionado

Similar a NS-CUK Seminar: J.H.Lee, Review on "Hyperbolic graph convolutional neural networks," Advances in neural information processing systems 2019(20)

Más de Network Science Lab, The Catholic University of Korea(17)

Anzeige

NS-CUK Seminar: J.H.Lee, Review on "Hyperbolic graph convolutional neural networks," Advances in neural information processing systems 2019

  1. Joo-Ho Lee Network Science Lab Dept. of Artificial Intelligence The Catholic University of Korea E-mail: jooho414@gmail.com
  2. 1 ➢ Introduction • Limitations • Contributions • Background ➢ Method • Model description ➢ Experiment • Datasets • Baselines • Results ➢ Conclusion
  3. 2 1. Introduction Limitation of previous study • Input node features are usually Euclidean, and it is not clear how to optimally use as inputs to hyperbolic neural networks • It is not clear how to perform set aggregation, a key step in message passing, in hyperbolic space • one needs to choose hyperbolic spaces with the right curvature at every layer of GCN
  4. 3 1. Introduction Contributions • Improved performance on graph-based tasks → Hyperbolic space is better suited for modeling hierarchical structures that are common in many real- world graphs • Interpretability → HGCNs can learn hierarchical representations of graph-structured data that are more interpretable than those learned by Euclidean GCNs. • Novelty → Paper introduces a hyperbolic attention-based aggregation scheme that captures hierarchical structure of networks
  5. 4 1. Introduction Background • Hyperboloid manifold ℍ𝑑,𝐾 ≔ 𝑥 ∈ ℝ𝑑+1: 𝑥, 𝑥 ℒ = −𝐾, 𝑥𝑜 > 0 𝒯 𝑥ℍ𝑑,𝐾 ≔ 𝑣 ∈ ℝ𝑑+1: 𝑣, 𝑥 ℒ = 0 𝐾: 𝑐𝑢𝑟𝑣𝑎𝑡𝑢𝑟𝑒 𝒯 𝑥ℍ𝑑,𝐾 : 𝐸𝑢𝑐𝑙𝑖𝑑𝑒𝑎𝑛 𝑇𝑎𝑛𝑔𝑒𝑛𝑡 𝑆𝑝𝑎𝑐𝑒 ℍ𝑑,𝐾 : ℎ𝑦𝑝𝑒𝑟𝑏𝑜𝑙𝑜𝑖𝑑 𝑚𝑎𝑛𝑖𝑓𝑜𝑙𝑑 𝑖𝑛 𝑑 𝑑𝑖𝑚𝑒𝑛𝑠𝑖𝑜𝑛 (𝑐𝑢𝑟𝑣𝑎𝑡𝑢𝑟𝑒: − 1 𝐾 )
  6. 5 1. Introduction Background • Distance 𝑑ℒ 𝐾 𝑥, 𝑦 = 𝐾𝑎𝑟𝑐𝑜𝑠ℎ − 𝑥, 𝑦 ℒ 𝐾 • Exponential and logarithmic maps: for mapping between tangent space and hyperbolic space exp𝑥 𝐾 𝑣 = cosh 𝑣 2 𝐾 + 𝐾 sinh 𝑣 ℒ 𝐾 𝑣 𝑣 ℒ log𝑥 𝐾 𝑦 = 𝑑ℒ 𝐾 𝑥, 𝑦 𝑦 + 1 𝑘 𝑥, 𝑦 ℒ𝑥 𝑦 + 1 𝑘 𝑥, 𝑦 ℒ𝑥 ℒ
  7. 6 2. Method Mapping from Euclidean to hyperbolic spaces 𝑥0,𝐻 = exp𝑜 𝐾 0, 𝑥0,𝐸 = 𝐾 cosh 𝑥0,𝐸 2 𝐾 , 𝐾 sinh 𝑥0,𝐸 2 𝐾 𝑥0,𝐸 𝑥0,𝐸 2 0, 𝑥0,𝐸 : 𝑎 𝑝𝑜𝑖𝑛𝑡 𝑖𝑛 𝑡ℎ𝑒 𝑡𝑎𝑛𝑔𝑒𝑛𝑡 𝑠𝑝𝑎𝑐𝑒 𝑜𝑓 𝑡ℎ𝑒 𝑜𝑟𝑖𝑔𝑖𝑛 𝑝𝑜𝑖𝑛𝑡 0 𝑖𝑛 ℎ𝑦𝑝𝑒𝑟𝑏𝑜𝑙𝑖𝑐 𝑠𝑝𝑎𝑐𝑒 Feature transform in hyperbolic space: Linear Transforms 𝑊⨂𝐾𝑥𝐻 ≔ exp𝑜 𝐾 𝑊𝑙𝑜𝑔𝑜 𝐾 𝑥𝐻 𝑥𝐻⨁𝐾𝑏 ≔ exp𝑥𝐻 𝐾 𝑃𝑜→𝑥𝐻 𝐾 𝑏
  8. 7 2. Method Neighborhood aggregation on the hyperboloid manifold 𝑤𝑖𝑗 = 𝑆𝑂𝐹𝑇𝑀𝐴𝑋𝑗∈𝒩 𝑖 (𝑀𝐿𝑃(𝑊𝑙𝑜𝑔𝑜 𝐾 𝑥𝑖 𝐻 ||𝑊𝑙𝑜𝑔𝑜 𝐾 𝑥𝑗 𝐻 )) 𝐴𝐺𝐺𝐾 𝑥𝐻 𝑖 = exp𝑥𝑖 𝐻 𝐾 ෍ 𝑗∈𝒩 𝑖 𝑤𝑖𝑗𝑙𝑜𝑔𝑥𝑖 𝐻 𝐾 𝑥𝑗 𝐻 𝜎⨂ 𝐾𝑙−1,𝐾𝑙 = exp𝑜 𝐾𝑙 𝜎 log𝑜 𝐾𝑙−1 𝑥𝐻
  9. 8 2. Method HGCN architecture ℎ𝑖 𝑙,𝐻 = 𝑊𝑙 ⨂𝐾𝑙−1𝑥𝑖 𝑙−1,𝐻 ⨁𝐾𝑙−1𝑏𝑙 (ℎ𝑦𝑝𝑒𝑟𝑏𝑜𝑙𝑖𝑐 𝑓𝑒𝑎𝑡𝑢𝑟𝑒 𝑡𝑟𝑎𝑛𝑠𝑓𝑜𝑟𝑚𝑠) 𝑦𝑖 𝑙,𝐻 = 𝐴𝐺𝐺𝐾𝑙−1 ℎ𝑙,𝐻 𝑖 (𝑎𝑡𝑡𝑒𝑛𝑡𝑖𝑜𝑛 − 𝑏𝑎𝑠𝑒𝑑 𝑛𝑒𝑖𝑔ℎ𝑏𝑜𝑟ℎ𝑜𝑜𝑑 𝑎𝑔𝑔𝑟𝑒𝑔𝑎𝑡𝑖𝑜𝑛) 𝑥𝑖 𝑙,𝐻 = 𝜎⨂𝐾𝑙−1,𝐾𝑙 𝑦𝑖 𝑙,𝐻 (𝑛𝑜𝑛 − 𝑙𝑖𝑛𝑒𝑎𝑟 𝑎𝑐𝑡𝑖𝑣𝑎𝑡𝑖𝑜𝑛 𝑤𝑖𝑡ℎ 𝑑𝑖𝑓𝑓𝑒𝑟𝑛𝑒𝑡 𝑐𝑢𝑟𝑣𝑎𝑡𝑢𝑟𝑒𝑠)
  10. 9 3. Experiment • Datasets 1. Citation Networks 2. Disease propagation tree 3. Protein-protein interactions (PPI) networks 4. Flight networks Experimental Setup
  11. 10 3. Experiment • Baselines 1. Euclidean embeddings (EUC) 2. Poincare embeddings (HYP) 3. EUC-MIXED & HYP-MIXED 4. GCN 5. GraphSAGE (SAGE) 6. Graph Attention Networks (GAT) 7. Simplified Graph Convolution (SGC) 8. MLP and its hyperbolic variant (HNN) Experimental Setup
  12. 11 3. Experiment Link Prediction & Node Classification (LP, NC)
  13. 12 3. Experiment Trainable Curvature
  14. 13 3. Experiment ROC AUC for link prediction
  15. 14 3. Experiment Visualization (DISEASE-M dataset) • In HGCN, the center node pays more attention to its (grand)parent. • In contrast to Euclidean GAT, our aggregation with attention in hyperbolic space allows to pay more attention to nodes with high hierarchy → such attention is crucial to good performance in disease, because only sick parents will propagate the disease to their children
  16. 15 4. Conclusions • HGCN is a novel architecture that learns hyperbolic embeddings using graph convolution networks. • In HGCN, the Euclidean input features are successively mapped to embeddings in hyperbolic spaces with trainable curvatures at every layer • HGCN achieves new state-of-the-art in learning embeddings for real-world hierarchical and scale- free graphs
  17. 16 Q&A
Anzeige