Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
pointer_network [2018/09/08 11:36]
admin
pointer_network [2018/09/08 11:45]
admin
Line 52: Line 52:
  
 current GNN methods are inherently flat and do not learn hierarchical representations of graphs—a limitation that is especially problematic for the task of graph classification,​ where the goal is to predict the label associated with an entire graph. Here we propose DIFFPOOL, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. DIFFPOOL learns a differentiable soft cluster assignment for nodes at each layer of a deep GNN, mapping nodes to a set of clusters, which then form the coarsened input for the next GNN layer current GNN methods are inherently flat and do not learn hierarchical representations of graphs—a limitation that is especially problematic for the task of graph classification,​ where the goal is to predict the label associated with an entire graph. Here we propose DIFFPOOL, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. DIFFPOOL learns a differentiable soft cluster assignment for nodes at each layer of a deep GNN, mapping nodes to a set of clusters, which then form the coarsened input for the next GNN layer
 +
 +https://​arxiv.org/​abs/​1809.01797 Narrating a Knowledge Base
 +
 +We aim to automatically generate natural
 +language narratives about an input structured
 +knowledge base (KB). We build our
 +generation framework based on a pointer
 +network which can copy facts from the
 +input KB, and add two attention mechanisms:
 +(i) slot-aware attention to capture
 +the association between a slot type
 +and its corresponding slot value; and (ii)
 +a new table position self-attention to capture
 +the inter-dependencies among related
 +slots. ​