dgl.ops.edge_softmax

dgl.ops.edge_softmax(graph, logits, eids='__ALL__', norm_by='dst')[source]

Compute edge softmax. For a node \(i\), edge softmax is an operation that computes

\[a_{ij} = \frac{\exp(z_{ij})}{\sum_{j\in\mathcal{N}(i)}\exp(z_{ij})}\]

where \(z_{ij}\) is a signal of edge \(j\rightarrow i\), also called logits in the context of softmax. \(\mathcal{N}(i)\) is the set of nodes that have an edge to \(i\).

By default edge softmax is normalized by destination nodes(i.e. \(ij\) are incoming edges of i in the formula above). We also support edge softmax normalized by source nodes(i.e. \(ij\) are outgoing edges of i in the formula). The previous case correspond to softmax in GAT and Transformer, and the later case correspond to softmax in Capsule network. An example of using edge softmax is in Graph Attention Network where the attention weights are computed with such an edge softmax operation.

Parameters
  • graph (DGLGraph) – The graph to perform edge softmax on.

  • logits (torch.Tensor) – The input edge feature.

  • eids (torch.Tensor or ALL, optional) – A tensor of edge index on which to apply edge softmax. If ALL, apply edge softmax on all edges in the graph. Default: ALL.

  • norm_by (str, could be src or dst) – Normalized by source nodes or destination nodes. Default: dst.

Returns

Softmax value.

Return type

Tensor

Notes

  • Input shape: \((E, *, 1)\) where * means any number of additional dimensions, \(E\) equals the length of eids. If the eids is ALL, \(E\) equals the number of edges in the graph.

  • Return shape: \((E, *, 1)\)

Examples

The following example uses PyTorch backend.

>>> from dgl.ops import edge_softmax
>>> import dgl
>>> import torch as th

Create a DGLGraph object g and initialize its edge features.

>>> g = dgl.DGLGraph()
>>> g.add_nodes(3)
>>> g.add_edges([0, 0, 0, 1, 1, 2], [0, 1, 2, 1, 2, 2])
>>> edata = th.ones(6, 1).float()
>>> edata
    tensor([[1.],
            [1.],
            [1.],
            [1.],
            [1.],
            [1.]])

Apply edge softmax on g:

>>> edge_softmax(g, edata)
    tensor([[1.0000],
            [0.5000],
            [0.3333],
            [0.5000],
            [0.3333],
            [0.3333]])

Apply edge softmax on g normalized by source nodes:

>>> edge_softmax(g, edata, norm_by='src')
    tensor([[0.3333],
            [0.3333],
            [0.3333],
            [0.5000],
            [0.5000],
            [1.0000]])

Apply edge softmax on first 4 edges of g:

>>> edge_softmax(g, edata[:4], th.Tensor([0,1,2,3]))
    tensor([[1.0000],
            [0.5000],
            [1.0000],
            [0.5000]])