dgl.softmax_nodesΒΆ
-
dgl.
softmax_nodes
(graph, feat, *, ntype=None)[source]ΒΆ Perform graph-wise softmax on the node features.
For each node \(v\in\mathcal{V}\) and its feature \(x_v\), calculate its normalized feature as follows:
\[z_v = \frac{\exp(x_v)}{\sum_{u\in\mathcal{V}}\exp(x_u)}\]If the graph is a batch of multiple graphs, each graph computes softmax independently. The result tensor has the same shape as the original node feature.
- Parameters
- Returns
Result tensor.
- Return type
Tensor
Examples
>>> import dgl >>> import torch as th
Create two
DGLGraph
objects and initialize their node features.>>> g1 = dgl.graph(([0, 1], [1, 0])) # Graph 1 >>> g1.ndata['h'] = th.tensor([1., 1.]) >>> g2 = dgl.graph(([0, 1], [1, 2])) # Graph 2 >>> g2.ndata['h'] = th.tensor([1., 1., 1.])
Softmax over one graph:
>>> dgl.softmax_nodes(g1, 'h') tensor([.5000, .5000])
Softmax over a batched graph:
>>> bg = dgl.batch([g1, g2]) >>> dgl.softmax_nodes(bg, 'h') tensor([.5000, .5000, .3333, .3333, .3333])
See also