# dgl.DGLGraph.push¶

DGLGraph.push(u, message_func='default', reduce_func='default', apply_node_func='default', inplace=False)[source]

Send message from the node(s) to their successors and update them.

Optionally, apply a function to update the node features after receive.

Parameters: u (int, iterable of int, or tensor) – The node(s) to push messages out. message_func (callable, optional) – Message function on the edges. The function should be an Edge UDF. reduce_func (callable, optional) – Reduce function on the node. The function should be a Node UDF. apply_node_func (callable, optional) – Apply function on the nodes. The function should be a Node UDF. inplace (bool, optional) – If True, update will be done in place, but autograd will break.

Examples

Create a graph for demo.

Note

Here we use pytorch syntax for demo. The general idea applies to other frameworks with minor syntax change (e.g. replace torch.tensor with mxnet.ndarray).

>>> import torch as th
>>> g = dgl.DGLGraph()
>>> g.ndata['x'] = th.tensor([[1.], [2.], [3.]])


Use the built-in message function copy_src() for copying node features as the message.

>>> m_func = dgl.function.copy_src('x', 'm')
>>> g.register_message_func(m_func)


Use the built-int message reducing function sum(), which sums the messages received and replace the old node features with it.

>>> m_reduce_func = dgl.function.sum('m', 'x')
>>> g.register_reduce_func(m_reduce_func)


As no edges exist, nothing happens.

>>> g.push(g.nodes())
>>> g.ndata['x']
tensor([[1.],
[2.],
[3.]])


Add edges 0 -> 1, 1 -> 2. Send messages from the node $$1$$. and update.

>>> g.add_edges([0, 1], [1, 2])
>>> g.push(1)
>>> g.ndata['x']
tensor([[1.],
[2.],
[2.]])


The feature of node $$2$$ changes but the feature of node $$1$$ remains the same as we did not push() for node $$0$$.