Chapter 2: Message Passing¶
Message Passing Paradigm¶
Let \(x_v\in\mathbb{R}^{d_1}\) be the feature for node \(v\), and \(w_{e}\in\mathbb{R}^{d_2}\) be the feature for edge \(({u}, {v})\). The message passing paradigm defines the following node-wise and edge-wise computation at step \(t+1\):
In the above equations, \(\phi\) is a message function defined on each edge to generate a message by combining the edge feature with the features of its incident nodes; \(\psi\) is an update function defined on each node to update the node feature by aggregating its incoming messages using the reduce function \(\rho\).
Roadmap¶
This chapter introduces DGL’s message passing APIs, and how to efficiently use them on both nodes and edges. The last section of it explains how to implement message passing on heterogeneous graphs.