# dgl.nn (PyTorch)¶

## Conv Layers¶

 GraphConv Graph convolutional layer from Semi-Supervised Classification with Graph Convolutional Networks EdgeWeightNorm This module normalizes positive scalar edge weights on a graph following the form in GCN. RelGraphConv Relational graph convolution layer from Modeling Relational Data with Graph Convolutional Networks TAGConv Topology Adaptive Graph Convolutional layer from Topology Adaptive Graph Convolutional Networks GATConv Graph attention layer from Graph Attention Network GATv2Conv EGATConv Graph attention layer that handles edge features from Rossmann-Toolbox (see supplementary data) EdgeConv EdgeConv layer from Dynamic Graph CNN for Learning on Point Clouds SAGEConv GraphSAGE layer from Inductive Representation Learning on Large Graphs SGConv SGC layer from Simplifying Graph Convolutional Networks APPNPConv Approximate Personalized Propagation of Neural Predictions layer from Predict then Propagate: Graph Neural Networks meet Personalized PageRank GINConv Graph Isomorphism Network layer from How Powerful are Graph Neural Networks? GINEConv Graph Isomorphism Network with Edge Features, introduced by Strategies for Pre-training Graph Neural Networks GatedGraphConv Gated Graph Convolution layer from Gated Graph Sequence Neural Networks GMMConv Gaussian Mixture Model Convolution layer from Geometric Deep Learning on Graphs and Manifolds using Mixture Model CNNs ChebConv Chebyshev Spectral Graph Convolution layer from Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering AGNNConv Attention-based Graph Neural Network layer from Attention-based Graph Neural Network for Semi-Supervised Learning NNConv Graph Convolution layer from Neural Message Passing for Quantum Chemistry AtomicConv Atomic Convolution Layer from Atomic Convolutional Networks for Predicting Protein-Ligand Binding Affinity CFConv DotGatConv Apply dot product version of self attention in Graph Attention Network TWIRLSConv Convolution together with iteratively reweighting least squre from Graph Neural Networks Inspired by Classical Iterative Algorithms TWIRLSUnfoldingAndAttention Description GCN2Conv Graph Convolutional Network via Initial residual and Identity mapping (GCNII) from Simple and Deep Graph Convolutional Networks HGTConv Heterogeneous graph transformer convolution from Heterogeneous Graph Transformer GroupRevRes Grouped reversible residual connections for GNNs, as introduced in Training Graph Neural Networks with 1000 Layers EGNNConv Equivariant Graph Convolutional Layer from E(n) Equivariant Graph Neural Networks PNAConv Principal Neighbourhood Aggregation Layer from Principal Neighbourhood Aggregation for Graph Nets DGNConv Directional Graph Network Layer from Directional Graph Networks

## Dense Conv Layers¶

 DenseGraphConv Graph Convolutional layer from Semi-Supervised Classification with Graph Convolutional Networks DenseSAGEConv GraphSAGE layer from Inductive Representation Learning on Large Graphs DenseChebConv Chebyshev Spectral Graph Convolution layer from Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

## Global Pooling Layers¶

 SumPooling Apply sum pooling over the nodes in a graph. AvgPooling Apply average pooling over the nodes in a graph. MaxPooling Apply max pooling over the nodes in a graph. SortPooling Sort Pooling from An End-to-End Deep Learning Architecture for Graph Classification WeightAndSum Compute importance weights for atoms and perform a weighted sum. GlobalAttentionPooling Global Attention Pooling from Gated Graph Sequence Neural Networks Set2Set Set2Set operator from Order Matters: Sequence to sequence for sets SetTransformerEncoder The Encoder module from Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks SetTransformerDecoder The Decoder module from Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks

## Heterogeneous Learning Modules¶

 HeteroGraphConv A generic module for computing convolution on heterogeneous graphs. HeteroLinear Apply linear transformations on heterogeneous inputs. HeteroEmbedding Create a heterogeneous embedding table. TypedLinear Linear transformation according to types.

## Utility Modules¶

 Sequential A sequential container for stacking graph neural network modules WeightBasis Basis decomposition from Modeling Relational Data with Graph Convolutional Networks KNNGraph Layer that transforms one point set into a graph, or a batch of point sets with the same number of points into a union of those graphs. SegmentedKNNGraph Layer that transforms one point set into a graph, or a batch of point sets with different number of points into a union of those graphs. RadiusGraph Layer that transforms one point set into a bidirected graph with neighbors within given distance. JumpingKnowledge The Jumping Knowledge aggregation module from Representation Learning on Graphs with Jumping Knowledge Networks NodeEmbedding Class for storing node embeddings. GNNExplainer GNNExplainer model from GNNExplainer: Generating Explanations for Graph Neural Networks LabelPropagation Label Propagation from Learning from Labeled and Unlabeled Data with Label Propagation