To scale GNNs to extremely large graphs, existing works can be classified into the following two types.
Source: Node Dependent Local Smoothing for Scalable Graph Learning
- Simplifying Graph Convolutional Networks [ICML 2019] [paper] [code]
- Scalable Graph Neural Networks via Bidirectional Propagation [NeurIPS 2020] [paper] [code]
- SIGN: Scalable Inception Graph Neural Networks [ICML 2020] [paper] [code]
- Simple Spectral Graph Convolution [ICLR 2021] [paper] [code]
- Node Dependent Local Smoothing for Scalable Graph Learning [NeurIPS 2021] [paper] [code]
- Scalable and Adaptive Graph Neural Networks with Self-Label-Enhanced training [Arxiv 2021] [paper] [code]
- Graph Attention Multi-Layer Perceptron [Arxiv 2021] [paper] [code]
- NAFS: A Simple yet Tough-to-Beat Baseline for Graph Representation Learning [OpenReview 2022] [paper] [code]
Source: Inductive Representation Learning on Large Graphs
- Inductive Representation Learning on Large Graphs [NIPS 2017] [paper] [code]
- Scaling Graph Neural Networks with Approximate PageRank [KDD 2020] [paper] [code]
- Stochastic Training of Graph Convolutional Networks with Variance Reduction [ICML 2018] [paper] [code]
- GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings [ICML 2021] [paper] [code]
- Graph Convolutional Neural Networks for Web-Scale Recommender Systems [KDD 2018] [paper]
- FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling [ICLR 2018] [paper][code]
- Accelerating Large Scale Real-Time GNN Inference using Channel Pruning [Arxiv 2021] [paper] [code]
- Adaptive Sampling Towards Fast Graph Representation Learning [NeurIPS 2018] [paper] [code_pytorch] [code_tentsor_flow]