Need help with GraphGallery?

Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

251 Stars 40 Forks MIT License 744 Commits 0 Opened issues

GraphGallery is a gallery for benchmarking Graph Neural Networks (GNNs) and Graph Adversarial Learning with TensorFlow 2.x and PyTorch backend.

No Data

Readme

*TensorFlow* or *PyTorch*, *both!*

GraphGallery is a gallery for benchmarking Graph Neural Networks (GNNs) and Graph Adversarial Learning with TensorFlow 2.x and PyTorch backend. Besides, Pytorch Geometric (PyG) backend and Deep Graph Library (DGL) backend now are available in GraphGallery.

- We have removed the TensorFlow dependency and use PyTorch as the default backend for GraphGallery.
<!-- + We have integrated the
**Adversarial Attacks**in this project, examples please refer to Graph Adversarial Learning examples. -->

Please make sure you have installed PyTorch. Also, TensorFlow, Pytorch Geometric (PyG) and Deep Graph Library (DGL) are alternative choices. ```bash

pip install -U graphgallery

orbash

git clone https://github.com/EdisonLeeeee/GraphGallery.git && cd GraphGallery pip install -e . --verbose ``

where-e` means "editable" mode so you don't have to reinstall every time you make changes.

In detail, the following methods are currently implemented:

| Method | Author | Paper | PyTorch | TensorFlow | PyG | DGL |
| ------------------ | ---------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------- | ------------------ | ------------------ | ------------------ | ------------------ |
| **ChebyNet** | *Michaël Defferrard et al* | Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering (NeurIPS'16) | :heavy*check*mark: | :heavy*check*mark: | | |
| **GCN** | *Thomas N. Kipf et al* | Semi-Supervised Classification with Graph Convolutional Networks (ICLR'17) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |
| **GraphSAGE** | *William L. Hamilton et al* | Inductive Representation Learning on Large Graphs (NeurIPS'17) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | |
| **FastGCN** | *Jie Chen et al* | FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling (ICLR'18) | :heavy*check*mark: | :heavy*check*mark: | | |
| **LGCN** | *Hongyang Gao et al* | Large-Scale Learnable Graph Convolutional Networks (KDD'18) | | :heavy*check*mark: | | |
| **GAT** | *Petar Veličković et al* | Graph Attention Networks (ICLR'18) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |
| **SGC** | *Felix Wu et al* | Simplifying Graph Convolutional Networks (ICLR'19) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |
| **GWNN** | *Bingbing Xu et al* | Graph Wavelet Neural Network (ICLR'19) | :heavy*check*mark: | :heavy*check*mark: | | |
| **GMNN** | *Meng Qu et al* | Graph Attention Networks (ICLR'19) | | :heavy*check*mark: | | |
| **ClusterGCN** | *Wei-Lin Chiang et al* | Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks (KDD'19) | :heavy*check*mark: | :heavy*check*mark: | | |
| **DAGNN** | *Meng Liu et al* | Towards Deeper Graph Neural Networks (KDD'20) | :heavy*check*mark: | :heavy*check*mark: | | |
| **GDC** | *Johannes Klicpera et al* | Diffusion Improves Graph Learning (NeurIPS'19) | :heavy*check*mark: | :heavy*check*mark: | | |
| **TAGCN** | *Jian Du et al* | Topology Adaptive Graph Convolutional Networks (arxiv'17) | :heavy*check*mark: | :heavy*check*mark: | | |
| **APPNP, PPNP** | *Johannes Klicpera et al* | Predict then Propagate: Graph Neural Networks meet Personalized PageRank (ICLR'19) | :heavy*check*mark: | :heavy*check*mark: | | |
| **PDN** | *Benedek Rozemberczki et al* | Pathfinder Discovery Networks for Neural Message Passing (ICLR'21) | | | :heavy*check*mark: | |
| **SSGC** | *Zhu et al* | Simple Spectral Graph Convolution (ICLR'21) | :heavy*check*mark: | :heavy*check*mark: | | |
| **AGNN** | *Zhu et al* | Attention-based Graph Neural Network for semi-supervised learning (ICLR'18 openreview) | :heavy*check*mark: | :heavy*check*mark: | | |
| **ARMA** | *Bianchi et al* | Graph Neural Networks with convolutional ARMA filters (Arxiv'19) | | :heavy*check*mark: | | |
| **GraphML*P*** | *Yang Hu et al* | Graph-MLP: Node Classification without Message Passing in Graph (Arxiv'21) | :heavy*check*mark: | | | |
| **LGC, EGC, hLGC** | *Luca Pasa et al* | Simple Graph Convolutional Networks (Arxiv'21) | | | | :heavy*check*mark: |
| **GRAND** | *Wenzheng Feng et al* | Graph Random Neural Network for Semi-Supervised Learning on Graphs (NeurIPS'20) | | | | :heavy*check*mark: |
| **AlaGCN, AlaGAT** | *Yiqing Xie et al* | When Do GNNs Work: Understanding and Improving Neighborhood Aggregation (IJCAI'20) | | | | :heavy*check*mark: |
| **JKNet** | *Keyulu Xu et al* | Representation Learning on Graphs with Jumping Knowledge Networks (ICML'18) | | | | :heavy*check*mark: |
| **MixHop** | *Sami Abu-El-Haija et al* | MixHop: Higher-Order Graph Convolutional Architecturesvia Sparsified Neighborhood Mixing (ICML'19) | | | | :heavy*check*mark: |
| **DropEdge** | *Yu Rong et al* | DropEdge: Towards Deep Graph Convolutional Networks on Node Classification (ICML'20) | | | :heavy*check*mark: | |
| **Node2Grids** | *Dalong Yang et al* | Node2Grids: A Cost-Efficient Uncoupled Training Framework for Large-Scale Graph Learning (CIKM'21) | :heavy*check*mark: | | | |

| Method | Author | Paper | PyTorch | TensorFlow | PyG | DGL |
| -------------------------- | ------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------ | ------------------ | ------------------ | --- |
| **RobustGCN** | *Petar Veličković et al* | Robust Graph Convolutional Networks Against Adversarial Attacks (KDD'19) | :heavy*check*mark: | :heavy*check*mark: | | |
| **SBVAT, OBVAT** | *Zhijie Deng et al* | Batch Virtual Adversarial Training for Graph Convolutional Networks (ICML'19) | :heavy*check*mark: | :heavy*check*mark: | | |
| **SimPGCN** | *Wei Jin et al* | Node Similarity Preserving Graph Convolutional Networks (WSDM'21) | :heavy*check*mark: | | | |
| **GCN-VAT, GraphVAT** | *Fuli Feng et al* | Graph Adversarial Training: Dynamically Regularizing Based on Graph Structure (TKDE'19) | :heavy*check*mark: | | | |
| **LATGCN** | *Hongwei Jin et al* | Latent Adversarial Training of Graph Convolution Networks ([email protected]'19) | :heavy*check*mark: | | | |
| **DGAT** | *Weibo Hu et al* | Robust graph convolutional networks with directional graph adversarial training (Applied Intelligence'19) | :heavy*check*mark: | | | |
| **MedianGCN , TrimmedGCN** | *Liang Chen et al* | Understanding Structural Vulnerability in Graph Convolutional Networks | :heavy*check*mark: | | :heavy*check*mark: | |

The graph purification methods are universal for all models, just specify:

graph_transform="purification_method"

so, here we only give the examples of

GCNwith purification methods, other models should work.

| Method | Author | Paper | PyTorch | TensorFlow | PyG | DGL |
| --------------- | ---------------------- | ------------------------------------------------------------------------------------------------------------------------------------------- | ------------------ | ------------------ | ------------------ | ------------------ |
| **GCN-Jaccard** | *Huijun Wu et al* | Adversarial Examples on Graph Data: Deep Insights into Attack and Defense (IJCAI'19) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |
| **GCN-SVD** | *Negin Entezari et al* | All You Need Is Low (Rank): Defending Against Adversarial Attacks on Graphs (WSDM'20) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |

| Method | Author | Paper | PyTorch | TensorFlow | PyG | DGL |
| ------------- | ---------------------- | ------------------------------------------------------------------------------- | ------------------ | ---------- | ------------------ | --- |
| **GAE, VGAE** | *Thomas N. Kipf et al* | Variational Graph Auto-Encoders (NeuIPS'16) | :heavy*check*mark: | | :heavy*check*mark: | |

The following methods are framework-agnostic.

| Method | Author | Paper | PyTorch | TensorFlow | PyG | DGL |
| ------------- | --------------------------------- | --------------------------------------------------------------------------------------------------------------- | ------------------ | ------------------ | ------------------ | ------------------ |
| **Deepwalk** | *Bryan Perozzi et al* | DeepWalk: Online Learning of Social Representations (KDD'14) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |
| **Node2vec** | *Aditya Grover and Jure Leskovec* | node2vec: Scalable Feature Learning for Networks (KDD'16) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |
| **Node2vec+** | *Renming Liu et al* | Accurately Modeling Biased Random Walks on Weighted Graphs Using Node2vec+ | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |
| **BANE** | *Hong Yang et al* | Binarized attributed network embedding (ICDM'18) | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: | :heavy*check*mark: |

more details please refer to GraphData.

It takes just a few lines of code.

python from graphgallery.gallery.nodeclas import GCN trainer = GCN() trainer.setup_graph(graph) trainer.build() history = trainer.fit(train_nodes, val_nodes) results = trainer.evaluate(test_nodes) print(f'Test loss {results.loss:.5}, Test accuracy {results.accuracy:.2%}')Other models in the gallery are the same.

If you have any troubles, you can simply run

trainer.help()for more messages.

>>> import graphgallery # Default: PyTorch backend >>> graphgallery.backend() PyTorch 1.9.0+cu111 Backend # Switch to TensorFlow backend >>> graphgallery.set_backend("tf") # Switch to PyTorch backend >>> graphgallery.set_backend("th") # Switch to PyTorch Geometric backend >>> graphgallery.set_backend("pyg") # Switch to DGL PyTorch backend >>> graphgallery.set_backend("dgl")

But your codes don't even need to change.

This is motivated by gnn-benchmark ```python from graphgallery.data import Graph

mydataset = Graph(adj*matrix=A, node*attr=X, node_label=y)

mydataset.to_npz('path/to/mydataset.npz')

mydataset = Graph.from_npz('path/to/mydataset.npz') ```

- [x] Add PyTorch trainers support
- [x] Add other frameworks (PyG and DGL) support
- [x] set tensorflow as optional dependency when using graphgallery
- [ ] Add more GNN trainers (TF and Torch backend)
- [ ] Support for more tasks, e.g.,
graph Classification

andlink prediction

- [x] Support for more types of graphs, e.g., Heterogeneous graph
- [ ] Add Docstrings and Documentation (Building)
- [ ] Comprehensive tutorials

Please fell free to contact me if you have any troubles.

This project is motivated by Pytorch Geometric, Tensorflow Geometric, Stellargraph and DGL, etc., and the original implementations of the authors, thanks for their excellent works!

Please cite our paper (and the respective papers of the methods used) if you use this code in your own work: ```bibtex @inproceedings{li2021graphgallery, author = {Jintang Li and Kun Xu and Liang Chen and Zibin Zheng and Xiao Liu}, booktitle = {2021 IEEE/ACM 43rd International Conference on Software Engineering: Companion Proceedings (ICSE-Companion)}, title = {GraphGallery: A Platform for Fast Benchmarking and Easy Development of Graph Neural Networks Based Intelligent Software}, year = {2021}, pages = {13-16}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }