This code reproduces the experimental results obtained with the MinCutPool layer as presented in the ICML 2020 paper
Spectral Clustering with Graph Neural Networks for Graph Pooling
F. M. Bianchi*, D. Grattarola*, C. Alippi
The official Tensorflow implementation of the MinCutPool layer is in Spektral.
The PyTorch implementation of MinCutPool is in Pytorch Geometric.
The code is based on Python 3.5, TensorFlow 1.15, and Spektral 0.1.2.
All required libraries are listed in requirements.txt
and can be installed with
pip install -r requirements.txt
Run Segmentation.py to perform hyper-segmentation, generate a Region Adjacency Graph from the resulting segments, and then cluster the nodes of the RAG graph with the MinCutPool layer.
Run Clustering.py
to cluster the nodes of a citation network. The datasets cora
, citeseer
, and
pubmed
can be selected.
Results are provided in terms of homogeneity score, completeness score, and
normalized mutual information (v-score).
Clustering_pytorch.py contains a basic implementation in Pytorch based on Pytorch Geometric.
Run Autoencoder.py
to train an autoencoder with bottleneck and compute the reconstructed graph. It
is possible to switch between the ring
and grid
graphs, but also any other
point clouds
from the PyGSP library
are supported. Results are provided in terms of the Mean Squared Error.
Run Graph_Classification.py to train a graph classifier. Additional classification datasets are available here (drop them in data/classification/
) and here (drop them in data/
).
Results are provided in terms of classification accuracy averaged over 10 runs.
A basic Pytorch implementation of the graph classification task can be found in this example from Pytorch Geometric.
Please, cite the original paper if you are using MinCutPool in your research
@inproceedings{bianchi2020mincutpool,
title={Spectral Clustering with Graph Neural Networks for Graph Pooling},
author={Bianchi, Filippo Maria and Grattarola, Daniele and Alippi, Cesare},
booktitle={Proceedings of the 37th international conference on Machine learning},
pages={2729-2738},
year={2020},
organization={ACM}
}
The code is released under the MIT License. See the attached LICENSE file.