Skip to content

Graph Diffusion is proposed and implemented as a defense mechanism against adversarial attacks targeting deep graph learning models.

Notifications You must be signed in to change notification settings

oranger0000/DiffusionGraphRobustness

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DiffusionGraphRobustness

This repo contains the artifacts for our project Understanding the Impact of Graph Diffusion on Robust Graph Learning as part of CS6604: 'Data Challenges in Machine Learning' course.

The project report can be accessed from here.

Acknowledgment

Graph Diffusion was implemented in our project using models implemented in PyTorch Geometric Python library.

Attack model and base Graph Neural Networks was implemented in or project using models implemented in DeepRobust Python library.

Requirements

It is advised to create a new environment and install the below packages along with PyTorch.

torch-geometric==1.7.0 (Installation steps here)

deeprobust==0.2.1 (Installation steps here)

Versions of software/libraries/OS used to run experiments on the repo owner's system

CUDA: 10.1

PyTorch: 1.8.1

Python: 3.8

Ubuntu 18.04

Implemented Models

logo

Attack method: Metattack

Type: Global attack; Poisoning; Domain: Node Classification; Paper

GNN: ChebNet

Domain: Node Classification; Paper

GNN: SGC

Domain: Node Classification; Paper

GNN: ChebNet

Domain: Node Classification; Paper

Diffusion: Graph Diffusion Convolution

Paper

Datasets

logo

The 'cora', 'cora-ml', 'polblogs' and 'citeseer' are downloaded from here, and 'pubmed' is from here.


Run Experiments

Pre-steps

To accommodate the change in data format after performing Graph Diffusion, we had to make few additions in the python files related to GNN architectures ChebNet, SGC, GAT in DeepRobust package. Please follow the steps below:

  1. Make sure you have successfully finished installing DeepRobust library in your system as well as cloned this repo to your system.

  2. Copy the following three python files in /examples/defense/ folder to the DeepRobust package in your system.

    chebnet.py | gat.py | sgc.py

    Destination folder: /home/[your system name]/anaconda3/envs/[your env name]/lib/python3.8/site-packages/deeprobust-0.2.0-py3.8/deeprobust/graph/defense

  3. Sample:

$ cp /home/satvik/PyCharmProjects/DiffusionGraphRobustness/examples/defense/chebnet.py /home/satvik/anaconda3/envs/graph/lib/python3.8/site-packages/deeprobust-0.2.0-py3.8/deeprobust/graph/defense/chebnet.py

Experiment

To run experiments go to /examples/graph/ folder and run
$ bash run_all.sh sgc cora 0.05

$1 = base GNN [Options: chebnet, sgc, gat]

$2 = Dataset [Options: cora, cora_ml, citeseer, polblogs, pubmed]

$3 = Perturbation Ratio [ptions: 0.05, 0.1, 0.15, 0.2, 0.25]

The output would be test-loss & accuracy for a clean graph, perturbed graph, and diffused graph with perturbation.

We faced issues with running diffusion experiments on PubMed dataset due to GPU memory. Please be careful when running on that dataset as the system might freeze.

About

Graph Diffusion is proposed and implemented as a defense mechanism against adversarial attacks targeting deep graph learning models.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published