This repository provides Relational (heterogeneous) Graph Attention (RGAT) operator implementation from scratch. This implementation is, as the name suggests, meant only for relational (simple/property/attributed) graphs. Here, two schemes have been implemented to compute attention logits
Additive attention
or multiplicative attention
where:
Here,
Two different attention mechanisms have also been provided:-
- Within-relation attention mechanism
- Across-relation attention mechanism
To ensure better discriminative power for RGATs, the following options have also been made available:-
- additive:
- scaled:
- f-additive:
- f-scaled:
where
PyTorch
PyTorch Geometric
Though the example.py
file contains the path to one of the relational entities graphs (AIFB
), this implementation works for other heterogeneous graph datasets such as MUTAG
, BGS
, AM
, etc. The AIFB
dataset contains no. of nodes (8285
), edges (58086
), and classes (4
).
- The layer implementation can be seen inside
rgat_conv.py
. - To train and test RGATs on heterogeneous graphs, run
example.py
, and this file, after every epoch, printstrain
as welltest
accuracies.