-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check hypergraphs #256
Check hypergraphs #256
Conversation
This reverts commit c2e4132.
…gin layer for consistency with official implementation
…ams to unisage_layer
…at initialization
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #256 +/- ##
==========================================
+ Coverage 97.07% 97.36% +0.29%
==========================================
Files 58 58
Lines 2014 2050 +36
==========================================
+ Hits 1955 1996 +41
+ Misses 59 54 -5 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice! One super minor comment about coding style (see PEP8 and the standards on naming functions: https://peps.python.org/pep-0008/.
Once this is fixed, we can merge. Congrats on this fantastic contribution (and sorry for the delay in reviewing).
@@ -15,7 +15,13 @@ def UniGIN_layer(self): | |||
self.in_channels = 10 | |||
return UniGINLayer(in_channels=self.in_channels) | |||
|
|||
def test_forward(self, UniGIN_layer): | |||
@pytest.fixture | |||
def UniGIN_layer2(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Avoid mix of under_scores and CamelCase in names.
Python guidelines ask for all lower cases and under_scores in function names, see Pep8. Can you adapt the naming of this function and the others which may have the same problem? Thanks.
self.in_channels = 10 | ||
return UniGINLayer(in_channels=self.in_channels, use_norm=True) | ||
|
||
def test_forward(self, UniGIN_layer, UniGIN_layer2): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No mix of under_scores and CamelCase.
If you're passing a class --> CamelCase
If you're passing an object or a function --> underscores.
See pep8 guidelines for international standards in code style.
@@ -42,6 +45,7 @@ def __init__( | |||
) | |||
) | |||
self.layers = torch.nn.ModuleList(layers) | |||
self.layer_drop = torch.nn.Dropout(layer_drop) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
@@ -51,6 +51,8 @@ class HyperSAGELayer(MessagePassing): | |||
Dimension of the input features. | |||
out_channels : int | |||
Dimension of the output features. | |||
alpha: int, default=-1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NIT: there is a space before the ":" in numpy's docstring conventions (which are the ones we use in the library) for some reason.
Thank you for the feedback! I resolved the mentioned issues |
Hello, |
This pull request has conflicts with the main branch that must be resolved before merging. Please rebase or merge the current main branch and resolve these conflicts :) |
@ffl096 Hi Florian, I have resolved the conflicts. Let me know if there is anything else! Thank you very much. |
#247
Changed the hypergraph models to use the Conv class defined in this library. Due to how the attention is calculated it was not always feasible to use the Conv operation. Hypergat and hypersage still use their costume convolution functions.