Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation of HOAN Mesh Classification model (model 1). #104

Closed
wants to merge 113 commits into from

Conversation

rballeba
Copy link
Collaborator

@rballeba rballeba commented Jun 2, 2023

This is an official submission for the ICML 2023 Topological Deep Learning Challenge, hosted by the second annual Topology and Geometry in Machine Learning Workshop at ICML. In this submission, we implement the Higher Order Attention Network (HOAN) Model for Mesh Classification. See Figure 11, [HOAN, Hajij22a] first left to right, of the article "Architectures of Topological Deep Learning: A Survey on Topological Neural Networks,” for a graphical representation using the tensor diagram’s notation. The model was initially presented in the first version of the paper “Topological Deep Learning: Going Beyond Graph Data” [TDLGBGD], at that time under the title of “Higher-Order Attention Networks,” [HOAN] (Hajij et al.).

We have contributed four novel implementations, which can be found in the following files:

  1. topomodelx/base/hbns.py,
  2. topomodelx/base/hbs.py,
  3. nn/combinatorial/hmc_layer.py,
  4. tutorials/combinatorial/hmc_layer.py.

(1. & 2.) The first two files present our implementation of the higher-order attention message passing steps for single non-squared and squared combinatorial complex neighbourhood matrices, respectively.

(3.) In the ‘nn/combinatorial/hmc_layer.py’ file, we implement the basic building block of the model, which consists of two consecutive levels of higher-order attentional message passing steps, transforming the signal features over the cells of the zeroth, first and second skeletons of the combinatorial complex (CC) representing the mesh. The first level displays two non-squared attention-blocks between vertices and edges, and edges and faces, along with a squared attention-block sending information from vertices to vertices. The second level consists of two non-squared attention-blocks, simply propagating the signals from the vertices to the edges, and from edges to vertices, alongside three squared attention blocks transforming the signals of each of the vertices, edges and faces. The latter propagate the information following their corresponding adjacency relations for vertices and edges and coadjacency for faces.

(4.) Lastly, in the fourth file, we implement and train the HOAN model for mesh classification over the shrec16 dataset, as it is specified on the webpage of the challenge. The model consists of a sequence of stacked layers (implemented in file 3.) and a final sum pooling layer which effectively maps the nodal, edge, and face features of the meshes into a shared N-dimensional Euclidean space, where N represents the number of different classes. We implement the dataset class and training procedure specifically for this problem.

Furthermore, we have also carefully tested each of the implementations of the novel message passing steps and the the HOAN basic building block in the following files:

  1. test/base/test_hbs.py,
  2. test/base/test_hbns.py,
  3. test/nn/combinatorial/test_hmc_layer.py.

(1. & 2.) In files 1 and 2, we test the attention mechanism and the forward pass of both higher-order attention blocks, respectively. We have not only considered the correctness of the output dimensions, but also made an effort to verify the accuracy of the values by comparing them to hand-made examples.

(3.) In file 3, we test the layer’s forward pass following the same methodology.

We have made a conscious effort to comment on the code thoroughly, describing all the novel classes, attention mechanisms, forward passes, and auxiliary methods according to their formal definitions and notations introduced in [HOAN] and in the article “Architectures of Topological Deep Learning: A Survey on Topological Neural Networks” (Papillon et al.). We also updated the docs folder to account for our implementations in the base and nn folders for generating the new documentation.

Additional comments:

  1. As depicted in [TDLGBGD], the formal definition of the attention high-order message passing on a CC is not limited to the use of a LeakyReLU as the attentional activation function. However, we decided to follow the [HOAN] implementation and maintain the LeakyReLU as the predefined one, aligning with the model's citation in Figure 11 of the survey, which refers to the [HOAN] article.
  2. We have chosen not to inherit from the MessagePassing class for the HBS and HBNS classes, making them inherit directly from torch.nn.Module instead. We believe that the implemented higher-order attention mechanisms for combinatorial complexes should receive messages in their method signatures as they operate over those transformed signals. To avoid overriding method signatures or making changes in the code that could compromise its understanding and readability, we have avoided MessagePassing inheritance for now. This decision is open for further discussion and modification if deemed necessary.
  3. We added a utility function used in both attention mechanisms (utils/srn.py) for computing row normalization of sparse matrices.
  4. All the implemented classes work with either GPU or CPU without the need to specify it as a parameter of the instances. To use the GPU version of a model, the user simply needs to specify which device to use.

…t raw in Figure 11 of the article "Architectures of Topological Deep Learning: A Survey on Topological Neural Networks"
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@rballeba rballeba changed the title Adding files for the HOAN Mesh Classification model. First commit for the HOAN Mesh Classification model. Jun 2, 2023
@codecov
Copy link

codecov bot commented Jun 2, 2023

Codecov Report

Merging #104 (569bd19) into main (c471ea2) will increase coverage by 1.02%.
Report is 69 commits behind head on main.
The diff coverage is 95.23%.

❗ Current head 569bd19 differs from pull request most recent head 4b94b78. Consider uploading reports for the commit 4b94b78 to get more accurate results

@@            Coverage Diff             @@
##             main     #104      +/-   ##
==========================================
+ Coverage   93.00%   94.02%   +1.02%     
==========================================
  Files           8       12       +4     
  Lines         200      368     +168     
==========================================
+ Hits          186      346     +160     
- Misses         14       22       +8     
Files Changed Coverage Δ
topomodelx/base/hbns.py 92.06% <92.06%> (ø)
topomodelx/base/hbs.py 95.00% <95.00%> (ø)
topomodelx/nn/combinatorial/hmc_layer.py 100.00% <100.00%> (ø)
topomodelx/utils/srn.py 100.00% <100.00%> (ø)

@ninamiolane ninamiolane changed the title First commit for the HOAN Mesh Classification model. First commit for the HOAN Mesh Classification model (model 1). Jun 2, 2023
@mathildepapillon mathildepapillon added the combinatorial Model implementations in the combinatorial domain label Jul 14, 2023
@mhajij
Copy link
Member

mhajij commented Oct 7, 2023

this has been handled elsewhere

@mhajij mhajij closed this Oct 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
combinatorial Model implementations in the combinatorial domain icml challenge 2023
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants