Skip to content

Commit

Permalink
update readme with code minimally (#287)
Browse files Browse the repository at this point in the history
Co-authored-by: your.user.name <mustafa.hajij@gmail.com>
Co-authored-by: Florian Frantzen <florian.frantzen@rwth-aachen.de>
  • Loading branch information
3 people authored Oct 21, 2024
1 parent 04d9437 commit 242dd60
Show file tree
Hide file tree
Showing 6 changed files with 147 additions and 68 deletions.
96 changes: 87 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,20 +27,73 @@
</div>



![tnns_network_with_layers](https://user-images.githubusercontent.com/8267869/234084036-f7d6585e-b7c2-4156-a825-cfa5b9658d71.png)

`TopoModelX` (TMX) is a Python module for topological deep learning. It offers simple and efficient tools to implement topological neural networks for science and engineering.

TMX's development follows the topological deep learning (TDL) blue print laid out in:
- [Hajij et al. 2023. Topological Deep Learning: Going Beyond Graph Data](https://arxiv.org/abs/2206.00606).
_**Note:** TMX is still under development._

## Quick Tour for New Users

TMX can reproduce and extend the topological neural networks (TNNs) surveyed in:
- [Papillon et al. 2023. Architectures of Topological Deep Learning: A Survey on Topological Neural Networks](https://arxiv.org/abs/2304.10031).
In this quick tour, we highlight the ease of creating and training a TNN model with only a few lines of code.

See [our graphical literature review](https://github.com/pyt-team/TopoModelX/blob/main/topomodelx.jpeg) with message-passing equations available at [https://github.com/awesome-tnns/awesome-tnns](https://github.com/awesome-tnns/awesome-tnns).
#Train your own TNN model

Below is a minimal example of using TopoModelX to load a simplicial complex dataset, define a simplicial attention network (SAN), and perform a forward pass:


```bash
import numpy as np
import toponetx as tnx
import torch
from topomodelx.nn.simplicial.san import SAN
from topomodelx.utils.sparse import from_sparse

# Step 1: Load the Karate Club dataset
dataset = tnx.karate_club(complex_type="simplicial")

# Step 2: Prepare Laplacians and node/edge features
laplacian_down = from_sparse(dataset.down_laplacian_matrix(rank=1))
laplacian_up = from_sparse(dataset.up_laplacian_matrix(rank=1))
incidence_0_1 = from_sparse(dataset.incidence_matrix(rank=1))

x_0 = torch.tensor(np.stack(list(dataset.get_simplex_attributes("node_feat").values())))
x_1 = torch.tensor(np.stack(list(dataset.get_simplex_attributes("edge_feat").values())))
x = x_1 + torch.sparse.mm(incidence_0_1.T, x_0)

# Step 3: Define the network
class Network(torch.nn.Module):
def __init__(self, in_channels, hidden_channels, out_channels):
super().__init__()
self.base_model = SAN(in_channels, hidden_channels, n_layers=2)
self.linear = torch.nn.Linear(hidden_channels, out_channels)

def forward(self, x, laplacian_up, laplacian_down):
x = self.base_model(x, laplacian_up, laplacian_down)
return torch.sigmoid(self.linear(x))

# Step 4: Initialize the network and perform a forward pass
model = Network(in_channels=x.shape[-1], hidden_channels=16, out_channels=2)
y_hat_edge = model(x, laplacian_up=laplacian_up, laplacian_down=laplacian_down)
```

## 🤖 Installing TopoModelX

`TopoModelX` is available on PyPI and can be installed using `pip`.
Run the following command:

```bash
pip install topomodelx
```

Then install torch, torch-scatter, torch-sparse with or without CUDA depending on your needs.
```bash
pip install torch==2.0.1 --extra-index-url https://download.pytorch.org/whl/${CUDA}
pip install torch-scatter torch-sparse -f https://data.pyg.org/whl/torch-2.0.1+${CUDA}.html
pip install torch-cluster -f https://data.pyg.org/whl/torch-2.0.0+${CUDA}.html
```
where `${CUDA}` should be replaced by either `cpu`, `cu102`, `cu113`, or `cu115` depending on your PyTorch installation (`torch.version.cuda`).

_**Note:** TMX is still under development._

## 🦾 Contributing to TMX

Expand Down Expand Up @@ -98,9 +151,19 @@ Then:

## 🔍 References ##

To learn more about the topological deep learning blueprint:
TMX is a part of TopoX, a suite of Python packages for machine learning on topological domains. If you find TMX useful please consider citing our software paper:

- Hajij et al. 2023. TopoX: a suite of Python packages for machine learning on topological domains

To learn more about the blueprint topological deep learning that topomodelx follows :

- Mustafa Hajij, Ghada Zamzmi, Theodore Papamarkou, Nina Miolane, Aldo Guzmán-Sáenz, Karthikeyan Natesan Ramamurthy, Tolga Birdal, Tamal K. Dey, Soham Mukherjee, Shreyas N. Samaga, Neal Livesay, Robin Walters, Paul Rosen, Michael T. Schaub.
[Topological Deep Learning: Going Beyond Graph Data](https://arxiv.org/abs/2206.00606) (arXiv) • [Topological Deep Learning: A Book](https://tdlbook.org/)

TMX topological neural networks are surveyed in:

- Papillon et al. 2023. Architectures of Topological Deep Learning: A Survey on Topological Neural Networks.

- Mustafa Hajij, Ghada Zamzmi, Theodore Papamarkou, Nina Miolane, Aldo Guzmán-Sáenz, Karthikeyan Natesan Ramamurthy, Tolga Birdal, Tamal K. Dey, Soham Mukherjee, Shreyas N. Samaga, Neal Livesay, Robin Walters, Paul Rosen, Michael T. Schaub. [Topological Deep Learning: Going Beyond Graph Data](https://arxiv.org/abs/2206.00606).
```
@misc{hajij2023topological,
title={Topological Deep Learning: Going Beyond Graph Data},
Expand All @@ -110,6 +173,21 @@ To learn more about the topological deep learning blueprint:
archivePrefix={arXiv},
primaryClass={cs.LG}
}
@article{hajij2024topox,
title={TopoX: a suite of Python packages for machine learning on topological domains},
author={PYT-Team},
journal={arXiv preprint arXiv:2402.02441},
year={2024}
}
@article{papillon2023architectures,
title={Architectures of Topological Deep Learning: A Survey of Message-Passing Topological Neural Networks},
author={Papillon, Mathilde and Sanborn, Sophia and Hajij, Mustafa and Miolane, Nina},
journal={arXiv preprint arXiv:2304.10031},
year={2023}
}
```
## Funding

Expand Down
7 changes: 4 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ classifiers = [
requires-python = ">= 3.10"
dependencies=[
"tqdm",
"numpy",
"numpy<2",
"scipy",
"requests",
"scikit-learn",
Expand Down Expand Up @@ -67,8 +67,9 @@ dev = ["TopoModelX[test, lint]"]
all = ["TopoModelX[dev, doc]"]

[project.urls]
homepage="https://github.com/pyt-team/TopoModelX"
repository="https://github.com/pyt-team/TopoModelX"
documentation = "https://pyt-team.github.io/TopoModelX"
source = "https://github.com/pyt-team/TopoModelX"
issues = "https://github.com/pyt-team/TopoModelX/issues"

[tool.ruff]
target-version = "py310"
Expand Down
28 changes: 14 additions & 14 deletions test/tutorials/test_cell.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,20 @@

def _exec_tutorial(path):
"""Execute a tutorial notebook."""
file_name = tempfile.NamedTemporaryFile(suffix=".ipynb").name
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1000",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
file_name,
path,
]
subprocess.check_call(args)
with tempfile.NamedTemporaryFile(suffix=".ipynb") as tmp_file:
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1000",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
tmp_file.name,
path,
]
subprocess.check_call(args)


paths = sorted(glob.glob("tutorials/cell/*.ipynb"))
Expand Down
28 changes: 14 additions & 14 deletions test/tutorials/test_combinatorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,20 @@

def _exec_tutorial(path):
"""Execute a tutorial notebook."""
file_name = tempfile.NamedTemporaryFile(suffix=".ipynb").name
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1000",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
file_name,
path,
]
subprocess.check_call(args)
with tempfile.NamedTemporaryFile(suffix=".ipynb") as tmp_file:
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1000",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
tmp_file.name,
path,
]
subprocess.check_call(args)


paths = sorted(glob.glob("tutorials/combinatorial/*.ipynb"))
Expand Down
28 changes: 14 additions & 14 deletions test/tutorials/test_hypergraph.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,20 @@

def _exec_tutorial(path):
"""Execute a tutorial notebook."""
file_name = tempfile.NamedTemporaryFile(suffix=".ipynb").name
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1500",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
file_name,
path,
]
subprocess.check_call(args)
with tempfile.NamedTemporaryFile(suffix=".ipynb") as tmp_file:
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1500",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
tmp_file.name,
path,
]
subprocess.check_call(args)


paths = sorted(glob.glob("tutorials/hypergraph/*.ipynb"))
Expand Down
28 changes: 14 additions & 14 deletions test/tutorials/test_simplicial.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,20 @@

def _exec_tutorial(path):
"""Execute a tutorial notebook."""
file_name = tempfile.NamedTemporaryFile(suffix=".ipynb").name
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1000",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
file_name,
path,
]
subprocess.check_call(args)
with tempfile.NamedTemporaryFile(suffix=".ipynb") as tmp_file:
args = [
"jupyter",
"nbconvert",
"--to",
"notebook",
"--execute",
"--ExecutePreprocessor.timeout=1000",
"--ExecutePreprocessor.kernel_name=python3",
"--output",
tmp_file.name,
path,
]
subprocess.check_call(args)


paths = sorted(glob.glob("tutorials/simplicial/*.ipynb"))
Expand Down

0 comments on commit 242dd60

Please sign in to comment.