Skip to content

Commit

Permalink
Publish package to pypi
Browse files Browse the repository at this point in the history
  • Loading branch information
bwconrad committed Aug 22, 2023
1 parent 72560d0 commit 4a6482a
Show file tree
Hide file tree
Showing 6 changed files with 50 additions and 13 deletions.
18 changes: 9 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ This implementation extends the [`timm`](https://github.com/huggingface/pytorch-

## Installation

<!-- ``` -->
<!-- pip install soft-moe-pytorch -->
<!-- ``` -->
```
pip install soft-moe
```

<!-- Or install the entire repo with: -->
Or install the entire repo with:

```
git clone https://github.com/bwconrad/soft-moe
Expand All @@ -29,7 +29,7 @@ pip install -r requirements.txt

```python
import torch
from soft_moe_pytorch import SoftMoEVisionTransformer
from soft_moe import SoftMoEVisionTransformer

net = SoftMoEVisionTransformer(
num_experts=128,
Expand All @@ -51,9 +51,9 @@ preds = net(img)
Functions are also available to initialize default network configurations:

```python
from soft_moe_pytorch import (soft_moe_vit_base, soft_moe_vit_huge,
soft_moe_vit_large, soft_moe_vit_small,
soft_moe_vit_tiny)
from soft_moe import (soft_moe_vit_base, soft_moe_vit_huge,
soft_moe_vit_large, soft_moe_vit_small,
soft_moe_vit_tiny)

net = soft_moe_vit_tiny()
net = soft_moe_vit_small()
Expand Down Expand Up @@ -95,7 +95,7 @@ The `SoftMoELayerWrapper` class can be used to make any network layer, that take
import torch
import torch.nn as nn

from soft_moe_pytorch import SoftMoELayerWrapper
from soft_moe import SoftMoELayerWrapper

x = torch.rand(1, 16, 128)

Expand Down
36 changes: 36 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
from setuptools import find_packages, setup

with open("README.md") as f:
long_description = f.read()

setup(
name="soft_moe",
packages=find_packages(),
version="0.0.1",
license="Apache-2.0",
description="PyTorch implementation of 'From Sparse to Soft Mixtures of Experts'",
long_description=long_description,
long_description_content_type="text/markdown",
author="Ben Conrad",
author_email="benwconrad@proton.me",
url="https://github.com/bwconrad/soft-moe",
keywords=[
"transformers",
"artificial intelligence",
"computer vision",
"deep learning",
],
install_requires=[
"timm >= 0.9.2",
"torch >= 2.0.1",
],
classifiers=[
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"Topic :: Scientific/Engineering :: Artificial Intelligence",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
],
python_requires=">=3.10",
)
5 changes: 5 additions & 0 deletions soft_moe/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
from soft_moe.soft_moe import SoftMoELayerWrapper
from soft_moe.vision_transformer import (SoftMoEVisionTransformer,
soft_moe_vit_base, soft_moe_vit_huge,
soft_moe_vit_large,
soft_moe_vit_small, soft_moe_vit_tiny)
File renamed without changes.
File renamed without changes.
4 changes: 0 additions & 4 deletions soft_moe_pytorch/__init__.py

This file was deleted.

0 comments on commit 4a6482a

Please sign in to comment.