Skip to content

Commit

Permalink
Update README
Browse files Browse the repository at this point in the history
  • Loading branch information
pbloem committed Aug 3, 2021
1 parent 9f8849b commit ce7af9d
Showing 1 changed file with 7 additions and 22 deletions.
29 changes: 7 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,26 +1,22 @@
# former

Simple transformer implementation from scratch in pytorch. See http://peterbloem.nl/blog/transformers for an in-depth
explanation.
Simple transformer implementation from scratch in pytorch. See http://peterbloem.nl/blog/transformers for an in-depth explanation.

# Limitations

The current models are designed to show the simplicity of transformer models and self-attention. As such
they will not scale as far as the bigger transformers. For that you'll need a number of tricks that
complicate the code (see the blog post for details).
The models implemented here are designed to show the simplicity of transformer models and self-attention. As such they will not scale as far as the bigger transformers. For that you'll need a number of tricks that complicate the code (see the blog post for details).

All models so far are a single stack of transformer blocks (that is, no encoder/decoder structures). It
turns out that this simple configuration often works best.
All models in the repository consist of a single stack of transformer blocks (that is, no encoder/decoder structures). It turns out that this simple configuration often works best.

# Use
# Installation and use

First, download or clone the repository. Then, in the directory that contains setup.py, run

```
pip install -e .
```

The switch `-e` means when you edit the code, this changes the installed package ads well.
The switch `-e` ensures that when you edit the code, the installed packaged is also changed. This means that you can, for instance, add print statements to the code to see how it works.

Then, from the same directory, run:

Expand All @@ -32,20 +28,9 @@ This will run a simple classification experiment on the IMDb dataset.
Hyperparameters are passed as command line arguments. The defaults should work well. The classification data is
automatically downloaded, and the wikipedia data is included in the repository.

You should be able to install directly from github as a package as well, with
```
pip install git+https://github.com/pbloem/former
```
but I haven't tried this. It's probably easier to just copy over the code you need.

## Requirements

Python 3.6+ is required.

The following should install all requirements
```pip install torch tb-nightly tqdm numpy torchtext```
### Requirements

You may also need
Python 3.6+ is required. The pip command above should install all required packages. You may also need
```pip install future```
depending on the exact python version.

Expand Down

0 comments on commit ce7af9d

Please sign in to comment.