Skip to content

Latest commit

 

History

History
15 lines (11 loc) · 549 Bytes

README.md

File metadata and controls

15 lines (11 loc) · 549 Bytes

BERT-Multi-Label

Try out BERT cause why not.

Only if transformers had longer sequence length, the world would have been a better place.
dataset.py and train.py is pretty general for any kind of BERT fine-tuning task. Change the model and
and you are good to go.

Hardcoded main.py cause I was lazy. Feel free get inspired and change as required.

How to run?

  1. Change main.py as required.
  2. Make changed to the model in train.py
  3. pip install -U transformers tez
  4. python3 main.py and watch the loss go brr