-
- conll2003
-
- atis
-
- CharLSTM+WordLSTM+CRF: Lample .etc, NAACL16
-
- Make a CoNLL-2003 batcher
-
- Implement trainer
-
- Implement WordLSTM + softmax
-
- Implement CharLSTM + WordLSTM + softmax
-
- Implement CharLSTM + WordLSTM + CRF
-
- Tranformer encoder + CRF
-
- BERT encoder + CRF
-
- pytorch JIT compilable Viterbi Decoder https://github.com/atulkum/sequence_prediction/blob/master/NER_BERT/decoder.py#L9
-
- Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
-
- Make a ATIS batcher
-
- Implement trainer
-
- Implement slot filler
-
- Implement intent
conda install pytorch -c pytorch
CoNLL-2003 can be downloaded from https://www.clips.uantwerpen.be/conll2003/ner/
ATIS dataset can be downloaded from split 0 split 1 split 2 split 3 split 4