You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# NeuralClassifier: An Open-source Neural Hierarchical Multi-label Text Classification Toolkit
4
+
# NeuralClassifierService: A Service for Multi-label Text Classification Toolkit
5
5
6
6
## Introduction
7
7
8
-
NeuralClassifier is designed for quick implementation of neural models for hierarchical multi-label classification task, which is more challenging and common in real-world scenarios. A salient feature is that NeuralClassifier currently provides a variety of text encoders, such as FastText, TextCNN, TextRNN, RCNN, VDCNN, DPCNN, DRNN, AttentiveConvNet and Transformer encoder, etc. It also supports other text classification scenarios, including binary-class and multi-class classification. It is built on [PyTorch](https://pytorch.org/). Experiments show that models built in our toolkit achieve comparable performance with reported results in the literature.
8
+
This service is for post use of the parent repository after trained with data sets, and if you want to use your model as some DeepLearning backend, fork this web wrapper repository.
9
9
10
-
## Support tasks
10
+
## Notice
11
11
12
-
* Binary-class text classifcation
13
-
* Multi-class text classification
14
-
* Multi-label text classification
15
-
* Hiearchical (multi-label) text classification (HMC)
* predict.json should be of json format, while each instance has a dummy label like "其他" or any other label in label map.
65
47
* eval.model\_dir is the model to predict.
@@ -80,81 +62,31 @@ The predict info will be outputed in predict.txt.
80
62
}
81
63
82
64
"doc_keyword" and "doc_topic" are optional.
65
+
## Start classification service
66
+
67
+
This service contains two module: a deep learning backend and a flask wrapper, they communicate via socket at port 4444. Above this is a shell wrapper to manipulate the two process: start them, keep them and kill them together when signal receieved.
68
+
69
+
>./start.sh #to start the whole service
70
+
71
+
the web page start at '0.0.0.0', port 55555 by default, or edit them in server.py.
72
+
73
+
The main page is to test if your config is valid, paste a post to the textarea,
74
+
`purge newline`, and click `Go` to get predict results, each post is splited by a newline which explains why you should purge new lines from a single post.
75
+
76
+
77
+
78
+
Also, headless json response is provided, send a json to `[host]:[port]/headless` and get a json with results
79
+
80
+
**send:**
81
+
82
+
{"posts":["blah","blah blah", "blah blah"]}
83
+
84
+
**and get:**
85
+
86
+
{"results" :["class of blah", "class of blah blah", "class of blah blah blah"]}
87
+
88
+
## Demo enviroment
89
+
90
+
* Training dataset : toutiao, 16 single labels, 280k posts, length 1536, eval precision 91%
0 commit comments