_______ _______ ______ _______ _______ _______ ______ __ __
| _ || || _ | | | | || || _ | | | | |
| |_| || ___|| | || |_ _| ____ | _____|| ___|| | || | |_| |
| || |___ | |_||_ | | |____| | |_____ | |___ | |_||_ | |
| _ | | ___|| __ | | | |_____ || ___|| __ || |
| |_| || |___ | | | | | | _____| || |___ | | | | | |
|_______||_______||___| |_| |___| |_______||_______||___| |_| |___|
BERT-serv provides FinBERT sentiment as a service. Send financial text via HTTP and get sentiment analysis back from these pretrained models.
❗ Only this sentiment model is implemented at this time.
- FinBERT analysis made available on-demand to HTTP clients (no Python required!)
- Avoid adding PyTorch dependencies to other parts of your system
- Results are saved and can be queried
- Analysis requests are handled asynchronously
Install Docker Desktop if needed.
git clone https://github.com/daveminer/BERT-serv.git
cd BERT-serv
cp .env.dist .env
docker-compose up
BERT-serv is now running on local port 8000!
Send a POST request to the /sentiment/new/
path. A callback_url
may be specified in
the query parameters for asynchrous use cases and long-running sentiment analyses. This callback
will have a JSON object in the body with an array of the new sentiment record ids: {"ids": [95, 96]}
The body of the POST request must be a list of objects, each of which containes these fields:
article_id
: An identifier for the text; this ends up on the database record as well as in the callback payload.tags
: The tags to be associated with the sentiment record; useful for filtering on existing sentiment records.text
: The text to analyze
curl --request POST \
--url 'http://localhost:8000/sentiment/new?callback_url=http://web:8000/callback/sentiment/new/' \
--header 'Content-Type: application/json' \
--data '[{"article_id": 1, "tags": ["stock"], "text": "year over year growth is increasing"}, {"article_id": 2, "tags": ["stock"], "text": "there is a shortage of capital, and we need extra financing"}]'
If a callback is specified, the following payload will be sent to the callback URL:
{
"results": [
{
"article_id": 1,
"sentiment": {
"label": "Positive",
"score": 0.9999837875366211
"tags": ["stock"]
}
}
]
}
Response defaults to HTML unless application/json
is specified in the Accept
header.
Make a GET request to the /sentiment/
path.
curl --request GET \
--url http://localhost:8000/sentiment/ \
--header 'Accept: application/json'
The output will look like this:
[
{
"created_at": "2024-08-23T02:19:45.657",
"label": "Negative",
"score": 0.9966173768043518,
"tags": ["stock"],
"text": "there is a shortage of capital, and we need extra financing"
},
{
"created_at": "2024-08-23T02:19:45.657",
"label": "Positive",
"score": 0.9999837875366211,
"tags": ["stock"],
"text": "year over year growth is increasing"
}
]
Add the index of the sentiment resource to the sentiment
path:
curl --request GET \
--url http://localhost:8000/sentiment/1/ \
--header 'Accept: application/json'
A comma-separated list of tags may be specified in the tags
query parameter.
The period
query parameter accepts an integer that specifies how many days back to look for sentiments.
The sentiment index controller uses Django's built in pagination.
The /sentiment/
path accepts page_size
and a page
query parameters.
The make services
command will start all of the services besides the app. This allows for the app to be started and stopped (with make app
) in the terminal for convenience during development. Note that make services
requires the Postgres database to be running. A database
can be started with make db
if one isn't running already.
Local development requires that the local environment is set up alongside the containerized services.
sudo apt install python3
sudo apt install python3-pip -y
sudo apt-get update
sudo apt install python3.12-venv
python3 -m virtualenv env
python3 -m venv .venv
source .venv/bin/activate
Notes:
make services
requires Docker Desktopmake deps
will install dependencies via pip3 and must be run beforemake app
. This can take a few minutes as the PyTorch dependencies are sizable.
The docker-compose-services.yml
is intended to stand this service and dependencies up against an external Postgres instance.
# To run:
docker-compose -f docker-compose-service.yml up