Skip to content

Commit

Permalink
streaming with flask and nginx
Browse files Browse the repository at this point in the history
  • Loading branch information
frankling2020 committed Mar 23, 2024
1 parent 905091e commit be07e23
Show file tree
Hide file tree
Showing 15 changed files with 199 additions and 20,387 deletions.
8 changes: 4 additions & 4 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,15 @@ updates:
# Enable version updates for npm
- package-ecosystem: "npm"
# Look for `package.json` and `lock` files in the `frontend` directory
directory: "/"
directory: "frontend"
# Check the npm registry for updates every day (weekdays)
schedule:
interval: "weekly"

# Enable version updates for pip
# Enable version updates for pip
- package-ecosystem: "pip"
# Look for a `requirements.txt` in the `backend` directory
directory: "/"
directory: "backend"
# Check for updates once a week
schedule:
interval: "weekly"
interval: "weekly"
15 changes: 15 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,9 @@
</p>

[![CodeQL](https://github.com/frankling2020/react-chatgpt/actions/workflows/github-code-scanning/codeql/badge.svg)](https://github.com/frankling2020/react-chatgpt/actions/workflows/github-code-scanning/codeql)
![GitHub License](https://img.shields.io/github/license/frankling2020/react-chatgpt?cacheSeconds=86400)
![GitHub Release](https://img.shields.io/github/v/release/frankling2020/react-chatgpt?cacheSeconds=86400)


## Simple Summarizer
### Description
Expand All @@ -17,8 +20,20 @@
<b>Tools</b>: <em>Docker, OpenAI, ReactJS, Flask (gunicorn + gevent), Celery (Redis + MongoDB), Nginx, JMeter</em>
</p>

<div align="center">
<img src="examples/demo.gif" style="width:90%">
</div>

---
For different usages:
- Frontend only: [frankling2020/react-chatgpt@3f264a8](https://github.com/frankling2020/react-chatgpt/tree/3f264a8)
- Frontend and Backend (Flask): [frankling2020/react-chatgpt@0288114](https://github.com/frankling2020/react-chatgpt/tree/0288114)
- With Celery and Redis: [frankling2020/react-chatgpt@a107670](https://github.com/frankling2020/react-chatgpt/tree/a107670)
- With Nginx: [frankling2020/react-chatgpt@11abd0b](https://github.com/frankling2020/react-chatgpt/tree/11abd0b)
- With JMeter: [frankling2020/react-chatgpt@9abe8ed](https://github.com/frankling2020/react-chatgpt/tree/9abe8ed)
- Streaming with Flask and Nginx

---
#### Why Tools?
- **Docker**: Docker is widely used for deploying web applications. Many cloud providers, such as AWS, Google Cloud, and Azure, have native support and integration with Docker, making it easier to deploy and manage containerized applications in their environments. Besides, Docker has a more mature set of tooling and orchestration solutions, such as Docker Compose, Docker Swarm, and integrations with Kubernetes. These tools simplify the management and scaling of containerized applications.
- **ReactJS**: React is a popular choice for building modern web application frontends. React's component-based architecture promotes modularity, reusability, and encapsulation of UI elements. This makes it easier to build and maintain complex user interfaces by breaking them down into smaller, self-contained components.
Expand Down
44 changes: 25 additions & 19 deletions backend/celery_task.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,47 +18,53 @@
celery_app.config_from_object("celeryconfig")


@celery_app.task
def fetch_summary_from_openai(api, query):
def openai_response(api, query, stream=False):
"""Fetch a summary from OpenAI using the provided query.
Args:
api (str): The OpenAI API key.
query (str): The query to send to the OpenAI model.
stream (bool): A boolean flag to indicate whether to stream the response.
Returns:
dict: A dictionary containing the response content, a list of sorted keywords,
and the Jaccard similarity score.
dict: A dictionary containing the submitted Celery task ID.
"""
# Create OpenAI client and send the query to the model
# Create a Celery task and send the query to the task
client = OpenAI(api_key=api)
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
model="gpt-4",
messages=[
{"role": "system", "content": instruction_text},
{"role": "user", "content": prompt_text + query}
],
temperature=0.5,
top_p=1
top_p=1,
stream=stream
)
return completion


@celery_app.task
def fetch_summary_from_openai(api, query):
"""Fetch a summary from OpenAI using the provided query.
Args:
api (str): The OpenAI API key.
query (str): The query to send to the OpenAI model.
Returns:
dict: A dictionary containing the response content, a list of sorted keywords,
and the Jaccard similarity score.
"""
# Create OpenAI client and send the query to the model
completion = openai_response(api, query)
# Default response in case of an error
default_response = {"content": "Error: No response from OpenAI", "keywords": [], "jaccard": 0.0}
default_response = {"content": "Success"}
try:
# Extract response content
response = completion.choices[0].message.content
paragraphs = response.split("\n\n")
raw_response = "\n\n".join(paragraphs[:-1])
# Extract keywords from the response and sort them by length
keywords = paragraphs[-1].split(": ")[1].split(", ")
sorted_keywords = set(sorted(keywords, key=len, reverse=True))
# compute how many keywords are in the query and the response
query_keywords = set(query.split()).intersection(sorted_keywords)
response_keywords = set(raw_response.split()).intersection(query_keywords)
jaccard = round(len(response_keywords) / len(sorted_keywords), 3)
# Update the default response with the extracted data
default_response["content"] = response
default_response["keywords"] = list(sorted_keywords)
default_response["jaccard"] = jaccard
except Exception as e:
# Handle any exceptions that may occur
default_response.update({"content": f"Error: {e}"})
Expand Down
24 changes: 21 additions & 3 deletions backend/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,10 @@
app (Flask): The Flask application instance.
"""

from flask import Flask, request, jsonify
from flask import Flask, request, jsonify, stream_with_context
from flask_cors import CORS
from celery.result import AsyncResult
from celery_task import fetch_summary_from_openai
from celery_task import fetch_summary_from_openai, openai_response
import socket


Expand All @@ -32,6 +32,24 @@ def get_result_task(task_id):
return jsonify(result)


@app.route("/stream", methods=["POST"])
def fetch_stream_summary():
"""Fetch a streamed summary from OpenAI using the provided query.
Returns:
Response: A streamed response containing the summary.
"""
api = request.json["api"]
query = request.json["query"]
stream = openai_response(api, query, stream=True)
def generate():
for chunk in stream:
chunk_data = chunk.choices[0].delta.content
if chunk_data:
yield chunk_data
return app.response_class(stream_with_context(generate()))


@app.route("/submit", methods=["POST"])
def fetch_summary():
"""Fetch a summary from OpenAI using the provided query.
Expand All @@ -48,4 +66,4 @@ def fetch_summary():


if __name__ == "__main__":
app.run(debug=True)
app.run()
12 changes: 6 additions & 6 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ services:
environment:
- FLASK_APP=server.py
- CELERY_BROKER_URL=redis://redis:6379/1
- CELERY_RESULT_BACKEND=mongodb://mongo:27017
- CELERY_RESULT_BACKEND=redis://redis:6379/2
command: ["gunicorn", "server:app", "-c", "./gunicorn.conf.py"]
# ports:
# - "8000:8000"
Expand All @@ -23,16 +23,16 @@ services:
environment:
- FLASK_APP=server.py
- CELERY_BROKER_URL=redis://redis:6379/1
- CELERY_RESULT_BACKEND=mongodb://mongo:27017
- CELERY_RESULT_BACKEND=redis://redis:6379/2
command: ["celery", "-A", "celery_task", "worker", "--loglevel=info"]
redis:
image: redis
# ports:
# - "6379:6379"
mongo:
image: mongo
# ports:
# - "27017:27017"
# mongo:
# image: mongo
# # ports:
# # - "27017:27017"
nginx:
image: nginx
volumes:
Expand Down
Binary file added examples/demo.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions frontend/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
FROM node:20.11.1-bullseye-slim
FROM node:bullseye-slim

WORKDIR /app
COPY package*.json ./
RUN npm install
ENV PATH /app/node_modules/.bin:$PATH
COPY . .
RUN npm run build

# No need to run the server here, it will be run by the nginx server
# CMD [ "npm", "run", "start"]
# CMD [ "npm", "run", "start"]
RUN npm run build
Loading

0 comments on commit be07e23

Please sign in to comment.