- π Overview
- π¦ Features
- π Structure
- π» Installation
- ποΈ Usage
- π Hosting
- π License
- π Authors
This repository provides a Minimum Viable Product (MVP) called "OpenAI-API-Python-Client". It offers a user-friendly Python backend API wrapper that simplifies the integration of OpenAI's powerful NLP capabilities into various projects. This MVP differentiates itself by focusing on simplicity and efficiency, making it ideal for developers of all skill levels.
Feature | Description | |
---|---|---|
βοΈ | Architecture | Utilizes a microservices architecture, with the API wrapper running as a standalone service. This provides flexibility and allows for independent scaling of components. |
π | Documentation | Provides detailed documentation, including API usage instructions, code examples, and tutorials. This streamlines onboarding and enables users to quickly learn and leverage the API. |
π | Dependencies | Leverages various libraries such as fastapi , uvicorn , pydantic , openai , sqlalchemy , psycopg2-binary , alembic , pyjwt , requests , logging , and prometheus_client for essential functionalities. |
𧩠| Modularity | The codebase is organized into modules for user management, API interaction, data validation, database interactions, and utility functions, promoting code reusability and maintainability. |
π§ͺ | Testing | Includes a comprehensive testing framework, including unit tests, integration tests, and end-to-end tests. This ensures the quality, stability, and reliability of the codebase. |
β‘οΈ | Performance | Employs optimization techniques such as caching API responses, optimizing database queries, and asynchronous processing to ensure efficient operation. |
π | Security | Implements security measures to protect user data and API keys, including secure storage of API keys, data encryption, and rate limiting. |
π | Version Control | Uses Git for version control and employs a Gitflow branching model for a structured and collaborative development process. |
π | Integrations | Integrates with popular cloud platforms like AWS or Azure for hosting the database and server infrastructure. |
πΆ | Scalability | Designed for scalability, leveraging cloud-based solutions for automatic scaling and resource management. |
openai-api-client/
βββ api
β βββ routes
β β βββ user.py
β β βββ openai.py
β βββ schemas
β βββ user.py
β βββ openai.py
βββ dependencies
β βββ auth.py
β βββ database.py
β βββ openai.py
β βββ utils.py
βββ models
β βββ base.py
β βββ user.py
β βββ api_usage.py
βββ services
β βββ user.py
β βββ openai.py
βββ startup.sh
βββ commands.json
βββ tests
β βββ conftest.py
β βββ unit
β β βββ test_openai.py
β β βββ test_user.py
β βββ integration
β βββ test_openai_routes.py
β βββ test_user_routes.py
βββ migrations
β βββ versions
β βββ ...
β βββ ...
β βββ alembic_version.py
βββ README.md
βββ .env.example
βββ .env
βββ gunicorn.conf.py
βββ Procfile
βββ .gitignore
βββ .flake8
- Python 3.9 or higher
- PostgreSQL 14+
pip
(Python package manager)alembic
(Database migration tool)
-
Clone the repository:
git clone https://github.com/coslynx/OpenAI-API-Python-Client.git cd OpenAI-API-Python-Client
-
Install dependencies:
pip install -r requirements.txt
-
Set up the database:
- Create a PostgreSQL database and user if you don't already have one.
- Update the
DATABASE_URL
in your.env
file with your database connection string. - Run database migrations:
alembic upgrade head
-
Configure environment variables:
- Create a
.env
file based on the.env.example
file. - Replace placeholder values with your actual API keys and database credentials.
- Create a
-
Start the application server:
uvicorn main:app --host 0.0.0.0 --port 8000
.env
file: Contains environment variables likeOPENAI_API_KEY
,DATABASE_URL
, andSECRET_KEY
.gunicorn.conf.py
: Configures thegunicorn
web server for deployment.
User Registration:
curl -X POST http://localhost:8000/api/v1/users/register \
-H "Content-Type: application/json" \
-d '{"username": "your_username", "email": "your_email@example.com", "password": "your_password"}'
User Login:
curl -X POST http://localhost:8000/api/v1/users/login \
-H "Content-Type: application/json" \
-d '{"email": "your_email@example.com", "password": "your_password"}'
Text Completion:
curl -X POST http://localhost:8000/api/v1/openai/complete \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_access_token" \
-d '{"text": "The quick brown fox jumps over the", "model": "text-davinci-003", "temperature": 0.7, "max_tokens": 256}'
Deploying to Heroku:
-
Install the Heroku CLI:
pip install -g heroku
-
Log in to Heroku:
heroku login
-
Create a new Heroku app:
heroku create openai-api-python-client-production
-
Set up environment variables:
heroku config:set OPENAI_API_KEY=your_openai_api_key heroku config:set DATABASE_URL=your_database_url heroku config:set SECRET_KEY=your_secret_key
-
Deploy the code:
git push heroku main
-
Run database migrations:
heroku run alembic upgrade head
OPENAI_API_KEY
: Your OpenAI API key.DATABASE_URL
: Your PostgreSQL database connection string.SECRET_KEY
: A secret key for JWT authentication.
-
POST
/api/v1/users/register
: Register a new user.-
Request Body:
{ "username": "your_username", "email": "your_email@example.com", "password": "your_password" }
-
Response Body:
{ "id": 1, "username": "your_username", "email": "your_email@example.com", "api_key": "your_api_key" }
-
-
POST
/api/v1/users/login
: Login an existing user and obtain an access token.-
Request Body:
{ "email": "your_email@example.com", "password": "your_password" }
-
Response Body:
{ "access_token": "your_access_token", "token_type": "bearer" }
-
-
GET
/api/v1/users/me
: Get the current user's information.-
Authorization: Bearer your_access_token
-
Response Body:
{ "id": 1, "username": "your_username", "email": "your_email@example.com", "api_key": "your_api_key" }
-
-
POST
/api/v1/openai/complete
: Complete a given text using OpenAI's text completion API.-
Authorization: Bearer your_access_token
-
Request Body:
{ "text": "The quick brown fox jumps over the", "model": "text-davinci-003", "temperature": 0.7, "max_tokens": 256 }
-
Response Body:
{ "response": "lazy dog." }
-
-
POST
/api/v1/openai/translate
: Translate a given text using OpenAI's translation API.-
Authorization: Bearer your_access_token
-
Request Body:
{ "text": "Hello world", "source_language": "en", "target_language": "fr" }
-
Response Body:
{ "response": "Bonjour le monde" }
-
-
POST
/api/v1/openai/summarize
: Summarize a given text using OpenAI's summarization API.-
Authorization: Bearer your_access_token
-
Request Body:
{ "text": "The quick brown fox jumps over the lazy dog.", "model": "text-davinci-003" }
-
Response Body:
{ "response": "A brown fox jumps over a lazy dog." }
-
- Register a new user or login to receive a JWT access token.
- Include the access token in the
Authorization
header for all protected routes using the format:Authorization: Bearer your_access_token
This Minimum Viable Product (MVP) is licensed under the GNU AGPLv3 license.
This MVP was entirely generated using artificial intelligence through CosLynx.com.
No human was directly involved in the coding process of the repository: OpenAI-API-Python-Client
For any questions or concerns regarding this AI-generated MVP, please contact CosLynx at:
- Website: CosLynx.com
- Twitter: @CosLynxAI
Create Your Custom MVP in Minutes With CosLynxAI!