I love @versus, and that's why I fine-tuned an LLM on versus' captions and made an end-to-end AI content tool with it. Low-cost deployment for projects featuring local LLM's is essentially non-existent, so I decided against deployment.
- Backend: Python, Flask, Flask-JWT-Extended, Flask-Bcrypt
- Frontend: React (with Vite), Tailwind CSS, shadcn/ui
- Database: PostgreSQL (managed with Docker Compose)
- AI & APIs: Google Gemini, Instagrapi, Newspaper3k
- Containerization: Docker
- Python>=3.10
- Node.js and npm
- Docker and Docker Compose
- Kaggle account
-
Create and activate a Python virtual environment:
python3 -m venv .venv source .venv/bin/activate -
Install Python dependencies:
pip install -r requirements.txt
-
Create an Instagram session:
-
Tip: Login to the same instagram account on your browser as well, and browse around normally, every once in a while. This will keep your session from being tagged as "abnormal".
python3 insta_login.py
-
Navigate to the frontend directory:
cd frontend -
Install Node.js dependencies:
npm install
-
Return to the root directory:
cd ..
-
Create a
.envfile in the root of the project and add the following variables:# --- Gemini API Key --- GEMINI=<your-gemini-api-key> # --- Instagram Credentials --- INSTA_USERNAME=<your_instagram_username> INSTA_PASSWORD=<your_instagram_password> # --- Kaggle Inference Server URL --- KAGGLE_INFERENCE_URL=https://your-ngrok-url.ngrok-free.app # --- PostgreSQL Database Credentials --- POSTGRES_USER=versusdb POSTGRES_PASSWORD=<your_secure_password> POSTGRES_DB=versus_db POSTGRES_PORT=5433 # --- RSS Feed URL --- RSS_FEED=https://www.theguardian.com/football/rss # --- JWT Secret Key --- JWT_SECRET_KEY=<a_very_strong_and_secret_key>
-
Kaggle Config:
T4/P100 GPU Enable Internet # -- Secretes -- NGROK_AUTHTOKEN=<your-ngrok-auth-token> NGROK_DOMAIN=https://your-ngrok-url.ngrok-free.app HF_TOKEN=<your-huggingface-access-token>
-
Sample Inference Server:
# ---------------------------------- Cell #1 ---------------------------------- !pip uninstall -y torch torchvision torchaudio numpy # ---------------------------------- Cell #2 ---------------------------------- !pip install numpy==1.26.4 # ---------------------------------- Cell #3 ---------------------------------- !pip install torch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 --index-url https://download.pytorch.org/whl/cu121 # ---------------------------------- Cell #4 ---------------------------------- !pip install --upgrade transformers peft accelerate bitsandbytes datasets trl flask pyngrok # ---------------------------------- Cell #5 ---------------------------------- # Refer inference_example.py # Make sure to "Restart and Clear All Cell Outputs" after installing all the dependencies.
- Start the PostgreSQL Database:
- Open a terminal in the project root and run:
docker compose up -d
- Start the Backend Server:
- In a new terminal, make sure your Python virtual environment is activated, and run the Flask application:
python app_ig.py
- Start the Frontend Development Server:
- In a third terminal, navigate to the
frontenddirectory. - Run the Vite development server:
npm run dev