DevDeCode is a FastAPI-powered backend that integrates Microsoft's Phi-3 Mini model using Hugging Face Transformers and LangChain. It takes Python code as input and returns a step-by-step explanation. Designed for developers and learners, this API simplifies code understanding using LLMs.
- 🧠 Powered by Phi-3 Mini (4K Instruct)
- 🔗 Built with LangChain for structured LLM workflows
- 🌐 Hosted using FastAPI with auto-generated Swagger docs
- 🌍 CORS-enabled for easy frontend integration
- 🧪 Uses
StrOutputParser
for clean output formatting - 🌩️ (Optional) Ngrok integration for public URL testing
Technology | Description |
---|---|
FastAPI | Web framework for building the RESTful API |
LangChain | Manages prompt templates, model pipeline, and parsing logic |
Transformers | Hugging Face library for using and fine-tuning pretrained models |
Phi-3 Mini | Lightweight instruction-tuned language model from Microsoft |
Hugging Face Hub | Model access, authentication, and (optional) deployment to Spaces |
Uvicorn | ASGI server to run the FastAPI app |
PyTorch | Deep learning backend for model execution |
Ngrok (optional) | Tunnels localhost for public access during development |
CORS Middleware | Enables smooth frontend-to-backend communication |
-
Install dependencies
pip install -r requirements.txt
Make sure your system supports CUDA or fallback to CPU by modifying
torch_dtype
anddevice_map
in your code. -
Run Locally
python app.py
Ensure your repo includes:
README.md
requirements.txt
app.py
huggingface.yml
(optional but useful)
You can use the huggingface_hub
Python SDK or upload via the UI.
- Base URL:
/explain
- Method:
POST
- Input:
{ "code": "your_python_code_here" }
- Output:
{ "output": "Step-by-step explanation of the code..." }
You can test the API using the built-in Swagger UI:
- Swagger UI (API Docs)
(If running locally, replace with your public URL if using ngrok or Spaces)
Or use tools like Postman to send POST requests to the /explain
endpoint.
MIT License © 2025 [Your Name]
- Microsoft for Phi-3
- Hugging Face for their incredible ecosystem
- LangChain for making LLM orchestration simple
Let me know if you want me to generate the requirements.txt
or a huggingface.yml
file