Skip to content

NANDINISHARMA30/Blog-Generation-LLM-application

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Blog Generation using LLaMA 2

This is an interactive LLM-based blog generation application built with Meta’s LLaMA 2, LangChain, and Streamlit. The app allows users to input a topic, choose the desired writing style, and receive a high-quality blog article generated using LLaMA 2.


🔍 Project Overview

With the rise of LLMs in content generation, this project focuses on:

  • ✅ Structuring a local LLM pipeline using LLaMA 2
  • ✅ Leveraging LangChain for prompt templates
  • ✅ Providing an interactive UI with Streamlit
  • ✅ Creating personalized blog content with minimal input

I built this tool to be modular, lightweight, and easy to extend — whether for content marketing, education, or research.


⚙️ Key Features

  • 🔗 LLaMA 2 integration via CTransformers (runs locally)
  • 🧠 Prompt engineering using LangChain templates
  • 🖱️ Interactive UI built with Streamlit
  • ✍️ Dynamic blog content generation based on topic, tone, and word count
  • 🛠️ Minimal resource requirement (runs without GPU/cloud)
  • 💡 Future-ready for vector-based retrieval integration

🧱 Tech Stack

  • Python 3.10+
  • [Meta's LLaMA 2 (GGML Quantized)]
  • [LangChain]
  • [CTransformers]
  • [Streamlit]
  • [Uvicorn] (for API)
  • Python Box (for cleaner configuration handling)

🪪 License

This project is for educational and experimental use. Ensure compliance with Meta's LLaMA 2 license when using the model.

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages