Skip to content

Spandan-Kar/LawGeeks-Prototype

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

8 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

βš–οΈ LawGeeks v1.0

AI-Powered Legal Understanding Assistant (Prototype) LawGeeks v1.0 is a lightweight AI legal assistant designed to help users understand legal text and ask questions in plain language using a locally hosted LLM.

🎯 Problem Statement

Legal documents are often:

❌ Filled with complex jargon

❌ Hard to interpret without legal expertise

❌ Time-consuming to read

❌ Risky to sign without understanding

Most individuals lack affordable access to legal consultation and often sign documents without clarity.

πŸ’‘ Solution: LawGeeks v1.0

LawGeeks v1.0 uses a local LLaMA 3 model via Ollama to:

Explain legal clauses in simple language

Answer user questions conversationally

Provide high-level legal understanding (not legal advice)

This version focuses on AI feasibility and user interaction, without external data sources.

✨ Key Features

πŸ€– AI Legal Chat (LLaMA 3 via Ollama) Conversational explanations of legal text using a locally hosted model.

πŸ“„ Plain-Language Legal Explanations Transforms complex clauses into easy-to-understand summaries.

πŸ’¬ Interactive Q&A Users can paste text or ask follow-up questions naturally.

πŸ” Privacy-First Design Runs completely locally β€” no data storage, no cloud processing.

πŸ§ͺ Proof-of-Concept Prototype Ideal for hackathons, demos, and academic evaluation.

πŸ–₯ Local Setup & Execution 1️⃣ Run Ollama with LLaMA 3

Ensure Ollama is installed and running.

ollama run llama3

2️⃣ Start the Application python server.py

That’s it βœ… No databases, no APIs, no embeddings.

πŸ›  Tech Stack

Language: Python

LLM: LLaMA 3 (via Ollama)

Backend: Python (minimal server / CLI-based)

Deployment: Local machine only

Focus: Privacy, simplicity, legal text explanation

πŸ“‚ Project Structure (Simplified) LawGeeks-v1/ β”œβ”€β”€ run.py # Application entry point β”œβ”€β”€ chat.py # Ollama + LLaMA3 integration β”œβ”€β”€ prompts.py # System prompts for legal explanation β”œβ”€β”€ requirements.txt └── README.md

⚠️ Disclaimer

LawGeeks v1.0 is not a substitute for professional legal advice. The system provides informational assistance only and may produce inaccuracies.

πŸš€ Future Enhancements (Planned)

Document upload support

Clause-level risk highlighting

RAG-based legal retrieval

Secure user history

Cloud deployment

🀝 Contributing

Feedback and contributions are welcome ❀️

Fork the repository

Create a feature branch

Commit changes

Open a Pull Request

πŸ“ License

MIT License

πŸ™ Acknowledgments

Ollama for local LLM execution

Meta LLaMA 3 model

Open-source AI community

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •