Skip to content

inquid/inquid-gpt

Repository files navigation

🧠 Laravel Local Code Assistant (Ollama + Gemma3 + Embeddings)

This project adds the ability to index any local Laravel/Vue/TS project and then ask natural-language questions about the codebase, fully offline, using:

  • Ollama (local LLM runtime)
  • Gemma3:4b (reasoning/chat model)
  • nomic-embed-text or embeddinggemma (embedding model)
  • cloudstudio/ollama-laravel
  • Custom artisan commands for indexing + Q&A

No OpenAI or Anthropic credits required.
Everything runs locally on your machine.


🚀 Features

  • 🔍 Indexes your entire project into vector embeddings
  • 🧠 Ask any question about your codebase (controllers, routes, models, services, Vue components, etc.)
  • 💬 Interactive terminal REPL mode
  • 🦺 Automatically skips useless directories (node_modules, vendor, .git, etc.)
  • 🧰 Works with Gemma3, Mistral, or any LLM supported by Ollama
  • 📄 Embeddings stored in DB for fast retrieval

📦 Requirements

1. Ollama installed

Download: https://ollama.com/download

2. Pull required models

Chat model:

ollama pull gemma3:4b

Embedding model (VERY IMPORTANT):

You must have one of these:

ollama pull nomic-embed-text
# or
ollama pull embeddinggemma

❗ If you skip this step, ALL embeddings will be empty and questions will fail with
Could not generate embedding for the question.

3. Laravel 10 or 11

4. PHP 8.2+


🛠️ Installation

1. Install Ollama Laravel package

composer require cloudstudio/ollama-laravel

Publish config:

php artisan vendor:publish --tag="ollama-laravel-config"

⚙️ Configuration

Add to your .env file:

# Chat model
OLLAMA_MODEL=gemma3:4b

# Embedding model (must be pulled in Ollama)
OLLAMA_EMBED_MODEL=nomic-embed-text
# or:
# OLLAMA_EMBED_MODEL=embeddinggemma

# Ollama server
OLLAMA_URL=http://127.0.0.1:11434

# Optional system prompt
OLLAMA_DEFAULT_PROMPT="You are a senior Laravel/PHP assistant."

# Timeout for slow starts
OLLAMA_CONNECTION_TIMEOUT=300

Verify installed models:

ollama list

📁 Code Indexing

Use the custom artisan command:

php artisan ollama:code-index {project_path}

Example:

php artisan ollama:code-index /Users/you/PhpstormProjects/my-project

The indexer will:

  • Scan extensions: .php, .ts, .js, .vue
  • Chunk files into 80-line windows
  • Embed each chunk using your embedding model
  • Store results in the code_chunks table
  • Skip noisy folders:
node_modules/
vendor/
storage/
bootstrap/
.git/
.idea/
.vscode/
dist/
public/build/

❓ Asking Questions About Your Code

1. Ask a one-shot question

php artisan ollama:code-ask /path/to/project "Where is authentication handled?"

2. Start interactive mode

php artisan ollama:code-ask /path/to/project

Example:

Interactive mode. Project: /Users/you/my-project
Type 'exit' to quit.

> Where do we register API routes?
Answer:
They are defined in routes/api.php …

> How is user authorization implemented?
Answer:
It uses Laravel's Gate system …

> exit

🧪 Test Embeddings Manually

If something seems wrong, try:

php artisan tinker
use Cloudstudio\Ollama\Facades\Ollama;
Ollama::model(env('OLLAMA_EMBED_MODEL'))->embeddings("hello world");

Expected:

[
  "embedding" => [0.12, -0.03, ...]
]

If you see:

"error" => "model not found"

👉 You forgot to run:

ollama pull nomic-embed-text

🧩 Troubleshooting

❌ All files show “Empty embedding … skipping.”

Cause:
❗ You did not pull an embedding model.

Fix:

ollama pull nomic-embed-text

❌ “Could not generate embedding for the question”

Same cause: missing embedding model OR wrong name in .env.


❌ Guzzle error: “Malformed UTF-8”

Some files contain binary/minified garbage.

Fix: they are auto-skipped, but you can extend the excluded dirs.


❌ Indexer tries to embed huge .min.js files

Already auto-skipped, but you may add custom exclusions.


🧱 Future Improvements

  • pgvector support
  • Web UI
  • VSCode extension
  • Incremental indexing
  • Codebase profiles

✔️ Summary

Feature Status
Local LLM (Gemma3) ✅ Using Ollama
Local embeddings ✅ nomic-embed-text / embeddinggemma
Code indexing ✅ Works
Q&A CLI ✅ One-shot + interactive
Fully offline ✅ No APIs

This README contains everything needed to run the system locally.

About

Index your local project using ollama models and ask questions righ away

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages