This project adds the ability to index any local Laravel/Vue/TS project and then ask natural-language questions about the codebase, fully offline, using:
- Ollama (local LLM runtime)
- Gemma3:4b (reasoning/chat model)
- nomic-embed-text or embeddinggemma (embedding model)
- cloudstudio/ollama-laravel
- Custom artisan commands for indexing + Q&A
No OpenAI or Anthropic credits required.
Everything runs locally on your machine.
- 🔍 Indexes your entire project into vector embeddings
- 🧠 Ask any question about your codebase (controllers, routes, models, services, Vue components, etc.)
- 💬 Interactive terminal REPL mode
- 🦺 Automatically skips useless directories (
node_modules,vendor,.git, etc.) - 🧰 Works with Gemma3, Mistral, or any LLM supported by Ollama
- 📄 Embeddings stored in DB for fast retrieval
Download: https://ollama.com/download
ollama pull gemma3:4bYou must have one of these:
ollama pull nomic-embed-text
# or
ollama pull embeddinggemma❗ If you skip this step, ALL embeddings will be empty and questions will fail with
Could not generate embedding for the question.
composer require cloudstudio/ollama-laravelPublish config:
php artisan vendor:publish --tag="ollama-laravel-config"Add to your .env file:
# Chat model
OLLAMA_MODEL=gemma3:4b
# Embedding model (must be pulled in Ollama)
OLLAMA_EMBED_MODEL=nomic-embed-text
# or:
# OLLAMA_EMBED_MODEL=embeddinggemma
# Ollama server
OLLAMA_URL=http://127.0.0.1:11434
# Optional system prompt
OLLAMA_DEFAULT_PROMPT="You are a senior Laravel/PHP assistant."
# Timeout for slow starts
OLLAMA_CONNECTION_TIMEOUT=300Verify installed models:
ollama listUse the custom artisan command:
php artisan ollama:code-index {project_path}
Example:
php artisan ollama:code-index /Users/you/PhpstormProjects/my-project- Scan extensions:
.php,.ts,.js,.vue - Chunk files into 80-line windows
- Embed each chunk using your embedding model
- Store results in the
code_chunkstable - Skip noisy folders:
node_modules/
vendor/
storage/
bootstrap/
.git/
.idea/
.vscode/
dist/
public/build/
php artisan ollama:code-ask /path/to/project "Where is authentication handled?"php artisan ollama:code-ask /path/to/projectExample:
Interactive mode. Project: /Users/you/my-project
Type 'exit' to quit.
> Where do we register API routes?
Answer:
They are defined in routes/api.php …
> How is user authorization implemented?
Answer:
It uses Laravel's Gate system …
> exit
If something seems wrong, try:
php artisan tinkeruse Cloudstudio\Ollama\Facades\Ollama;
Ollama::model(env('OLLAMA_EMBED_MODEL'))->embeddings("hello world");Expected:
[
"embedding" => [0.12, -0.03, ...]
]If you see:
"error" => "model not found"
👉 You forgot to run:
ollama pull nomic-embed-textCause:
❗ You did not pull an embedding model.
Fix:
ollama pull nomic-embed-textSame cause: missing embedding model OR wrong name in .env.
Some files contain binary/minified garbage.
Fix: they are auto-skipped, but you can extend the excluded dirs.
Already auto-skipped, but you may add custom exclusions.
- pgvector support
- Web UI
- VSCode extension
- Incremental indexing
- Codebase profiles
| Feature | Status |
|---|---|
| Local LLM (Gemma3) | ✅ Using Ollama |
| Local embeddings | ✅ nomic-embed-text / embeddinggemma |
| Code indexing | ✅ Works |
| Q&A CLI | ✅ One-shot + interactive |
| Fully offline | ✅ No APIs |
This README contains everything needed to run the system locally.