A Crossplane Configuration package that installs the Ollama Helm chart with a minimal, stable interface.
helm-ollama renders a single Helm release for Ollama. It exposes only the inputs needed
for chart values, namespace, and release name, keeping the interface stable while allowing full Helm overrides.
Ollama is an open-source tool for running large language models (LLMs) locally.
- Minimal Helm interface: values and overrideAllValues with stable defaults
- Predictable naming: defaults to
<clusterName>-ollamain theollamanamespace - GitOps friendly: ships a
.gitops/deploy chart
- Crossplane installed in the cluster
- Crossplane providers:
provider-helm(>=v1.0.6)
- Crossplane function:
function-auto-ready(>=v0.6.0)
apiVersion: pkg.crossplane.io/v1
kind: Configuration
metadata:
name: helm-ollama
spec:
package: ghcr.io/hops-ops/helm-ollama:latestapiVersion: helm.hops.ops.com.ai/v1alpha1
kind: Ollama
metadata:
name: ollama
namespace: example-env
spec:
clusterName: example-cluster
values:
ollama:
models:
pull:
- llama3.2Ollama supports GPU acceleration for faster inference:
apiVersion: helm.hops.ops.com.ai/v1alpha1
kind: Ollama
metadata:
name: ollama
namespace: example-env
spec:
clusterName: example-cluster
values:
ollama:
gpu:
enabled: true
type: nvidia
number: 1
models:
pull:
- llama3.2make render
make validate
make test