Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 46 additions & 1 deletion backend/python/common/libbackend.sh
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,14 @@ function getBuildProfile() {
# Make the venv relocatable:
# - rewrite venv/bin/python{,3} to relative symlinks into $(_portable_dir)
# - normalize entrypoint shebangs to /usr/bin/env python3
# - optionally update pyvenv.cfg to point to the portable Python directory (only at runtime)
# Usage: _makeVenvPortable [--update-pyvenv-cfg]
_makeVenvPortable() {
local update_pyvenv_cfg=false
if [ "${1:-}" = "--update-pyvenv-cfg" ]; then
update_pyvenv_cfg=true
fi

local venv_dir="${EDIR}/venv"
local vbin="${venv_dir}/bin"

Expand All @@ -255,7 +262,39 @@ _makeVenvPortable() {
ln -s "${rel_py}" "${vbin}/python3"
ln -s "python3" "${vbin}/python"

# 2) Rewrite shebangs of entry points to use env, so the venv is relocatable
# 2) Update pyvenv.cfg to point to the portable Python directory (only at runtime)
# Use absolute path resolved at runtime so it works when the venv is copied
if [ "$update_pyvenv_cfg" = "true" ]; then
local pyvenv_cfg="${venv_dir}/pyvenv.cfg"
if [ -f "${pyvenv_cfg}" ]; then
local portable_dir="$(_portable_dir)"
# Resolve to absolute path - this ensures it works when the backend is copied
# Only resolve if the directory exists (it should if ensurePortablePython was called)
if [ -d "${portable_dir}" ]; then
portable_dir="$(cd "${portable_dir}" && pwd)"
else
# Fallback to relative path if directory doesn't exist yet
portable_dir="../python"
fi
local sed_i=(sed -i)
# macOS/BSD sed needs a backup suffix; GNU sed doesn't. Make it portable:
if sed --version >/dev/null 2>&1; then
sed_i=(sed -i)
else
sed_i=(sed -i '')
fi
# Update the home field in pyvenv.cfg
# Handle both absolute paths (starting with /) and relative paths
if grep -q "^home = " "${pyvenv_cfg}"; then
"${sed_i[@]}" "s|^home = .*|home = ${portable_dir}|" "${pyvenv_cfg}"
else
# If home field doesn't exist, add it
echo "home = ${portable_dir}" >> "${pyvenv_cfg}"
fi
fi
fi

# 3) Rewrite shebangs of entry points to use env, so the venv is relocatable
# Only touch text files that start with #! and reference the current venv.
local ve_abs="${vbin}/python"
local sed_i=(sed -i)
Expand Down Expand Up @@ -316,6 +355,7 @@ function ensureVenv() {
fi
fi
if [ "x${PORTABLE_PYTHON}" == "xtrue" ]; then
# During install, only update symlinks and shebangs, not pyvenv.cfg
_makeVenvPortable
fi
fi
Expand Down Expand Up @@ -420,6 +460,11 @@ function installRequirements() {
# - ${BACKEND_NAME}.py
function startBackend() {
ensureVenv
# Update pyvenv.cfg before running to ensure paths are correct for current location
# This is critical when the backend position is dynamic (e.g., copied from container)
if [ "x${PORTABLE_PYTHON}" == "xtrue" ] || [ -x "$(_portable_python)" ]; then
_makeVenvPortable --update-pyvenv-cfg
fi
if [ ! -z "${BACKEND_FILE:-}" ]; then
exec "${EDIR}/venv/bin/python" "${BACKEND_FILE}" "$@"
elif [ -e "${MY_DIR}/server.py" ]; then
Expand Down
135 changes: 133 additions & 2 deletions backend/python/diffusers/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,136 @@
# Creating a separate environment for the diffusers project
# LocalAI Diffusers Backend

This backend provides gRPC access to Hugging Face diffusers pipelines with dynamic pipeline loading.

## Creating a separate environment for the diffusers project

```
make diffusers
```
```

## Dynamic Pipeline Loader

The diffusers backend includes a dynamic pipeline loader (`diffusers_dynamic_loader.py`) that automatically discovers and loads diffusers pipelines at runtime. This eliminates the need for per-pipeline conditional statements - new pipelines added to diffusers become available automatically without code changes.

### How It Works

1. **Pipeline Discovery**: On first use, the loader scans the `diffusers` package to find all classes that inherit from `DiffusionPipeline`.

2. **Registry Caching**: Discovery results are cached for the lifetime of the process to avoid repeated scanning.

3. **Task Aliases**: The loader automatically derives task aliases from class names (e.g., "text-to-image", "image-to-image", "inpainting") without hardcoding.

4. **Multiple Resolution Methods**: Pipelines can be resolved by:
- Exact class name (e.g., `StableDiffusionPipeline`)
- Task alias (e.g., `text-to-image`, `img2img`)
- Model ID (uses HuggingFace Hub to infer pipeline type)

### Usage Examples

```python
from diffusers_dynamic_loader import (
load_diffusers_pipeline,
get_available_pipelines,
get_available_tasks,
resolve_pipeline_class,
discover_diffusers_classes,
get_available_classes,
)

# List all available pipelines
pipelines = get_available_pipelines()
print(f"Available pipelines: {pipelines[:10]}...")

# List all task aliases
tasks = get_available_tasks()
print(f"Available tasks: {tasks}")

# Resolve a pipeline class by name
cls = resolve_pipeline_class(class_name="StableDiffusionPipeline")

# Resolve by task alias
cls = resolve_pipeline_class(task="stable-diffusion")

# Load and instantiate a pipeline
pipe = load_diffusers_pipeline(
class_name="StableDiffusionPipeline",
model_id="runwayml/stable-diffusion-v1-5",
torch_dtype=torch.float16
)

# Load from single file
pipe = load_diffusers_pipeline(
class_name="StableDiffusionPipeline",
model_id="/path/to/model.safetensors",
from_single_file=True,
torch_dtype=torch.float16
)

# Discover other diffusers classes (schedulers, models, etc.)
schedulers = discover_diffusers_classes("SchedulerMixin")
print(f"Available schedulers: {list(schedulers.keys())[:5]}...")

# Get list of available scheduler classes
scheduler_list = get_available_classes("SchedulerMixin")
```

### Generic Class Discovery

The dynamic loader can discover not just pipelines but any class type from diffusers:

```python
# Discover all scheduler classes
schedulers = discover_diffusers_classes("SchedulerMixin")

# Discover all model classes
models = discover_diffusers_classes("ModelMixin")

# Get a sorted list of available classes
scheduler_names = get_available_classes("SchedulerMixin")
```

### Special Pipeline Handling

Most pipelines are loaded dynamically through `load_diffusers_pipeline()`. Only pipelines requiring truly custom initialization logic are handled explicitly:

- `FluxTransformer2DModel`: Requires quantization and custom transformer loading (cannot use dynamic loader)
- `WanPipeline` / `WanImageToVideoPipeline`: Uses dynamic loader with special VAE (float32 dtype)
- `SanaPipeline`: Uses dynamic loader with post-load dtype conversion for VAE/text encoder
- `StableVideoDiffusionPipeline`: Uses dynamic loader with CPU offload handling
- `VideoDiffusionPipeline`: Alias for DiffusionPipeline with video flags

All other pipelines (StableDiffusionPipeline, StableDiffusionXLPipeline, FluxPipeline, etc.) are loaded purely through the dynamic loader.

### Error Handling

When a pipeline cannot be resolved, the loader provides helpful error messages listing available pipelines and tasks:

```
ValueError: Unknown pipeline class 'NonExistentPipeline'.
Available pipelines: AnimateDiffPipeline, AnimateDiffVideoToVideoPipeline, ...
```

## Environment Variables

| Variable | Default | Description |
|----------|---------|-------------|
| `COMPEL` | `0` | Enable Compel for prompt weighting |
| `XPU` | `0` | Enable Intel XPU support |
| `CLIPSKIP` | `1` | Enable CLIP skip support |
| `SAFETENSORS` | `1` | Use safetensors format |
| `CHUNK_SIZE` | `8` | Decode chunk size for video |
| `FPS` | `7` | Video frames per second |
| `DISABLE_CPU_OFFLOAD` | `0` | Disable CPU offload |
| `FRAMES` | `64` | Number of video frames |
| `BFL_REPO` | `ChuckMcSneed/FLUX.1-dev` | Flux base repo |
| `PYTHON_GRPC_MAX_WORKERS` | `1` | Max gRPC workers |

## Running Tests

```bash
./test.sh
```

The test suite includes:
- Unit tests for the dynamic loader (`test_dynamic_loader.py`)
- Integration tests for the gRPC backend (`test.py`)
Loading
Loading