A DDEV add-on that runs a LiteLLM proxy as a DDEV service, enabling local AI model testing with Drupal’s ai_provider_litellm module.
The add-on starts two containers via a single docker-compose.litellm.yaml:
| Container | Purpose |
|---|---|
ddev-<project>-litellm |
LiteLLM proxy (port 4000) |
ddev-<project>-litellm-postgres |
PostgreSQL database required by LiteLLM for key management |
LiteLLM uses Prisma as its ORM and requires PostgreSQL — key management endpoints (/key/info, /key/generate) will not work without it.
http://ddev-<project>-litellm:4000ddev start: waits for LiteLLM readiness, creates a Drupal virtual key in the DB, and auto-sets ai_provider_litellm.settings.host via drushollama pull llama3.2ddev add-on get credevator/ddev-litellm
ddev restart
| Context | URL |
|---|---|
| Drupal web container | http://ddev-<project>-litellm:4000 |
| Host browser | https://<project>.ddev.site:4001 |
ddev litellm # Show service status
ddev litellm open # Open LiteLLM UI in browser
ddev litellm logs # View service logs
ddev litellm-models # List available models
Edit .ddev/litellm_config.yaml to add or modify model backends, then run ddev restart.
Ollama (default, connects to host machine port 11434):
- model_name: ollama/llama3.2
litellm_params:
model: ollama/llama3.2
api_base: http://host.docker.internal:11434
vLLM / HuggingFace (uncomment in litellm_config.yaml and set VLLM_API_BASE):
# In .ddev/config.yaml:
# web_environment:
# - VLLM_API_BASE=http://host.docker.internal:8000
On each ddev start the add-on automatically:
sk-drupal-dev-key in the LiteLLM database (safe to run repeatedly — no-op if it already exists)ai_provider_litellm.settings.host to http://ddev-<project>-litellm:4000 via drushYou still need to wire up the API key in Drupal once:
/admin/config/system/keys)litellm_key with value sk-drupal-dev-key/admin/config/ai/providers/litellm)litellm_key under API Key| Key | Value | Purpose |
|---|---|---|
| Master key | sk-ddev-litellm |
LiteLLM admin — used to generate virtual keys and call management endpoints |
| Drupal virtual key | sk-drupal-dev-key |
Used by Drupal for all AI requests — stored in the PostgreSQL DB so /key/info resolves it |
To override the master key, add to .ddev/config.yaml:
web_environment:
- LITELLM_MASTER_KEY=sk-your-custom-key
Note: if you change the master key after first start, update the key/generate call in .ddev/config.ddev-litellm.yaml to match.
LiteLLM takes a while to start — the image is ~2GB. PostgreSQL must be healthy first, then LiteLLM runs Prisma migrations before serving requests. The healthcheck allows up to 5 minutes (30s interval × 10 retries + 120s start_period). Check progress with ddev litellm logs.
Cannot reach Ollama — ensure Ollama is running on the host (ollama serve) and the model is pulled (ollama pull llama3.2).
Port 4000 or 4001 conflict — edit HTTP_EXPOSE / HTTPS_EXPOSE in .ddev/docker-compose.litellm.yaml.
Linux users — host.docker.internal is injected via extra_hosts: host.docker.internal:host-gateway. This is handled automatically; macOS and Windows Docker Desktop provide it natively.
PostgreSQL data — key data persists in a Docker named volume (litellm-postgres-data). If you need a clean slate: docker volume rm <project>_litellm-postgres-data then ddev restart.
ddev add-on remove ddev-litellm
ddev restart
Note: .ddev/litellm_config.yaml is preserved on removal. Delete it manually if no longer needed. The litellm-postgres-data Docker volume is also preserved — remove it manually with docker volume rm <project>_litellm-postgres-data.
Apache 2.0