This add-on integrates Ollama into your DDEV project.
Ollama allows developers to run LLMs locally.
ddev add-on get tyler36/ddev-ollama
ddev restart
After installation, make sure to commit the .ddev
directory to version control.
ddev ollama
is a helper command allowing developers to interact with the Ollama service.
The table below disable some commonly-used commands. Type ddev ollama --help
to see all available commands.
Command | Description |
---|---|
ddev ollama run <model> |
Run a model. Models will automatically download if not available locally |
ddev ollama stop |
Stop a running model |
ddev ollama list |
List models |
ddev ollama ps |
List running models |
ddev ollama --help |
Show available commands |
Command | Description |
---|---|
ddev describe |
View service status and used ports for Ollama |
ddev logs -s ollama |
Check Ollama logs |
To change the Docker image:
ddev dotenv set .ddev/.env.ollama --ollama-docker-image="ollama/ollama:latest"
ddev add-on get tyler36/ddev-ollama
ddev restart
Make sure to commit the .ddev/.env.ollama
file to version control.
All customization options (use with caution):
Variable | Flag | Default |
---|---|---|
OLLAMA_DOCKER_IMAGE |
--ollama-docker-image |
ollama/ollama:latest |
[!tip] If you do not require GPU support, it is recommended to use
alpine/ollama
It is significantly smaller, however, it be a week or 2 out of sync with the latestollama/ollama
image.
When running Ollama locally, it is typically accessed at http://localhost:11434
.
However when using DDEV and Docker, access the Ollama server via the container name (ollama:11434
). For example:
$ ddev exec curl ollama:11434
Ollama is running
Many Ollama-compatible packages recommend setting this directly in an .env
:
OLLAMA_BASE_URL=ollama:11434
Contributed and maintained by @tyler36