Skip to content

Commit b1d3b03

Browse files
authored
Fixes #536 (CS) - Added Ollama scenario for Docker compose
1 parent 48d7bd5 commit b1d3b03

File tree

1 file changed

+18
-0
lines changed

1 file changed

+18
-0
lines changed

install/docker-compose/docker-compose-scenarios.rst

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,7 @@ The following scenarios are supported and explained further below:
3030
- :ref:`Additional scenarios <additional-scenarios>`
3131

3232
- Disable the backup service
33+
- Add an Ollama instance to the stack
3334

3435
You can find the files in the
3536
`Zammad-Docker-Compose repository <https://github.com/zammad/zammad-docker-compose>`_.
@@ -202,6 +203,23 @@ built in backup service in the stack to save resources.
202203
You can do so by just using the scenario file
203204
``scenarios/disable-backup-service.yml`` for deployment.
204205

206+
Add Ollama
207+
^^^^^^^^^^
208+
209+
You can spin up an additional `Ollama <https://ollama.com/>`_ container to use
210+
:admin-docs:`Zammad's AI features </ai/features.html>` on your machine.
211+
212+
.. hint:: This is intended for development or testing purposes as running a
213+
productive LLM stack is complex.
214+
215+
To deploy an Ollama container inside the Zammad stack, use the scenario file
216+
``scenarios/add-ollama.yml``. This creates an Ollama container which
217+
automatically pulls and serves ``Llama3.2`` to be ready to use/test AI features
218+
out of the box.
219+
220+
To use it in Zammad, add the service name and port (``http://ollama:11434``) to
221+
the :admin-docs:`provider configuration </ai/provider.html>`.
222+
205223
Other Use Cases
206224
^^^^^^^^^^^^^^^
207225

0 commit comments

Comments
 (0)