File tree 1 file changed +18
-0
lines changed 1 file changed +18
-0
lines changed Original file line number Diff line number Diff line change @@ -30,6 +30,7 @@ The following scenarios are supported and explained further below:
30
30
- :ref: `Additional scenarios <additional-scenarios >`
31
31
32
32
- Disable the backup service
33
+ - Add an Ollama instance to the stack
33
34
34
35
You can find the files in the
35
36
`Zammad-Docker-Compose repository <https://github.com/zammad/zammad-docker-compose >`_.
@@ -202,6 +203,23 @@ built in backup service in the stack to save resources.
202
203
You can do so by just using the scenario file
203
204
``scenarios/disable-backup-service.yml `` for deployment.
204
205
206
+ Add Ollama
207
+ ^^^^^^^^^^
208
+
209
+ You can spin up an additional `Ollama <https://ollama.com/ >`_ container to use
210
+ :admin-docs: `Zammad's AI features </ai/features.html> ` on your machine.
211
+
212
+ .. hint :: This is intended for development or testing purposes as running a
213
+ productive LLM stack is complex.
214
+
215
+ To deploy an Ollama container inside the Zammad stack, use the scenario file
216
+ ``scenarios/add-ollama.yml ``. This creates an Ollama container which
217
+ automatically pulls and serves ``Llama3.2 `` to be ready to use/test AI features
218
+ out of the box.
219
+
220
+ To use it in Zammad, add the service name and port (``http://ollama:11434 ``) to
221
+ the :admin-docs: `provider configuration </ai/provider.html> `.
222
+
205
223
Other Use Cases
206
224
^^^^^^^^^^^^^^^
207
225
You can’t perform that action at this time.
0 commit comments