This guide explains how to install and run the Ollama language model on an Android device using Termux. By following this tutorial, you can leverage LLMs (Large Language Models) directly on your mobile device without needing a desktop environment.
Before starting, ensure you have the following:
- An Android device with sufficient storage and RAM.
- A stable internet connection.
- Basic familiarity with Termux commands.
F-Droid is an open-source app store for Android. Download and install it from:
🔗 https://f-droid.org/
- Open F-Droid.
- Search for Termux and install it.
Open Termux and run the following command:
pkg update && pkg upgrade
Proot-Distro allows you to run Linux distributions inside Termux. Install it with:
pkg install proot-distro
Now, install Debian:
pd install debian
Start your Debian environment:
pd login debian
Tmux allows you to run multiple terminal sessions. Install it with:
apt update && apt upgrade
apt install tmux
Download and install Ollama inside Debian:
curl -fsSL https://ollama.com/install.sh | sh
Start a new session for running Ollama:
tmux new -s llm
Inside the Tmux session, run:
ollama serve
Split the Tmux window by pressing Ctrl+b
and then "
.
To run the Gemma 2B model:
ollama run gemma:2b
To run the Phi-3 model:
ollama run phi3
Interact with the model by entering prompts directly in the terminal.
With this setup, you can now run Ollama on your Android device, unlocking the potential of on-device AI for development and experimentation.
This project is licensed under the MIT License.
Feel free to contribute or report issues in the repository. Your feedback and contributions are highly appreciated.
Enjoy exploring the capabilities of on-device AI with Ollama on your Android device. 🚀