Closed roshanravan closed 5 months ago
Firstly hi 👋🏻 Can I consider this as a feature request or is this supposed to be a tutorial for other people. Adding termux support is currently not planning and would likely require significant work to be done. Maybe in the far future this could be a thing.
I guess both I did this and works very good, here is the screenshot of some of the LLMs i have: It's slow but it answers But i suggest LLMs like Phi on phone which run better on mobile CPU
If you need Tutorial for Termux and running Ollama Server on your android: you need Termux and also Termux Widget to run it easy from launcher open Termux and enter the following commands, pressing y and enter wherever prompted:
pkg install wget
pkg install curl
pkg install vim
pkg update && pkg upgrade -y
Next, you need to install Ubuntu in Termux:
pkg install wget curl proot tar -y && wget https://raw.githubusercontent.com/tuanpham-dev/termux-ubuntu/master/ubuntu.sh && chmod +x ubuntu.sh && bash ubuntu.sh nde
Then, run Ubuntu with the command:
./start-ubuntu20.sh
Now that you are in Ubuntu, update and install curl and other necessary tools by entering these commands one by one:
sudo apt update
sudo apt upgrade -y
sudo apt install curl -y
sudo apt install vim -y
To install Ollama in Ubuntu, use this command:
curl -fsSL https://ollama.com/install.sh | sh
To start the Ollama server in Ubuntu under Termux, use this different command:
ollama serve > /dev/null 2>&1 &
Each time you enter Ubuntu, you need to run this command to start Ollama. To make this easier, we'll add it to Ubuntu's startup. Use the vim editor to modify the bashrc file:
vim ~/.bashrc
Press i to edit, paste the following line:
ollama serve > /dev/null 2>&1 &
Then press Esc, type :wq, and press enter to save.
With Ollama running, go to Ollama Library and choose your desired model. Copy the installation code, for example, the Phi3 3.8B version optimized model Phi3 with benchmarks, and run this command:
ollama run llava-phi3
to create a shortcut to easily execute it from a widget, do the following:
First, go to Termux and navigate to the shortcuts folder to create a shortcut to Ubuntu:
cd .shortcuts
Then create the shortcut:
vim start-ubuntu
Press i to edit, paste the following line:
./start-ubuntu20.sh
Press Esc, type :wq, and press enter to save.
Now, add the Termux: Widget on your Android launcher screen. You’ll see start-ubuntu in the widget that you can run and keep it minimized while using ollama-app and the http://localhost:11434 works perfectly
@roshanravan
Thank you very much for your process code,I'm new to linux installation on Android. Every time I run ubuntu with the command, The environment needs to be reconfigured.Is there any solution to this?
./start-ubuntu20.sh
@JHubi1 This may be good to pin for other users to find. Also, could you please consider adding this to the app natively? The llama3.2:1b model requires very little resources to run, and I'm sure more small models will be made.
here is the tutorial: https://xdaforums.com/t/run-ai-mistral-7b-llm-on-xiaomi-13-ultra-ai-local-install-working-offline.4658021/