ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
4.04k stars 339 forks source link

GPU Selection & Device Specification #161

Open nycameraguy opened 4 months ago

nycameraguy commented 4 months ago

I would like to suggest a feature that would allow specifing which GPU or GPUs to run on directly within the Ollama Python library.

This feature is crucial in shared server environments across multiple GPUs and multiple users, as it allows each Jupyter notebook to run on the corresponding GPU without conflicts. Currently, specifying GPU usage in Ollama is somewhat complex. A streamlined method to assign tasks to specific GPUs directly inside the Python program would prevent conflicts and optimize workflow. Implementing this feature would significantly improve usability and align Ollama with other machine-learning frameworks.

Thank you for considering this suggestion. I would be happy to discuss further details if needed.

gileneusz commented 4 months ago

would be great if this could be implemented asap