Open eckartal opened 1 month ago
Yes please! This is my biggest blocker with Jan. I don't want multiple redundant model file location. I'd like my ollama models to be easily used
I second this. I looked at the docs about "Ollama integration", but all that does is set up the server endpoint. You can't select an Ollama model already downloaded where Ollama stores its models and I don't think you can upload the model. On my openSUSE Tumbleweed system, Ollama stores its models in /var/lib/ollama/.ollama/models/ rather than the default Ollama location, and the Import file selection dialog can't even see the directories below the /var/lib/ollama directory.
Problem Integrating Ollama with Jan using the single OpenAI endpoint feels challenging. It’s also a hassle to ‘download’ the model.
Success Criteria
Additional context Related Reddit comment to be updated: https://www.reddit.com/r/LocalLLaMA/comments/1d8n9wr/comment/l77ifd1/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button