containers / ramalama

The goal of RamaLama is to make working with AI boring.
MIT License
280 stars 48 forks source link

Incompatible Ollama paths #352

Closed vpavlin closed 1 month ago

vpavlin commented 1 month ago

Is there any particular reason why the path to ollama models contains https://?

/home/vpavlin/.local/share/ramalama/repos/ollama/manifests/https\:/registry.ollama.ai/library/

a) it feels weird to have https:// in the path b) it will break compatibility with Ollama and any tools relying on the paths

rhatdan commented 1 month ago

@ericcurtin PTAL ^^

ericcurtin commented 1 month ago

We don't need to store the manifests at all, instead it makes sense to just pull these fresh everytime. @swarajpande5 can you take a look?

vpavlin commented 1 month ago

From my perspective it would make sense to store them, just make sure the paths match, so if I set

OLLAMA_MODELS=/home/vpavlin/.local/share/ramalama/repos/ollama/

Ollama will actualyl work (there might still be issues with access to the files as Ollama is using ollama user etc., but that is a separate story:) )

ericcurtin commented 1 month ago

Yeah maybe... Iterop with Ollama tooling hasn't really been a goal upto now, but if people want that I guess why not

vpavlin commented 1 month ago

To be specific, I think I mentioned I built this tool - https://github.com/vpavlin/ollama-codex - and I'd like to make sure it works for both Ollama and ramalama.

I believe the only issue right now is the paths being slightly different.

From the perspective of Ollama user - if you allow me to pull an Ollama model, then store it as Ollama would (at least from the path structure perspective:) ).

rhatdan commented 1 month ago

I agree, since we probably want to work with tools being built to work with Ollama this seems like a good idea.

swarajpande5 commented 1 month ago

@ericcurtin As per our prior discussions, we should fix this to have better compatibility with Ollama. I'll be happy to take up this.

rhatdan commented 1 month ago

You got it.