Closed vpavlin closed 1 month ago
@ericcurtin PTAL ^^
We don't need to store the manifests at all, instead it makes sense to just pull these fresh everytime. @swarajpande5 can you take a look?
From my perspective it would make sense to store them, just make sure the paths match, so if I set
OLLAMA_MODELS=/home/vpavlin/.local/share/ramalama/repos/ollama/
Ollama will actualyl work (there might still be issues with access to the files as Ollama is using ollama
user etc., but that is a separate story:) )
Yeah maybe... Iterop with Ollama tooling hasn't really been a goal upto now, but if people want that I guess why not
To be specific, I think I mentioned I built this tool - https://github.com/vpavlin/ollama-codex - and I'd like to make sure it works for both Ollama and ramalama.
I believe the only issue right now is the paths being slightly different.
From the perspective of Ollama user - if you allow me to pull an Ollama model, then store it as Ollama would (at least from the path structure perspective:) ).
I agree, since we probably want to work with tools being built to work with Ollama this seems like a good idea.
@ericcurtin As per our prior discussions, we should fix this to have better compatibility with Ollama. I'll be happy to take up this.
You got it.
Is there any particular reason why the path to ollama models contains
https://
?a) it feels weird to have
https://
in the path b) it will break compatibility with Ollama and any tools relying on the paths