Closed mandroll closed 5 months ago
Hey, the ollama server being a completely external program, I'd recommend you use Docker for that.
You can create a compose file that deploys the ollama server using the official image https://hub.docker.com/r/ollama/ollama and you can create your own image of the rust app that interacts with ollama.
Thanks!
Thanks for making this extension! Do you know if it's possible to set up the ollama server from within the Rust program, such that one could deploy a standalone application that runs llama2 and makes queries to it?
Feel free to close this issue once answered. Thanks again!